[ / / / / / / / / / / / / / ] [ dir / animu / asmr / rel / strek / sw / travis2k / vore / zoo ]

/tech/ - Technology

Winner of the 42nd Attention-Hungry Games
/ara/ - A Place for Mothercons.
Comment *
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Show oekaki applet
(replaces files and can be used instead)

Allowed file types:jpg, jpeg, gif, png, webm, mp4, pdf
Max filesize is 16 MB.
Max image dimensions are 15000 x 15000.
You may upload 3 per post.

File: d40517633f36ee9⋯.jpg (30.6 KB, 400x353, 400:353, 1496311758567.jpg)


Because it is 2018 and people have forgotten how to static HTML. All that lovely rendering code in your browser goes unused as people rewrite it in Javascript, badly; or even in C++ and then compile to web assembly, that’s all the rage now, because why stick with one layer of abstraction when you can involve three?

Oh, and of course everyone rolls their own, so that website with the dancing pigs not only has its own stack, but also one for every ad and tracking network it pulls in. It takes a whole lot of javascript to provide that transparent pixel!

The situation is no better on the server side, of course. Why write static HTML for your pageful of text when you can have a web framework pulling C++ from a SQL database, compiling it to web assembly on the fly and vomiting forth to the browser just in time because someone might want their flying unicorn background to have different colour sparkles and pointing those people at a different static location is just gauche.

No, I’m not bitter, far from it. It all makes work for the working man to do.


File: bae0ef1c3fceab5⋯.png (214.61 KB, 688x409, 688:409, 179316753.png)

Because pajeet is cheaper, and because bouncy flashy round cornered rainbow boxes and funny interactive flipping sliding panels keep normies engaged even if the page has no valuable or meaningful content whatsoever.



It's because the web is no longer focused on serving documents.


Do we really need another thread about this?


File: 14300e02a9b382f⋯.png (212.74 KB, 1068x720, 89:60, 1527318364.png)



Probably because of poor workflow that html pages provide.


>tfw can't download pdf forms from the government website without javascript

>tfw the form doesn't open and prompts me to download Adobe reader

>turns out that the pdf form is encoded in a way that only adobe can decode

>tfw find out a javascript from adobetdm.com is stucked into the government website

>tfw can't do basic citizenship duties without using proprietary JS and adobe obsolete reader


I unironically think we should bring Gopher back.

<Why are websites so shit?

Because they need all that (((telemetry))) crap to spy on you to sell your data to advertisers and governments. Gopher and static html would go against all that.


Think things were bad now, (((WebAssembly))) is going to make the whole web proprietary.


Situation related makes neo-luddites seems sane.



Does Gopher have encryption yet?


File: bd6db7291f23479⋯.jpg (284.06 KB, 1280x720, 16:9, jim.jpg)

Speaking of shit websites. The /pol/ board owner is looking for moderators. It would be nice if /pol/ had competent moderators for a change.



Competent and JQ-heavy.



not all websites are like this

maybe you were just visiting some shit websites


File: ae9f8eebdb81286⋯.png (284.93 KB, 1024x680, 128:85, 4586890002_51f3dbd6fd_o.png)

Because people must work, because UBI is way to JAR, because this all is just foreplay either to space exploration which will never happen until true internationalization or total darwinian genocide of all non-rich not-a-bee citizens.

This not about "normies", it's about keeping people occupied.



I don't know what websites you visit. Last fad of the month is static websites hosted on github pages (basically the new geocities), google analytics as visitor counter and Disqus as guestbook. All made with love and detached CMS's written in python, go or other hispter languages. WSYWG html editors are soooo 1992, everything is done in markdown and vim these days.

But You better tell me why there are absolutely no imageboards that work and implement all their functions in pure html+css. None of them, from kusaba to lynxchan work free of javascript at all. JS is still needed for: rendering post replies, adding second, third file upload forms, lightbox image previews, post deletion and moderation, reporting, hiding, and so on and so on. Even fucking captcha needs javascript 90% of times I see it.


Stop using insecure CIAnigger (((interconnected networks))).

Use Tor, i2p or virtual intranets like cjdns that respect your freedoms and right to privacy.



>But You better tell me why there are absolutely no imageboards that work and implement all their functions in pure html+css. None of them, from kusaba to lynxchan work free of javascript at all. JS is still needed for: rendering post replies, adding second, third file upload forms, lightbox image previews, post deletion and moderation, reporting, hiding, and so on and so on. Even fucking captcha needs javascript 90% of times I see it.

Question is if it is even a good idea to have imageboards run in the browser. Would it not be better to just download imageboard software and run it on your computer? Then you could just have a standard htlm+css page with some information about the software and some links.



>I unironically think we should bring Gopher back.

Agreed. Either that or BBS over ssh.

Telnet was a joy, in simpler times.



>lets turn the desktop into an app store where every website has it's own app!

How about you kill yourself



>lets write software instead of websites



File: 5fd143e2beb4a7c⋯.png (6.61 KB, 640x400, 8:5, 167980-pimpwars-dos-screen….png)


There's a lot of BBS around, but most of them are telnet. But you could wrap that with ssl, or use ssh also.

Gopher is pretty limited, but it's very useful if you just want to provide a "file dump" type site. It has less overhead than http and is easier to deal with than ftp. It allows a small amount of server-side scripting too. A gopher interface to git/svn/cvs repos would make a lot more sense than all the nasty html+js most "open source" sites use these days. That stuff is pretty ironic since it basically shackles the user to web browsers that need amd64 with gigs of ram and GPU in order to run well, whereas gopher runs fine on a 30-year old computer or the puniest ARM SBC without GPU blob.



>lets turn websites into apps

Yes, horror indeed.


File: 59f66c8afd3b696⋯.png (74.37 KB, 320x180, 16:9, numale_lunduke.png)


I just dislike companies going all in on numale JS frameworks like Angular or React. I feel it all goes against the original vision of hypertext. Just plain old linked documents. Modern websites are nearly impossible to navigate in text browsers too.



are websites not already apps? (but in the browser for some weird reason)



That's because nobody wants to download and install some botnet onto their computer for every website they visit.

The web browser is like a file browser that browses files from the internet instead of your hard drive.



A suckless imageboard would be pretty cool desu.




>download botnet software

>download botnet website(software)

why do you think the 2nd is better than the first?


File: a49e01d0063a885⋯.png (15.46 KB, 560x225, 112:45, Untitled.png)


Try doing this to the former.



File: 99d8a3c9a0f3804⋯.png (563.38 KB, 1024x768, 4:3, cf7bb304e3439b44ecb808789c….png)


Good point. Websites are easier to filter than software.

Though the main problem is that the website/software needs to be filtered in the first place.

For good software that does not need filtering I don't see use in running it in browser, but for botnet software that needs filtering running software on the websites is probably the lesser evil, you are right in this case.


>Why are websites so shit?

It's because of the UNIX philosophy, which encourages programs with their own memory managers because C sucks and UNIX IPC sucks. If we were using Lisp machines, there would be no JVM or JavaScript because Lisp already does all those things better. It would be insulting to make a language like that. The suckless philosophy is even further in the wrong direction than the usual UNIX philosophy. Read this quote about why GNU Emacs is larger than the original PDP-10 version.

>All that lovely rendering code in your browser goes unused as people rewrite it in Javascript, badly; or even in C++ and then compile to web assembly, that’s all the rage now, because why stick with one layer of abstraction when you can involve three?

That's more UNIX bullshit. A lot of languages were made so UNIX weenies don't have to use C, such as C++, sh, awk, Java, Go, Python, JavaScript, Perl, and Ruby. They would all be useless on a Lisp machine. They all have their own memory managers, type systems, and packaging systems and everything beyond C is completely incompatible. On a Lisp machine, there's one GC and one common type system so any program can use any data given to it by any other program. You can pass a function, class, or package between programs on Lisp machines more easily than you can pass strings in these languages on UNIX. Lisp would also be much smaller because there's a lot less duplicated code.

With respect to Emacs, may I remind you that the original
version ran on ITS on a PDP-10, whose address space was 1
moby, i.e. 256 thousand 36-bit words (that's a little over 1
Mbyte). It had plenty of space to contain many large files,
and the actual program was a not-too-large fraction of that

There are many reasons why GNU Emacs is as big as it is
while its original ITS counterpart was much smaller:

- C is a horrible language in which to implement such things
as a Lisp interpreter and an interactive program. In
particular any program that wants to be careful not to crash
(and dump core) in the presence of errors has to become
bloated because it has to check everywhere. A reasonable
condition system would reduce the size of the code.

- Unix is a horrible operating system for which to write an
Emacs-like editor because it does not provide adequate
support for anything except trivial "Hello world" programs.
In particular, there is no standard good way (or even any in
many variants) to control your virtual memory sharing

- Unix presents such a poor interaction environment to users
(the various shells are pitiful) that GNU Emacs has had to
import a lot of the functionality that a minimally adequate
"shell" would provide. Many programmers at TLA never
directly interact with the shell, GNU Emacs IS their shell,
because it is the only adequate choice, and isolates them
from the various Unix (and even OS) variants.



This is some really interesting criticism on current technology. I'd like to see how the web would've ended up looking from a technology standpoint if LISP-Machines would've taken over instead of UNIX.


File: c4cd493272438f9⋯.png (305.74 KB, 640x360, 16:9, smug026.png)


I love how we have all these people from the unix.haters crowd crying about how lisp machines are dead but not a single one of you has the technical expertise to fix this by writing a new lisp OS.



Not bad, but now you have do everything manually, including copy & pasting quotes, post numbers etc.



>Because it is 2018 and people have forgotten how to static HTML.

You're probably young and don't remember the 1990s where all the web designers were using Dreamweaver and Frontpage. Those were the first giant doses of cancer that was injected into the Web.




the web was always shit. it just gradually gets shitter over time, and lots of time has passed. the web is like some stupid corporate idea of a platform like flash. it has no bearing on anything. literally the only good idea in the web is static HTML with hyperlinks, but even that is ghetto as fuck. like how do you even parse that shit. for starters it would be a great step forward if you had some markup format that was actually machine parsable as opposed to some crap with no standard aside from one that nobody follows


this nigga knows some shit


it would have looked like AIDS because LISP is AIDS when you start using libraries from 3000 different people. you really want something with a real type system like SML but for the entire OS



as long as you remember/pol/ is now an edgy GOP forum, r/t_D part deux



So why the fuck doesn't somebody make a modern LISP machine? That one faggot's been sitting on Genera forever, maybe somebody can cattle prod his ass and we can get it released?

For that matter where's my Smalltalk machine?



But turning a well-established communication medium into a dedicated application is a good thing. Look already existing messaging protocols like Email, XMPP or Matrix, are designed run in dedicated clients. Web is indeed a shit way of delivering bulletin boards. Yes, telnet and ssh are shit too.

We need something like slack, but with more asynchronicity in mind without going full usenet. Basically, it is possible to make a simple protocol faster than HTTP that delivers functions from anonymous textboards to full reddit-style megaforums and even comments for blogs. And if you need le perr-to-peer meme, connect it to a proxy that deals with such networks and acts a virtual server for software. On top of that, it would be finally possible to add encryption to forums via Axolotl which is impossible in web without running botnet javascript or third-party addons that are already like dedicated software themselves.


>hurr durr le original hypertext

Implement Xanadu using HTML 4.0 only. Go, on, let us see.

Back in the 80's and early 90's people from warez and demoscene used to share diskmagazines. Every one of them was unique program, and not just a boring txt file or index like gopher. Treat web as works of art with hypertext capabilities.



>So why the fuck doesn't somebody make a modern LISP machine?

Do you write OSes on your spare time? Are you Terry? No matter how simple whatever you're doing is, it's not simple enough if you're not shitting out boiler-plate. And unfortunately people don't generally take solitary hobbies as their main hobby. Otherwise we would have a ton of hobby OSes and software projects lying around.



I threw my hat in, but chances are I won't be drawn.



We actually to have a ton of toy OSes. These toys are nothing more than proof of concept and works of educational learning. Most of these kinds of toys don't advance beyond this.



That doesn't matter if the end result is browsable with plain HTML browser like Lynx or Mosaic. Speaking of which, a lot of webmasters tested their shit in Lynx back then. And for every bozo who used frontpage extensions or dreamweaver, there was another guy doing it by hand and putting "made in vi" button on his site. You don't see that much anymore now. There was also the "any browser" campaign that had a lot of traction. You don't see that stuff anymore because the poz has taken over almost every part of the web. Hell even the javascript back then was mostly used to implement non-essential fluff like animations, but the page still worked without them. And the few pages I remember having trouble with in the late 90's due to hardcoded javascript links (as opposed to using normal href tags) were easily dealt with by looking at "page source" and copy/pasting the javascript link url into browser's url bar. So even the few broken pages back then could be dealt with easily, if you really needed to. These days shit's all obfuscated with url generated at runtime, so no such luck to have an easy workaround. Plus there's just too many sites that do that shit these days, whereas back then most at least gave you a graceful plain html fallback mode if they even used javascript at all.



Sure, but the main complaint in those days was that it was no longer possible for a *human* to know what the code was doing. We've come a long way.



daily reminder that you faggots actually care more about features than if something is a bloated piece of shit or not, which is why you are posting using a 20 million line kernel, on a billions of transistors cpu, with a millions of line web browser, on a shitty php image board.



/tech/ status:


















you didn't have to be so mean anon



this is a complete and total straw man

it's like saying you can't care about the environment because you drive a car



People actually do care more about getting to work than they care about the environment. People don't actually give a big enough shit about the world to change how they act. At best they recycle and post about it on facebook.



Don't defend your worthless shitpost, just kill yourself instead.



You are just like those supposed environmentalists that complain about muh pollution all day while they drive around in their private car, and consume hundreds of gallons water every day with their hot showers. Its all about the bitching and not the cause.



>going full retard

OK anon



bitch all day, glug down that soy milk, never change



I 100% agree with you.



The Lisp systems language was better than JavaScript is now and it had "scripting language" features like closures and hash tables built in, but Lisp machines are not the only thing that's better than UNIX. UNIX "has caused huge redundancies between software packages" by leaving out important features. The duplicate GCs and object systems in scripting languages are a small part of the huge redundancy in UNIX. The UNIX Haters figured all this out more than 20 years ago.


Lisp machines have a CPU designed to run Lisp with tag checking built in to speed up dynamic typing and GC. It's slower and needs a lot more code on different hardware. Remaking a Lisp machine is hard to do because we're not using the right hardware.



It would work with SML and Smalltalk.

This explains how the Lisp machines work.


Once one strips away the cryptology, the issue is control.
UNIX is an operating system that offers the promise of
ultimate user control (ie: no OS engineer's going to take
<feature> away from ME!), which was a good thing in its
infancy, less good now, where the idiom has caused huge
redundancies between software packages. How many B*Tree
packages do we NEED? I think that I learned factoring in
high school; and that certain file idioms are agreed to in
the industry as Good Ideas. So why not support certain
common denominators in the OS?

Just because you CAN do something in user programs does not
mean it's a terribly good idea to enforce it as policy. If
society ran the same way UNIX does, everyone who owned a car
would be forced to refine their own gasoline from barrels of


Lisp is too expensive and hardware companies cannot be trusted in this age to create reliable and secure Lisp machines



Because lazy programming.




Because neither HTML, CSS, JavaScript or web in general were designed to be a GUI toolkit/platform.



Welcome to CY+3. Only now we can compute functional languages with bearable performance.



That's fearmongering. Lisp machine microcode could be changed by the user.


They're lazy because the systems programmers are lazy. Nobody cares if things work when the OS doesn't work. That way of thinking goes all the way from the kernel to the shell scripts.


You're missing the point. The web has huge redundancy and bloat because it does everything the OS does (or should do). Every programming language has to do the same things from scratch because the OS doesn't do it. Lisp machines have hash tables, closures, bignums, strings, arrays, exceptions, OOP, streams, and GC all there already and it's standard and every program can use it. You can pass a closure or hash table from one program to another without any kind of serialization or other bullshit. A Lisp machine has a systems language with the high level of scripting languages. I don't think everything should use GC and most OSes that I consider good don't have GC, but on the desktop we're using Python and JS so we already depend on the GC. Lisp machines are just taking away the bloat and redundancy that we don't need and making these features more powerful.

    Up until recently, we owned everything from the
hardware to the microcode to the applications. We
could fix anything that broke at any level; we could
evolve wonderful new systems. How do we "fix" the X11
releases or the SMTP protocol or SunRPC??

In my opinion, things got the way they are because
market forces completely overwhelmed technological
forces. Because UNIX was free (or nominally licensed)
it came into wide use, first in CS and EE departments
and later in the world. To some, moving from MS-DOS or
worse, it seemed like a win. To those of us who have
been around for a while and are aware of the
alternatives, it seemed like a nightmare. We thought
it would go away when users came to their senses. We
were naive. Sigh. Meanwhile, thanks to BSD, UNIX grew
like Topsy, or more like barnacles encrusting a sunken
ship. Ultimately, UNIX began to be viewed by decision
makers who were not technically competent as a panacea
for competing technologies.



>And the few pages I remember having trouble with in the late 90's due to hardcoded javascript links (as opposed to using normal href tags) were easily dealt with by looking at "page source" and copy/pasting the javascript link url into browser's url bar. So even the few broken pages back then could be dealt with easily, if you really needed to.

What I've been working on is a bunch of Greasemonkey scripts that fix all these sorts of javascript pages (greasemonkey scripts will work even with javascript.enabled = 0). For instance, I have one that removes all opacity=0, hidden=true, display:none from every image on the page, to deal with pages such as these: http://www.brokenbrowser.com/revealing-the-content-of-the-address-bar-ie/

Another one detects youtube embeds on the page (iframes which I block with umatrix) and converts them to links to the video. Another one does the same thing with embed.ly frames, converting them to an img tag. I have other site-specific ones that convert redirect tracking links into direct links. One script I'm planning on making will create a video tag using the direct media source from youtube-dl. I'm going to transpile it from python to javascript. I would have done it already, but I ran into trouble because the functions in youtube-dl for retrieving the media url are too tightly coupled with the rest of the program and the files containing them include modules or files that contain modules that can't be transpiled or that are inappropriate in a browser context (such as the os module.) So I have to write a script to crawl the import tree to retrieve just the necessary code for these functions. But hopefully when it's done it will mean all the video sites youtube-dl supports can be used without javascript. I'll probably clean these up and start putting them up on gitgud or something.



Because cucks are coders now.

Double Trips.




hahahahaha, you think the commie MIT crew is new?



Web pages weren't as shiny before Web 2.0, but they were functional and they flew. If you load a website with Javascript disabled and it's a blank slate, you're a shit web developer and part of the problem. Web browsers should be simple programs, and they were at one point in time. The majority of what I see out there is like looking at 400lbs whales that they show on TLC and people are happy and patting themselves on the back for needing 2GB to display a page.




>rendering post replies


>adding second, third file upload forms

You can upload multiple files with the file input

>lightbox image previews


>post deletion

not implemented yet

>and moderation





not implemented yet


on the form with JS, on a separate page without JS



Can't you make a Lisp machine with FPGA board?




Choose one and only one



>on a separate page

Absolute state of niggerware in current year.

Is it so hard to put captcha iframe right under the reply box?

>written in Django

oh my. here goes the blazing speed



>javascript isn't a programming language



>everyone who owned a car

would be forced to refine their own gasoline from barrels of


That would be an improvement. It would make the actual car operators masters of their craft and weed out the retards on the roads.



Because that breaks the workflow. The captcha is an extension, it cannot add iframes to the post form without JS.


JavaScript ruined everything



We had websites made in Flash and Java before js/html5 took over.



Java wasn't used much, and the flash stuff wasn't nearly as widespread as the JS/HTML5 crap today. If a site used that shit, I could pretty much just write it off and not care. I even used non-JS browser for my credit union and ebay around 1999-2003. I would turn off JS in Netscape anyway because it crashed so much with that on. But I could pretty much browse most places in Lynx those days and it was comfy.



Dreamweaver had some pretty neat ways for generating clean static webpages with templates though. Also Dreamweaver was a 'dream' compared to anything frontpage rendered.



>browse most places in Lynx those days and it was comfy.

Sounds very suspicious, like something a terrorist would do. I'm going to have to bring you into the station for a chat.




>Desktop App

>In Standard HTML+CSS

Basically a web browser - javascript - urlbar for each app. Servers will serve this standard html to your 'desktop app' via http b/c its standard and wide spread. What should we call this desktop app? I am going to call it a 'single purpose webbrowser', but you can just call it electron.

[Return][Go to top][Catalog][Nerve Center][Cancer][Post a Reply]
Delete Post [ ]
[ / / / / / / / / / / / / / ] [ dir / animu / asmr / rel / strek / sw / travis2k / vore / zoo ]