Forgetting What Fast Means

In all of this "progress" we have developers, mired in their workday, compiling and deploying atop all of this modern hardware and bloated software. The increasing time between when you make the change and when you can see it working is still brutally slow. Slow enough to leave room for distraction and loss of flow.

A lot of young developers come into the field with a knowledge that computers are fast, but perhaps not understanding how fast they actually are. Since the scale of human thought to modern computation speeds is impossible to grasp without some reference. FOSS marketing employs a lot of modern advertising techiques. Characterizing something as Blazing fast is common among real and bullshit software projects alike. No one without context can square the advertising promises with real-world, non-trivial examples. Your mileage may very.

So let's come up with a way to understand why stuff is inherantly slower than it could be by design.

[Graph of CPU speed over last 2 decades] [Graph of Harddrive speed] [Graph of frontend load speeds] [Graph of internet transfer speeds] [Graph of framework adoption]

In the past 2 decades, CPU speed has increased ??? times.

If anything, that speed may have bought us wider adoption. There's is something to be said for starting front inside a framework as a beginner. But those technologies for babies now dominate the internet. And they're proliferation is making the software community dumber.

Language within a language

So lets imagine a programming language we use to build a web browser. Let's call it C++. C++ has concepts like loops and variables, static types that compile to actual machine code that's run on a real machine, and is "fast" or as fast as it gets. Now, inside that web browser compiled from fast C++ code, you write another language with the same concepts, like loops and variables and types. But this language did more guessing as to what a type is and if it gets it wrong it tries again, and it automatically manages cleaning up all the memory those variables take, so your computer doesn't crash. And all of this is run in a software abstraction called a virtual machine on top of your actual, fast machine. It does this to create a security boundary. To keep it within a certain range of memory and a certain number of operation system commands. Let's call this language Javascript.

This language has developer convieniences that, the theory goes, allows for more adoption of programming and thus more programs in the world. And that's certainly been the case. We've seen the true cost of a loss in performance is a gain in popularity. Somewhere down the line, it was decided that memory management was the main blocker to productivity amongst new programmers. And types were also too compicated. So we have lots of code around those concepts in what are called "scripting languages."

These convieniences became popular on the server (or system) side as well. Leading to some to wonder if they could write an operating system in their scripting language of choice. Spoiler: they could not. And the reason why not is due to the tradeoffs that those convieniences get you. Is automatic memory management and variable types determined at run time the right convieniences and are they worth the trade offs? The answer after the 30 odd year experiment is probably no, because the downside has no final surface area. The downside is the people who use technology now unable to think in a way fundamental to computing. They can't think in terms of data types or memory management. Let's build software, rest billion dollar companies on that software. What could go wrong?

The Web made it worse

Staring with templating languages on the server side, we added another layer to the mix. No longer happy with munging strings together, we needed to express HTML using another language, this one again with loops and variables and whatever other abstractions they wanted. But that had to be compiled as well and things slowed down again. You're doing work to interpret a template, that's then interpreted by a virtual machine that then runs code finally in C++ or other systems language that doesn't have automatic memory management and dynamic types. Something that can actually talk to a computer.

3 levels of languages, all needing to be built. Shirinking back we decided caching everything would help, Removing the last language parse and replacing it with those strings munged together. Trading off that speed for another layer of complexity. Cache invalidation. And what else could we do? If you start from a scripting language and thus haven't learned about why they're slow (like you're doing now), your imagination of what to do is limited. You reach for complexity instead of rewriting it in something that doesn't throttle on Garbage collection.

Why does it matter?

In game development, the why is a lot clearer. As we code, the amount of stuff a computer has to do before the frame can render goes up. If we code too much, the frame rate goes down. If it goes down too far, the game is lame. In web development, we have a network roundtrip that often obscures the speed of the code the developer has writing on either side of the wire. This "slack" given through years of low expectations in internet speed, gave rise to the idea that a language like Javascript was good enough, for what it was supposed to do. It gave rise to the idea that a language like Perl or PHP was fast enough parsed and run in less time than it takes to get out of a typical network or render to a typical screen.

The time humans want to wait to see the result of our actions seem only to be going down. But generally this idea of under 1 or 2 seconds between action and first paint, or the time it takes for the interfact to update, is where humans land. But of course, that's not true at all, because if we were playing a video game and the man jumped a second after we pushed the button, we'd throw the fucking game out. It matters because as a matter of fun and interaction, fast/snappy user experiences win. If you tie what web based game developers do to what web based marketing and enterprise software expects, well, the game developers are going to suffer the consequences.

So the question of what the web becomes over time is answered by the technology limitations and by the expectations of users who, just want to read their spreadsheet in peace. The window of possiblily on the world's biggest connected network closes before it ever really openned. The window of whether it becomes the slightly better version of Microsoft Word, just another place to fill out forms. Or something more, emotional, for lack of a better word.

Maybe you think the Web is emotional enough and it's true, the limitations of comment systems does lead the human mind down a certain road. But perhaps I'm think about something more akin to what VR and video games brings. Or movies. Something more like what I imagined it could be as a teenager. And for something to be more involved,