7 Comments

To compete with Apple and Microsoft you'd have to do what Apple does. Build your own hardware.

Expand full comment
author

Yep, if it wasn't clear from what I said at the end, I believe that fully eliminating the monopoly on the home computer will require a new open hardware stack that is drastically simplified and does not require millions of lines of code to realistically support.

Expand full comment
Jul 5, 2022·edited Jul 5, 2022

Well when you do it, you will have a customer here.

What you said about complexity of hardware and software being liked by big corporations, reminded me of something someone told me about government regulations. He said the more complex they are, the more big corporations like them, because they can afford the lawyers and accountants to deal with them, while small businesses can't.

Expand full comment
author

100% agreed. The best friend of large corporation is large government, and vice versa. A large corporation will gladly use a large state to exercise power over competitors and gain unearned advantage in the market, and a large state will gladly use large corporations as the means by which it exerts force. The idea that the state protects the equality of rights for all, irrespective of income, status, class, and many other factors, is the kind of utopian fantastical propaganda that only government schools could invent.

Expand full comment

This was an amazing article, I was planning on writing something on the centralizing effect of unnecessary complexity in software, but I doubt I could write half as good as this. Some projects I've come across that give me hope on fixing this are: George Hotz's tinygrad, the QBE compiler backend and Rasmus Anderssen's Playbit project.

Looking really far out, I believe the barriers to entry in building custom hardware is reducing with the creation of ergonomic HDLs like Amaranth, better tooling like Andreas Oloffson's Silicon Compiler Project (https://github.com/siliconcompiler/siliconcompiler/) creating custom chips is becoming easier and easier, I feel the last major barrier is reducing the cost of chip fabrication. But this is just very far out stuff I hope comes to fruition.

I'd love to subscribe to your substack but I'm broke Nigerian undergrad 😅.

Expand full comment

Great article.

I would add there are 5 social phenomena right now that have caused technology to become highly centralized, beyond the OS. I've been considering fleshing these out into a longer piece.

1. The transition away from ownership.

Right now, popular consensus is that it is too complicated and too risky to manage your own compute stack. Media in particular. The high emotional cost of losing childhood photos, wedding photos, etc etc. Corporate clouds promise redundancy and availability that consumer products cannot match. The Drobo server is one of the few products that tries, but it still far from the dumbed-down, hands-off user experience that big tech has cultivated. Keeping complexity alive in the computing stack, as you note, is how people came to accept this as the only path forward. More broadly, American culture, once obsessed with self-sufficiency, is more than happy to outsource personal liberty in exchange for convenience. I discuss this shift more broadly in this post https://subdivided.substack.com/p/give-me-liberty-or-give-me-lattes

2. Management of computing infrastructure

Back in my childhood, and I'm sure yours, there were no plug-and-play solutions for infrastructure. Building a video game server used to be an exercise in self-reliance. The server code would be distributed freely, in the case with the Valve classics. I learned Linux by running Counter-Strike Source servers, setting up Ventrilo servers, hosting web forums, and so on. Now, all the newest multiplayer games handle all the infrastructure for you. Fortnite, Apex Legends, Valorant, all those new games, they run the servers (because they want to centralize and profit off personalization). Chat and forums is outsorced to Discord. I worry that Minecraft is the final game of that era, which is still relevant, that encourages ownership and agency.

Even the enterprise computing stack is largely centralized on cloud providers. People that would have otherwise learned how to set up a bare-metal networking stack, hypervisors, and manage disks are now, instead, directed to write Terraform -- the most prosaic form of automation, where you simply direct Amazon or Google or Microsoft to provision resources for you.

3. The Appleization of the Computing Experience

Highly curated App Stores are favored to the open Internet. Every edge in computing is now rounded off for a "safe" experience. Indeed, Apple has been able to smuggle anti-user practices under the guise of security. Code signing, for instance, makes distributing an indie application to normal users infinitely more annoying, contributing to (1) and (2). No more chancing it on Limeware or Bittorrent, or all the other things that used to be common a decade ago. Now, your computer cannot hurt you. As a result, you cannot use all of its capabilities. Windows is where everyone that wanted a more holistic computing platform would go. But releases 10 and 11 looks like poor imitations of Apple's design.

4. Creating impenetrable black boxes

The iPad, to me, epitomizes your point of "[..] absurdly complex black boxes produced by major corporations in order to control and dominate the industry." The iPad is hardly a computer. Rather, it is window into something that looks and feels like a computer. Black boxes are good. Indeed, it is arguable that the black box is the ideal form of technology. But when it is made impossible to peel back those layers of abstraction, it withholds learning anything about the device beyond what is exposed to the user. It is a bummer that the iPad is what generation alpha understands as a computer. It is a computing platform which, by and large, is solely used for consumption.

5. The decline in unbridled personalization

1990s and 2000s computing encouraged personalization that is a distant memory. Myspace and Neopets, for instance, encouraged users to learn HTML/JS to express themselves to the fullest of their abilities. Instagram, Twitter, Facebook, on the other hand, are highly constrained interfaces by design. Expression is limited and so is the learning that used to come with it. Personalization was not just an expression of individuality; it was also an avenue for learning and innovation.

Expand full comment

Very nice article

Expand full comment