6 Comments

To compete with Apple and Microsoft you'd have to do what Apple does. Build your own hardware.

Expand full comment

Great article.

I would add there are 5 social phenomena right now that have caused technology to become highly centralized, beyond the OS. I've been considering fleshing these out into a longer piece.

1. The transition away from ownership.

Right now, popular consensus is that it is too complicated and too risky to manage your own compute stack. Media in particular. The high emotional cost of losing childhood photos, wedding photos, etc etc. Corporate clouds promise redundancy and availability that consumer products cannot match. The Drobo server is one of the few products that tries, but it still far from the dumbed-down, hands-off user experience that big tech has cultivated. Keeping complexity alive in the computing stack, as you note, is how people came to accept this as the only path forward. More broadly, American culture, once obsessed with self-sufficiency, is more than happy to outsource personal liberty in exchange for convenience. I discuss this shift more broadly in this post https://subdivided.substack.com/p/give-me-liberty-or-give-me-lattes

2. Management of computing infrastructure

Back in my childhood, and I'm sure yours, there were no plug-and-play solutions for infrastructure. Building a video game server used to be an exercise in self-reliance. The server code would be distributed freely, in the case with the Valve classics. I learned Linux by running Counter-Strike Source servers, setting up Ventrilo servers, hosting web forums, and so on. Now, all the newest multiplayer games handle all the infrastructure for you. Fortnite, Apex Legends, Valorant, all those new games, they run the servers (because they want to centralize and profit off personalization). Chat and forums is outsorced to Discord. I worry that Minecraft is the final game of that era, which is still relevant, that encourages ownership and agency.

Even the enterprise computing stack is largely centralized on cloud providers. People that would have otherwise learned how to set up a bare-metal networking stack, hypervisors, and manage disks are now, instead, directed to write Terraform -- the most prosaic form of automation, where you simply direct Amazon or Google or Microsoft to provision resources for you.

3. The Appleization of the Computing Experience

Highly curated App Stores are favored to the open Internet. Every edge in computing is now rounded off for a "safe" experience. Indeed, Apple has been able to smuggle anti-user practices under the guise of security. Code signing, for instance, makes distributing an indie application to normal users infinitely more annoying, contributing to (1) and (2). No more chancing it on Limeware or Bittorrent, or all the other things that used to be common a decade ago. Now, your computer cannot hurt you. As a result, you cannot use all of its capabilities. Windows is where everyone that wanted a more holistic computing platform would go. But releases 10 and 11 looks like poor imitations of Apple's design.

4. Creating impenetrable black boxes

The iPad, to me, epitomizes your point of "[..] absurdly complex black boxes produced by major corporations in order to control and dominate the industry." The iPad is hardly a computer. Rather, it is window into something that looks and feels like a computer. Black boxes are good. Indeed, it is arguable that the black box is the ideal form of technology. But when it is made impossible to peel back those layers of abstraction, it withholds learning anything about the device beyond what is exposed to the user. It is a bummer that the iPad is what generation alpha understands as a computer. It is a computing platform which, by and large, is solely used for consumption.

5. The decline in unbridled personalization

1990s and 2000s computing encouraged personalization that is a distant memory. Myspace and Neopets, for instance, encouraged users to learn HTML/JS to express themselves to the fullest of their abilities. Instagram, Twitter, Facebook, on the other hand, are highly constrained interfaces by design. Expression is limited and so is the learning that used to come with it. Personalization was not just an expression of individuality; it was also an avenue for learning and innovation.

Expand full comment

Very nice article

Expand full comment