Building Infrastructure in the Enshittocene

This post provides a big picture context for other future posts and will evolve. An occasional post full of rants should be fine.

We are in a never ending fight against the second law of thermodynamics, the natural tendency of increasing entropy and disorder in a closed system. The corporate world has its own tendency of Enshittification, where something enriching and positive to people evolves into something destructive and unpleasant. Power and greed taking the place of dreams and real innovation. Humanity today, circa year 2025, is entering a next geological epoch dominated by human-caused destruction and an unpleasant environment for life, the Enshittocene [1], or the even less flattering Anthropocene Epoch, essentially the age of human-caused destruction.

Stepping back for now from the real global problem. How can we ever build and keep nice small things, such as this blog? What are the key principles to build things using software that can last longer and are more resilient? How to fight impermanence?

Oldest known cave painting of 40,000 years ago at Petta-kere, South Sulawesi. Photo by Sanjay P.K. under CC license.


The following are considerations related to impermanence.

Informed Abstraction

Abstraction is powerful but also can make systems more fragile and bloated and encourage ignorance about inner workings.

Questions to ask: Do I understand at least the general architecture and basic processes running behind the scene? Can I get more information easily? Who controls the project and technology? Are there ways to reduce my reliance or even eliminate this dependency?

Avoid Lock-In

All kinds of lock-in contribute to Enshittification, and lead to impermanence of solutions and ultimately happiness. Convenient is nice, but it’s not worth it in the end. Always have a strategy to escape.

Questions to ask: Do I own my data? How easy and costly might it be to get my data out? Do I really own my hardware? How quickly can I adapt if needed? Can I take measures to increase control? Is it really worth the trouble?

Avoid License or Patent Risk

Nothing can be more disruptive and costly than licensing and patent issues. Unfortunately, it is even a risk for selfless non-profit activities. Checking on licensing and copyright is a good habit.

Questions to ask: Does my work rely on patents, licenses, copyright? Who owns the patents? What do the licenses allow me to do? What are my responsibilities? Is it really worth it?

Solutions and Practices

The following are smaller or bigger elements of solutions or helpful starting points for acting on the considerations in the previous section.

Prefer technologies with good documentation

Documentation should really start with at least one diagram of the architecture, list clear differentiation from similar technologies, be extensive, and be up-to-date.

Free software licenses

It is important to find out ownership of a technology or project first, the licensing policies for open-source contributions, and the track record of the project’s leadership or owner. For example, some companies take advantage of open source without giving back or stopping giving back to the community. Strongly consider using a community fork. Avoid OSS projects requiring an CLA (Contributor License Agreement).

Some examples: Debian over Ubuntu, Rocky Linux over RHEL/CentOS, Codeberg over Github, Forgejo hard fork over Gitea Enterprise (and that over Gogs), Incus hard fork over LXD. OpenTofu over Terraform. The list goes on and on.

It’s tough to stay ahead of each Enshittification wave, but it helps to avoid corporations and to keep transitioning to the latest forks in a timely manner while the transition remains as seamless as possible in the early days.

Follow Hobbyists, Academia and Retirees

Hobbyists, academics and retirees are people who are more likely to be spending their efforts driven by an interest in progress and contribution. Unlikely to be forever, but better than entities seeking fame and profit. Groups of such people are even better for continuity.

Everything under Version Control

Everything should be under version control. Version control preserves the value of all efforts made over time. Version control avoids reinventing the wheel and enables automation. And of course, it is a great prevention for losing work.


Applications should be designed following the idea of Local-First [2]. Keeping things close to the user and always under control of the user seems what we used to have before the internet and the cloud. The Internet and the cloud can support that by supporting asynchronous syncing of data and offloading of computing capabilities in the cloud without making that a point of failure. The rise of the cloud led to the extreme of doing everything in the cloud, which was nice but which came at the great cost of loss of control. We can do much better now using both local and cloud. Here is a website going into greater depth.

Globally Distributed Computing and Storage

Storing data locally must always be accompanied by redundancy and backups. The same applies, perhaps less critically, to individual users, to computing. Both require rather complex architectures using conventional technology, such as servers and the Internet. Blockchain technologies are an example of a new approach for both computing and storage, but only for a very specific application. IPFS protocol aims to enable distribution of storage globally. MAID SAFE Network is an even more ambitious project to distribute storage guaranteeing security (anonymity, encryption, reliability) perpetuity of data, and prevention of any form of censorship by default. Longer term, these technologies will replace the cloud and servers as we know it. Decentralized compute seems less advanced but interesting projects are popping up such as Fluence.

Take Ownership of Proprietary Hardware at Receipt

Purchasing free hardware, and chips in particular, is getting increasingly difficult. A recent example is Raspberry Pi. With all the appearance of transparency, the schematics are incomplete, which appears to be because of certain policies of chip maker Broadcom, and even then not a priority as we have seen with the Pi 4’s initial USB-C design, for instance. Another is Brother pushing a firmware update to printers to ensure the printer only recognizes more recent ink cartridges. My old cartridges with plenty of ink in them became useless overnight. Disable Brother “Firmware Auto Check”, but become more informed about any true security updates.

Download everything you can get from support sites at the time of purchase. Software, documentation, repair manuals. Make a backup of the firmware if possible. (This may not always be possible.) Try to keep a downgrade path open. Disable automatic updates and perform them manually. Limit other conveniences offered by the vendor that might take control away or lead to more data on the vendor’s servers. If you failed, you could your luck with the Wayback Machine to travel back in time.


An enshittified SEO plugin tells me to do it, so let’s add an annoying “so there you have it:” Now more than ever, it is important to spend a significant amount of time on planning before spending time on implementation. The same applies to purchases of services and hardware. The considerations and solutions should help with that, to build and keep nice things with minimum maintenance and cost over time. Be ready for the Enshittocene.



2. Martin Kleppmann, Adam Wiggins, Peter van Hardenberg, and Mark McGranaghan. Local-first software: you own your data, in spite of the cloud. 2019 ACM SIGPLAN International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software (Onward!), October 2019, pages 154–178.

update: 20240300, 20240403, 20240420

Leave a Reply

Your email address will not be published. Required fields are marked *