We hide complexity to scale utility and expand power. Convenient access to utility is an attractive oversimplification of our lifestyle. In exchange for utility, it pulls us into dependence on things we don’t understand. We revel in the comfort of cultivated ignorance, capable to achieve everything without having to understand anything. Freed from the burden to know pesky details of how something works, we keep expanding what we can do.
What applies in the small to software, applies in the large to capitalism. You don’t need to know how a product available on the market is produced, where it comes from, who supplies it. You just have to pay for it. If something breaks, it’s often more convenient to replace it with another one of its kind than to repair the broken one. And who knows how to repair it anyway?
That’s the pinnacle of technological progress. We keep creating more utility, with more convenient access, forever, indefinitely, because we can stand on the shoulders of giants. And because we don’t want to deal with too many giants standing on shoulders of other giants, we “abstract them away” and just declare the current pair of shoulders the new ground.
As software is eating everything, websites and apps become the primary interfaces for our agency in the world. Most people have no clue what exactly happens when they push the buttons provided. But you only need to know which button to press. We have normalized cluelessness. You don’t need to know. It just works. Trust us. Oh, and don’t forget to pay us for this convenience.
In exchange for ubiquitous convenient utility, we celebrate our ignorance and accept our dependence on hopefully resilient processes and the few experts that still know what’s going on so we don’t have to. For most of us the world just works, mostly, and we don’t know how, because we don’t need to. We feel disconnected, because we can only feel connected to something that makes sense to us. We cannot feel at home in a place we don’t understand. And the modern world becomes more and more difficult to understand.
Those who design and build these systems, don’t feel as clueless. Yet. We are still in control, at least over the systems we helped build, aren’t we? We can see through the user interfaces into the code that makes everything work, can’t we? Except that we can’t possibly know every detail about every system. It’s just too complex. And there’s too many of them. More often than not, even we have to give up and accept the magic workings somebody else has put together for us. We do so regularly when we choose a library or framework as a dependency for our project.
If you have ever taken over a codebase that wasn’t yours, you know what it’s like to feel foreign in a place you are not familiar with. And rarely do we accept everything in that place as we slowly become familiar with it. Things have been done in obviously terrible ways, which we of course would’ve done differently. And as soon as we get a chance to “fix” them, we will do so. Modifying it slowly makes it ours. We start to feel more at home. It makes a little bit more sense to us now. We repaired a little bit of our world and learned a bit about how it works in the process.
As software designers we have taken away the ability to do this from our users. The clue is in the name: Users just use. They’re not supposed to design anything. They have to accept what we build for them. They have to learn to live in the environment we provide to them. If they don’t feel welcome, if they can’t belong, if it doesn’t make sense to them, that’s not really our problem, is it? At least as long as there is no better alternative and they keep paying us.
In a consumerist society, agency to shape our own environment has largely degraded into merely choosing one out of a few environments available that have already been shaped for a target audience. Not for us, but for a group of individuals with similar requirements. Each of these environments on the market has to cater to large audiences to be successful. It needs to be universal. We can’t have something specific that’s perfectly adapted to our needs. Instead, we have to adapt to it.
What a sad state of advanced mechanization we have reached. Computers and algorithms tell us what is possible and allowed, what is visible and accessible, and what is hidden and irrelevant. Even if we have a reasonable request that has just not been considered in the design of the system, if the “computer says no”, we have to accept that. Or we can switch to one of the other available options.
If there is a problem and something doesn’t work, we most likely can’t fix it ourselves. So we call support. And support is treating us as if we are clueless and are “holding it wrong” — which in most cases for most people is most likely correct. And, just like everything else, the support process has been designed to work for the most likely scenarios. Maybe support passes our problem on to the capable few who understand just enough about the system to make changes — but will they ever get to it? Is our individual request that important?
Scalable convenient utility means ordinary, mass-produced, replaceable products, that only ever fit our needs mostly, but never completely. Personal interactions turn into impersonal transactions. Such a world is functional and perhaps highly efficient. But we cannot truly connect to it. Fewer things in such a world are unique and meaningful to us, most of them are generic and replaceable. Most of it does not make sense to us. We feel foreign, unwelcome, alienated, looking for meaning where there is none.
This world is our home. But we don’t feel at home.
“But, look, all that utility and convenience we have achieved this way!”
Progress!
“Are you afraid that someday machines will become just like us?”
“I’m afraid that someday we realize that we have become just like them.”
If you liked this post, you might also like:
Mind or machine? Who cares?
Are machines becoming more like us? Will they take over soon? Before we get too worried, let’s look at what separates minds from machines.Does generative AI make our lives simpler?
We are more productive than ever, but we are also more dependent than ever on the technologies that make us so productive.A short parable about technology
How many programmers do you need to change a lightbulb?*
Mirror of the Self is a series of essays investigating the connection between creators and their creations, trying to understand the process of crafting beautiful objects, products, and art.
Using recent works of cognitive scientist John Vervaeke and design theorist Christopher Alexander, we embark on a journey to find out what enables us to create meaningful things that inspire awe and wonder in the people that know, use, and love them.
Series: Mirror of the Self • On simplicity… • Voices on software design
Presentations: Finding Meaning in The Nature of Order
“We only need to fear being replaced by robots, if we live like robots.”
“The costs to the environment, people and communities that we have externalised and that are neither visible nor accounted for might become more tangible once we learn to make and repair some things for ourselves again. These endeavours become even more empowering if they’re done in community. Total self-sufficiency is hard, and to me also not desirable. The exchange of knowledge, skills, and goods is a vital element of a more equitable and fulfilling future.”
https://cityquitters.substack.com/p/the-home-as-a-place-of-production