🌀 On tech and non binary thinking (v0.1)
In short, resist your innate urge to think in black and white, good and bad. One way to do that is to think in values. When you put a technology out into the world: which values is it realising, and which (equally legitimate) values is it taking away? Think in values, and you think in a non-binary way.
Ethically speaking, hardly anything is either good or bad.
Take the lockdowns imposed around the world in response to COVID. They saved lives, at the cost of people’s jobs and mental health. Did that make them ethical? Well, yes. And no.
That nothing is only good, or only bad, should be almost too obvious to write about. But when we face questions concerning technology in real life, it’s the first thing that gets forgotten, or lost.
One reason is our innate need to judge. We rush to form ethical decisions that frame things in black and while (x is good, y is bad). Then, we cling to those judgements, wanting to be consistent. We stop considering all the information we have, let alone everything we don’t know.
Sure, some things are bad, while others are less bad. A drone shooting an unarmed civilian is worse than a drone delivering life saving medicines. But neither case is straightforward. Are some civilian deaths justified in wars on ‘evil’ regimes? Does a health delivery drone take away local jobs?
And once we know it’s there, how can we systematically dig into this grey space between the good-bad binary?
If you’re trying to reach an ethical decision, considering both the good and the bad is a good start. You could ask: what are the pro’s and con’s? Here, you might make a classic, two column list. Let’s try it for the health delivery drone we mentioned earlier, carrying vaccines and drugs to hard to reach places¹.
|Pro’s – health delivery drone||Con’s – health delivery drone|
|✅ get things from A to B reliably||❌ take away income from local couriers|
|✅ get life-saving drugs to patients quickly||❌ high upfront costs|
This is a great start – we can see there’s some reasons for and against using drones. But this list is still rooted in your point of view. To make it more inclusive, you could instead ask: who benefits from this, and who is harmed? Exercising that empathy muscle.
|Benefits (and for whom)||Harms (and for whom)|
|😇 People in hard to reach places get better (quicker, more reliable) access to life saving medicines||👿 Couriers of vaccines/drugs and their households suffer from losing income, as their job is now being done by a drone|
|😇 Drone companies (predominantly Western) get access to new users and customers||👿 Local drone startups suffer from being ‘crowded out’ in their local market|
|😇 Ministry of Health in the host country gets better health for its citizens||👿 People on drone flight paths suffer an alien, unfamiliar object|
Rather than just you, the list is now considering many of the different players in the system. More voices and points-of-view now have a say. We know who benefits, who’s harmed, and can average these out to get at how good or bad something is.
This has moved us on nicely, and in my experience it’s the end point to most important ethical decisions like this one. It’s summed up by utilitarian development sector phrases like ‘net positive impact’. The ethical thing to do is to give the greatest benefit – or least harm – to the greatest number. Getting this far is a big step, not least because we’ve gone from the interests of who’s powerful, or who’s deciding things, to everyone.
But at the intersection of technology and ethics, there is one level deeper we can go. That is to ask:
What values does this thing realise, and what values does it take away?
|Values realised||Values taken away|
|⚖️ Accessible healthcare||⚖️ Local agency over outside intrusion|
|⚖️ The most effective (quickest, most reliable) solution in a global free market||⚖️ Protecting the most economically marginalised from loss of income|
|⚖️ Peace and calm, based on what’s familiar|
What I love about this question is the word ‘values’. It asks us to talk about the values that underpin benefits and harms.
For instance, we go from benefits to a drone company (new customers and users) to the value of global markets, and prioritising the most effective solution in the global free market. We go from harm to local communities suffering alien objects, to the value of local agency, and the value of peace and calm².
Moving away from benefits and harms is moving away from just thinking about people’s interests, useful though that is. It’s moving us much deeper, towards value systems, and how they play out in a given situation. The values might be political, social, cultural, economic, or some combination. Looking at them helps you to interrogate what it is for something to benefit and harm, to be good or bad. It’s the result of taking a benefit or harm, and asking ‘Why?’.
And – here’s the most important bit – asking “whose values are being realised?” turns everyone into an agent, and reveals questions concerning technology to be always a balancing act of many, equally legitimate values and those who care about them.
Tech introduces foreign, new objects into places and starts new behaviors around them. Not only does it create material benefits and harms, it realises and takes away from sets of values. Once you look at it this way, you realise most situations don’t have a black and white ‘right’ answer based on greatest benefit. Just a value set they calibrate with, and another (equally legitimate) value set they go against. In other words, you’re no longer the all-knowing judge, objectively calculating net positive impact. You are now opting for one set of values, over another. Most likely, you’re opting for the set of values that feel familiar, over those outside your social or cultural context.
So, how to act ethically? I think a good start is to understand and respect all the values at play. Then, it’s to be clear on the values you are realising, and just as clear on the values you are taking away. Is it the most powerful whose values are being realised? What part of the your approach to the tech product or service takes away values, and could it be adapted? Who’s comfortable with the give-get of values, and who’s not?
And, of course – how much is your own value system leading you?
So, to end with a plea for those who build, deploy, or work with tech:
Nothing is either good or bad. Think pro’s & con’s, benefits and harms, and especially values realised and taken away³.
¹ Thinking through ethics in the context of introducing and testing health delivery drones is a journey I’ve gone on myself, in different countries. My good fortune has been having such talented people to help work through it. If you’re reading this, you know who you are 😉.
² How can you hope to arrive at enough understanding of the different values at play? Answer – it’s very difficult, very messy, and takes time. What exactly that means is one for a future piece.
³ There’s so much more to say here. Dear reader, I was in two minds about whether to publish this piece. But I hope, by putting the kernel of an idea into the world, I can spark ideas, reflections and responses in others. To grow together.
🎬 Thanks to Constanza Robles Fumarola for looking at drafts of this post.
🤔 Got thoughts? Don’t keep them to yourself. Email me on firstname.lastname@example.org. Let’s figure this out together.
If you enjoyed this, sign up to get pieces just like it straight to your inbox by clicking the link below. One email, towards the middle of each month (and nothing else).
Banner is Rodin’s ‘The Thinker’ (2007), from Wikimedia Commons, the free media repository