
⛑ How can I help?
TL; DR – if you want to make people’s lives better through technology, ask yourself three questions:
1/ Are you helping someone who really needs it?
2/ Would they prefer to be left alone?
3/ If you want to help, how can you make sure you’re not muscling in against the values they’ve built up for themselves?
1/ Are you helping someone who really needs it?
A couple of years ago, I was in Kenya working with an amazing team building ‘smart’ vending machines dispensing contraception.
We were trying to solve for people, particularly women, not being able to get condoms or morning after pills when they needed them. It’s a huge problem everywhere, but particularly in remote parts of the country. The vending machines came with a 3G/4G connection, so they could send data when they were broken or out of stock. The automated machines + internet connection combined to make contraception more accessible.
The first question we asked ourselves: did the women we were trying to help really need it?
It’s easy to assume someone needs your help. And it can be patronising too. Especially when you also assume your superiority vis-a-vis the person you’re helping. It’s easy to do; and it happens a lot. Especially in the international development sector, where helping ‘developing countries’ to ‘develop’ is literally baked into the sector’s language. It also happens in the tech world, where product gets shipped globally with a sense of ‘knowing what’s best’ for everyone, everywhere.
In Kenya, we wanted to know people needed our help, before coming in. We tried to make sure in three ways.
We asked if people wanted our help. And not in a leading, please don’t turn us down kind of way. Working with a local partner, we asked women what their experiences buying contraception were like today. We heard about how they walked past pharmacies several times before going in, to make sure nobody they knew was in there. We also made sure that Swahili-speaking women carried out the interviews. They could pick up on the small cues, the hidden references, the unspoken words between what was said. And they were much more likely to get honest answers.
It was only after going deep on their lived experience, that we asked the women if our vending machines might help. We learnt they might, if they were both accessible and hidden from public view, because of the stigma that comes with buying contraception. That gave us meaningful feedback – and enough confidence – to move to the next stage. We set up some prototype vending machines, and got direct advice from the women we’d interviewed on how they should look and work.
In pursuit of structural justice, Panathea Lea talks about how true co-creation means investing in it, at each step in the journey. We’ll work in this way even as we enter full on ‘build’ mode. Always asking and finding out honestly if our help is wanted, and always matching what we build to a real, lived need.
We saw data that validated the need. 60% of pregnancies in Kenya are unintended, according to one large 2019 study. What’s more, 60% of those unintended pregnancies are among girls aged 15-19. At that age, a pregnancy can narrow a woman’s agency and trap her in grinding poverty and poor health. For instance, one study from 2013 found that 13,000 Kenyan women drop out of school every year because they are pregnant.
Unintended pregnancies can also be lethal. 1 in 200 births leads to a woman’s death in Kenya. In the UK, it’s 1 in 10,000.
We asked if our work aligned to a universally acknowledged ‘good’. At some level, there are things all people want regardless of culture and disposition. Things like reciprocity, fairness, and group loyalty, according to some anthropologists. You (and your tech idea) are helping to the extent you’re helping someone realise those things in humanity’s common moral code.
In my time working with tech around the world, the closest I’ve come to seeing that something always makes people’s lives better is if it gives them the agency to shape that Iife, and its trajectory. Better access to contraception, via automation and internet connectivity, does this. It gives women more choice, and more freedom. It gives them the power to shape their lives.
It also helps a marginalised group, giving them better access to something that might help them survive or succeed. It fills an equity gap; it makes the world fairer. So, the vending machines help realise some universal values, like fairness and agency.
Having said that, the majority of Kenya is Christian and socially conservative. Many, including many young people, disapprove of contraception. So while we believed we were on the side of some fundamental moral principles, we were also disputing some deeply held values.
That’s important, and it brings me onto my second question¹.
2/ Would they prefer to be left alone?
You might think it’s safe to assume that if something you build solves a problem for someone who really needs it, then it helps them.
This is mostly true. It’s a good that you are helping someone. But it’s not perfect. In some cases, your product can solve a problem, for someone who you’ve worked out really needs it, but it doesn’t truly help them. Leaving them alone would be better.
Here’s why.
Tech products – including our ‘smart’ vending machines – embed a set of values. Introducing the product somewhere means introducing those values. And that means directing people to do as you think they should do, and as you do. This is ethical imperialism. While there are such things as shared values (like the ones we talked about above), it’s dangerous to assume the values embedded in your tech product are shared by the community you’re trying to help.
In Kenya, the team spoke to landlords who refused to put the vending machines on their premises. They spoke to local authorities who refused to allow the vending machines near bus stops. They were anxious about what widespread contraception meant for family values in their home country. They were worried about the social tension the machines might cause. They were unhappy about their own loss of agency, having these new, internet connected, automated vendors on their doorstep.
Stoking these legitimate feelings wasn’t helping the communities we were working in.
And, ultimately, this friction against the values of a place would mean that we wouldn’t truly help the women who needed it either.
We could force our product in, and women might use it, but it wouldn’t be trusted, and so it wouldn’t be sustainable. The machines, and the in-country system needed to stock and maintain them, would break down (even with our 3G/4G powered data flows). People wouldn’t care for them. They wouldn’t become part of the community’s fabric or rhythm. They might even prompt backlash against women, and their right to contraception. They wouldn’t scale, and even if they did, they wouldn’t help anyone in the long run, even if they solved an acute problem in the short term.
Here, a precautionary approach would be to stick to non-interference. To leave people alone, rather than impose on them. In a lot of cases, that’s the most ethical approach.
But what about those who need your help? How do we not turn our backs on them, while navigating our ethical imperialism?
That brings us to our third question.
3/ If you want to help, how can you make sure you’re not muscling in against the values people have built up for themselves?
Weighing up the values at stake, we felt we couldn’t stay away.
To recap: women had asked for our help, and helped design our prototypes. Data backed up the desperate need for accessible contraception, via our ‘smart’ vending machines. Yet, we had to proceed mindful that the values embedded in those machines ran counter to many people’s values in the communities we worked in.
Here’s some ways we proceeded, with caution and with respect.
We started with students. Our first vending machines weren’t in rural Kenya, or in the slums where unintended pregnancies were most common. But to student halls and university campuses.
Here, the conservative values that rubbed up against our product weren’t as prominent. It wasn’t where our help was most needed. Many students could already access contraception by other means, and without stigma. But it was a start. We could scale slowly, while building trust and learning what it takes from both a technical and ethical standpoint.
We stocked the machines with crisps and drinks, as well as condoms and morning after pills. We realised that a vending machine stocked with nothing but contraception screamed blatant disregard for some people’s values. To them, it was a visual affront, crude and blunt. So we placed crisps and drinks at eye level in our machines, with contraception on lower shelves.
We didn’t place our machines in public: Public space reflects a country’s values. We hadn’t yet earnt the right to shape it. So we didn’t put the vending machines in bus shelters, or public streets, or housing estates. We focused instead on private accommodation, and bars and restaurants. It made contraception less accessible, but did make it easier to hide from public view, and less offensive as a result. Again, it was a starting point, from which we could go on without muscling in.
Values evolve. A society’s values, captured now, won’t stay the same. But they evolve slowly, over years and generations. Values that communities have built up over time won’t change overnight. If there’s a theme to these actions, it’s that we moved gently. We didn’t thrust our contraceptive vending machines – and the values embedded in them – into people’s faces. We didn’t move fast, or break things. Moving slowly gave us a shot at acceptance, at fitting into a community’s fabric. It could even lead to evolving the values of those who were anxious, unhappy, or angry at our work.
What made this hard was the urgency of need. We couldn’t optimise for engagement and impact, as we would otherwise. But we could build something that – in the long run – was the best, most sustainable way to help.
¹ Answering all three questions is important. Just asking people might give you false positives, especially if you’re already invested in your idea. People won’t want to hurt your feelings. Thanks to Google, you can also normally find data to back what you want. Universal values, by nature broad, can be retro-fitted into the impact of any technology. But if you can convincingly answer all 3, that’s a good sign.
🎬 Thanks to Rika Goldberg, Camila Mirabal, Jilian Anthony, and Russell Smith for looking at drafts of this post.
🤔 Got thoughts? Don’t keep them to yourself. Email me on asad@asadrahman.io. Let’s figure this out together.
If you enjoyed this, subscribe to get pieces just like it straight to your inbox. One email, towards the middle of each month (and nothing else).
Banner depicts gauze bandage. From Wikimedia Commons, the free media repository.