Developers & TechnologyView profile
The purpose of the Tech For Good series of interviews is to create a platform that showcases and champions companies, products and technologists who are using technology as a force for positive change in the world.
Clark: I’m Clark and I oversee Moonshot’s work with tech platforms. The work is varied but the unifying aim is to make our online public spaces safer places for us all to spend time. That doesn’t just mean identifying harmful content – we also help platforms build safety mechanisms directly into their products – for example, when someone searches for a piece of illegal content, they might get a warning combined with an offer of help. Just because the content has already been removed doesn’t mean the desire to find it has disappeared, so rather than present people with the absence of something (which will likely see them move to another platform), we try to meet that desire with something positive.
Moonshot’s broader portfolio of work covers a wide range of online harms, including violent extremism and gender-based violence, hate speech, disinformation and child sexual exploitation and abuse. We’re unusual in that we offer the full stack of interventions services, from identifying and understanding harmful spaces and behaviours, to reaching both the victims and perpetrators of harms with different offers of help and, where we can, connecting them with real-world support services.
Clark: Back in 2015-16, our co-founders (Vidhya Ramalingam and Ross Frenett) recognised the need to innovate with the same freedom and at the same speed as violent extremist groups. At the time, ISIS were becoming increasingly adept at using online platforms to recruit. Vidhya and Ross set up Moonshot as a private company specifically to give themselves the necessary flexibility and resources to move quickly and disrupt recruitment efforts and the spread of violent extremist propaganda in mainstream online spaces.
That effort led to the creation of the Redirect Method – the use of online advertising to offer people an alternative path. For example, searching for “join [group]”, or “how to make a bomb” would trigger an advert which, if they chose to click, would redirect them to content curated by us. That might be videos featuring former violent extremists debunking the myths on which recruitment narratives depend, or content that recognises the importance of patriotism, community and belonging but rejects the need for violence to achieve those ends.
Since then, we’ve grown from 2 to around 70 people and expanded our work way beyond countering violent extremism. But the fundamental insight – that you can use existing tech and existing content to help people who are otherwise at risk from harm – remains and still informs all of our work.
Clark: Advertising technology is usually designed to reach large, Western audiences with a relatively low degree of precision. A lot of it is also concerned with collecting data on people so that they can be retargeted in future. None of this is helpful when you’re trying to reach niche populations who are sympathising with terrorist groups in multiple languages, none of which are English. We don’t want any of their personal information – all we care about is that their online behaviour indicates they could be at risk and we want to offer them help.
This would be relatively easy if we owned the platforms on which these behaviours took place (and that’s one reason why we adapted to working with, rather than just on, social platforms).
If you don’t own the digital real estate on which dangerous behaviours are taking place, but you have committed yourselves to making those spaces safer in a way that’s both ethical and compliant with human rights, the challenge is this: how do you reach people who are – you think, with a high degree of confidence – at risk from harm, with content and offers of help that you think could be impactful, and measure the impact of that intervention, all while remaining ‘for good’, i.e. without stigmatising them, without creating any backfire effect, and without collecting any personally identifiable information?
Clark: Because today, most tech professionals live in urban areas where making big, meaningful life decisions and settling down has been made prohibitively difficult by people who should know better. Too tired to mount any kind of effective rebellion, we instead take the pressure off ourselves by postponing those decisions until we think we can take them on, which is sometimes never.
This leaves a relatively new but potentially infinite expanse of time in which we look for meaning elsewhere – often from work. And if you’re not working somewhere with even a vaguely positive mission, you’ll probably be doing endless amounts of largely meaningless work purely for the benefit of someone else. This will eventually cause an existential crisis that manifests as a series of important 4am Google searches like, “is the lonely whale okay now” and “how to make money from spoons”.
I think that’s why purpose-driven opportunities are a growing trend.
Clark: ‘Tech for evil’ is easy, but ‘tech for good’ isn’t hard IF platforms co-operate. I honestly believe the ones that go beyond the regulations and make safety-by-design a feature rather than a chore will be the ones who, in the long run, make the most money. This shouldn’t be controversial. Is a city’s most expensive housing in the dangerous area or the trendy, nice area?
As for our target audiences, anyone online who we consider to be at risk either from harm or at risk of perpetrating it is fundamentally no different from us. Even when we’re talking about people who might be at risk of joining a violent extremist movement – but for a few different life events, that could be me or you, so let’s treat them as we’d want to be treated.