Kinzen’s CEO Mark Little on the fight against disinformation

In this disinformation age, troll farms, fake news, and hate speech run amok, and the truth is harder and harder to spot. Can content moderation help us protect online communities from its consequences?

Fake news isn’t a recent phenomenon, but social media platforms have provided the perfect echo chamber for them to fester and spread at an unprecedented magnitude and with very real-life consequences. But traditional fact-checking and automated content filters are no match for the power behind disinformation and misinformation. It’s going to take a whole new solution to tackle the problem, and Mark Little is one of the people leading the charge.

Mark is an award-winning journalist and digital media innovator with a career in broadcast news that spans over 20 years – from his very first report on a prison riot in Dublin, to his award-winning reporting from war-torn Afghanistan. In 2010, after noticing how young people were using social media to report what was happening, he founded Storyful, the world’s first social media newswire that combined a traditional news agency with authenticated user-generated content. In 2015, after selling Storyful to News Corp, Mark joined Twitter as the Vice President of media and partnerships for Twitter in Europe.

A year later, Mark resigned from his position as he realized a new threat was emerging – what had begun as a democratic wave of social media awakening was being corrupted. The algorithms and business models behind these platforms were being weaponized to produce and disseminate propaganda and conspiracy theories. Together with Áine Kerr, a fellow journalist who, at the time, was managing global journalism partnerships for Facebook, he started working on a response to that problem, and Kinzen was born.

“To tackle the challenge, they’re using a combination of machine learning and human analysis that can scale the response to a global level”

Ever since, they’ve been working to get ahead of this threat and protect online communities from dangerous misinformation and hate speech that creates real-world harm. To tackle the challenge, they’re using a combination of machine learning and human analysis that can scale the response to a global level. Mark remains an optimist about the democratic potential of social media networks, but he’s the first to admit it desperately needs a redesign. And that’s where more precise content moderation comes in.

In today’s episode, we sat down with Mark to talk about the evolution of journalism, the rise of misinformation, and what we can do to protect online communities from harmful content.

Here are some of our favorite takeaways from the conversation:

  • Failure is built into the process of starting your own business. The secret of being a great entrepreneur is not survival – it’s resilience.
  • In the past couple of years, we’ve seen the rise of “algospeak”, which happens when online communities change certain words to avoid being flagged by content moderation algorithms.
  • Mark doesn’t advocate for legislation banning disinformation but rather more precise content moderation that detects harmful content while allowing maximum freedom of speech.
  • Over the next few years, Mark believes more platforms will try to decentralize power for people to set their own filters for what they want to see online.
  • By combining human analysis and machine learning, they can detect things like irony, slang, or “algospeak” and scale it at a global level.

If you enjoy our discussion, check out more episodes of our podcast. You can follow on iTunes, Spotify, YouTube or grab the RSS feed in your player of choice. What follows is a lightly edited transcript of the episode.


Political awakening

Liam Geraghty: Mark, thank you so much for joining us. You’re very welcome to the show.

Mark Little: My pleasure, Liam. Thanks for having me.

Liam: You’ve had a remarkable journey to this point before founding Kinzen. Where did your interest in journalism come from?

“I was obsessed with how the world worked and why certain people seemed to see around the corner to what was going to happen next”

Mark: Well, I was one of those very precocious kids. At the age of maybe six or seven, I used to fight for The Irish Times in the morning with my father. And at about nine or 10, I realized I never had enough natural talent to be my true passion, which was to be center forward for Liverpool. And basically, the thing I remember falling back on was what someone, I suppose my religion teacher, pointed out to me in a report card when I was about 14 years old. He said I was prematurely cynical, had a fierce curiosity about the world, a certain skepticism, and a passion for change.

I was obsessed with how the world worked and why certain people seemed to see around the corner to what was going to happen next in politics or business. And that was the kernel that began with journalism and kind of stayed with me throughout my career.

Liam: Was politics big in your family?

Mark: Absolutely. It was front and center. Growing up in the 1970s and 1980s, if you were not interested in politics, you weren’t aware or awake. We were living in Ireland right then, which was still dominated by a very regressive church-dominated society. We were still the poor people of Europe. Immigration was at a record high; unemployment was at a record high. And looking abroad, when I was growing up as a student activist in the eighties, I mean… Everything was going on. All over the world, it felt like there were tectonic shifts in everything that was going on.

Everything felt so consequential to the point where nuclear apocalypse was something that I thought about deeply through the ’80s until the fall of the Berlin Wall. That’s just to give you a sense of how consequential it felt to be alive in a period of huge anxiety. But if you were politically minded, of huge excitement and challenge, as well.

Social unrest in London

Liam: And did I read right that your first gig was in the advertising department of the Communist Party’s magazine in the UK?

Mark: Yeah, it was a funny thing. Back then, there were a lot of people on the left who were very much into culture and were influenced by what was happening with Glasnost and Perestroika. There was this group of people who would’ve been called the new left or the Euro communists, and I was fascinated by that magazine. It was called Marxism Today.

I was working in a car park and at McDonald’s, and I just happened to chance my arm and say, “Hey, any guys going in Marxism Today?” And it turned out that one of the jobs was not even selling the advertising – it was collecting the advertising. I’d have to ring up granola producers and people producing futons and advertising holidays in Bulgaria and slightly threaten them, demanding the money they had pledged to pay for advertising. That was summer in London in ’87. I got to see Red Wedge and Billy Bragg. The miners’ strike had just ended, and there was a real sense that Thatcher was the biggest person on the international scene.

“Politics was in everything at that time. It kind of felt like the oxygen that surrounded us”

Living in London in ’87 and being part of a left movement – not necessarily a communist party, which was kind of on the way out – was a real eye-opener because I was a student radical. I got to be in places like the Trades Union Congress and meet some of the cutting-edge people who really had created change in Britain. I also learned how to flip burgers and cook the perfect chicken McNugget.

That same summer in ’87 was when the IRA campaign was at its height. I worked in this car park, and one of my jobs was to drive around, at about 4:00 or 5:00 in the morning, the Long Term Car Park to check if there were cars there for more than a week because they feared the IRA were going to plant a bomb there. And so there I was, an Irish young fellow and Pakistani colleague, and the both of us were responsible for reporting, to the special branches that came in the morning, what cars had been there too long. Again, just a reminder that politics was in everything at that time. It kind of felt like the oxygen that surrounded us.

“I was so young that, when I went on to get my big job in Washington, I was told to maybe grow a mustache or wear blue, which apparently makes you look older”

Liam: Wow, that’s nuts. All of that makes sense in terms of your career in journalism and moving to the National Broadcaster. I mean, you pretty much went to work for the National Broadcaster in Ireland straight out of college.

Mark: I was so young that when I went on to get my big job in Washington, I was told to maybe grow a mustache or wear blue, which apparently makes you look older. Coming out of college, I realized I didn’t want to be in politics. I wasn’t particularly ideological at the end of the day. I wasn’t partisan. And I was, as I say, fascinated by change. I went to DCU to do a journalism course, and before that course even finished, RTE was advertising for people to join them, so I did.

Within about 24 hours, my first report was of a prison riot out of Phibsborough, North Dublin. I was on top of a roof and had to broadcast the 6:00 o’clock news on a mobile phone, which, believe me, was a massive technological innovation at the time. I was so bad that when I came back to the office, my boss said, “Don’t listen back to that. That won’t be inspiring you for the future.” But at that stage, being given the chance to start a report on big issues was a dream come true.

A new wave of journalism

Liam: Your career in journalism spanned almost 20 years, am I right?

Mark: That’s right.

Liam: What prompted you to move out of journalism after a hugely successful career, being a Washington correspondent and presenting Prime Time, one of the big shows in Ireland?

Mark: Well, I started to realize that things had changed, to use an old phrase. The means of production of journalism and news had changed from being a person like me, the man on the telly, standing in war zones. I remember being in Kandahar, in the south of Afghanistan, and it was very frustrating because I was sitting there listening to somebody interpreting for me what was going on on the ground. And suddenly, I realized that the old golden age of journalism was very undemocratic. It was people like me, the gatekeepers, telling people at home who sat down at a certain time at night to listen to me, the man on the telly, telling them what was true, what was real.

“What would happen if we could combine old-fashioned storytelling, truth-telling, and journalism with this revolutionary new democratic awakening on these platforms?”

At the same time, I saw Twitter and YouTube emerging, and I remember it was a protested election in Iran in 2009. Every foreign correspondent has a story that gets under their skin – Iran was my story. I remember watching 17 or 18-year-olds using social media to deliver the most visceral, authentic reporting of what was going on and thinking, “Oh my God, this is going to change everything.”

Most contemporaries of mine were scared and fearful of this democratic revolution. But I saw this opportunity. What would happen if we could combine old-fashioned storytelling, truth-telling, and journalism with this revolutionary new democratic awakening on these platforms? I looked forward 25 years and thought, “If I don’t do this now, I will regret this for the rest of my life.” You have to think about what the Talmud, the great old Jewish religious text, says, “If not me, who? If not now, when?” I had that moment and there was no way back.

Liam: So you went off and created Storyful. I love the great tagline from Storyful, “news from the noise of social media.” What was it like going from journalism to founding a business?

Mark: It was like watching the water in winter. You think, “Wouldn’t it be great to go for a swim?” And then you jump in and you are paralyzed by the cold. We couldn’t raise a year into the venture with Storyful. I thought we were going out of business. It was brutal. I had staked everything on this venture – my reputation, all my money, and it looked like it was going out of business.

“The secret of being a great entrepreneur is not survival. It’s resilience. And failure is built into the model”

I remember a Christmas Eve, driving down to see my family and feeling the weight of the world on me, and I realized my worst thought, “We’ll go out of business, but I’ll get a job and recover from that.” And I learned that when you confront your worst fear, particularly in an early startup, it will never haunt you again because you’ve confronted it. That’s the worst that can happen.

I started to realize that, as a journalist, I was about survival. I had to be a war correspondent. I remembered days when I could work at the fraction of whether I’d be killed or wounded, but I had never thought about resilience, about what happens when you’ve got to get up every day and it’s hard. And that was the big difference. The secret of being a great entrepreneur is not survival. It’s resilience. And failure is built into the model – it’s something you have to put up with. It was a real change in mindset. There were many similarities, but there was a big change in mindset required.

Liam: What was it like? Because, as you said, a lot of journalists were kind of fearful of it. But this is an entirely new form of journalism, and we probably didn’t know it at the time.

“We started developing a collaboration with the people on the ground that were eyewitnesses to history”

Mark: Well, we could kind of see it happening in practice while people were trying to work it in theory. For example, during the Arab uprisings, which started in 2010 in Tunisia and went through Egypt and Syria, we were seeing, in our tiny startup in Ireland, democratic activists on the ground trying to get the story out from places like Aleppo. They started to realize that we were watching them and would do things like tilt up the camera to show a minaret, which would help us geolocate that image. They would help us by putting up newspapers and telling us what day it was and what location they were in. And we started developing a collaboration with the people on the ground that were eyewitnesses to history.

And, of course, we were bringing journalistic rigor. When Osama bin Laden was killed in Pakistan, we were immediately in with satellite imagery to analyze the nature of the helicopter that had crashed and landed in the compound. We could direct journalists to the physical location because it was incorrectly ascribed.

This new form of open source journalism was democratic because our primary sources were people right there on the ground and not other journalists. But at the same time, it had the same rigor of investigative journalism we had on accountability. It didn’t emerge by accident, but we certainly didn’t plan to create a new form of journalism. It evolved organically from this awakening that was the dominant force in the first wave of social media. Today, many of the Storyful alumni are working in big news organizations like the New York Times, CNN, the BBC, or the Washington Post, and they’re bringing that new form of journalism that we were part of with Storyful.

Weaponization of social media

Liam: In 2013, you sold Storyful to News Corp, moved to New York to help with the transition, but eventually came back to Dublin to take on the role of managing director in Twitter, Dublin. Were you kind of longing for the next big thing to do off the back of Storyful?

Mark: Not really. I mean, when you think about it, I’d had a career working for Marxism Today, Rupert Murdoch, Jack Dorsey, and the Irish taxpayer. I had a pretty good range from, at least, an ideological perspective. And I come from Storyful really fascinated by Twitter. I had fallen in love with Twitter, it had changed my life, and I wanted to get into the heart of the machine. I didn’t have a desire to do another startup. In fact, I asked another contemporary of mine, another media startup founder, for permission not to do it again.

At that stage, Twitter was a fairly large corporation. I wanted to see if I could help bring some change and energy into the business of Twitter. That’s why I chose to do that. It was a real chance to get inside a platform that had changed everything in my business to see if I could have an impact. It was only by accident that it meant coming from New York, where I was very happy, back to Dublin, which happened to be the international headquarters. But I wanted to take a break from the startup life and see what it would look like if I were inside a big corporation.

“I had seen many of the problems we had spotted in the Arab uprisings, where people were using social platforms not as a democratic tool but as a weapon”

Liam: Where did Kinzen come out of? It seems to be born out of the flames of Storyful, to a certain extent.

Mark: It was born from a frustration with that big corporation called Twitter. I loved working there, but it wasn’t particularly well run. And in the end, the media partnership team basically got ridden of. I could have stayed on in a nice corporate job, but meanwhile, it was 2016 and the US presidential election had just happened. I had seen many of the problems we had spotted in their earliest incarnation in the Arab uprisings, where people were using social platforms not as a democratic tool but as a weapon. They were using the virality of video on places like YouTube and Twitter to engineer false stories that were either propaganda or conspiracy theories.

The first wave of the internet was a democratic awakening. And then, when 2016 happened, I realized, “Holy shit! This is being turned into a weapon.” Not just because of Donald Trump – there were deeper issues where, suddenly, the virality, the business model, and the algorithms were being hijacked by people opposed to democracy. That was the birthplace and the idea that led me to go back to Áine Kerr, who was my most trusted colleague, who was at Facebook at the time, and say, “What if we did something to give power back to citizens, to allow them to protect themselves against this emerging threat?” And we both jumped.

We set out, to begin with, to give people a newsfeed they could control, but obviously, as we moved on, like every startup, the idea evolved. The first wave of democracy was being replaced by a new dark force on the internet and that was the inspiration behind Kinzen.

Liam: For listeners who don’t know, what is Kinzen? Who is the user of Kinzen?

Mark: We help big tech platforms and emerging platforms protect the world’s conversations from information risks. And by that, we mean dangerous misinformation that creates real-world harm, organized disinformation, and hateful and violent speech. Our customers are trusted safety professionals, policy professionals, and the people inside these companies desperately trying to get ahead of these information threats and risks instead of reactive fact-checking.

“We’re scaling the human solution to this particular information crisis”

We’re using a combination of good old-fashioned human analysis and the later stages of machine learning to solve the wicked problem that these platforms face. These are, number one, that humans can’t scale to match the problem, and that machines don’t have the insight to detect these information risks in multiple languages and multiple different formats. That’s the problem Kinzen is trying to solve. We’re scaling the human solution to this particular information crisis.

Liam: So, you’re kind of taking those editorial skills from Storyful and coding them into the machines to give them those values.

Mark: Exactly. We follow a human and the loop machine learning approach, which has only really become possible in the last couple of years since we’ve had access to these big language models. Our analysts are creating machine-readable data in a variety of languages, which has been pushed into the machine. The machine is transcribing, translating, and trying to understand, and the human data is helping it learn quicker. It’s a beautiful feedback loop between a small expert group and really advanced machine learning systems that are exponentially greater in their capacity now than they were even four years ago.

Curbing misinformation

Liam: Just how big is the scale of misinformation at the moment?

“The key issue right now is not whether people are saying the wrong thing on the internet; it’s not about disputes between people over politics; it’s not even about Donald Trump or what he gets on Twitter”

Mark: Well, I think what’s happening right now is it’s getting worse before it gets better. Not everybody online is going to see it, but what’s happening in the places where misinformation and disinformation are particularly problematic is in places where it’s life or death. Right now, in India, we see almost genocide levels of rhetoric coming from supporters of the government toward Muslims. We’re seeing organized extremists, far-right groups, and neo-Nazis in Europe using the virality of these platforms to spread their message.

And obviously, we’re seeing it in conspiracy theories around health topics, not just COVID, but people trying to promote conspiracy thinking in mainstream conversations. The key issue right now is not whether people are saying the wrong thing on the internet; it’s not about disputes between people over politics; it’s not even about Donald Trump or what he gets on Twitter. What we are looking at is the multitude of languages and threats where there is actual real-world harm and possibly life-or-death situations.

We have currently 13 languages, and soon we’ll have 26. It’s in places like Brazil, which will have a hugely consequential election in October that may be a rerun of what happened in 2020 in the United States. Many people think we’re trying to sort out what’s true or false. It’s not just about that or even primarily about that – it’s about where we can stop something happening online that will have a real-world impact and, potentially, a life-or-death impact.

Liam: You mentioned the Brazilian elections there, and I heard Áine Kerr on another podcast talking about disinformation actors coming to realize that phrases like “election fraud” and “rigged election” were alerting content moderators who could take down their false claims, so then these actors began substituting these phrases for stuff like, “We are campaigning for clean elections.” And that’s where human moderators can come in to spot those changes and help the authorities intercept these messages.

“We’re constantly seeing words being amended and changed. And, of course, machines aren’t able to keep up with that”

Mark: We’re seeing what we call the rise of “algospeak”, which is when communities realize that there may be content moderation algorithms looking at what they’re saying and they’re trying to avoid it. During the pandemic, we saw anti-vax activists using the word panini instead of pandemic. Most recently, in Germany, we saw the anti-vax community using the word smurf for vaccine because, in German, the pronunciation of that word sounds very much like the pronunciation of the children’s TV character. Last year, in Scandinavia, we saw a neo-Nazi group changing a word associated with a traditional children’s festival to become a racial slur.

We’re constantly seeing words being amended and changed. And, of course, machines aren’t able to keep up with that. When something like the pandemic happens, suddenly, we have all these scientific terms coming into our language. The machine is just not caught up, and that’s where human analysis is important – to correct the machine.

And so, more and more, inside Kinzen, we’re starting to see the machine picking up on the evolution of language. One word connected with another. Smurf with vaccine. Or for example, in places like India, we see anti-Muslim activists using words that may seem totally innocuous, but we can see they’re part of a pattern of intimidation and hate speech.

This is why the human part is so important. We don’t advocate for more content moderation. We don’t advocate for laws or banning misinformation. I think that’s absolutely the wrong way to go. We’re looking for more precise content moderation that can pick the needle in the haystack that is dangerous while allowing maximum freedom of speech. And that’s the long-term goal for us. We cannot have safety by decree and governments banning content. We need to redesign the platforms so that machines and humans working in content moderation are getting more exact, more precise, and more timely in getting ahead of the problem instead of reacting to it.

A blend of machines and humans

Liam: Is misinformation a tech problem, a human problem, or a bit of both?

Mark: Listen, that’s one of those existential questions where people think technology made them do it or made the world dumb. I don’t go for that. I remain an evangelist for the democratic potential of the technologies that we now take for granted or are starting to dislike intensely. The business model and the way the algorithms were primed initially with social media do take advantage of our worst instincts, but I think we can redesign this technology to liberate our best intentions. And part of that is having much better filters that the ordinary person can access when they browse the internet.

“The average user will have more power and precision in content moderation”

Right now, we work with centralized moderation teams inside the tech platforms. My guess is that, over the next two to three years, more and more platforms will try to decentralize power for the ordinary person to set their filters. They can say, “Listen, I don’t want to hear extreme language in my feed,” or someone else might say, “Well, I do. I want to see what the other side is thinking.” The average user will have more power and precision in content moderation. But it will take time, and we’re only in the beginning. Remember, the big platforms that exist today will be replaced by new ones that are being incubated by some 17-year-olds right now.

We have to be aware of some developments that could make the problem worse before they get better. At Kinzen, we are pioneers of audio content moderation. We’ve been analyzing the way in which misinformation, disinformation, and hate speech spreads through live audio and podcasts. And we think the spoken layer of the internet is an area that we need to focus on with high degrees of importance for the next couple of years.

Liam: I was going to ask you about that before we wrap up. There are just countless hours of audio. What problem is that presenting to you and fact-checkers?

Mark: It’s like a perfect storm. Obviously, with live audio, you’ve got the speed. Looking at podcasts, many of these are hours and hours long. And then, the most important and biggest challenge is language. There are thousands of languages spoken. If you’re analyzing something in India, you might be listening to someone speaking in Hindi, but they’ll jump into English now and again. What we are very conscious of is that this is where machine learning is so fascinating. You have automatic speech recognition. But again, you can tweak that if you know what you’re looking for with some human signals. We can allow the language model to get better, to be optimized to listen out for not just the sound of a word but the relationship between languages.

“Can we start to pair machines and humans to interpret the meaning of language, and not just what they’re saying, but how they’re saying it?”

I’m a journalist, right? I’m not a techie by training. A lot of this stuff makes me wonder as I try to keep up with it. But I think what’s important for audio is that we are looking out for the way people speak; their tone of voice. Arabic, for example, is one language if you write it down, but multiple dialects when you speak it. And that, for me, is what’s most troubling and also most exciting. Can we start to pair machines and humans to interpret the meaning of language, and not just what they’re saying, but how they’re saying it? That is the big challenge of audio moderation, but I also think it’s one of the most exciting challenges to be working on. Only humans are able to detect things like irony, sarcasm, slang, or “algospeak”. And that’s why our approach is not just more effective but ultimately more democratic. We want the least possible kind of intervention but the highest level of precision, and that’s where the mix of human and machine is just so vital.

Liam: Brilliant. Well, Mark, thanks very much for joining me today.

Mark: Liam, it’s my pleasure. Let’s do it again soon.

Inside Intercom Podcast (horizontal) (1)