Areto Labs – Lana Cuthbertson, CEO, and Kasey Machin, COO
As Lana Cuthbertson and Kasey Machin worked to encourage women to run for elected office, they noticed a disturbing trend: Worries of vile online abuse were rising rapidly.
“Every time we had coffee with some-one who was thinking about running, that excuse for not doing it was further up the list until it was the number one reason,” Cuthbertson says. “They’d ask, ‘What would it do to their Twitter feed? How would they and their staff cope with it? There are no solutions, so how do we manage this?’”
Machin and Cuthbertson, who both worked with Equal Voice and were a part of the ParityYEG start-up, decided to work on a solution.
Cuthbertson, a Top 40 Under 40 in 2018, invented ParityBOT, a Twitter bot that uses artificial intelligence to detect and rate the toxicity of tweets aimed at women in public office or leadership. The bot scores tweets in terms of their toxicity. It does not respond to each negative tweet, but instead sends out encouraging words to the candidate’s account, so the feed isn’t entirely toxic.
It has sent out more than 12,000 tweets so far. An example: During the recent American election, it automatically sent out the tweet, “To #womeninpolitics: You’re stronger than anyone who criticizes you” to candidates subjected to abusive language online.
The pair later founded Areto Labs. Cuthbertson is the CEO, Machin, the COO.
Earlier in her career, Machin had a front-row seat to online abuse directed at elected officials. She worked as a policy and communications adviser to Edmonton City Councillor Andrew Knack, who has an active social media presence.
“I had a wonderful time there, but when you’re an elected official, you are subject to really terrible language and some people say some really offensive things. That got draining,” says Machin.
Things came to a head when Knack received a death threat online. For the first time, Machin was worried about safety.
“Those messages kept me up at night.” she says. “For social-media managers it can be a real struggle. That experience, along with our work with ParityYEG, is what led us to start Areto.”
They launched ParityBOT during the Alberta provincial election in March 2019. It has been active in four elections, including the United States election in the fall of 2020.
During the U.S. election, the company’s AI tracked and analyzed every tweet sent to each woman candidate for congress and senate over several weeks.
“Out of the six million tweets we’ve tracked, analyzed and scored for toxicity, more than 4,000 of them are … the most toxic, vile tweets you can imagine,” Cuthbertson wrote in a late October social media post.
Toxicity isn’t always as explicit as it might seem. Cuthbertson and Machin have learned it can be insidious, building up over time to poison an online community or company culture.
Areto’s sophisticated machine learning and its ability to detect toxicity against a certain group of people lends itself to applications in other arenas.
“It’s not often the stark, obvious ones where you’d say, ‘Yep, it’s time to call the police about this,’” says Cuthbertson. “It’s usually the collection over time of maybe slightly more subtle and less obvious toxicity that builds up. That’s where we’re seeing the value of detecting that pattern and automatically intervening.”
While ParityBOT first focused on women in politics, the company has expanded with other bot products tailored for public relations and human resources.
“Companies and businesses experience toxicity in their brand communities externally online, when they’re trying to engage customers on places like Twitter and Instagram, and companies also experience this internally in places like Slack and Google and Microsoft Teams when they’re trying to engage their team members,” Cuthbertson says.
Ultimately, Areto Labs wants to highlight the positive instead of the negative. As Machin notes, that’s as much about all the people watching the ugly Twitter exchange as it is about the exchange itself.
“It’s less about the one-on-one interaction. It’s more, for us, that this is a cultural intervention we’re trying to make.”
This article appears in the Winter 2021 issue of Edify