Despite the proliferation of online discussion forums and social media, many agencies have struggled to embrace online dialog as a functional part of the public process. Their hesitation is simple – curating these spaces has historically meant significant investments of time and worry over what might be posted.
It’s not a problem limited to public organizations. The New York Times has grappled with the challenge of managing online comments for years, and now employs a team of 14 moderators to review approximately 11,000 daily comments.
For organizations like the Times, this often means making the difficult decision to provide fewer opportunities for the public to engage.
Google ‘Perspective’ Developed for Online News Comments
Recognizing this challenge as a great opportunity for technology to aid human moderators, Google’s Jigsaw group launched ‘Perspective‘ in 2017.
What is Perspective exactly? Simply put, it’s an online service to instantly score comments as to how likely they are to be ‘toxic’ to a conversation.
Perspective’s scoring system was trained using hundreds of thousands of human-moderated comments to identify patterns that make a comment “toxic.”
For the purposes of civil dialog, “toxic” means “a rude, disrespectful, or unreasonable comment that is likely to make you leave a discussion.”
Perspective saw its first applications with organizations like the New York Times, Wikipedia, The Guardian, and The Economist to provide an instant comment scoring model of 0 to 100 for the following factors:
- Attack on author
- Attack on commenter
Whereas prior moderation tools have been mostly limited to detecting profanity, this nuanced, quantified approach allowed human moderators to focus attention on a smaller subset of comments. For outlets like the Times, that meant freeing up staff time and supporting more comment opportunities.
According to NYT editors:
The Times hopes that the project will expand viewpoints, provide a safe platform for diverse communities to have diverse discussions and allow readers’ voices to be an integral part of nearly every piece of reporting. The new technology will also free up Times moderators to engage in deeper interactions with readers.
Those outcomes are great for news, but could have an even more profound implication for resource-starved local agencies. Freeing up staff capacity while providing increased access for diverse voices is a big win-win.
Testing Perspective in the Public Sector
To glimpse how tools like this might be applied, PublicInput.com analyzed engagement data from the City of Virginia Beach and the North Carolina Department of Transportation.
Both agencies recently conducted outreach on high-profile projects and received thousands of public comments – 1,403 comments on Virginia Beach’s Dome Site project and 2,561 comments on NCDOT’s I-440 Walnut to Wade project.
To assess Perspective’s potential impact, we retrospectively ran comment data from PublicInput.com through the Perspective API to assess what would have been deemed ‘toxic’, and how that would have affected staff moderation efforts.
Virginia Beach and NCDOT: Bad Actors Represented Less Than 1%
While one might assume that online ‘trolls’ constitute a large portion of participants, in the samples we analyzed, less than 1% of commenters were responsible for 99% of the toxic comments.
In the case of Virginia Beach, only 12 of 1,403 comments had a toxicity score of 50 or higher. That pattern also held true on NCDOT’s I-440 project, where 18 of 2,561 were found to be potentially toxic.
The infographics below show number of comments by toxicity, where comments to the left are considered generally healthy dialog, and those to the right are potentially unhealthy. Specific examples are included at various toxicity levels to provide context.
Keeping a Handful of People from Stifling the Process
In the case of NCDOT’s I-440 project, an average human being reading at 250 words per minute would need 628 minutes to manually moderate every comment.
With a tool like Perspective, manual moderation time would have been reduced from 628 minutes to about 15 minutes, even if staff chose to review anything with a toxicity score over 25%. That’s a big deal if you care about freeing up staff time and saying ‘yes’ to more opportunities to listen to residents.
Nudging Behavior: Providing Feedback Before They Press ‘Submit’
Part of the beauty of face-to-face dialog is receiving real-time feedback on how our words might affect others. Subtle body language and facial movements tell us when we’re on track. That feedback helps us adjust our words and tone to make sure we’re fully understood.
So here’s an intriguing question: If we know why a sentence is toxic, why not provide feedback to a participant before they hit submit?
To explore this, we’ve created a beta implementation of real-time comment feedback for anyone to experiment with in the embedded comment section below. As you type, pause every now and then to see real-time feedback like this:
Test It Out and Share Your Thoughts
We’re interested to know how this technology could address challenges you face. Here’s a few questions to spark your thoughts:
- How might this be helpful in a social media context? Could there be ways to simplify monitoring and management?
- Could smart suggestions reduce the need for manual follow up? (i.e. sounds like you’re referencing short term rentals – here’s our policy)
- Would you want to tailor your use of automatic moderation to times you are more or less available (i.e. nights and weekends)?
About the Author
Jay Dawkins is the co-founder and CEO of PublicInput.com, a provider of community engagement software to over 50 public agencies like the City of Raleigh, San Diego, and Austin Texas.
Jay comes from a family with a deep tradition of local government service, and prior to PublicInput.com worked as a transportation engineering consultant. Today he lives in downtown Raleigh, NC with his wife Sarah and orange cat, Zeus.