Social Media Services - LiveWorld

4 Best Practices for Content Moderation That Elon Musk Is Failing At And Why Brands Should Keep Their Twitter Accounts (For Now)

December 8, 2022
Posted by: Peter Friedman, Founder, Chairman & CEO

Elon Musk has bought Twitter and sought to remake it in his own image. That was his first mistake. Every move since then flies in the face of content moderation best practices and is destabilizing Twitter. Still, a good understanding of best practices can save Twitter. Meanwhile, brands can and must continue to make Twitter work for them and their customers.

Let’s start with the often-used metaphor of a town square, which is a place people come together to engage with and enjoy each other, sometimes talking, debating topics, and having conversations. This all works well when the people have a common understanding of their cultural norms of behavior. If someone breaks those rules, some form of law enforcement, or the group itself, in an orderly way corrects the behavior or removes offenders from the square. In social media, the people who maintain the cultural norms and rules are the content moderators. Without understood norms and rules, and having some kind of policing or moderation, real or social media town squares descend into chaos, an environment that cannot support free and reasoned discourse. At that stage, most people don’t want any part of it.

4 Content Moderation Best Practices

1) Human moderation, supported by technology. The essence of effective moderation is a team of human moderators who are trained, skilled, and empathetic. They consider context and nuance and deal with users as humans. Technology can empower these human moderators by organizing and prioritizing content. But even the best AI technology cannot replace human judgement in such a socially dynamic environment. Even if tech one day can do that, it can’t provide the empathy, reassurance, and coaching that users want. Nor can it satisfy the demand to be treated with respect by another human who cares about them. Most social networks, founded and culturally centric on software engineers, have never understood this. They’ve relied primarily on technology, only reluctantly building up a set of human moderators. Generally, moderators have become step-child organizations, with little respect and nowhere near the support needed. The reality is the social networks can’t just tech their way out of this one.

2) Manage, support, and just treat your human moderators like people. As moderators are the front line of user experience, smart leaders want them to treat users well—intelligently, with empathy, just as any company wants its customer service team, salespeople, or baristas to do. A business fundamental is to treat your people as you expect them to treat your customers. That’s all your people, not just the moderators.

3) Clear, consistent guidelines. Guidelines are the rules, cultural norms, escalation paths, and consequences for bad behavior in our social media town square. They must be published, accessible and understandable. They have to be managed with feedback and effective mechanisms for users to appeal decisions. It’s also important to manage the guidelines with some stability. Moderators and users cannot absorb guidelines if they change daily or with other volatility. Ever-changing guidelines undermine a good environment, rather than supporting it.

4) Prioritize quality over quantity. A workable well-moderated social media town square can’t be grown by simply stamping out more copies like a manufactured product. Unlike even complex manufactured products that have thousands of standard parts, a social network is made of human users. If we think of them as parts, they are ever changing in their behavior in often unpredictable ways. Cultural norms that enable quality with scale take time to develop. Trying to grow or change such a dynamic organic environment too fast destabilizes it. Alas most Internet sites, including social networks, have sacrificed quality environment for quantity of users. This choice comes about because showing more users has driven stock valuations, and volume drives advertising revenue. The founding CEO of one of the biggest Internet sites was advised his chat rooms needed moderation, especially with rapid growth. Without that, bad behavior dynamics would make it harder to tame later on. He responded that while that might be right, his team felt such an approach would impair volume. Clearly they cared about volume above all else.

 

Leaders of the organic fluid environment that is a social network simply cannot know how users will act and react until the product is in use by people. Smart social media leaders test products before any launch, then test them with small sets of users, then some more, scaling in phases. Each step of the way they change, tune, and even roll back features based on live experience. Witness how many times Facebook and all the others have done this, only to find out the users have another idea. They’ve learned to go in phases. eBay built a massive ecommerce marketplace in part by focusing on the quality of its trust and feedback culture, along with the content moderation that provided its foundation.

These best practices drive the overall cultural model of a social network or online community. After all. the essence of a social network is social by hosting conversations among people, the resulting relationships, and the way individuals treat each other.

Twitter logo

What Elon Musk is doing wrong with Twitter. Elon Musk is failing on all four of these content moderation best practices. His first error was assuming he can do better at something he knows little or nothing about, not realizing how complex a social network is. Sure, cars and space rockets are complicated, with lots of moving parts. (Well electric cars, not so many.) Social media isn’t a discrete item that one can change with easily predicted results and fixes. Every user is a moving part, with Twitter counting over 200 million such parts. It’s a game of Whack-A-Mole, with 200 million moles.

He’s cut the human moderator staff dramatically. The front line of moderating the user experience has been shown the door, and with that Musk has drastically limited the ability to maintain or effectively change the town square’s rules and cultural norms. In this context when Eli Lilly escalated a problem fake account, it took Twitter 6 hours to take it down. This was before the latest cut in staff. Along with these cuts, he’s mistreated and insulted the moderators, arbitrarily cutting people, dismissing their value, and showing no respect. He’s done the same for the rest of the employees, advertisers, and users. Advertisers like to advertise where users have a good experience. Insult models don’t provide for that. Under Musk’s direction, Twitter has been changing by the minute and becoming an unstable place. By example and action, Musk has told stakeholders that Twitter is a place where anything goes and there is no need or interest for anyone to behave well. He is making the same novice mistake that several social network leaders have made, thinking it’s all about code and features. Certainly improvements can be made via tech, especially better tools. But a tech centric approach misses that a social network is really all about people, conversations, and relationships. He thinks the solution is to make Twitter more engineering driven and great code is the solution. He plans to automate at the expense of critical nuance and context nature of effective moderation. He is blind to the people dynamics and the critical role of moderation.

In total disregard for quality, he rolls out and changes features by the hour, skipping the forethought, testing, and phases needed to carefully manage changes in the fluid environment of 200 million different customers. One hopes he appropriately tests Tesla cars and SpaceX rockets before putting them on the road and in space. With his verified users fiasco, he has proven himself a total rookie in social media.

Perhaps he intends to replace Twitter with something different, that isn’t really social media. He talks about a payment service and other aspects of a WeChat wannabe. But WeChat too is a complex social media eco-system. Mark Zuckerberg decided to build another WeChat on top of Messenger and WhatsApp. Even with the greater leverage, economics, and experience of Facebook, that didn’t turn out to be so easy.

Can Twitter be saved? Yes, if the above best practices are followed with a priority focus on people and culture, and less so on technology. Community transformation models have been successfully deployed elsewhere such as HBO (post-Sopranos), Burger King, and others. He has to turn leadership over to someone who actually has knowledge and experience in running a social network, understands the importance and best practices of moderation, and can focus 100% on Twitter. Musk has even admitted he has too much on his plate.

Brands should stick with Twitter for now. First, while the town square may have some extra chaos, it’s still filled with hundreds of millions of people. The brand’s consumers, patients, doctors, and influencers are still there. According to ZoomRx, 75% of physicians who were active during oncology conferences continued to post on Twitter over a week in mid-November. Musk’s proclamations of unfettered speech have been mostly talk. The rules have not changed much so far. Twitter is big and pervasive enough, with over fifteen years of history, that it isn’t going away in a snap. Even if Musk didn’t care about the $44 billion he paid (and he does), that purchase is largely made with other people’s money. People who will not stand for their investment flippantly being tossed out the window.

Even if Twitter overall is vibrating, a brand has a great deal of influence over and vested interest in its own Twitter presence. Customers will continue to speak about the brand and their product experiences. People will still seek and expect customer service from the brand on Twitter. If the brand isn’t present, it’s abdicating the customer experience to fake accounts, antagonists, misinformation, and disinformation. Put simply, much of today’s marketing and customer support game is played out on Twitter. Brands cannot afford to skip that game. Fortunately, the brand’s customer experience is only partly a function of the overall service; more is driven by the brand’s own presence. The brand still decides what content it posts. The brand can still moderate its own presence and user engagement. The brand can still comb Twitter for comments related to its reputation and products. Brands should watch carefully as the situation evolves, using these guidelines to continue to interact productively with their customers.

4 brand actions to effectively manage Twitter:

1) Keep your Twitter accounts active which is critical to supporting your customers and mitigating damage from imposter accounts, antagonists, and misinformation

2) Post and engage. Publish meaningful content multiple times a week to daily. Engage actively with users, regularly confirming your authenticity, and directing them to your website for additional validation and more content.

3) Step up content moderation to mitigate risk by managing and engaging with users. Identify and act on fake accounts as well as mis- and disinformation.

4) If continuing to advertise do so with an engagement model rather than broadcast. Whether to continue to advertise is a case by case decision, Some brands may feel they need to hold on advertising as a statement regarding Musk’s behavior and policies and the rise of problematic content on Twitter. With Twitter staff cutbacks a brand may find they don’t get the account service and ad program support, they need. If continuing to advertise, it’s important to take an engagement approach which enables the brand to be more authentic, verify itself as the real thing, direct users to website content and apps, and build trusted relationships.

The author, Peter Friedman, is the founder and CEO of LiveWorld, the longest standing social media related company in the world and with the most years of content moderation experience. With a combination of human moderators, technology platforms, and digital agency services, for over 27 years LiveWorld has provided hundreds of programs to F500 brands in healthcare, financial services, CPG, auto, entertainment, and Internet categories. Prior to founding LiveWorld, Mr. Friedman was Vice President and General Manager of Apple’s Internet services division including its moderated online communities