Integrity teams work offers a different solution. We may be in the spotlight now, but we have a long history in the industry. We learn a lot from anti-spam approaches in email or search engines, and we borrow a lot of concepts from computer security.
One of the best integrity strategies we’ve found is to bring some real-world friction back into online interactions. I’ll focus on a couple of examples to help explain this, but there are many other mechanisms, such as group size restrictions, a karma or reputation system (like Google’s PageRank), a “neighborhood you belong” index, and structures for good conversation And the Share button is less powerful. For now, let’s talk about two ideas that Integrity Workers have developed: We’ll call them test drives and speed bumps.
First, we need to make it more difficult for people to have fake accounts. Imagine if, after being arrested for a crime, someone could get out of prison and completely disguise themselves as a whole new person. Imagine if it was impossible to tell if you were talking to a group of people or one person who changes disguise quickly. This lack of confidence is no good. At the same time, we have to remember that accounts with pseudonyms are not always bad. Perhaps the person behind the nickname is a gay teen not outside the family, or a rights activist living under an oppressive regime. We don’t need to ban all fake accounts. But we can raise their costs.
One solution is similar to the method, in many countries you cannot drive a car until you have learned how to operate it under supervision and have passed a driving test. Likewise, new accounts should not get instant access to all the features in the app. To unlock the most abused features (spam, harassment, etc.), the account probably needs to pay some costs in time and effort. Maybe you just need time to “mature”. Perhaps he needs enough goodwill accumulated in some karmic system. He may need to do some things that are difficult to automate. Once the account has qualified with this ‘Drive Test’, it will be trusted with access to the rest of the app.
Spammers can, of course, jump through those hoops. In fact, we expect them to do so. After all, we don’t want to make it too difficult for legitimate users of fake accounts. By requiring some effort to create a new “disguise,” we’re bringing some physics back into the equation. Three fake accounts can be controlled. But hundreds or thousands will become very difficult to withdraw.
Online, the worst damages almost always come from powerful users. This is very intuitive to understand – social apps generally encourage their members to post as much as possible. Experienced users can do this more often, for different audiences, and at the same time more often than is possible in real life. In the old cities, the cost of a person doing harm is limited by the physical need of a person to be in one place or speak to an audience one at a time. This is not true online.
Online, some actions are quite reasonable if done in moderation, but become suspicious when done in volume. Think about creating twenty groups at once, commenting on a thousand videos an hour, or posting every minute for an entire day. When we see people using a feature too much, we think they’re probably doing something like driving at an unsafe speed. We have a solution: a speed bump. Prevent them from doing this thing for a while. There is no value judgment here – it’s not a penalty, it’s a safety feature. These actions may be an easy way to make things safer for everyone while disturbing only a small portion of people.