Over the past few days there have been some articles that piqued my interest. Both are regarding the “wisdom of crowds“, but are looking at different angles. First, there is Greg Linden (who has been on a roll lately) who talks about the madness of a growing crowd. In his post, he references a few other
There is a repeating pattern with community sites. They start with great buzz and joy from an enthusiastic group of early adopters, then fill with crud and crap as they attract a wider, less idealistic, more mainstream audience.
To put it more bluntly, he quotes a Xeni Jardin article on Wired,
Web 2.0 is very open, but all that openness has its downside: When you invite the whole world to your party, inevitably someone pees in the beer.
The main concern in all of this is that self-policing does not seem to scale very well. This is also implied by Nicholas Carr’s post on crowd control at eBay. So there are obviously a few problems here. First, the trolls and negativity suggested by Xeni’s “pee in the beer” comment tend to cause “conversations” to devolve into screaming matches or “flame wars”. This definitely takes something away from the sense of community on some of the social sites. Then there is the blog spam, regular spam and porn spam that come with a wider mainstream acceptance of a site. You can go to any major social media site and find plenty of it.
So how do you get around these types of problems? There are several ways to do this. This is the target of another recent article at Slate that talks about the wisdom of chaperones. Slate references the process of Wikipedia and Digg as the “myth of democracy in web 2.0”.
Social-media sites like Wikipedia and Digg are celebrated as shining examples of Web democracy, places built by millions of Web users who all act as writers, editors, and voters. In reality, a small number of people are running the show.
Given that I am an active member of some social media sites, this is of significant interest to me. Part of the self-policing idea is not voting up a story. That will only work while the site is still fairly small. Once the site grows larger than a small number of early adopters, voting is secondary in terms of policing. Some site, including Mixx, allow for the reporting of users of submissions that violate the terms of service. Typically, these violations are various forms of spam or porn. However, spam reports require some sort of administrative intervention from the site in order to determine if it is a violation and take corrective action. This again does not scale very well. The approaches by Wikipedia and Digg are an attempt to alleviate this. The issue at Digg is that the superuser concept was kept secret and seems almost underhanded. People do not like to be lied to. Mixx recently added a superuser concept as well, but they are meant to link similar stories so that the same story from various sites can be grouped together. This allows the duplicate story detection algorithms to remain fairly simple, but does not address the scalability concerns regarding spam submissions.
This does not leave us with many good options. Eliminating spam is always a difficult problem. One idea is to use something like the Akismet plugin for WordPress. This does a very good job of holding comments that look like spam, but I do not know of something that could work for something like Digg. Giving power to the users is a decent idea as long as the approach is well done. Digg failed miserably by hiding things, but Mixx must have learned from this and made it known through its own blog. The wisdom of crowds has been proven to be a decent concept and I would hate to have spammers and trolls kill it.
What other methods could be used to deter spammers and trolls? Any ideas? Anyone?