Wisdom or Madness in the Crowd?

Over the past few days there have been some articles that piqued my interest. Both are regarding the “wisdom of crowds“, but are looking at different angles. First, there is Greg Linden (who has been on a roll lately) who talks about the madness of a growing crowd. In his post, he references a few other

There is a repeating pattern with community sites. They start with great buzz and joy from an enthusiastic group of early adopters, then fill with crud and crap as they attract a wider, less idealistic, more mainstream audience.

To put it more bluntly, he quotes a Xeni Jardin article on Wired,

Web 2.0 is very open, but all that openness has its downside: When you invite the whole world to your party, inevitably someone pees in the beer.

The main concern in all of this is that self-policing does not seem to scale very well. This is also implied by Nicholas Carr’s post on crowd control at eBay. So there are obviously a few problems here. First, the trolls and negativity suggested by Xeni’s “pee in the beer” comment tend to cause “conversations” to devolve into screaming matches or “flame wars”. This definitely takes something away from the sense of community on some of the social sites. Then there is the blog spam, regular spam and porn spam that come with a wider mainstream acceptance of a site. You can go to any major social media site and find plenty of it.

So how do you get around these types of problems? There are several ways to do this. This is the target of another recent article at Slate that talks about the wisdom of chaperones. Slate references the process of Wikipedia and Digg as the “myth of democracy in web 2.0”.

Social-media sites like Wikipedia and Digg are celebrated as shining examples of Web democracy, places built by millions of Web users who all act as writers, editors, and voters. In reality, a small number of people are running the show.

Given that I am an active member of some social media sites, this is of significant interest to me. Part of the self-policing idea is not voting up a story. That will only work while the site is still fairly small. Once the site grows larger than a small number of early adopters, voting is secondary in terms of policing. Some site, including Mixx, allow for the reporting of users of submissions that violate the terms of service. Typically, these violations are various forms of spam or porn. However, spam reports require some sort of administrative intervention from the site in order to determine if it is a violation and take corrective action. This again does not scale very well. The approaches by Wikipedia and Digg are an attempt to alleviate this. The issue at Digg is that the superuser concept was kept secret and seems almost underhanded. People do not like to be lied to. Mixx recently added a superuser concept as well, but they are meant to link similar stories so that the same story from various sites can be grouped together. This allows the duplicate story detection algorithms to remain fairly simple, but does not address the scalability concerns regarding spam submissions.

This does not leave us with many good options. Eliminating spam is always a difficult problem. One idea is to use something like the Akismet plugin for WordPress. This does a very good job of holding comments that look like spam, but I do not know of something that could work for something like Digg. Giving power to the users is a decent idea as long as the approach is well done. Digg failed miserably by hiding things, but Mixx must have learned from this and made it known through its own blog. The wisdom of crowds has been proven to be a decent concept and I would hate to have spammers and trolls kill it.

What other methods could be used to deter spammers and trolls? Any ideas? Anyone?

5 thoughts on “Wisdom or Madness in the Crowd?

  1. I have no ideas, but I know it pushes me to the limit of sanity to see the spammers and trolls on Mixx that don’t get banned.

    But, as I said in a recent post on the forum, not all people consider the same thing to be spam.

    Like

  2. @honestape,

    First, thanks for the comment as I always appreciate the conversation. I think the first step for any of the social media sites is to limit the amount of blatant spam and trolling. That may be algorithmic at its core. Some of the other content is much harder to discern and dumping it into an algorithm means you need to keep an exceptions list for those sites that trigger the algorithm but are not actually spam.

    Like

  3. Very insightful post. I tend to think of this whole mess this way:

    As time goes on, more and more of these sites are adding in Digg-like features such as voting down something. Now, if you think that gaming a system is easy, it gets even easier when you have up voting AND down voting.

    The usual trend of the web in these types of matters is:
    1) Site “A” brings big traffic.
    2) Spammers/Gamers join in bulk
    3) Best users get tired, move on to another site
    4) Site “A” owners figure out an algorithm to beat out most spam.
    5) It’s too little, too late, since no one cares about the site as much anymore.
    6) We all chuckle because it ends up just being spammers spamming each other, and not affecting as many “normal” web users.

    Like

  4. @Pinny Cohen
    We are seeing a lot of that right now, especially with the way Digg is handling things. The only problem is that I do not think we ever get to #6, or we have not gotten there yet. I think we will see many more niche Digg-clones this year, because what is the point of spamming with “free meds” on a sports social media site?

    General purpose/interest sites will always be popular, but they may not be as useful in the long run.

    Like

Comments are closed.