The Web Is Finally Evolving To A More Semantic Future

Three companies have been making changes to the way people use the web. I am not talking about mobile devices as they still use the same basic internet that desktops and laptops use. I am talking about Google, Facebook, and Twitter. Each of these companies is trying to own something in the future infrastructure of the internet. Each of these companies is trying to own the data passing down the pipes. Each of these companies is trying to do this in completely different ways.

First, let me present some background information. A few months ago, Google announced several different protocols and products (Wave, SPDY, Go, PubSubHubbub, Public DNS, Salmon and WebFinger). I have been talking about Twitter becoming infrastructure for over a year, and they recently announced plans for their SMS services. This is all in addition to the fact that most social services are hooking into Twitter now. Last week, Facebook announced a bunch of things regarding the infrastructure of the social web and real time streams. If you look at all of this news, you may realize that these companies are not direct competitors. Why would I say this? I will give you an analogy that is very technical in nature. Basically, these companies are rebuilding the web but they are attacking different layers of the network protocol stack.

If you try to relate this to the image on the right, Google is playing with the “Lower Layers”. Maybe this is the geek culture or the fact that their core product, Search, is a very low level feature. They are defining the Physical Layer with things like SPDY and Public DNS. They are defining the Network Layer with Wave, Salmon and WebFinger. They are defining the way that sites will talk to one another.

Facebook is focused on the “Upper Layers” and is now defining the Application and Presentation Layers with their developer platform. They are also getting into the Session and Transport Layers with OpenGraph and the various widgets and APIs they are providing for third party sites. Twitter spans both the Upper and Lower layers. Their focus on SMS keeps them in the Network Layer, and the Twitter API spans the Transport and Session Layers. Their API has enabled an entire ecosystem of applications for the Presentation and Application Layers.

Ignoring the potential privacy issues that go with owning these layers, and each company has privacy issues, we are seeing something that potentially changes the way the entire web works. Google sees bits being transferred faster. Twitter sees the messages being passed to whatever device or application you tend to use. Facebook sees the relationships among all of the activity that users create. The evolution of this new web has taken a different form, and it will quickly take shape due to the number of combined users these sites have.

The original plans for the semantic web were created more by academics and it was a grand plan. However, there was no way to get people to see that vision without thousands of developers pushing in that direction. Now, the combined forces of Facebook, Google and Twitter are making the evolution to a semantic web much simpler. Much to the chagrin of many bloggers, users are already part of this evolution. This is the only way to make this evolution happen, to forcibly move users that are typically resistant to change. Privacy issues can be resolved, but the core features and relationships delivered by these companies enable developers to create applications that are helpful and not just another tool for another job.

I doubt many people thought that a social network, a search engine and a microblogging application would eventually form the backbone of a semantic web. However, they are leading the charge. Now, let’s see what developers can make out of this.

The OSI Protocol Stack image is borrowed from (

5 thoughts on “The Web Is Finally Evolving To A More Semantic Future

  1. I think that the emergence of social and in particularly activity streams, necessitates the use of better semantic tools. Activity streams tend to contain very little data and it’s unstructured to boot. Applying semantic analysis to these small data sets is a great way to augment the value of the data contained in these conversational snippets.



  2. Bill

    If various standards, like activity streams, get adopted, there could be a lot of information flowing between sites in known formats. You mention that “activity streams tend to contain very little data” but the real power of those streams is really the aggregation of the information. Maybe not from one person, but millions of people create a lot of data. As you say, applying semantic analysis can make this data very interesting.


  3. One of the challenge of creating semantic web is the ability to provide meaningful relationship between the social object on the net.

    There is not enough data to create a semantic search for the web. (Btw Rob can I create a link to another web page, there is a good post in RWW that discusses this issue). There is also the problem of meta data abuse. These are some of the challenges in creating a meaningful semantic on the web.

    Putting these issues aside I personally feel excited by the recent move by Twitter, Facebook, and Google to promote semantic web.


  4. Zainul

    Semantic search has been the holy grail for years. The real question is whether a semantic web backed by a social network will work better than a pure semantic search engine. There is a lot of data to start with in things like Freebase, but the Twitter annotations and new Facebook features could open up the data floodgates.

    Just so you know, you can put a URL in your comments in you choose to. If you put too many links, then it just gets moderated.


Comments are closed.