Google Makes A Bid To Control The Internet

A few weeks ago, I wrote about the possibility of Google going evil. At that time, I was talking mostly about the fact that Google has free or cheap applications in many different areas. However, I did not really talk about standards they were developing. One commenter, Ed Richardson, mentioned this as a reason why they were not evil:

The interesting situation is whether they abuse their position in relation to the setting of standards for the industry…

At that time, there really was no evidence of this kind of thing except for their development of Google Wave, PubSubHubbub and the new Salmon protocol. I did not think these items were much of an “evil leaning” until I read some news this week.

First we have the release of the Google Go programming language. Go is described as “a systems programming language, expressive, concurrent, garbage-collected”. In the announcement, they talk about what go could do:

Go is a great language for systems programming with support for multi-processing, a fresh and lightweight take on object-oriented design, plus some cool features like true closures and reflection.

Want to write a server with thousands of communicating threads?

New languages are created all the time, so you would not think this is a big deal. However, it comes from a company that has a vested interest in the way you develop applications and websites. At least it did not seem like a big deal until I put it together with the announcement of SPDY:

SPDY is at its core an application-layer protocol for transporting content over the web. It is designed specifically for minimizing latency through features such as multiplexed streams, request prioritization and HTTP header compression.

So over the last few months, a few of us here at Google have been experimenting with new ways for web browsers and servers to speak to each other, resulting in a prototype web server and Google Chrome client with SPDY support.

This definitely puts Google into the territory of uncomfortable developments. If Google was not a major search engine that cared about how quickly they can crawl sites or a major web application provider that cared about how quickly the application responds to user actions, I would feel much better. However, they have a very significant interest in how quickly things move on the web. I can understand that everyone benefits from this, but Google probably benefits more than the user. This still does not really push them into evil territory though.

At the end of the week, I read about the one thing that made this all very possibly, damned close to evil. Your PageRank may soon depend on how fast your site loads:

Not long after that Google’s Matt Cutts has an interview with Mike McDonald from WebProNews were Matt lets the world know that Google is seriously looking at making the page load time a part of the algorithm used to calculate a web site’s PageRank. The idea being that the faster your site loads the better that will affect your overall score as calculated by Google and the better placement you’ll have in search results.

Now let’s look at Ed Richardson’s comment again, “The interesting situation is whether they abuse their position in relation to the setting of standards for the industry.” So, if you own over 70% of the search market, you are planning to release a new language that allows you to build faster and more concurrent servers that all talk this new faster SPDY standard, and you are saying that speed will be a part of your ranking algorithm for search results, you are using the monopoly hammer. There is another side issue to this problem that comes from a quote in the Inquisitr post:

Douglas Karr from Marketing Tech Blog doesn’t think so and he goes as far as to suggest that this is leaning to being evil.

How does a small personal blog hosted on GoDaddy for a few dollars compete with a company hosted on a platform that costs thousands of dollars with loadsharing, caching, web acceleration or cloud technologies?

This means that smaller bloggers may be pushed down in the rankings because of these new developments and not given the same chance to compete against larger companies when it comes to search. Now, I am not Google-hating, but I am trying to raise awareness of what is happening. Google is using their dominant position in search and the internet in general to move things in the direction they see fit. We need to be very careful about this trend of events.

This does not include the ideas that Google is building the Chrome browser which is focused on making JavaScript run faster or the new Chrome OS which is an internet-focused operating system that is likely to run on netbooks and other devices. I know people may consider this all conspiracy theories or just hating on a dominant player, but it is a disturbing trend. I just thought everyone should know 🙂

Disclosure: I use Google Search, GMail, Google Reader and Google Apps on a regular basis. I also do not see this changing anytime soon as they are the best products for me right now.

23 thoughts on “Google Makes A Bid To Control The Internet

  1. hello,

    just my 2 cents!

    Well this technologies that google has been devoloping are based on standards and use FLOSS licences, so i really don’t think there is any kind of danger, everybody can use and implement it; one or more of the google’s GO language creators created B, C and UNIX.

    and what about ON2 technologies recently bought by google? this codecs could be what html5 video tags need to get reed of flash and silverlight. with a FLOSS licence of course.

    Looking at it from a certain angle it in fact could be very positive for all the web, all the standards compliant browsers and sites, anda very very bad for microsoft.

    and i really don’t agree about the rankings eventual problem, because right now, we have the same problems, some sites are faster some sites are slower.

    Like

  2. I think you’re taking the wrong perspective on this whole Google issue…

    First, look at it from Google’s perspective:
    * their main interest is indexing the largest amount of content possible, and doing it fast so the content’s relevant
    * the majority of the modern internet, everything from server to browser to protocols, are at least ten years old at the core
    * Google’s results are only good if they’re relevant and sorted properly, so they have a vested interest in attempting to make it so

    Now, look at their behaviors:
    * “Webmaster Tools” to help Google better access sites you own, arguably the first attempt by Google to make things faster/better on their own
    * Chrome takes marketshare away from IE, which is beneficial to all mankind
    * JS engine developments are a good thing – especially when they’re spread back to the community. JS is notoriously slow in some browsers, and even in the fastest of them can still be sluggish
    * the HTML standards have been bogged down in meetings for years, and got nothing done – so Google’s begun pushing things like , rather than wait for the W3C, who could take many more years to produce the same results
    * Google has contributed highly to languages like Python, PHP, and MySQL because of their intense usage of them – creating a language on top is only a half step away from helping optimize pre-existing languages
    * wanting to augment the HTTP protocol, and presumably make it backwards compatible, is a good goal: HTTP is pretty slow sometimes, especially over things like 3G networks. If it can be expanded to do more, or do things better, this benefits anyone

    Ok, in case that isn’t easily pieced all together, here’s the gist: Google may be doing lots of things out of semi-self-interest, but you need to remember that they’re not only doing these things in the open, they’re licensing many (most? all?) of them so that they aren’t in total control. A new language that could be used to rewrite/replace Apache, in combination with a new web protocol, may seem dubious, but not so much when you consider their licensing.

    The comparison to Microsoft and other “monopolies” (to reference you last topic ;-)) is easy: Microsoft participated in similar activities (IE, ActiveX, VB.Net, Windows Server, etc.), but did so either behind closed doors or with strict licensing. The fact that Google’s opening the door to Go in the same manner that PHP or Python does is an extremely crucial differentiation, and they’re very careful about doing this for all of their potentially gamechanging structure inventions.

    Now if Google were to do all of this but not distribute source code, or sue a bunch of people for using it in a way they didn’t expect or don’t appreciate, that would be different. But they’re not, and they’re not going to – even if they don’t have control over creations like Go, they’ll be happy to see it used because it benefits them too.

    But you can’t exactly force a new programming language or any of these other products on people – they have to choose to use them, a decision that Microsoft’s products have never really needed.

    –Kyle

    Like

  3. Ricardo

    Even though everything is open-source, they have a very big chance to change the direction people choose. That is my real concern. The problem with page load times is that it is not considered right now, but could be in the future. That is penalizing people using smaller hosts or people not using the fastest technology.

    Like

  4. Kyle

    I am thinking you should just write a blog post about this 🙂

    I do understand the idea that all of this is open source. However, they stand to benefit much more than anyone else implementing the technology. By itself it seems OK, but I am trying to raise awareness that Google is in some uncomfortable territory now.

    Also, thanks for the lengthy comment, I really appreciate it.

    Like

  5. You’re forgetting Google’s Native Client …
    🙂

    I’m more concerned by the total scope on knowledge gather by Google about any of us…

    Translation, Semantic Web and Facial recognition are key to total control…

    Google robots are scanning almost any digital format and thus capable of assembling the best reports on any individual on earth for more detailed than any intelligence service in the past.

    The net also remembers anything ever put on it, except from my earliest homepage… that is,
    in an unseen way; even when we have forgotten about it.

    Is this Google’s fault ? No, it’s inherent to digital information and the net itself.

    But I have to say that Wave with only one copy of any document makes me thing about ‘1984’ and the rewriting of history as quite possible…

    It’s hard to distroy all 5000 hard copies of a book, but it’s Amazon – if I’m not mistaken – that killed of that number of copies of ‘1984’ in one glimpse…

    If there is only one copy… risks of total erasability are immense.

    And I can know: I’ve lost my link collection not once, not even twice but three times…

    [favoritesanywhere.com disappeared, murl.com crashed and mybookmarks.com did reset 🙂 ]

    I’m now quite carefull of preserving links in only one place… all well until the service has a major glitch…

    Like

  6. “How does a small personal blog hosted on GoDaddy for a few dollars” manage to keep up with the traffic if they are put on equal footing in page rankings with much larger sites hosted on much more robust infrastructure?

    Is it necessarily in the best interest of these smaller sites to be overexposed?

    If the level of traffic that might come with higher page rankings is of importance to the owner of such a site, there’s obviously a path they can take to achieve that – ensure their site can handle the user load and pages load quickly. It’s not as if they are permanently relegated to obscurity by this possible change in page ranking strategy.

    Like

  7. ersouza,

    As an example, this blog is a PR5 I think. There is one post of mine (https://regulargeek.com/2009/02/11/what-programming-language-should-i-learn/) that is the second result when searching for “what programming language should i learn”. My page takes almost a full second to load. That post is a big reason why my blog has grown. Obviously I want that visibility from google search results. If page load time becomes a part of page rank, my ranking will drop, maybe even off the first page of results. This would be a huge change for me.

    Like

  8. I don’t think a “full second” is going to get you significantly lowered in the page ranking here. Even bigger sites can’t typically promise much better than that.

    Like

  9. Multi-national Corporations and especially Google remind me of the story about frogs and boiling water. As long as we give them power gradually few will realize how dangerous it is until they start using it more blatantly.

    I’ve written many posts in the past about privacy and data mining that can be found in a category of the same name in my blog. Few realize how much information they have already revealed that is being merged into databases across many sources.

    Google has all the power they need right now to put any online business under that they choose because they already control most of the traffic ecommerce sites receive.

    Right now they primarily use it to increase their earnings. They already Google-dance organic listings and play games with ppc ad positions – especially during the holiday shopping season.

    They can pull the plug on anyone’s traffic any time they wish and their personalized search makes it really challenging to verify who is seeing what. (The drop in your visitors will be painfully obvious.)

    For those who don’t believe me I encourage you to read my post on Why You Can NOT Rely Solely on Organic Search Listings for Traffic which links to three separate occasions that Darren Rowse wrote about his ProBlogger traffic dropping during the holidays. I linked it to this comment so you can find it easily.

    Rob makes another excellent point about load times. Those of us who prefer to write long, complex posts and interact continually with our readers through comments would be penalized because that type of page is going to take longer to load.

    Do we want to make it harder and harder to locate those capable of thinking and discussion in favor of sound-bite Web sites?

    I recommend we all keep track of our favorite sites so we can always find them without using search engines. Best to do this in online tools like StumbleUpon and Delicious AND offline tools that we control.

    I use the keyword searchable free Tomboy Notes app to compile my research and just started using BackLinkReporter to keep track of the blogs I most want to participate in. If you don’t want to keep track yourself be sure to know a few people who do.

    Like

  10. InternetStrategist,

    You hit one of the main topics that I alluded to, people use organic search as a major source of traffic in many cases. Whenever they make a fairly significant change to their algorithm, people quickly feel the pain. A change like load times would have people scurrying everywhere looking to squeeze another 100ms of performance out of their site.

    Like

  11. “This means that smaller bloggers may be pushed down in the rankings because of these new developments and not given the same chance to compete against larger companies when it comes to search.”

    True but Google already offers filters on their search results for things like blogs and forums. With these filters different sites will end up being ranked compared to the same kind of sites (which may still have some downfalls but nothing close to competing against larger companies)! I can only imagine they’ll come out with more filtering options so that everyone gets to see the results they want 🙂

    Like

  12. theeresa,

    Generally, the filtering options sound like a good idea, but there have been studies that show a large percentage of people never look at filtering options or advanced options (I don’t have a link to the research readily available). Also, would you want a large portion of your traffic dependent upon someone looking at the “blog filter” for search results? I am a firm believer that people are generally lazy and will likely only look through the first page of basic search results.

    Like

  13. I’m becoming more and more concerned about the scammers phoning me 5-6 times a day promising to put me on the first page of Google at £100 a time! Try to get their email addresses and phone numbers and they hang up on you, their just a modern version of thieves, lets try to make a concerted effort to get them put away. Even the police won’t trace their numbers any more, I think they’ve given up because they’re understaffed and can’t be bothered.

    Also, if you receive a letter from the Revenue offering you a rebate, they’ll probably ask for your account and contact details and then milk you for all you’re worth!

    Like

  14. Uhhh,

    While I am wary of Google and am not rushing to use a lot of their free stuff (I use the search features, their web advertising (both directions on that) & webmaster tools) I have always understood that page load time, while not the most important factor in SEO, is a factor to consider when thinking about SEO.

    In fact, there are a number of WordPress plugins that focus on speeding up the dishing out of WordPress post & page requests — with at least one of those plugins creating a static page for every post & page in a Word Press run system.

    Like

  15. Google manipulates public opinion in Brazil

    In 2010 there will be an election in Brazil and one of the candidates, the governor of Sao Paulo, Jose Serra of the PSDB – Conservative Party, a rival of President Lula – has always been protected by the media (TV’s, newspapers and magazines in Brazil are in the hands of conservatives).

    On 13/11/2009 there was a serious accident in one of the works of the governor, called Rodoanel (like “Ring Road”). There are many allegations of corruption in this work.

    I saw on a site that Serra had ordered to Google to not publish the accident in the Rodoanel. At first, I thought that’s a joke. I like Google and did not believe there was this kind of manipulation. I always knew of the manipulation in newspapers and magazines, but in Google? I could not believe it…

    Well, try to find a picture of the Rodoanel accident there. I tried but I could not find anything. Not a single image. The complaint is on this site (portuguese): http://cloacanews.blogspot.com/2009/11/serra-mandou-google-obedeceu-sistema-de.html

    This is not scary?
    (sorry my poor english)

    Like

  16. Interesting take on Google’s market. Clearly, the faster the web is the more money Google makes. It does seem that eventually the only way to have a decent pagerank will be to either have your site hosted by Google (with associated implications) or to have a giant company backing you with the tech for a fast page. However, there seems to be a slightly-less evil take which is that given two search results which would normally have equivalent page-rank should the faster page have a higher rank? Clearly, this could lead to a latency ‘arms-race’ for large sectors of the web in order to stay on the top page of results.

    The real issue for me is that this could very negatively impact the ability of people and small companies to grow their online presence. Starting with a small budget would no longer be viable as increasing demand would decrease site speed and thus could create a cycle by which a site’s pagerank stays low. The rapid growth possible on the web means that it is nearly impossible to catch up to demand if you get behind. A good example might be Twitter and its early abysmal performance and reliability. The solution? Google will gladly let you run your site on their servers in AppEngine! Oh, wait….

    Like

Comments are closed.