Louis Gray wrote two posts this weekend that were related, but were not planned to be so intertwined. The first post was regarding the lack of interoperability of applications using the same basic data. As an example, if you look at the basic address book program, you likely cannot export the data in order to import it into another program. So, if you are moving from an iPhone to a new Droid, there could be some issues transferring basic address book data from one device to another. Yes, there are standard data formats and specifications for much of this type of information, but support is sometimes spotty or lagging. There is also the issue that time is lost transferring all of this data around. Louis proposes the idea of a personal cloud where all of your data is stored and your applications are configured to use that data.

The idea of a personal cloud is interesting and should be pursued or at least researched further. It was a matter of circumstance that led to his second post, where his reliance on cloud data transfer seemed to have failed or at least been delayed. In this case, it was the delay of the Reader2Twitter service which made Louis’ Google Reader shared items twitter account go quiet. Granted, this is a case where there was no revenue to lose and there were minimal implications to the information not flowing properly. The question remains, what if this data was important to revenue, or you could not access your contact list?

The idea of a personal cloud is not entirely new. Google has been trying to build it for a few years, Yahoo has followed the same path and Facebook is not far behind. However, there still remain a few problems with this scenario. First, what about connectivity? Do you always have enough bandwidth to really have all of your data in the cloud? Are you actually always connected? Obviously, the idea of losing connectivity or having limited bandwidth is fairly common even in the US. You can not follow a major technical conference without hearing about the 3G failures of ATT.

So, if you want to mitigate the risks associated with loss of connectivity, that means data that must be available or backed up locally. This type of setup is also fairly common within the email server world. Many people are familiar with the delays of Microsoft Outlook as it synchronizes with a Microsoft Exchange server. The synchronization allows you to work on your client application while you are disconnected and not really have any issues when the data needs to be merged again. So, living purely in the cloud is not really desirable, but if there was a standard synchronization process, you could get the best of both desktop and cloud applications.

Another issue that needs to be handled very carefully is security and identity. Obviously, the data needs to be secure, but the location needs to be well known. Most people will not be able to remember that 10.2.1.123/5673957390123 is the location of their personal cloud. Most cloud services have already assumed this and have moved to personalized URLs like http://facebook.com/robdiana. With known URLs for personal data, verifying identity must be very secure but must maintain simplicity in order to spread to the masses. When securing networks, the machine ID or MAC address is used to filter for known devices. When dealing with mobile devices there are similar identifiers, but we cannot depend on the typical consumer to deal with those IDs. Something similar to OAuth must be developed, like an SMS challenge/response model. I am not proposing this as a solution, there are people that are already devoted to digital identities and could work through the issues better than I.

The combination of standard data synchronization processes, standard data formats, standard security and identity verification processes would be a very powerful model. You could quickly switch between applications as well as personal cloud services. However, what would the benefit be for companies to develop such applications? There are the targeted advertisements, but the revenue from that model is extremely difficult to maintain growth. Even if a solid non-ad revenue model was developed, what differentiation could one service provide over others? If all of these standards exist, we would have made such services a commodity. Reliability can be a differentiation, but only up to a reasonable point. Even at 99.9% uptime, that is only 45 minutes of downtime per month and 99.99% uptime is about 4 minutes of downtime. Anything more reliable is not a concern for the mass consumer.

When something becomes a commodity, the main difference becomes price. If you look at desktop and laptop PCs, the differences are mostly limited to processor speed, disk storage, RAM and price. Because the differences between most of these items are hard for most people to really understand, price becomes a major driving factor. If the only differentiation is price, innovation slows down. Innovation is more expensive than commodity services or parts.

If we lose innovation, what happens to the services that we all rely on? Where does the next big breakthrough come from? It would require another major technological shift like the Internet, Google and now mobile technology have provided. Those shifts are rare, often quite disruptive and normally extremely difficult to complete successfully.

I will admit that many of the standards driven ideas are solid. Assuming companies can still innovate on top of the standards, we would not have the typical innovation problems because there will be competition. However, we need to be careful what we wish for, as we might just get it.

http://blog.louisgray.com/2010/01/future-operating-system-and-application.html
Reblog this post [with Zemanta]