Thursday, April 23, 2009

Relentless Innovation

The mantra at Google is 'release early, iterate often'. This constant evolution inevitably leads to constant innovation. It would be easy to argue that their ability to innovate is a natural result of their size and financial resources but that conclusion is short sighted. Joining companies like  3M and Apple as icons of creativity, Google continues to shape business and society all over the world.

Some innovation might seem small at times but that doesn't mean they are not relevant. For instance, after adding Java support to the AppEngine, yesterday they release a minor upgrade to their Python SDK with new libraries for cryptography. Now, that might not sound like a big deal; in fact, some will claim that it should have had it a year ago. Either way this small step continues to enable developers to create more powerful and more secure applications. One step at a time.

Meanwhile on 4/21 they launched the Google Analytics Data Export API. Granted, other web analytics vendors (e.g. Omniture) have had APIs for a long time but they are not that easy to use nor are they free. 

OK, so might still think 'two little APIs, what is the big deal?' - well the big deal is that adding 2 or 3 APIs every other day you can compile significant functionality in a couple of months. However, the real value is the way Google continues to enhance every single one of their product every single day. Release early, test, measure, improve, release again. This is non-stop. For a slightly more impressive announcement, just take a look at the following post regarding the web standard for 3D Graphics. The video is stunning but remember it is running on a web browser. 

Thank you Google for pushing the envelope. I can only hope this approach to innovation permeates through private and public organizations alike.
 

Tuesday, April 21, 2009

Booz Allen Comments on McKinsey's Cloud Report

Booz Allen makes several interesting remarks about McKinsey's Cloud Report. The following two jumped at me. First:
They state that cloud offerings “are most attractive” to small and medium sized business, and “there are significant hurdles to the adoption of cloud services by large enterprises.” That would come as quite a shock to Target, Eli Lily, the New York Stock Exchange, the American Stock Exchange, NSADAQ, Toyota, E*Trade, Computer Associates, and a host other large enterprises that have been in the cloud for a couple of years.
Second:
Where this example appears to break down is that, for the data center, they are calculating the cost per core, while for Amazon they are calculating the cost of a Large EC2 instance, which is four cores. On a single-core basis, an EC2 Small instance is only $72 month, running non-stop. Assuming the same 10% utilization used in other examples, the comparison should be $48/month for the data center and $7.20 month for EC2.

Thursday, April 16, 2009

McKinsey and the Cloud

McKinsey&Company just released an interesting document on Cloud Computing: Clearing the air on cloud computing. Very interesting thoughts. I agree with the idea that over hyping Cloud Computing (and any other new technology) is risky and when done on purpose, irresponsible. I also liked their Cloud definition, it seemed pragmatic, down to earth.

I was surprised by their conclusion that AWS would not be cost effective for large corporations. I know AMZN has some large customers and I'm sure they will have some follow up commentary. In terms of the cost analysis, I think the author is missing two points. First, I think that the effort to initiate or further deploy virtualization in the corporate data center has a not zero cost. Starting from training and support. It obviously does not happen overnight either. Secondly and more important in my opinion is the opportunity cost. I believe that the financial rewards offered by the Cloud's speed to market far outweigh the potential incremental cost (assuming they are correct and it is more expensive for large corporations--I have my doubts). 

For example, let's take a hypothetical example of a multi-billion dollar media company, that would be a large corporation in my mind. They need to analyze 4 to 6 TB of clickstream data every month to fine tune their advertising efforts. The ability to execute on their strategy could easily bring additional revenues in the 8 digit range. They have two options: 1) go with their current data center 2) deploy MapReduce/Hadoop at AWS. Option 1 would easily take 6 to 8 months to complete. Option 2 could be up and running in days at most. To me that speed to market is priceless. In the short and long term.

Tuesday, April 14, 2009

Blue Analytics

IBM is launching a new consulting organization to focus on Business Analytics. This is a very significant move that should bring a lot of positive developments to the industry. IBM has tremendous experience in business consulting and unmatched technology assets to deliver a complete an actionable solution. 

I am convinced that Analytics will be a key differentiator in years to come. Companies will need to compete with Analytics to remain competitive. The technology is available and the current economic conditions - along the need for better risk management - will foster unprecedented innovation. Welcome to the Analytics generation.

Thursday, April 9, 2009

SDC is what really matters

Two days ago Google announced several enhancements to the AppEngine. The support for Java grabbed most of the headlines. It was the number one feature request from developers and it certainly opens new possibilities for JRuby, Scala and others. Personally, I prefer non strongly typed languages like Python but I digress. 

During this announcement Google also introduced the Secure Data Connector (SDC) to access data behind the firewall. This, I think is more significant and will have a bigger impact on the Cloud Computing landscape. Establishing a secure yet simple to setup link between the Cloud and corporate data assets will prove to be a game changer. Microsoft knows this and it has been developing its Cloud platform to interconnect public and private clouds as well. It seems that many companies are going to be publishing connectors in the near future, among them Oracle.

One step at a time the Cloud continues to evolve and mature. Each evolution delivers new capabilities and removes obstacles. The future is exciting; without a doubt.

Friday, April 3, 2009

AMZN AWSome


Well, Amazon strikes again. MapReduce (Hadoop) on demand. Although AMZN already offered some Hadoop pre-configured AMIs, the simplicity of this new packaging makes it much easier. Furthermore, it is synergistic with EC2 and S3.

I have been using Amazon Web Services for close to a year now and they continue to surpass my expectations. I wouldn't be surprised if AMZN spun off AWS and filed for an IPO sometime next year. It is not easy to isolate AWS's revenues from AMZN financial statements but with customers in 96 countries and a super scalable business model I have to believe this is a cash machine for them. These folks are brilliant. 

Many people often relate the Cloud to pure storage and CPUs as in pure hosting. AMZN goes up one level and provides application services. SimpleDB and SQS are good examples, now Elastic Map Reduce is another one. These are higher level application services on demand, industrial strength and world class.

A quote from Spiderman comes to mind: "... with great power comes great responsibility". What would you do with all this power?


Thursday, April 2, 2009

Cloud Manifesto



The Open Cloud Manifesto was published earlier this week. I have been following this the development of this Manifesto along with the activity in the Cloud Interoperability Forum for several weeks now. Inevitably I have mixed feelings about a lot of the concepts being discussed.

What does it really mean to have an Open Cloud and why does that matter?

Advocates of Cloud Interoperability would like to be able to switch from one Cloud provider to another quickly and easily if their business requires so. They would also like  to see a common API for provisioning of services and applications. For instance, something like "ODBC" for Cloud repositories. Efforts of standardization and industry cooperation always remind me of the development of GSM for wireless communication in Europe. One school of thought believes that it is better to let individual companies create their own standards and allow for open competition to select the best one. The other school of thought sees too much friction in that model and believes that cooperation by market leaders can ultimately produce a better solution faster.

In the history of technology innovation, standards have always followed the establishment of a dominant design. For the cloud, this is way too early. There are just too many viable offerings with clearly distinct functionality. First with ODBC, then with J2EE I have heard many people claim that standards reduce their risk because they could "easily" change providers (i.e. database or application servers). But was this ever the case? In spite of supporting a common connectivity layer each RDBMS offers so many unique features that porting applications is can be completely unpractical. The same with application servers, I believe. Vanilla functionality can be migrated from WebLogic to WebSphere relatively easily. But the best performing, mission critical, strategic applications are most often optimized for a specific platform.

The Cloud seems pretty open already. It is very accessible, easy to get started, well documented and it is already based on industry standards (http, XML, REST, SOAP). I can see how a number of vendors could expand their product's markets if they did not have to re-write them for each Cloud provider, but are we trying to boil the ocean?

It seems the Cloud is doing quite well and although it needs to continue to mature, why fix it if it ain't broken?