Thursday, October 29, 2009

Decision Management and the Cloud

Last week Predictive Analytics World brought together a fantastic collection of minds to share case studies and expertise. A theme echoed in multiple sessions and conversations was that analytics are a necessary but not sufficient ingredient for success. To succeed it is critical to have a strong alignment and integration of technology, processes and corporate strategy. Not an easy task but the ROI tends to be irresistible and mastering this act is at the core of Decision Management.

What does Cloud Computing have to do with effective Decision Management? Well, one of the most obvious aspects is better analytics. Cloud computing offers more storage and more processing power at lower costs. More computing power and more data for less money seems quite attractive. Furthermore, these characteristics make predictive analytics accessible to many more organizations and applications. I like to think about it as the democratization of predictive analytics made possible by Cloud computing.

But that is just one part of the story. Predictive Analytics on the cloud will succeed because of the Cloud's standards and open platforms. Operationalizing analytics behind the firewall of a large corporation can require the custom integration several layers of expensive software and in-house applications, e.g.: point of sales systems, call centers, databases, rule/workflow engines, analytic engines/models, etc. The cost and complexity of these projects can easily challenge the most optimistic ROI models.

Fortunately the Cloud is being designed precisely to simplify these scenarios. For example, let's consider, one of the leading Cloud platforms available today (along with Netsuite and Intuit among others). Salesforce simplifies integration because it has a rich API based on open standards. Salesforce has a powerful workflow engine to automate business processes a flexible data repository and native support of email, social and mobile channels. All of this functionality is available on demand on a pay as you go basis. Similar functionality is becoming more and more popular among cloud platforms but getting it on-premises is far more complicated.

The Cloud can deliver more powerful analytics and also help make them actionable. Now we just need alignment with the Corporate Strategy but we'll leave that for a future post.

Wednesday, September 2, 2009

Doing more with less

Christopher Musico wrote a well timed piece for DestinationCRM describing how some companies are leveraging predictive analytics to become more competitive. It comes as no surprise that during challenging economic times companies try to be more efficient by making better use of their assets. And what better asset than their internal databases?

I think it is safe to say that by now most companies (large and small) have deployed relational databases and reporting software of varying sophistication. This reporting tools have been asked to answer questions about historical events. Questions about 'what happened', about 'the impact (how much, how many, how often)' and even more specific questions that require drilling down to granular levels of information to understand where exactly the event took place (place, product, point in time). Some more advanced companies have even deployed alerting systems to be notified as soon as certain conditions occur. The natural next step to develop competitive advantages is to deploy analytic solutions that can predict and optimize future events.

This conclusion might seem obvious to many. Christopher's article includes the results of a survey where more than 50% of the respondents intend to deploy predictive analytics in the next 6 months. Predictive Analytics can bring many benefits but effective deployment is no easy task. The software cost, skills shortage and infrastructure complexity can be significant barriers to entry. Not to mention the necessary changes in culture and business processes.

Traditionally, the successful deployment of predictive analytical solutions has been reserved to a handful of large companies with vast resources. It is nice to see organizations like USTA succeeding with these projects. I believe this is a sign of things to come. I think the intersection of Cloud Computing and predictive analytics will create new possibilities for powerful and accessible insight.

Monday, August 31, 2009

What if the Cloud never makes it to the enterprise?

David Linthicum wrote an interesting piece about the cloud reality setting in. The enterprise is cautious and a bit skeptical about the Cloud; as it should. Frederic Paul at writes about the Forrester report that asks "Why not run with the Cloud". There seems to be a general agreement that when it comes to the Cloud the question is not IF but rather WHEN and HOW.

But what if the answer for the enterprise is never. What would happen if the enterprise never adopts the Cloud (at least the cloud as I like to think about it). What are the risks of inaction? How competitive would those enterprises be?

I believe history would repeat itself. No matter how large or mighty, those corporations would collapse unable to compete against more efficient and more innovative competitors. Just like Chrysler, GM or the print media industry. I echo David's opinion: 'there are pros and cons, let's understand both' (paraphrasing) and I would like to add that companies need to go one step farther and start focusing on the HOW. The ones that do will deliver more value to customers and shareholders for a longer period of time.

Friday, July 31, 2009

Gartner and the Cloud

Back in may there was some controversy with the estimates by Gartner of the Cloud Computing market size. Earlier this month Gartner announced the results of a survey of SaaS users where findings did not exactly provide a ringing endorsement of SaaS. Needless to say these results also caused controversy and a very articulate response from Jeff Kaplan from THINKstrategies.

Gartner's press release of the SaaS survey mentions that TCO, ease of integration and technical requirements are challenges faced by SaaS solutions. No surprise. And that is why when I look at the most successful SaaS company in the planet I can understand that success is never a coincidence.'s is a true role model for Cloud Computing and SaaS startups. Ten years ago they understood the importance of technology integrity, innovation and excellent customer service.

When it comes to the future, I prefer to focus on the ones that are shaping it and not just talking about it.

More Blue Analytics

IBM continues to position itself as the absolute leader in Business Analytics and Optimization. With the acquisition of SPSS IBM has the deepest portfolio of products and services available under one roof. Oracle also has quality products but it lacks IBM's depth in Business Consulting. On the other hand SAS software and consulting is top notch but IBM's offering is superior because it can enhance its analytics platform with database, hardware and decision management software. It is almost too much for a single company to handle effectively. For sure some new buyers will be concerned with vendor lock-in but I believe most IBM customers will be glad to have access to all of these capabilities from a trusted advisor.

While Cloud Computing has grabbed most of the headlines this year, the developments in Predictive Analytics should not be ignored. I am convinced that the intersection of Cloud Computing and Predictive Analytics will provide the perfect stage for the future of business innovations.

Thursday, June 25, 2009

BI and the Cloud

Wayne Eckerson from The Data Warehouse Institute has an interesting post about Implementing BI in the Cloud. He mentions that BI in the Cloud faces four constraints:

1) Customization or application fit
2) Ongoing cost of transferring data to the Cloud
3) Data Security
4) Vendor viability

Wayne wraps up his post with the following conclusion:
BI for SaaS offers a lot of promise to reduce costs and speed deployment but only for companies whose requirements are suitable to cloud-based computing. Today, these are companies that have limited or no available IT resources, little capital to spend on building compute-based or software capabilities inhouse, and whose BI applications don’t require significant, continuous transfers of data from source systems to the cloud.
I tend to agree with the following high level thoughts:

1) BI on the Cloud is not for everybody (yet)
2) Due diligence is necessary to reduce risks on data security and vendor viability

But Wayne's post raised several questions in my mind:

Integration. There are several ways to customize an application. For example, a multi-tenant architecture like offers endless possibilities to customize and extend every single instance. Are these customizations unprofitable? No they are not, they are part of the application and they do not require changes to the underlying code. Can the same level of customization apply to BI? Absolutely. Wayne mentions briefly Platform as a Service but his focus is towards custom application development (although his chart shows "DW as a Service"). An intriguing approach to offer a BI platform as a service would be to setup something like MicroStrategy and configure it in a multi-tenant fashion. The underlying layer of IaaS would support the data repository while the top layer of SaaS could support ad-hoc reporting, vertical applications or full customizations on top of their API. Would this be unprofitable? Not at all. Wayne makes another good point regarding integration:
So, unless the SaaS vendor supports a broad range of integrated functional applications, it’s hard to justify purchasing any SaaS application.
But from my experience, successful enterprise wide deployments need to focus on integrating subject areas at the data level. This is an architecture and design challenge. A well integrated data repository will support integrated functional applications seamlessly. It is about the underlying data not only the application.

Ongoing Data Transfers Costs. Is this really a significant constraint? How much data does the typical Data Warehouse has to incorporate on a daily basis? The cost to transfer data to Google's App Engine is $0.10 per GB. Moving a TB a day would cost around $3,000 per month (I'm not suggesting using BigTable as a DWH repository yet). As Data Warehouse costs go, this does not seem unreasonable. Amazon is running a promotion right now that would bring that cost down to $1,000; hardly a deal braker. Latency and complexity can complicate this data transfer. This is to be expected because 99% of them were not designed with the Cloud in mind. Which brings me to my final point.

I mentioned using MicroStrategy as a BI platform on the cloud as an example to make a point. I believe that successful Cloud applications need to do more than just cloning their on-premise counterparts. They need to leverage the Cloud inherent qualities, for example elastic computing power. The nature of the Cloud can enable ongoing ETL: receive a copy of the transaction on the fly via a web hook, cleanse, transform and aggregate in real time or a few times a day at least. How about Map Reduce? I think this technique will allow to create more powerful analysis over more data, faster and easier.

Rigid applications built with yesterday's patterns will struggle to survive, in the Cloud or elsewhere. The Cloud is an open environment by definition, its openness will facilitate the integration of multiple data sources from inside and outside the corporate firewall. This integration will support a next generation of cross-functional applications. Bandwidth and storage costs continue to drop very rapidly and will cease to be a major consideration in the near future. New design principles (e.g. scale out vs. scale up) will enable more sophisticated analysis over ever larger datasets (Google analyzes over a PetaByte of data every day). With over $1B in sales is the most successful SaaS provider. They host more than 55k customers, well over 1M users and every day execute more than 30M lines of customer code. If they can do it, I'm convinced the next BI leader in the Cloud will do it as well. That is how I see it.

Tuesday, June 23, 2009

Federated Applications

Intuit recently launched a new ambitious program to deliver Federated Applications on their platform. These applications will be available to Intuit customers via a "marketplace" (my term not Intuit's). Intuit has a number of SDKs to use their Cloud platform (Flex seems to be a strategic platform for them) but Federated Applications will not be constrained to a specific technology because they will be able to use a REST API. From the distance this program looks similar to Salesforce's successful AppExchange--approaching 1,000 published applications rapidly--but after a closer look it is important to highlight two intriguing differences:

1) Intuit takes care of the billing. It collects the monthly fees, pays the application developer and keeps a %. To many developers this is a nice service, they just know that every month money will be deposited in their bank accounts and they don't have to worry about paying Intuit. Intuit gets paid right away.

2) Federated applications have the ability to read data from Intuit applications but ALSO from any other Federated application in use by the current user. To make this happen Intuit requires that all applications meet certain integration requirements. While this is an extra step for most application developers the amount of work does not appear to be excessive (i.e. weeks not months). These requirements help to deliver a more consistent user experience and they make it easier for customers to try new applications. Another interesting consequence is the potential Network effect among applications. Federated applications are expected to publish their data objects to an open and shared environment. This means that application A can read/write data from Intuit applications but also from federated application B. This data exchange can increase the value of individual applications significantly (the more connected the more valuable). Of course, each application would need to know what type of objects to expect and what to do with them but given a common framework it should not be too difficult.

Compared to other Cloud providers (e.g. Salesforce) Intuit would seem to be off to a slow start but I find their vision intriguing and ambitious. More importantly if Intuit succeeds in recruiting useful applications this strategy could become very profitable for all involved.

Thursday, June 4, 2009

The Cloud is Crossing the Chasm

Cloud Computing is crossing the chasm. Every other week you hear about the deployment of Cloud Computing applications to thousands of users by leading corporations and governments. These deployments go hand in hand with the announcement of new Cloud offerings by leading technology vendors and service providers. You can still find disagreements about the size of the market or the actual definition but there is no doubt that Cloud Computing is having a great year and the best is yet to come. These are some of the many news/announcements that caught my eye:

1) This is a couple of months old but I have to mention Oracle's acquisition of Sun. It was interesting to see Oracle make the move after all the talk about IBM buying Sun. But the implications to the industry are far more intriguing. How is this acquisition going to impact HP? Will IBM see any impact from Oracle 'owning' Java? Will ORCL be able to manage Hardware and Software effectively? Oracle has been a bit slow to announce a compelling Cloud strategy. Sun made a late announcement about their software strategy and more recently another one about consulting services for the Cloud. This is an intriguing combination, it might seem like the odds are against them but you should never underestimate Oracle.

2) TIBCO launches TIBCO Silver an application development platform for AMZN's AWS. I'm intrigued by this announcement. Although some have labeled it "Amazon for Dummies" I'm curious to see if TIBCO will be able to leverage some of its sophisticated analytical technology for intelligent scaling and SLAs.

3) In close partnership with RedHat (RHT) Verizon (VZ) launched a new service offering called Computing as a Service (CaaS) "With CaaS, you have access to bandwidth, servers, storage, and firewalls with dynamic real-time control over what, when, and how those resources are deployed". This service will put pressure on traditional/smaller hosting providers looking to transition to the Cloud. It will also help enterprise customers feel more comfortable about embracing the Cloud.

4) The brilliant Ray Ozzie elaborates on Microsoft's view of the Cloud. You know it is serious when Microsoft gets serious about it. It is hard not to imagine a future with public and private clouds, unless of course, networking technology evolves to a point where the difference is irrelevant. Given MSFT's impressive footprint in the enterprise (Exchange, Sharepoint, SQL Server, etc.) they seem to be in a perfect position to dominate that hybrid world with Azure. If you can offer greater scalability, lower costs and greater flexibility leveraging the same skills and the technology you already have then you have a winning recipe. The question is "Will they be able to execute?"

5) CSC to offer cloud services. Now if it wasn't enough that Microsoft is dead serious about Cloud Computing, CSC one of the largest and most successful government contractors will leverage its security and strategy expertise to offer cloud services. I'm expecting Lockheed, EDS and others to follow with similar offerings. These services will certainly lower the risk for CIOs in the public sector.

6) More fuzzy math, this time courtesy of Gartner in one of their estimates for the size of the Cloud Computing industry. Inconsistent estimates are the consequence of inconsistent market definitions, I can see that. I just hope that there are no hidden agendas.

7) Google Apps is profitable and growing. Although not a direct replacement of MS Office, Google Apps continues to capture market share one enterprise customer at a time. Although mostly driven by cost savings in Email you can expect the adoption of Google Docs to increase as new and improved versions continue to roll out over the next year (including of course my favorites: App Engine and SDC).

With so much activity it is hard to believe that the best is yet to come but trust me, it is. Hang on tight.

Thursday, May 28, 2009


Today Google dominated the spotlight with with the announcement of Google Wave. The idea and implementation appear superb while the approach to open a lot of its code is very intriguing. However what I found just as important was HTML5 and how much Google is embracing it.

Google has an impressive portfolio of APIs and products for developers. They have created and embraced a large number of open standards and are working hard to bring everything together as a comprehensive platform. A platform that depends heavily on the support by "Modern Open Source Browsers" of HTML5 and Javascript.

Microsoft has a dominant footprint in the enterprise and Silverlight continues to gain popularity. Adobe's Flash is as ubiquitous as HTML and Flex is also gaining momentum. But I when I look at the power and simplicity of Google's App Engine, along with the dozens of open APIs (from visualization to online analytics to documents scripting) it is hard to believe that the status quou will resist this powerful 'wave' of innovation. 

Thursday, May 21, 2009

Are we answering the wrong questions?

Lyndsay Wise wrote a good article on about different types of Business Intelligence (BI) and how organizations adopt them based on their level of BI maturity. It reminded me of a recent user group meeting where several people from a Fortune500 company discussed their difficulty managing the ever growing list of sales reports requested by their users. The business users were not finding the answers they were looking for and they hoped that having more reports would help answer their questions. Quantity over quality.

I think this is a reflection of how the evolution of BI has been driven by IT and not Business. This same reason is a common obstacle for BI projects and a contributor to elusive ROI. Looking around it is easy to find many vendors offering "Sales Dashboards". You can get them on a browser with AJAX or Flash, you can get them on your iPhone and even integrate them in your favorite SFA platform or portal. 

This is nice but when you look at the actual reports, they are still pretty basic. These dashboards show charts such as: Revenue and Win Rates Trends, Revenue by Industry/Region/Quarter, Variances over Plan, Count of Deals by Age, etc. These are important questions but companies have been looking at similar reports for the past 20 years. The technical delivery has improved (faster, better, easier) but the actual business content is still lagging.

What would I like to see instead? Well, if I was a sales executive I would be looking for information that can drive action. Something to tell me "what to do" and "what to stop doing" (beyond 'pick up phone and call a Region Manager to ask him why is he/she behind plan'). Knowing that my win rate for last quarter was 25% is fine but I want to know why? What are they key contributing factors? How do I increase it to 30%? 

Technology has had its 15 minutes (years?) in the spotlight. I think it is time to turn our focus on Business. This new focus will drive innovation and will ultimately make companies more effective and more competitive. Of course, I think the solution is Predictive Analytics and I will explain why in a future post.

Thursday, May 7, 2009

Customer Experience beyond Customer Feedback

Customer Experience analysis and management has gained a lot of popularity as a business intelligence application. This popularity is due in part to advances in text processing technology as well as the exponential growth of unstructured data (i.e. blogs, email, IM, twitter, etc.)

A common analysis parses customer feedback to identify problems or causes of dissatisfaction. For example measuring the sentiment (positive or negative) of a hotel guest after a stay. This is an important metric for the hotel management along with identifying the root cause of that sentiment. However, I hope that we pay enough justice to these applications and consider all of their capabilities and potential. Otherwise our vision could be too narrow and a narrow vision is risky for adopters, providers and the industry in general. 

The vendors of Customer Experience software and methodologies offer depth that goes beyond the simplistic example of customer sentiment. In my opinion Customer Experience needs to be analyzed in the context of Customer Life-cycle and Customer Value. 

Once we have identified and ranked the key factors that drive customer's sentiment, we need to look at those rankings across a number of dimensions, including time and geography but most importantly customer segment. After tracking both sentiments (both positive and negative) across customer segments we need to overlay financial metrics at the customer level and at the company level. How is this sentiment affecting profitability and how big is the impact. For instance "... because the A/C was too loud in these locations, our business traveler segment reduced their number of stays by X which caused a drop in margins of Y ..." These type of analysis would offer a clear and meaningful ROI analysis to justify and champion initiatives to manage and improve Customer Experience. The next step of course is enhancing these analysis with predictive analytics to create stronger leading indicators and react before the problems appear.

Comment on this blog or email if you have any thoughts on this topic. If you are a vendor and have a case study that touches on these topics let me know as well, I'd love to write about it in this space.

Thursday, April 23, 2009

Relentless Innovation

The mantra at Google is 'release early, iterate often'. This constant evolution inevitably leads to constant innovation. It would be easy to argue that their ability to innovate is a natural result of their size and financial resources but that conclusion is short sighted. Joining companies like  3M and Apple as icons of creativity, Google continues to shape business and society all over the world.

Some innovation might seem small at times but that doesn't mean they are not relevant. For instance, after adding Java support to the AppEngine, yesterday they release a minor upgrade to their Python SDK with new libraries for cryptography. Now, that might not sound like a big deal; in fact, some will claim that it should have had it a year ago. Either way this small step continues to enable developers to create more powerful and more secure applications. One step at a time.

Meanwhile on 4/21 they launched the Google Analytics Data Export API. Granted, other web analytics vendors (e.g. Omniture) have had APIs for a long time but they are not that easy to use nor are they free. 

OK, so might still think 'two little APIs, what is the big deal?' - well the big deal is that adding 2 or 3 APIs every other day you can compile significant functionality in a couple of months. However, the real value is the way Google continues to enhance every single one of their product every single day. Release early, test, measure, improve, release again. This is non-stop. For a slightly more impressive announcement, just take a look at the following post regarding the web standard for 3D Graphics. The video is stunning but remember it is running on a web browser. 

Thank you Google for pushing the envelope. I can only hope this approach to innovation permeates through private and public organizations alike.

Tuesday, April 21, 2009

Booz Allen Comments on McKinsey's Cloud Report

Booz Allen makes several interesting remarks about McKinsey's Cloud Report. The following two jumped at me. First:
They state that cloud offerings “are most attractive” to small and medium sized business, and “there are significant hurdles to the adoption of cloud services by large enterprises.” That would come as quite a shock to Target, Eli Lily, the New York Stock Exchange, the American Stock Exchange, NSADAQ, Toyota, E*Trade, Computer Associates, and a host other large enterprises that have been in the cloud for a couple of years.
Where this example appears to break down is that, for the data center, they are calculating the cost per core, while for Amazon they are calculating the cost of a Large EC2 instance, which is four cores. On a single-core basis, an EC2 Small instance is only $72 month, running non-stop. Assuming the same 10% utilization used in other examples, the comparison should be $48/month for the data center and $7.20 month for EC2.

Thursday, April 16, 2009

McKinsey and the Cloud

McKinsey&Company just released an interesting document on Cloud Computing: Clearing the air on cloud computing. Very interesting thoughts. I agree with the idea that over hyping Cloud Computing (and any other new technology) is risky and when done on purpose, irresponsible. I also liked their Cloud definition, it seemed pragmatic, down to earth.

I was surprised by their conclusion that AWS would not be cost effective for large corporations. I know AMZN has some large customers and I'm sure they will have some follow up commentary. In terms of the cost analysis, I think the author is missing two points. First, I think that the effort to initiate or further deploy virtualization in the corporate data center has a not zero cost. Starting from training and support. It obviously does not happen overnight either. Secondly and more important in my opinion is the opportunity cost. I believe that the financial rewards offered by the Cloud's speed to market far outweigh the potential incremental cost (assuming they are correct and it is more expensive for large corporations--I have my doubts). 

For example, let's take a hypothetical example of a multi-billion dollar media company, that would be a large corporation in my mind. They need to analyze 4 to 6 TB of clickstream data every month to fine tune their advertising efforts. The ability to execute on their strategy could easily bring additional revenues in the 8 digit range. They have two options: 1) go with their current data center 2) deploy MapReduce/Hadoop at AWS. Option 1 would easily take 6 to 8 months to complete. Option 2 could be up and running in days at most. To me that speed to market is priceless. In the short and long term.

Tuesday, April 14, 2009

Blue Analytics

IBM is launching a new consulting organization to focus on Business Analytics. This is a very significant move that should bring a lot of positive developments to the industry. IBM has tremendous experience in business consulting and unmatched technology assets to deliver a complete an actionable solution. 

I am convinced that Analytics will be a key differentiator in years to come. Companies will need to compete with Analytics to remain competitive. The technology is available and the current economic conditions - along the need for better risk management - will foster unprecedented innovation. Welcome to the Analytics generation.

Thursday, April 9, 2009

SDC is what really matters

Two days ago Google announced several enhancements to the AppEngine. The support for Java grabbed most of the headlines. It was the number one feature request from developers and it certainly opens new possibilities for JRuby, Scala and others. Personally, I prefer non strongly typed languages like Python but I digress. 

During this announcement Google also introduced the Secure Data Connector (SDC) to access data behind the firewall. This, I think is more significant and will have a bigger impact on the Cloud Computing landscape. Establishing a secure yet simple to setup link between the Cloud and corporate data assets will prove to be a game changer. Microsoft knows this and it has been developing its Cloud platform to interconnect public and private clouds as well. It seems that many companies are going to be publishing connectors in the near future, among them Oracle.

One step at a time the Cloud continues to evolve and mature. Each evolution delivers new capabilities and removes obstacles. The future is exciting; without a doubt.

Friday, April 3, 2009


Well, Amazon strikes again. MapReduce (Hadoop) on demand. Although AMZN already offered some Hadoop pre-configured AMIs, the simplicity of this new packaging makes it much easier. Furthermore, it is synergistic with EC2 and S3.

I have been using Amazon Web Services for close to a year now and they continue to surpass my expectations. I wouldn't be surprised if AMZN spun off AWS and filed for an IPO sometime next year. It is not easy to isolate AWS's revenues from AMZN financial statements but with customers in 96 countries and a super scalable business model I have to believe this is a cash machine for them. These folks are brilliant. 

Many people often relate the Cloud to pure storage and CPUs as in pure hosting. AMZN goes up one level and provides application services. SimpleDB and SQS are good examples, now Elastic Map Reduce is another one. These are higher level application services on demand, industrial strength and world class.

A quote from Spiderman comes to mind: "... with great power comes great responsibility". What would you do with all this power?

Thursday, April 2, 2009

Cloud Manifesto

The Open Cloud Manifesto was published earlier this week. I have been following this the development of this Manifesto along with the activity in the Cloud Interoperability Forum for several weeks now. Inevitably I have mixed feelings about a lot of the concepts being discussed.

What does it really mean to have an Open Cloud and why does that matter?

Advocates of Cloud Interoperability would like to be able to switch from one Cloud provider to another quickly and easily if their business requires so. They would also like  to see a common API for provisioning of services and applications. For instance, something like "ODBC" for Cloud repositories. Efforts of standardization and industry cooperation always remind me of the development of GSM for wireless communication in Europe. One school of thought believes that it is better to let individual companies create their own standards and allow for open competition to select the best one. The other school of thought sees too much friction in that model and believes that cooperation by market leaders can ultimately produce a better solution faster.

In the history of technology innovation, standards have always followed the establishment of a dominant design. For the cloud, this is way too early. There are just too many viable offerings with clearly distinct functionality. First with ODBC, then with J2EE I have heard many people claim that standards reduce their risk because they could "easily" change providers (i.e. database or application servers). But was this ever the case? In spite of supporting a common connectivity layer each RDBMS offers so many unique features that porting applications is can be completely unpractical. The same with application servers, I believe. Vanilla functionality can be migrated from WebLogic to WebSphere relatively easily. But the best performing, mission critical, strategic applications are most often optimized for a specific platform.

The Cloud seems pretty open already. It is very accessible, easy to get started, well documented and it is already based on industry standards (http, XML, REST, SOAP). I can see how a number of vendors could expand their product's markets if they did not have to re-write them for each Cloud provider, but are we trying to boil the ocean?

It seems the Cloud is doing quite well and although it needs to continue to mature, why fix it if it ain't broken?

Tuesday, March 31, 2009

Seybold and the Cloud

Andrew Seybold wrote an interesting piece on Cloud Computing: Cloud Computing -- a new version of an old idea. He identifies a few similarities between the centralized nature of cloud computing and the mainframes from many decades ago. Andrew's comments are centered mostly around centralizing data and making it available to a variety of clients. One of Andrew's biggest concerns is that he might not have network access ALL the time.

Network access is a valid concern and while wifi and broadband coverage continues to grow every day it is still not EVERYWHERE. Google has made offline versions of several of its products, most recently GMail. However, I think Andrew's mainframe comparison is not quite right. The Cloud is much more web based email and web based document storage. Openness and Accessibility are the Cloud's key disruptive drivers. 

Openness: the programmable Web lists over 1200 APIs available for any software developer. Most of them are free or free to get started. These are 1200 intelligent services that can be mashed up together to create new applications. They are open to anyone in the world without the need of any high end equipment. Technologies like HTTP, XML, SOAP, REST, and JavaScript are making this possible.

Accessibility: The programmable Web is open to anyone with the right skills but you don't need to be a rocket scientist to benefit from it. Kids in high school are building mashups to share pictures and communicate with each other but so is the Federal government. This is unprecedented. Sure there is a digital divide but the gap can't be compared to the mainframe days. Furthermore, anyone from their home computer can have access to Google's massive computing infrastructure for free. And once they go beyond their free quota, they can still serve an application with more than 5M page views a day for less than $50 a month. These conditions will power unprecedented innovation over the next 5 years.

Andrew says that the Cloud is not being pushed by IT professionals. I don't completely agree. I see a lot of CIOs looking at Cloud based solutions where possible to reduce operating costs. But in my experience the real people pushing the Cloud are actual business users that are exposed to the innovation delivered by consumer services like Facebook and wonder why can't they have the same type of tools at work.

The Cloud is successful because it represents the ultimate democratization of technology: driven by end users and open to all.

Sunday, March 29, 2009

SAS, analytics and the cloud

SAS got a lot of press coverage when it announced it would invest $70M in a new data center. There is no doubt SAS is a world class company, clear market leader. In a 2008 survey by Rexer Analytics, 45% of respondents reported they use SAS. Almost 1 in 2 data miners? not bad.

I did not know SAS had a hosting service. As they said, their hosting business has grown with almost no advertising but late last year I heard they were pitching to one of the top media companies. I was surprised at first but it really makes a lot of sense and their recent announcement confirms the solid traction that business is getting.

After thinking about it a little more I'm very curious to see how their SaaS offering will play out. I believe the Analytics market is prime for a big disruption. The market is dominated by a handful of companies with relatively closed technology (at least in one direction) and significant profit margins. Open Source projects like R have shaken the game a bit but nothing earth shattering yet. I believe the Cloud -with its limitless storage and cpu power- will bring a more disruptive wave.  Will SAS take the lead? Can they embrace the power of Hadoop and in-memory databases to take their business to the next level? Or will they play conservatively, milking their current cash cow and using their market dominance to crush their smaller competitors? 

This will be a fascinating race. thoughts, 2nd part

This is a short follow up to a previous post about (SFDC) development platform. The more I work with this platform the more impressed I become: SFDC is fantastic. I have talked about their documentation and support in the past. Well, their technology is quite impressive as well.

The platform has been built on a very consistent and predictable multi-tenant architecture. A highly efficient one by the way. Reports indicate that all of SFDC runs on 1000 mirrored servers, that is a total of only 500! This is quite remarkable.

Their development environment has all the features an enterprise developer can ask from a cloud provider. SFDC uses Visualforce Pages to render GUIs. These pages use a proprietary markup language that is quite similar to other frameworks like Django. By using this framework they enable/enforce a mainstream and efficient Model View Controller design pattern.

In addition to the Visualforce pages, SFDC recently added APEX, a Java based scripting language for data manipulation operations such as triggers or traditional stored procedures in relational databases. Java developers should feel right at home with APEX, it is strongly typed and the syntax appears identical to Java. APEX supports inheritance, unit testing and access to web services. Some other operations are restricted for security reasons; very similar to Google's approach with the AppEngine and Python's libraries.

Finally, the integration and customization features of SFDC are very useful and easy to use. Objects can be extended with new attributes, users can have different security profiles that apply specific privileges to different applications, custom applications can be packaged, managed and published with just a few clicks and their security requirements should put any corporation at ease. A winner all in all. What I find most impressive is how easy it is to see how all of these features are really customer driven. is definitely a role model for any other aspiring Cloud company.


Cloudera has created quite a buzz with its recent launch: great talent and great backers without a doubt. Business plan? I'm just not as excited as everybody else. I like Hadoop and what I have read about Cloudera makes it look like a solid business. Game changer? I find it hard to believe. This launch and press coverage reminds me of another startup from many years ago: -and I'm sure there must be dozens of other examples. Great talent, tons of press, overblown expectations. I hope Cloudera has a better future.

Tuesday, March 17, 2009

Customer Experience and Analytics

Last year I spent some time learning about Customer Experience and Customer Interaction Management. I enjoy both topics quite a bit. More recently due to my work with predictive analytics I have come across a number of Text Analytics companies that are focusing on improving the Customer Experience. From my experience in the Automotive and Pharmaceutical industry I know there are vast amounts of unstructured data repositories buried in most corporations; and more is generated every day.

I have found however a bit of a disconnect between decision management, text/predictive analytics and customer experience management. I think the effective management of the customer experience starts with a clear strategy that defines what the desired experience should be across customer segments. This strategy needs to be aligned with the corporate goals and have corresponding financial indicators. Once a strategy is in place, analytics can be used to detect specific behaviors or recommend the next best action. These recommendations can be as a predictive score or an alert to a customer service, sales representative or customer touchpoint. But then there is the need for a decision management system that would take the analytics' result and act on it. These decision management system needs to have rules managed by those business users responsible for managing the Customer Experience strategy.

Today I see a chasm where vendors focus just on the analytics or just on the strategy or just on the rules management. It seems to me that there is a big opportunity to bring all those pieces together in a proactive, money making, recession proof solution.

Monday, March 16, 2009

Success story

Here is a short success story about how Appirio runs its entire business on the cloud.

Sunday, March 15, 2009

Weekly roundup

The following announcements from last week caught my attention:
  • The New York Times keeps publishing new APIs, now for state legislature. Just like the Guardian in the UK it seems they are truly looking to reinvent themselves.
  • Google enables behavioral ad targeting. This is a brilliant maneuver. It is simple and powerful in classic Google style.


Earlier this week I wrote about the idea of using text analytics to parse and analyze citizen's sentiment along with government communications. Seth Grimes suggested that OpenCalais could be used to build this mashup. I had heard of OpenCalais not long ago but I have not had a chance to play with it yet. Coincidentally, I came acroos MediaCloud and it seems they are doing something similar but with a focus on Media coverage. I am not sure if I will have the time to research OpenCalais but it is certainly encouraging to see some of these ideas in action. The next natural step would be to combine OpenCalais with the Twitter API to enable the structured analysis of people's activity streams.

Wednesday, March 11, 2009

Text Analytics

Sid Banerjee from Clarabridge wrote an interesting post on text analytics. I don't know if Clarabridge has a RESTful API. If it does, I think it would be interesting to use it to create a mashup with the White House blog. The Obama administration has asked the public to submit comments and ideas on a wide variety of topics ranging from health care to the economy. I believe in some cases they have received tens of thousands of emails. I wonder how are they processing them?

With a Clarabridge mashup we could analyze each blog posting from the Obama administration but more importantly we could use it to analyze the citizen's feedback. We could then use this analysis to track the government progress and responsiveness.

In addition to the WH blog I would like to use the API to process the RSS feeds from Talk about efficiency and transparency, I can't think of a better way of doing it.

Sunday, March 8, 2009

News Roundup

The past couple of weeks have been full of interesting news and announcements. Here are the ones that caught my attention, in no particular order:
  • Vivek Kundra to the WSJ: "If I went to the coffee shop, I would have more computing power than the police department". Long live the cloud.
  • From Newsweek Magazine the People's Data, expanding on the idea of publishing government information through web APIs.
  • Martin Fowler had a couple of great blog postings. One on Contradictory Observations a simple and powerful thought about our common approach to information management. Another one titled Technical Debt and right on target with the current economic times.

Wednesday, March 4, 2009

Gmail and GFail

A few days ago Gmail was unavailable for a few hours and many people were outraged. I understand the pain of not being able to access email but was all the criticism really justified? Are people holding Google to unrealistic expectations? Are the standards too high? What is the actual impact of not having email for 2 or 4 hours?

I believe in the age of the cloud there will be many very public service interruptions. I would not consider Email a trivial service, particularly not for Gmail's size. However, there are already many more complex applications starting to become available. What can we realistically expect in terms of Service Level Agreements?

I believe these new applications will have to offer many alternatives to handle exceptions when something goes wrong, particularly when there is little control over the component that fails. Nevertheless businesses and individuals will have to put in place processes in place to mitigate some of these risks.

Friday, February 27, 2009

Scaleable Data Stores

This week I came across two great articles about scaleable data repositories with MySQL. The first article is from Bret Taylor where he explains how they use MySQL at FriendFeed. The second one is from Jurriaan Persyn and he discusses database sharding at Netlog. Both are fantastic reads. These are two examples of creative approaches to handle massive scalability requirements. These approaches join a number of other projects like Cassandra and Voldemort looking for ways to scale out to meet the requirements of the most demanding applications. But a lot of these approaches today might seem like best practices more than bleeding edge. Although I say this with reservation because these techniques are still not easily embraced by the majority of the corporate world.

However, what truly amazes me is the true genius of companies like Inktomi, Yahoo and Google that came to the same conclusions 10 years ago. They had the vision to understand the challenges ahead and the courage to follow the path less traveled. They did not settle for suboptimal solutions using commercially available technology. Instead they chose to innovate: writing their own file system, their own distributed storage and their own algorithms. True genius, the courageous kind.  

Wednesday, February 25, 2009 thoughts

A few weeks ago I started looking at -'s Cloud development platform. Initially I was a bit skeptical because I had never thought of Salesforce as a platform provider. I have used the SFA and CRM applications many times and although I have always liked its simplicity and performance of I did not know what to expect about the underlying infrastructure as a development platform.

I must admit that I have been very pleasantly surprised. Salesforce is well known for its aggressive marketing but the quality and depth of their technology should not be overlooked. However what has impressed me the most is their pragmatic approach to software development and the quality of their documentation and developer support.

What do I mean by pragmatic approach to software development? I mean that Salesforce has developed technology that is truly focused to solve a business problem. As you look around their API, data models, tools and configuration options it is easy what specific problem they address. You would not find technology for technology sake. 

The quality of their documentation and developer support are quite remarkable. Salesforce truly makes it easy for developers to get started. Their support turn around time is terrific and their user and developer communities are active and vibrant. I will discuss their platform in a future post but as of now I consider a leading Cloud platform that should be carefully considered by anyone looking to build an application on the Cloud.

Monday, February 23, 2009

Open Data Mashups

A few days ago the smart fellows from JuiceAnalytics published an interesting Treemap using data from a draft of the economic stimulus bill. I am a big fan of Treemaps and I think they executed very well. 
This exercise reminded me of several comments on Open Government. I believe it was President Obama who a few weeks ago was talking about taking data from several government agencies and offer it to the public like Open Source. His theory was that citizens would take it upon themselves to analyze the data and hold their government accountable.
I find the concept fascinating. Maybe we could take the tax returns of individuals in power and pass them through a TurboTax Web Service to detect irregularities. At least it would help narrow down the field of candidates for cabinet positions.

Sunday, February 22, 2009

Weekly roundup

This is the Cloud Computing announcements that I found most interesting this past week:
  • Juniper and IBM partner for Cloud management. Thse Cloud ecosystem continues to grow but now the emphasis is on the networking operational processes. The effort and awareness generated by these giants will continue to accelerate Cloud developments and deployments. However I get a bit nervous when too much attention is centered around private clouds. Few cases will justify the investment in a private Cloud but for most cases I think it beats the purpose. In my mind embracing public Clouds securely will create a more open world.

Tuesday, February 17, 2009

The Cloud and the RDB

One of the promises of the Cloud is limitless disk storage. This storage is often delivered by massively distributed data engines. The scalability and simplicity of these engines comes with some compromises, which tend to surprise people with relational database backgrounds. Tony Blain at RWW does a fantastic job explaining the differences, threats and opportunities. This is an architecture area of Cloud Computing where I find the most concerns and confusion. A superficial understanding of the application requirements can lead to performance bottlenecks or costly architecture mistakes. Some applications will be able to use value pair data stores, others might need to deploy a partitioned row/column relational database while others might need a combination. Tony does a great job outlining the high level differences. The decision you make can affect your Cloud partners, your team's skill set and your development approach. When looking at data repositories for your cloud application choose the deep dive; it will be time well spent.

Monday, February 16, 2009

Cloud Success Story

Cindy Waxer wrote a nice article for Fortune Small Business about the experience of a couple of companies using Cloud Computing for mission critical applications. It makes a lot of sense that the majority of early adopters tend to be medium or small size companies. It could be that large corporations are more skeptical, they have resources to spare or simple job security stands in the way of innovation.

Cloud Computing Paper

This is an interesting paper about Cloud Computing from the University of California at Berkeley. A lot of its content is focused on AMZN's AWS but I think it does a very good job at explaining the high level opportunities for Cloud Computing as well as the challenges it faces. Don't expect deep technical analysis of architecture, protocols or patterns. Instead you will find solid examples that will satisfy a broad audience. Given all of the fragmented content on Cloud Computing available online, I consider this guide a very solid introduction to Cloud Computing's past, present and future.

Wednesday, February 11, 2009

Weekly roundup

These are the headlines that caught my attention this past week:
  • IBM launches AMZN AMIs. Glad to see IBM move in this direction. Customers will still need to have their own licenses but the provisioning and setup should be much faster. This would have been very handy for a former customer of mine in the automotive industry. We had a 45 day project delay because the production server was not ready.
  • Experian launches QAS Pro On Demand, address verification and standardization on demand. I am a big fan of on demand data quality/enhancement services. In house solutions could be very expensive and cumbersome to put together. 
  • YHOO announces pricing structure for BOSS. I am all for this because nothing can be free forever. I hope other vendors follow suit and charge users for extended use of their services. I also wish everybody offered (AMZN AWS) offered a limited set of services at no charge.
  • The New York Times "big" API is now available: 2.8 million articles from 1981 to date.

Tuesday, February 10, 2009

Reading Radar

I came across the wonderful Reading Radar mashup by John Herren. It is a great example of the power of mashups: simplicity and elegance. Of course, John's neat packaging helps enhance the overall experience.

I really liked this mashup for 3 reasons: time to market, maintainability and business angle. Time to market: per John's blog the mashup was very simple to put together maybe a handful of days. Granted John is very talented but the end product is well worth 2 or 3 weeks of effort; still fairly quick to put together. Maintainability: I love the fact that the content is managed by AMZN and the NYTimes. With an automatic update this application requires no administration overhead. Finally business angle: by leveraging AMZN's associate program John now has the opportunity to profit from this application with minimum on going costs (e.g. hosting).

This example helps to underline the tremendous potential of the programmable web. A few dozen reusable services can potentially power thousands of applications and if these applications expose their own APIs then they become building blocks on their own to create millions of new services. I believe this formidable domino effect will create a new virtual marketplace with almost unlimited economic power.

Saturday, February 7, 2009

Weekly Roundup

These are the headlines that caught my attention this past week:
  • Sun announces Cloud Computing Service. This is a somewhat late entry from a struggling company. I look forward to more details; I think they face an uphill battle against more established players like AMZN and GOOG and against more recent offerings from IBM and MSFT.
  • Mosso puts pressure on the pricing of cloud storage and content delivery, directly challenging Amazon. I don't know how these services compare but I am sure Mosso will not be the last vendor to go after AMZN with aggressive pricing. I have never been a big fan of competing on price in a market that is still so young. I would rather see additional functionality be the key differentiator. Particularly when there are so many opportunities to innovate. I hope this is not a sign that the Mosso team is running low on ideas.

Wednesday, February 4, 2009

Cloud Availability

Recently I was talking to an IT executive about Cloud Computing. He was concerned about vendor reliability and SLAs. This is a recurring topic among Cloud skeptics. I often hear them quote the Gmail outages from a few months ago.

Downtime tolerance certainly varies by application and there is a school of thought that proposes that applications with high availability requirements will never move to the Cloud. 

"Independent Applications" that control all of its components should have no problems.  Services like Amazon's EC2 provide so much flexibility and control that administrators and developers can feel right at home. Traditional recovery and redundant approaches can be put in place but with the benefit of having limitless (virtual) resources that can be launched on demand. These are the key ingredients to satisfy even the most demanding environments.

Applications that rely on external services face a bigger challenge. These applications will be as strong as their weakest link. Whether using public services or from partners these Mashups or composite applications will have to develop new patterns to handle exceptions and recovery. Maybe through data caching if timeliness requirements permit or maybe by re-routing to secondary providers. Whatever the approach might be, it seems that the first order of priority is working with IT leaders to ensure a smooth migration for the "Independent Applications". 

Monday, February 2, 2009

Cloud Computing Definition

William Hurley's post about the need to have a common definition for Cloud Computing made me think for a couple of days; particularly this quote:
"What matters is whether or not the community can get together, collaborate on a definition, and support that definition."
To me Cloud Computing is the most disruptive technology of the past 10 years (wireless communications was the previous one) and contrary to popular wisdom, the lack of a common definition will not hamper its evolution nor its adoption.

For many years the Internet was just as difficult to define but its clarity and effectiveness took the world by storm. Cloud Computing will be the same. The Cloud will continue to find its way into our every day life, at home and at work. Call it Gmail, AWS, Twitter, GAE or Virtualization. Individuals and businesses of any size will be convinced by the Cloud's simplicity, efficiency and economics.

While Vendors, Press and Analysts might struggle a bit, the Cloud will continue to impact the way we communicate and process information. We are witnessing a technology tsunami. It is here to stay and it needs no formal definition. 

Saturday, January 31, 2009

Weekly roundup

These are the Cloud Computing news or announcements that I found most interesting this past week, in no particular order:
  • IBM's announcements at Lotusphere. I think this is a brillinat move by IBM: incrementally expanding their current offering to leverage leading cloud services. 
  • The New York Times reports that CISCO plans to launch a computer server with virtualization software. This is an intriguing move. I'm not sure I can see CISCO being very successful in this market but they are a world class company and I'm looking forward to see how it plays out.
  • This article on the Business Mirror describes how the US Department of Defense is looking to leverage the lessons learned from leading Cloud Computing providers.
  • The New York Times released their Best Sellers API. This is a nice addition to their Congress, Movie Reviews, Campaign Finance and Community APIs. 

Thursday, January 29, 2009

Challenges for Cloud Computing adoption

Bernard Golden wrote a great article on "The Case Against Cloud Computing". In this article he cites several interesting challenges slowing the adoption of Cloud Computing in the enterprise, along with his thoughts on how they will be addressed. 

Several months ago I was watching a presentation by Google's Jeff Ragusa, and he made a point that really caught my attention. Many years ago the information technology available to corporations or governments was significantly superior to anything available to consumers. But over the past 10 or 15 years there has been a significant change. Open standards and economies of scale have put powerful technology on the hands of consumers and small businesses. Today, as Jeff points out, there are many instances where small business or individuals are able to deploy solutions faster, easier and cheaper than many leading corporations. 

This brings me back to some of Bernard's points. I agree with him that most of the challenges that he mentions will be overcome over time. But I think the solutions will be the result of different approaches, different laws and different expectations. I think the technical evolution will foster a business evolution. Processes, regulations and business models will change. These changes will come not as a way to adapt to the cloud but as a way to harness its power.

Wednesday, January 28, 2009

Cloud Computing and SOA

In a blog posting earlier this week David Linthicum writes about the synergies between SOA and Cloud Computing. He mentions that "... the real value of cloud computing is the ability to identify services, data, and processes that can exist outside the firewall in somebody else's data center"

This is an interesting thought. While I tend to agree that one of the benefits of Cloud Computing is the ability to outsource resource management (in this case data center capacity), in my mind the true value of cloud computing goes far beyond. 

The true value of cloud computing is that it enables the creation of never before seen applications. The scalability, openness and speed to market are already reshaping the traditional monolithic and rigid corporate applications. This will be a transformational change. Enabling and supporting this transformation is the true value of Cloud Computing.

David also mentions that "... one can consider Cloud Computing the extension of SOA out to cloud delivered resources ...". The way I see it, Could Computing is the ultimate instantiation of Service Oriented Architectures. I think software architectures like REST and programming frameworks like Ruby On Rails will drive the evolution of SOA towards ROA (Resource Oriented Architectures) like Sam Ruby and Leonard Richardson like to call it. Not really an extension but rather an evolution. 

Tuesday, January 27, 2009

Google Developer Conference

Google has opened up registration for  Google I|O, Google's yearly event for software developers. This year the key topics will include Android, App Engine, Chrome, Google Web Toolkit and the AJAX APIs among others.

Here you can find session videos and slides from last year's event. I have found them to be a very valuable resource. Enjoy.

About this blog

Over the past several months I have become extremely interested in cloud computing. I am marveled at the potential of the programmable web and elastic computing resources. I believe these technologies will enable a fundamental transformation of corporate IT applications and infrastructure. This blog will contain informal thoughts and observations about Cloud Computing, Mahups and Analytics. From time to time I will post links to news articles and resources that I find interesting.

The views and opinions expressed in this pages are strictly mine.