Why cloud compute will be free

Today at Dell, I was presenting to our storage teams about cloud storage (aka the “storage banana”) and Dave “Data Gravity” McCrory reminded me that I had not yet posted my epiphany explaining “why cloud compute will be free.”  This realization derives from other topics that he and I have blogged but not stated so simply.

Overlooking that fact that compute is already free at Google and Amazon, you must understand that it’s a cloud eat cloud world out there where losing a customer places your cloud in jeopardy.  Speaking of Jeopardy…

Answer: Something sought by cloud hosts to make profits (and further the agenda of our AI overlords).

Question: What is lock-in?

Hopefully, it’s already obvious to you that clouds are all about data.  Cloud data takes three primary forms:

  1. Data in transformation (compute)
  2. Data in motion (network)
  3. Data at rest (storage)

These three forms combine to create cloud architecture applications (service oriented, externalized state).

The challenge is to find a compelling charge model that both:

  1. Makes it hard to leave your cloud AND
  2. Encourages customers to use your resources effectively (see #1 in Azure Top 20 post)

While compute demands are relatively elastic, storage demand is very consistent, predictable and constantly grows.  Data is easily measured and difficult to move.  In this way, data represents the perfect anchor for cloud customers (model rule #1).  A host with a growing data consumption foot print will have a long-term predictable revenue base.

However, storage consumption along does not encourage model rule #2.  Since storage is the foundation for the cloud, hosts can fairly judge resource use by measuring data egress, ingress and sidegress (attrib @mccrory 2/20/11).  This means tracking not only data in and out of the cloud, but also data transacted between the providers own cloud services.  For example, Azure changes for both data at rest ($0.15/GB/mo) and data in motion ($0.01/10K).

Consequently, the financially healthiest providers are the ones with most customer data.

If hosting success is all about building a larger, persistent storage footprint then service providers will give away services that drive data at rest and/or in motion.  Giving away compute means eliminating the barrier for customers to set up web sites, develop applications, and build their business.  As these accounts grow, they will deposit data in the cloud’s data bank and ultimately deposit dollars in their piggy bank.

However, there is a no-free-lunch caveat:  free compute will not have a meaningful service level agreement (SLA).  The host will continue to charge for customers who need their applications to operate consistently.  I expect that we’ll see free compute (or “spare compute” from the cloud providers perspective) highly used for early life-cycle (development, test, proof-of-concept) and background analytic applications.

The market is starting to wake up to the idea that cloud is not about IaaS – it’s about who has the data and the networks.

Oh, dem golden spindles!  Oh, dem golden spindles!

2 thoughts on “Why cloud compute will be free

  1. My take on this is that in general these comments are spot on but in some cases compute can cost more than data storage. For example for testing a microprocessor simulation you can spin up 200 nodes for 4 hours and bring these down, input data more or less will remain constant and outputs are not heavy either. In these types of scenarios cloud is a great fit as it cuts down time of testing a microprocessor simulation from 4 days to 4 hours (for example sake). I don’t think stickiness of clients will be due to their data volume, I think stickiness will be due a mix of several factors (of which data can be one) and these will differ based on verticals and use cases. Few things which are important to enterprises are: quality of service, consultative approach to service delivery, integration with existing enterprise infrastructure, vendor’s ability to innovate and vendor’s understanding of client’s requirements/challenges. Having said that, I strongly agree that cloud providers which throw out some free services (testing the waters) will do much better in attracting and retaining customers. After all, humans will pick which cloud to go to and if they are well versed in one vendor’s technology/process/offerings they will prefer that vendor over others. Technique of throwing a free evaluation licensing out there (for six months or so) was mastered by Microsoft and that was secret behind dominance of languages like VB in 1990s. Later this technique was copied by other vendors like Oracle and they are benefiting from it (SAP ignore it and see where they are now). Another twist I would like to add here is; a vendor’s treatment of developer community which is closely related to my point on free service for limited time, vendors who will make it easy for developers to write code to their cloud services (for integration to new applications through programmable infrastructure to platform as a service) will win in the long term. Keep in mind; what makes data valuable is the processing of data which is done through applications which are written by developers at large.

    Like

    • Thanks for the comments! The nuance about compute being free is that free would have no SLA. For the compute intensive case you are giving, my expectation is that you’d be willing to pay to ensure that your job gets priority.

      Like

Leave a comment