Or “Darth Vader vs Godzilla”
Way way back in January, I’d heard loud and clear that companies where not expecting to mix cloud computing loads. I was treated like a three-eyed Japanese tree slug for suggesting that we could mixing HPC and Analytics loads with business applications in the same clouds. The consensus was that companies would stand up independent clouds for each workload. The analysis work was too important to interrupt and the business applications too critical to risk.
It has always rankled me that all those unused compute cycles (“the dark cycles”) could be put to good use. It’s appeals to my eco-geek side to make best possible use of all those idle servers. Dave McCrory and I even wrote some cloud patents around this.
However, I succumbed to the scorn and accepted the separation.
Now all of a sudden, this idea seems to be playing Godzilla to a Tokyo shaped cloud data center. I see several forces merging together to resurrect mixing workloads.
- Hadoop (and other map-reduce Analytics) are becoming required business tools
- Public clouds are making it possible to quickly (if not cheaply) setup analytic clouds
- Governance of virtualization is getting better
- Companies want to save some $$$
This trend will only continue as Moore’s Law improves the compute density for hardware. Since our designs are leading towards scale out designs that distribute applications over multiple nodes; it is not practical to expect an application to consume all the power of a single computer.
That leaves a lot of lonely dark cycles looking for work.