If Apple is Disney then is the iPad Miley Cyrus?

Or Is Apple’s walled garden more like Disney World

With the iPad frenzy, I’ve been hearing a lot about Apple’s success with its walled garden approach.  I objected to their proprietary closed stance on principle for a long time.  When I finally caved in, I came to understand something fundamentally true about consumers: predictability matters to the mainstream.

This is really no surprise.   Walt Disney figured this out with his amusement parks a long time ago.

Disney World is the ultimate walled garden.  They relentlessly control every mote of our experience in their parks and my family loves it.  We happily willingly pay a premium for the experience because we know that going to Disney World will be a smooth and our fun in assured. 

However, we less willingly pay a second price for our Disney experience; it’s homogenous and bland.  It lacks the spontaneity and vibe of the Austin City Limits music festivals.   At festivals, the content is raw and fresh and things can go wonderfully wrong.  You may be delighted by Vampire Weekend when you’d planned to see the Bob Dylan.

And so, Apple provides the quality control and censorship to Disney-ify our smart phones and tablets.  They’ve created a safe place to show off their impressive innovations.  They’ve created a limited market where they can control the spot lights.  In this way, Apple reminds me of how Disney manipulates it media outlets to create multi-talent superstars like Miley Cyrus.  They craft personas for their actors and ensure that they can sing, dance, and act.  This maximizes the appeal for Disney’s platform but blocks out other talented singers, dancers, and actors. 

Way when Brittany Spears a Disney property there was room left for other (better, truer) singers like Avril Lavigne.  Today, the sanitized Miley Cyrus talent trifecta effectively blots out the sun.

So far, the iPhone has been a platform for innovation.  Please ignore the fact that developers had to buy Apple computers to write applications for it.  Please ignore the fact that developers must pass through Apple’s QA and censors.  Please ignore the fact that you must purchase an Apple device.  Please ignore the fact that you can only purchase applications through the iTunes store.  They are a platform trifecta with hardware, software, and distribution.  This is the price that you pay to ride on Space Mountain, you must enter Apple’s iPark.

I’m hearing about some interesting new products emerging that will challenge Apple’s technology; however, I’m not sure if consumers are ready to leave the park and go to the festival.  I hope they are.

Disclaimer: I am a Dell employee.  We have products (based on Android) that complete with Apple’s smart phones and tablets.

Buy virtual goods at Seven-11! Zyanga offers MafiaWars burrito.

Even in the cloud provider business, we sometimes scratch our heads about how much people are willing to pay for virtual products.  A colleague was ranting enviously about a $20 virtual horse offered in World of Warcraft that sold thousands of units in the day hour it was offered.  That’s over two million dollars of revenue for a vanity accessory made of brightly colored pixels! 

In some ways, this is a generational challenge because I want to see real commodities in return for my cash.  Last week, my elementary age daughter did a grueling hour of yard work so that she could purchase some brightly colored phoenix shaped pixels Webkinz.  Normally, she’d have to buy a stuffed animal to get the unlock code but now she can bypass the plush closet dweller.  When I asked if she wanted the toy that normally accompanies the virtual goods she looked at me with the “Daddy, you are stupid but I love you anyway” look.  To her, the virtual item WAS the commodity and the toy was disposable packaging.  Upon reflection, I realized that this is a much better economic model than requiring her to purchase landfill fodder transported from sweatshops on the other side of the planet.

But I digress….

I was pumping gas today and noticed that Seven-11 is pimping concessions that are co-marketed with Zyanga.  This is not just a Zyanga advertising campaign – it is a fully integrated physical-for-virtual-goods marketing genius.  Here’s the deal: if you buy physical food from Seven-11 then I suspect that you get codes to things like unlock virtual food in FarmVille, yoyos in YoVille, and Seven-11’s to rob in MafiaWars.  They even appear to target specific foods to individual games – the MafiaWars burrito was simultaneously spooky and inspiring.

I suspect that ultimately these items will only by available by purchasing goods at Seven-11.  We’re already seeing applications like Gowalla that hope to bundle physical experiences (visiting specific stores) with coupons (free Starbucks).  It’s a logic step to assume that we’ll soon be directed to physical activities (buying a slurpee) to shape virtual experiences (bumping off a crime boss).   Since it seems like a marketer dream come true, I’m absolutely certain that you’ll see it coming to a social network near you.

So now I’m watching for the day when having physical lunch with my virtual Facebook friends may earn us some useful currency.  I wonder what that currency will be.

Are Clouds using Dark Cycles?

Or “Darth Vader vs Godzilla”

Way way back in January, I’d heard loud and clear that companies where not expecting to mix cloud computing loads.  I was treated like a three-eyed Japanese tree slug for suggesting that we could mixing HPC and Analytics loads with business applications in the same clouds.  The consensus was that companies would stand up independent clouds for each workload.  The analysis work was too important to interrupt and the business applications too critical to risk.

It has always rankled me that all those unused compute cycles (“the dark cycles”) could be put to good use.  It’s appeals to my eco-geek side to make best possible use of all those idle servers.   Dave McCrory and I even wrote some cloud patents around this.

However, I succumbed to the scorn and accepted the separation.

Now all of a sudden, this idea seems to be playing Godzilla to a Tokyo shaped cloud data center.  I see several forces merging together to resurrect mixing workloads.

  1. Hadoop (and other map-reduce Analytics) are becoming required business tools
  2. Public clouds are making it possible to quickly (if not cheaply) setup analytic clouds
  3. Governance of virtualization is getting better
  4. Companies want to save some $$$

This trend will only continue as Moore’s Law improves the compute density for hardware.  Since our designs are leading towards scale out designs that distribute applications over multiple nodes; it is not practical to expect an application to consume all the power of a single computer.

That leaves a lot of lonely dark cycles looking for work.

Now all of a sudden, this idea seems to be playing Godzilla to a Tokyo shaped cloud data center.  I see several forces merging together to resurrect mixing workloads.

  1. Hadoop (and other map-reduce Analytics) are becoming required business tools
  2. Public clouds are making it possible to quickly (if not cheaply) setup analytic clouds
  3. Governance of virtualization is getting better
  4. Companies want to save some $$$

This trend will only continue as Moore’s Law improves the compute density for hardware.  Since our designs are leading towards scale out designs that distribute applications over multiple nodes; it is not practical to expect an application to consume all the power of a single computer.

That leaves a lot of lonely dark cycles looking for work.

Green Clouds?

This is an interesting take on clouds by the Guardian.  Dell’s new cloud offerings are more power efficient; however, we are racking lots and lots of servers.  It’s like everyone in China buying fuel efficient cars – they are better then Hummers, but still going to use gas.

We’re clearly entering an age where compute consumed per person is going up dramatically.  They are correct that the cost and environmental impact of that compute is hidden from the consumer.  I have a front row seat to these cloud data centers and I can verify that lots and lots of new servers are being brought online every day. 

Welcome back to 2001.

McCrory: Zoo Tycoon = DC Management post

One of my past colleagues and fellow virtual server cloud inventor, Dave McCrory, posted an interesting comparison between Zoo Tycoon and the challenge of running a virtualized data center.

Way back in 2000, we talked about using this type of interface (or better a FPS!) to help manage data centers.  It still seems practical that the game dynamic of herding animals (VMs), visitors (users), attractions (infrastructure) ties very well to the daily activity of running a data center.

Looking that the landscape 10 years later, I think we’re still a long way from seeing that happen.  I don’t think it’s just a question of APIs – vendors need to look at the problem differently is we want to use this type of solution model.

Now, which start-up has the guts vision to give this approach a try?

IEEE “Pragmatic Clouds” Presentation (2/24/10)

Tonight I presentated “Pragmatic Clouds” to the IEEE Central Texas Consultants Network Meeting.

The abstract was, “Cloud computing seems to mean everything! We’ll talk in specifics about how application development and delivery models are changing based on new “cloud” commercial models. We’ll also explore how the plumbing behind the curtains is changing to reflect these new models.”

In the presentation, I covered drivers for cloud business models and how it impacts creating applications for clouds. I also described how to write a future-proof application that can work for IaaS clouds today and PaaS clouds tomorrow.

Time vs. Materials: $1,000 printer power button

Or why I teach my kids to solder

I just spent four hours doing tech support over a $0.01 part on an $80 inkjet printer.  According to my wife, those hours were a drop in the budget in a long line of comrades-in-geekdom who had been trying to get her office printer printing.  All told, at least $1,000 worth of expert’s time was invested.

It really troubles me when the ratio of purchase cost to support cost exceeds 10x for a brand new device.

In this case, a stuck power button cover forced the printer into a cryptic QA test mode.  It was obvious that the button was stuck, but not so obvious that that effectively crippled the printer.   Ultimately, my 14 year old striped the printer down, removed the $0.01 button cover, accidentally stripped a cable, soldered it back together, and finally repaired the printer.

From a cost perspective, my wife’s office would have been exponentially smarter to dump the whole thing in to the trash and get a new one.   Even the effort of returning it to the store was hardly worth the time lost dealing with the return.

This thinking really, really troubles me.

I have to wonder what it would cost our industry to create products that were field maintainable, easier to troubleshoot, and less likely to fail.  The automotive industry seems to be ahead of us in some respects.  They create products that a reliable, field maintainable, and conform to standards (given Toyota’s recent woes, do I need to reconsider this statement?).  Unfortunately, they are slow to innovate and have become highly constrained by legislative oversight.  Remember the old “If Microsoft made cars” joke?

For the high tech industry, I see systemic challenges driven by a number of market pressures:

  1. Pace of innovation: our use of silicon is just graduating from crawling to baby steps.  Products from the 90s look like stone tablets compared to 10’s offerings.   This is not just lipstick, these innovations disrupt design processes making it expensive to maintain legacy systems.
  2. Time to market: global competitive pressures to penetrate new markets give new customer acquisition design precedence.
  3. Lack of standards: standards can’t keep up with innovation and market pressures.  We’re growing to accept the consensus model for ad hoc standardization.  Personally, I like this approach, but we’re still learning how to keep it fair.
  4. System complexity: To make systems feature rich and cost effective, we make them tightly coupled.  This is great at design time, but eliminates maintainability because it’s impossible to isolate and replace individual components.
  5. Unequal wealth and labor rates:  Our good fortune and high standard of living make it impractical for us to spend time repairing or upgrading.  We save this labor by buying new products made in places where labor is cheap.  These cheap goods often lack quality and the cycle repeats.
  6. Inventory costs: Carrying low-demand, non-standard goods in inventory is expensive.   I can a printer with thousands of resistors soldered onto a board for $89 while buying the same resistors alone would cost more than the whole printer.  Can anyone afford to keep the parts needed for maintenance in stock?
  7. Disposable resources: We deplete limited resources as if they were unlimited.  Not even going to start on this rant…

Looking at these pressures makes the challenge appear overwhelming, but we need to find a way out of this trap.

That sounds like the subject for a future post!

WhatTheBus, Day1: MemCacheD roundtrip

Today I got the very basic bus data collection working using Cucumber TDD.  That means that I wrote the basic test I wanted to prove BEFORE I wrote the code that operates the test.

The Cucumber feature test looks like this:

Feature: Mobile Access
In order to ensure that location updates are captured
School Bus Location providers
want to have data they send stored on the site

Scenario: Update Location
When bus named “lion” in the “eanes” district with a id of “1234” goes to “32,-97”
When I go to the bus “1234” page
Then json has an object called “buses”
And json has a record “1234” in “buses” with “lat” value “32”
And json has a record “1234” in “buses” with “lng” value “-97”

There’s is some code behind this feature that calls the web page and gets the JSON response back.  The code that actually does the work in the bus controller is even simpler:

The at routine takes location updates just parses the parameters and stuffs it into our cache.  For now, we’ll ignore names and district data.

def at

Rails.cache.write params[:id], “#{params[:lat]},#{params[:lng]},#{params[:name]},#{params[:district]}”, :raw=>:true, :unless_exist => false, :expires_in => 5.minutes
render :nothing => true

end

The code that returns the location (index) pulls the string out of the cache and returns the value as simple JSON.

def index

data = Rails.cache.read(params[:id], :raw => true).split(‘,’)
if data.nil?
render :nothing => true
else
render :json => {:buses => { params[:id].to_sym => { :lat => data[0], :lng => data[1] } } }
end

end

Not much to it!  It’s handy that Rails has memcache support baked right in!  I just had to add a line to the environment.rb file and start my memcached server.