Cleantech and history of software

I’m on my way to Linz, Austria to participate in a panel at  an event called Cleantech Venture Forum. We’ve spent a lot of the last two years looking at ways to save power in data centers and mini-data centers (which may be a term nobody else uses).  If we look at the history of software, the key thing to optimize changes over time:

  1. Alan Turing Era – minimize processor time and memory use (machines are few and very expensive)
  2. Gene Amdahl/Seymour Cray Era – maximize throughput (big batch jobs in expensive machines)
  3. Gordon Bell Era – maximize responsiveness (suddenly we have terminals and impatient people screaming at them.)
  4. Bill Gates Era (client server era) – minimize programmer time and maximize functionality.  This is the era marked by three inspiring principles:

    1. Can’t we get these 4000 poorly educated, poorly treated code  monkeys to produce something valuable?
    2. No matter how crappy this is, it will work better after more memory and more processors.
    3. Get it out the door and let the customers debug it (or “Ready, Fire, Aim” as a former boss used to put it before our company went bankrupt!).
  5. The Al Gore Era -  power use/functionality becomes important.

Something like that.