The Innovation Conundrum and Why Architecture Matters

A number of items in the financial and business news last week set me thinking about why architecture matters to innovation. Both IBM and Apple announced their second quarter results. IBM’s revenue for Q2 2011 was $26.7B, up 12% on the same quarter last year and Apples revenue for the same quarter was  $24.67B, an incredible 83% jump on the same quarter last year. As I’m sure everyone now knows IBM is 100 years old this year whereas Apple is a mere 35 years old. It looks like both Apple and IBM will become $100B companies this year if all goes to plan (IBM having missed joining the $100B club by a mere $0.1B in 2010). Coincidentally a Forbes article also caught my eye. Forbes listed the top 100 innovative companies. Top of the list was salesforce.com, Apple were number 5 and IBM were, er, not in the top 100! So what’s going on here? How can a company that pretty much invented the mainframe and personal computer, helped put a man on the moon, invented the scanning electron microscope and scratched the letters IBM onto a nickel crystal one atom at a time, and, most recently, took artificial intelligence a giant leap forward with Watson not be classed as innovative?

Perhaps the clue is in what the measure of innovation is. The Forbes article measures innovation by an “innovation premium” which it defines as:

A measure of how much investors have bid up the stock price of a company above the value of its existing business based on expectations of future innovative results (new products, services and markets).

So it would appear that, going by this definition of innovation, investors don’t think IBM is expected to be bringing any innovative products or services to market whereas the world will no doubt be inundated with all sorts of shiny iThingys over the course of the next year or so. But is that really all there is to being innovative? I would venture not.

The final article that caught my eye was about Apples cash reserves. Depending on which source you read this is around $60B and as anyone who has any cash to invest knows, sitting on it is not the best way of getting good returns! Companies generally have a few options with what to do when they amass so much cash, pay out higher dividends to shareholders, buy back their own shares, invest more in R&D or go on a buying spree and buy some companies that fill holes in their portfolio. Whilst this is a good way of quickly entering into markets companies may not be active in it tends to backfire on the innovation premium as mergers and acquisitions (M&A) are not, at least initially, seen as bringing anything new to market. M&A’s has been IBM’s approach over the last decade or so. As well as the big software brands like Lotus, Rational and Tivoli IBM has more recently bought lots of smaller software companies such as Cast Iron Systems, SPSS Statistics and Netezza.

A potential problem with this approach is that people don’t want to buy a “bag of bits” and have to assemble their own solutions Lego style. What they want are business solutions that address the very real and complex (wicked, even) problems they face today. This is where the software architect comes into his or her own. The role of the software architect is to take existing components and assemble them in interesting and important ways“. To that I would add innovative ways as well. Companies no longer want the same old solutions (ERP system, contact management system etc) but new and innovative systems that solve their business problems. This is why we have one of the more interesting jobs there is out there today!

The Legacy Issue

Much of the work we do as architects involves dealing with the dreaded “legacy systems”. Of course legacy actually means the last system built, not one that is necessarily 5, 10, 20 or more years old. As soon as a system goes into production it is basically “legacy”. As soon as new features get added that legacy system gets harder to maintain and  more difficult to understand; entropy (in the sense of an expression of disorder or randomness) sets in.

Apple have recently been in the news again for the wrong reasons because some of the latest iPod’s do not work with previous versions of Mac OSX. Users have been complaining that they are being forced to to upgrade to the latest version of OSX in order to get their shiny new iPods to work. To make matters worse however Apple do support the relatively ancient version of Windows XP. Apple have always taken a fairly hard line when it comes to legacy by not supporting backwards compatibility particularly well when their OS gets upgraded. The upside is the operating systems does not suffer from the “OS bloat” that Windows seems to (the last version of OSX actually had a smaller footprint than the previous version).

As architects it is difficult to focus both on maintaining legacy systems and also figuring out how to replace them. As Seth Godin says:“Driving with your eyes on the rearview mirror is difficult indeed”. At some point you need to figure out whether it is better to abandon the legacy system and replace it or soldier on supporting an ever harder to maintain system. There comes a point where the effort and cost in maintaining legacy is greater than that needed to replace the system entirely. I’m not aware of any formal methods that would help answer this particularly hard architectural decision but it’s one I think any architect should try and answer before embarking on a risky upgrade program that involves updating existing systems.

Open vs. Closed Architectures

There has been much Apple bashing in cyberspace as well as the ‘dead-wood’ parts of the press of late. To the extent that some people are now turning on those that own one of Apple’s wunder-devices (an iPad) accusing them of being “selfish elites“. Phew! I thought it was a typically British trait to knock anything and anyone that was remotely successful but it now seems that the whole world has it in for Mr Jobs’ empire.Back in the pre-google days of 1994 Umberto Eco declared thatthe Macintosh is Catholic and that DOS is Protestant. Indeed, the Macintosh is counter-reformist and has been influenced by the ratio studiorum of the Jesuits. It is cheerful, friendly, conciliatory; it tells the faithful how they must proceed step by step to reach — if not the kingdom of Heaven — the moment in which their document is printed.

The big gripe most people have with Apple is their closed architecture which controls not only who is allowed to write apps for their OS’s but who can produce devices that actually run those OS’s (er, that would be Apple). It’s one of life’s great anomalies as to why Apple is so successful in building products with closed architectures when most everyone would agree that open architectures and systems are ultimately the way to go as, in the end, they lead to greater innovation, wider-usage and, presumably, more profit for those involved. The classic case of an open architecture leading to wide-spread usage is that of the original IBM Personal Computer. Because IBM wanted to fast-track its introduction many of the parts were, unusually for IBM, provided by third-parties including, most significantly the processor (from Intel) and the operating system (from the fledgling Microsoft). This together with the fact that the technical information on the innards of the computer were made publicly available essentially made the IBM PC ‘open’. This more than anything gave it an unprecedented penetration into the marketplace allowing many vendors to provide IBM PC ‘clones’.

There is of course a ‘dark side’ to all of this. Thousands of vendors all providing hardware add-ons and extensions as well as applications resulted in huge inter-working problems which in the early days at least required you to be something of a computer engineer if you wanted to get everything working together. This is where Apple stepped in. As Umberto Eco said, Apple guides the faithful every step of the way. What they sacrifice in openness and choice they gain in everything working out the box, sometimes in three simple steps.

So, is open always best when it comes to architecture or does it sometimes pay to have a closed architecture? What does the architect do when faced with such a choice? Here’s my take:

  • Know your audience. The early PC’s, like it or not were bought by technophiles who enjoyed technology for the sake of technology. The early Mac’s were bought by people who just wanted to use computers to get the job done. In those days both had a market.
  • Know where you want to go. Apple stuck solidly with creating user friendly (not to mention well designed devices) that people would want to own and use. The plethora of PC providers (which there soon were) couldn’t by and large give a damn about design. They just wanted to sell as many devices as possible and let others worry about how to stitch everything together. This in itself generated a huge industry which in a strange self-fulfilling way led to more devices and world domination of the PC and left Apple in a niche market. Openness certainly seemed to be paying.
  • Know how to capitalise on your architectural philosopy. Ultimately openness leads to commoditization. When anyone can do it price dominates and the cheapest always wins. If you own the space then you control the price. Apple’s recent success has been not to capitalise on an open architecture but to capitalise on good design which has enabled it to create high value, desirable products showing that good design trounces an open architecture.

So how about combining the utility of an open architecture with the significance of a well thought through architecture to create a great design? Which funnily enough is what Dan Pink meant by this:

Significance + Utility = Design

Huh, beaten to a good idea again!

New Dog, Old Tricks

I can’t believe this but today I have observed no less than three people using the latest wonder-gadget from Apple (the iPad) to play solitaire, Tetris and some other game which seemed to involve nothing more than poking the screen at moving shapes! Having just bought my own iPad and being convinced it conforms to Aurthur C. Clarke’s third law (any sufficiently advanced technology is indistinguishable from magic) I am aghast that such a technological wonder is being used for such mind numbing activities; just dust off your ZX Spectrums guys!