All Watched Over by Machines of Loving Grace?

 

This-HAL-9000-Inspired-AI-Simulation-Kept-Its-Virtual-Astronauts-Alive
The Watching “Eye” of the HAL 9000 Computer from 2001 – A Space Odyssey

I like to think
(it has to be!)
of a cybernetic ecology
where we are free of our labors
and joined back to nature,
returned to our mammal
brothers and sisters,
and all watched over
by machines of loving grace.

The last verse of Richard Brautigan’s 1967 poem, All Watched Over by Machines of Loving Grace, has a particular resonance during these dark and uncertain times caused by the COVID-19 pandemic[1].

The poem, which was also the name of a BBC documentary series by Adam Curtis[2], speaks of a time when we can return to nature and that mammals and computers will live together in “mutually programming harmony” with machines taking care of all our needs.

Things haven’t quite turned out like that have they?

In some kind of warped way maybe our machines are taking care of our needs but are they things we really need taken care of? If by “meeting our needs” we mean machines whose algorithms predict and dictate our shopping choices (Amazon), influence our voting behaviour (Facebook), satisfy our sexual preferences (Tinder, Grindr) or find us cheap rides and accommodation (Uber and Airbnb) then yes, maybe we have reached a mutually programmed harmony. I’m not sure that is exactly what Brautigan had in mind though.

If we think the “machines of loving grace” part of the poem have not quite happened in the way Brautigan predicted it could be that the “all watched over” part is about to become only too true however.

China, where the current coronavirus variant, SARS-CoV-2 originated, was already building the worlds largest social credit system whereby all citizens are given points from which the authorities make deductions for bad behaviour like traffic violations, and add points for good behaviour such as donating to charity. The full system is being rolled out during this decade at which point all citizens will be forced into using the system and everything from credit worthiness to political allegiance will be ‘measured’, not just by the system but by your peers as well. If trust is broken in one place restrictions will be imposed elsewhere meaning the untrustworthy will have reduced access to everything from jobs, to foreign travel, to bank loans and the internet.

Now, as a way of tracking peoples freedom of movement as its citizens come out of the coronavirus lockdown, the government has, through the ubiquitous Alipay and WeChat platforms, developed a “health code” service. This assigns users a colour-coded status based on their health and travel history plus a QR code that can be scanned by authorities. If you have a green code you are allowed to travel relatively freely. A yellow code indicates that the holder should be in home isolation, and a red code says the user is a confirmed COVID-19 patient and should be in quarantine. In China, which is not exactly known for its liberal attitude toward privacy, this may be acceptable as the price to pay for relative freedom of movement however as talk of such apps being rolled out in western liberal democracies start to become news, its citizens may not be quite as accepting of such uses of private data.

A similar system in South Korea that sends emergency virus text alerts has already revealed some embarrassing revelations about infected people’s private lives. These include a text saying “A woman in her 60s has just tested positive. Click on the link for the places she visited before she was hospitalised.” For many people the texts, whilst intended to be helpful, are creating a climate of concern by revealing a little too much personal information including revelations about extra-marital affairs.

At a country level there are already plentiful supplies of open data that allow apps such as this one to track COVID-19 statistics by country. The fact that we have systems and organisations that publish such data is to be applauded and should be seen as a good thing in providing us all (if we can be bothered to look) with plentiful amounts of data to help us come to our own conclusions and combat the unfortunately equally plentiful supply of fake news that abounds on social media about COVID-19. However once such data starts to get more personal that becomes a different matter.

Dominic Cummings, the Prime Ministers chief advisor, hosted a meeting at Downing Street on 11 March with technology company leaders to see how they could help develop an app to tackle COVID-19 and on Easter Sunday the UK government confirmed plans for an app that will warn users if they have recently been in close proximity to someone suspected to be infected with the coronavirus. Meanwhile Apple and Google have announced a system for tracking the spread of the new coronavirus, allowing users to share data through Bluetooth technology.

Four questions immediately arise from this situation?

  1. Should we trust corporations (especially Apple and Google) to be handling location data identifying where we have travelled and who we might have been close to?
  2. Can we trust the government to handle this data sensitively and with due regard to our privacy?
  3. What happens if not enough people use these apps?
  4. Once the pandemic is over can we trust the government and corporations to disable these functions from our phones and our lives?

Let’s take these one at a time.

First, are Google and Apple to be trusted with our private data? Historically neither exactly have a clean slate when it comes to protecting private data. In 2014 third-party software was used to steal intimate photos of celebrities from Apple’s cloud service iCloud, forcing the company to expand it’s two-step authentication service. More recently Hacker News revealed that Apple suffered a possible privacy breach in 2018 due to a bug in its platform that might have exposed iCloud data to other users.

Google’s failed social networking site Google+, which had already suffered a massive data breach in 2018 that exposed the private data of more than 500,000 Google+ users to third-party developers, was shut down earlier than planned in April 2019 following the discovery by Google engineers of another critical security vulnerability.

Despite the breaches of security suffered by these companies it is probably true to say that they have a deeper understanding of their platforms than most companies and government agencies. Putting something temporary in place during this potentially existential threat to society is probably not a bad thing however what happens once the pandemic is over then becomes critical.

Can we trust governments to behave properly with how they handle this data? Again governments do not have a good track records here. Edward Snowden, in his memoir  Permanent Record, reveals the extent of the mass surveillance that was taking place on US citizens by the National Security Agency from 2010 and beyond. If even democratically elected governments do this what chance for the dictatorial regimes of Russia and China? Even during these unprecedented times we should not be too hasty to give away the freedoms that we enjoy today without knowing the extent to which our data could be compromised. As John Naughton explains here there are ways of doing non-intrusive tracking of COVID-19 but to do so our smartphones have to be a bit, well, smarter. This is also a good reason why here in the UK, parliament should be recalled, even in virtual form, to ensure decisions being made in this area are challenged and subject to proper scrutiny.

Next, what happens if not enough people use the apps, either because they don’t trust the government or because not everyone has smartphones or they simply can’t be bothered to install the app and make sure it is active? It is estimated that in order for this to work there must be at least a 60% take up of the app. Can governments somehow enforce its usage and penalise users in someway if they don’t? Maybe they rule that only those who have smartphones with this app installed and active are the ones who will be allowed freedom of movement both to work, socialise and meet with other family members. Whilst this may encourage some to install the app it would alsonput a huge burden on police, the authorities and maybe even your employer as well as shops, bars and restaurants to ensure people moving around or entering their buildings have apps installed.  Also, what about people who don’t have smartphones? Smartphone ownership here in the UK  varies massively by age. In 2019, 96% of 25-34 year olds owned smartphones whereas as only 55% of 55-64 year olds owned these devices and only 16% (figures only available for 2015) of people over 65 owned them. How would they be catered for?

Finally, what happens when the pandemic is over and we return to relative normality? Will these emergency measures be rolled back or will the surveillance state have irrevocably crept one step closer? Recent history (think 9/11) does not provide much comfort here. As Edward Snowden says about the US:

“The two decades since 9/11 have been a litany of American destruction by way of American self-destruction, with the promulgation of secret policies, secret laws, secret courts, and secret wars, whose traumatising impact – whose very existence – the US government has repeatedly classified, denied, disclaimed, and distorted.”

Will our governments not claim there will always be a zoonotic-virus threat and that the war against such viruses, just like the “war on terror” will therefore be never ending and that we must never drop our guard (for which read, we must keep everyone under constant surveillance)?

An open letter published by a group of “responsible technologists” calls upon the NHSX leadership and the Secretary of State for Health and Social Care to ensure new technologies used in the suppression of Coronavirus follow ethical best practice and that if corners are cut, the public’s trust in the NHS will be undermined. The writer Yuval Noah Harari, who is quoted in the open letter by the data campaigners, warns that such measures have a nasty habit of becoming permanent. But he also says this: “When people are given a choice between privacy and health, they will usually choose health.”

Once the surveillance genie has been let out of its bottle it will be very difficult to squish it back in again allowing us to return to times of relative freedom. If we are not careful those machines which are watching over us may not be ones of loving grace but rather ones of mass surveillance and constant monitoring of our movements that make us all a little less free and a little less human.

  1. COVID-19 is the disease caused by the 2019 novel coronavirus or to give it its World Health Organisation designated name severe acute respiratory syndrome coronavirus 2 or SARS-CoV-2.
  2. No longer available on the BBC iPlayer but can be found here.

The Art of What’s Possible (and What’s Not)

One of the things Apple are definitely good at is giving us products we didn’t know we needed (e.g. the iPad). Steve Jobs, who died a year ago this week, famously said “You’ve got to start with the customer experience and work back to the technology — not the other way around”  (see this video at around 1:55 as well as this interview with Steve Jobs in Wired).

The subtle difference from the “normal” requirements gathering process here is that, rather than asking what the customer wants, you are looking at the customer experience you want to create and then trying to figure out how available technology can realise that experience. In retrospect, we can all see why a device like the iPad is so useful (movies and books on the go, a cloud enabled device that lets you move data between it and other devices, mobile web on a screen you can actually read etc, etc). Chances are however that it would have been very difficult to elicit a set of requirements from someone that would have ended up with such a device.

Jobs goes on to say “you can’t start with the technology and try to figure out where you’re going to try and sell it”. In many ways this is a restatement of the well known “golden hammer” anti-pattern (to a man with a hammer, everything appears as a nail) from software development, the misapplication of a favored technology, tool or concept in solving a problem.

Whilst all this is true and would seem to make sense, at least as far as Apple is concerned, there is still another subtlety at play when building truly successful products that people didn’t know they wanted. As an illustration of this consider another, slightly more infamous Apple product, the Newton Message Pad.

In many ways the Newton was an early version of the iPad or iPhone (see above for the two side by side), some 25 years ahead of its time. One of its goals was to “reinvent personal computing”. There were many reasons why the Newton did not succeed (including it’s large, clunky size and poor handwriting recognition system) however one of them must surely have been that the device was just too far ahead of the technology available at the time in terms of processing power, memory, battery life and display technology. Sometimes ideas can be really great but the technology is just not there to support them.So, whilst Jobs is right in saying you cannot start with the technology then decide how to sell it equally you cannot start with an idea if the technology is not there to support it, as was the case with the Newton. So what does this mean for architects?

A good understanding of technology, how it works and how it can be used to solve business problems is, of course, a key skill of any architect however, equally important is an understanding of what is not possible with current technology. It is sometimes too easy to be seduced by technology and to overstate what it is capable of. Looking out for this, especially when there may be pressure on to close a sale, is something we must all do and be forceful in calling it out when we think something is not possible.

Why We Need STEM++ Graduates

The need for more STEM (that’s Science, Technology, Engineering and Maths) skills seems to be on the agenda more and more these days. There is a strong feeling that the so called developed nations have depended too much on financial and other services to grow their economies and as a result “lost” their ability to design, develop and manufacture goods, largely because we are not producing enough STEM graduates to do this.Whilst I would see software as falling fairly and squarely into the STEM skillset (even if it is also used to  underpin nearly all of the modern financial services industry) as this blog post by Jessica Benjamin from IBM points out STEM skills alone won’t solve the really hard problems that are out there. With respect to the particular problems around big data Jessica succinctly says:

All the skills it takes to tell a good story, to compose a complete orchestra, are the skills it takes to put the pieces of this big data world together. If data is just data until its information, what’s a lot of information without the thought and skill of pulling all the chords together?

The need for right as well as left brained thinkers for solving the worlds really, really hard business problems is something that has been recognised for some time now by several prominent business leaders. Indeed the intersection of technology (left-brained) and design (right-brained) has certainly played a part in a lot of what technology companies like IBM and Apple have been a part of and made them successful.

So we need not just STEM skills but STEM++ skills where the addition of  “righty” skills like arts, humanities and design help us build not just a smarter world but one that is better to live in. For more on this check out my other (joint) blog The Versatilist Way.

Giving Users What They Want (Maybe)

Tradition has it that users come up with a set of requirements which architects and designers take and turn into “a solution”. That is, a combination of bespoke and off-the-shelf, hardware and software components, that are assembled in such a way they address all the requirements (non-functional as well as functional). Another point of view is that users don’t actually know what they want and therefore need to be guided toward solutions they never knew they needed or indeed knew were possible. Two famous proponents of this approach were Henry Ford who supposedly said:

If I had asked people what they wanted, they would have said faster horses.

which is debunked here and of course Steve Jobs and Apple whose “Eureka” moments continue to give us gadgets we never knew we needed. As Adrian Slywotzky points out here however, the magic that Jobs and Apple seem to regularly perform is actually based on highly focused and detailed business design, continuous refinement through prototyping and a manic attention to the customer experience.In other words it really is 90% perspiration and 10% inspiration.

One thing that both Henry Ford and Steve Jobs/Apple did have in common was also a deep understanding of the technology in their chosen fields of expertise and, more importantly, where that technology was heading.

If, as an architect, you are to have a sensible conversation with users (AKA customers, clients, stakeholders et al) about how to create an architecture that addresses their needs you not only need a good understanding of their business you also need a deep understanding of what technology is available and where that technology is heading. This is a tall order for one persons brain which is why the job of an architect is uniquely fascinating (but also hard work). It’s also why, if you’re good at it, you’ll be in demand for a while yet.

Remember that, even though they may not know it, users are looking at you to guide them not only on what the knowns are but also on what the unknowns are. In other words, it’s your job to understand the art of the possible, not just the art of the common-or-garden.

What Can Architects Learn from Steve Jobs

I’ve just finished reading Steve Jobs by Walter Isaacson. In case there is anyone out there who doesn’t know it yet, this is the authorised biography that Jobs asked Isaacson to write which was completed a few weeks before Jobs untimely death aged 56 last month. Jobs insisted that Isaacson would have complete control over the contents of the book saying he would not even read it before it was published adding “I don’t have any skeletons in my closet that can’t be allowed out”.Jobs is clearly a very complex personality, on the one hand a creative genius whose zen like focus on simplicity and efficiency helped create some of the most beautiful and useful gadgets of our time (some of which we never even knew we needed) whilst on the other he was a bully and a tyrant who knew exactly how to “size people up, understand their inner thoughts, and know how to relate to them, cajole them, or hurt them at will”. One of jobs girl friends, who later went on to found a mental health resource network in California, even went so far to say that she thought Jobs suffered from Narcissistic Personality Disorder (NPD) in which the individual is described as being excessively preoccupied with issues of personal adequacy, power, prestige and vanity.

Whilst it is to be hoped that NPD is not a prerequisite for being a software architect Jobs did have vision and understanding of IT that we as architects can learn from. Six big ideas that stand out in this respect are:

  1. Engineering matters. When jobs met with President Obama in 2011 he implored the President to reform the US education system and to create more engineering students. Jobs said “if you could educate these engineers we could move more manufacturing plants here”. Whilst there was always an uneasy tension between engineering and design at Apple Jobs recognised and valued the importance of there being an engineering led rather than sales led team at the top of the company berating companies like Microsoft (under Balmer), IBM (under Akers) and HP (under their last several CEOs) for putting sales people in charge rather than engineers. For software architects, engineering clearly translates to being intimately knowledgeable with the technology you are using, knowing how to put the working parts together. The best architects I know are passionate about technology.
  2. Artistry and design matters just as much as engineering. This is a theme that Jobs emphasises over and over again. From when he dropped out of college and instead took a course on calligraphy to his sometimes maniacal focus on the smallest details of design to make the product as satisfying and aesthetically pleasing as possible. He even emphasized that circuit boards, which no one would ever see once the product was fully assembled, should be laid out in as clean and uncluttered was as possible. It is this aspect of design that most matters for architects. Provided that functionally a system does what it is meant to do within the required constraints and system qualities one could argue it does not matter how messily the software is assembled. Whose going to see it anyway? This misses the point though.Great design, as opposed to just good enough design, means the system will be easier to maintain, take less effort to learn and generally be more enjoyable for those that need to carry on working on it once the architects and developers have moved on.
  3. Simple is better than complex. Apple had a design mantra: “Simplicity is the ultimate sophistication” or as Jobs said “very simple, and we’re really shooting for Museum of Modern Art quality”. Jobs felt that design simplicity should be linked to making products easy to use.So much of the software that we create today is far too complex and feature rich and as a result is very hard to use. People will often say that it has to be like that because just look at all the features you are getting. Unfortunately a lot of the time many of those features are not needed but add to the general bloat of the systems we build making them hard to use as well as difficult to maintain. Sadly building a complex system is often easier than building a simple one and it is not many architects that see value in stripping out functionality rather than adding it.
  4. An unremitting focus on detail is key to creating a great product. Jobs was unique in that he was able to hold both the big picture view as well as zooming in to fine details. He would often sweat over the smallest detail until he was satisfied it was just right. This could be anything from the colour of a screw on the back plate of the iPod to the angle of the bevel on the iPad to make someone want to pick it up. This capacity for holding both the big picture view whilst also being able to zoom right down and question low level details is probably one of the hardest things architects have to do but being able to do so gives a definite advantage and enables greater integrity as well as better execution of vision.
  5. Customers don’t always know what they want. In September 1982 when Jobs and his team were designing the original Macintosh he held a retreat for the Mac team near Monteray where he gave a presentation on his thoughts for the Mac. At the end someone asked whether or not they should do some market research to find out what customers wanted. “No”, replied Jobs, “because people don’t know what they want until we’ve shown them”. He then pulled out a device the size of a desk diary and flipped it open, it turned out to be a mock-up of a computer that could fit into your lap with a keyboard and screen hinged together like a notebook. “This is my dream of what we will be making in the mid- to late eighties”, Jobs said. Apple supposedly never did do any market research preferring to follow the Henry Ford approach who said he never asked what people wanted because they would have just asked for a better horseless carriage. Whilst it is probably the case that people can often see how to make incremental improvements to products they usually cannot see how to make disruptive changes that introduce a who new way of doing things, possibly making everything that went before it redundant. It is the job of the architect to show what is in the realms of the possible by creating new and innovative systems.
  6. Putting things together in new and creative ways is sometimes more important than inventing things. Jobs was not the first to market with an MP3 player, a mobile phone or a tablet computer. Others had already innovated and built these things. What Jobs and Apple did were to tweak things that already existed. As Isaacson says “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to”. Jobs applied his design skills to these and came up with a (far) better product and in fact a whole new platform as well (i.e. the computer as the digital hub. Architects to need to learn that its often putting together existing components in new and innovative ways that really counts and gives a competitive and business advantage.

Steve Jobs 1955 – 2011

During the coming days and weeks millions of words will be written about Steve Jobs, many of them on devices he created. Why does the world care so much about an American CEO and computer nerd? For those of us that work with technology, and hope to use it to make the world a better place, the reason Steve Jobs was such a role model is that he not only had great vision and a brilliant understanding of design but also knew how to deliver technology in a form that was usable by everyone, not just technophiles, nerds and developers. Steve Jobs and Apple have transformed the way we interact with data, and the way that we think about computing, moving it from the desktop to the palm of our hands. As IT becomes ever more pervasive we could all learn from that and maybe even hope to emulate Steve Jobs a little.

Five Software Architectures That Changed The World

Photo by Kobu Agency on Unsplash
Photo by Kobu Agency on Unsplash

“Software is the invisible thread and hardware is the loom on which computing weaves its fabric, a fabric that we have now draped across all of life”.

Grady Booch

Software, although an “invisible thread” has certainly had a significant impact on our world and now pervades pretty much all of our lives. Some software, and in particular some software architectures, have had a significance beyond just the everyday and have truly changed the world.

But what constitutes a world changing architecture? For me it is one that meets all of the following:

  1. It must have had an impact beyond the field of computer science or a single business area and must have woven its way into peoples lives.
  2. It may not have introduced any new technology but may instead have used some existing components in new and innovative ways.
  3. The architecture itself may be relatively simple, but the way it has been deployed may be what makes it “world changing”.
  4. It has extended the lexicon of our language either literally (as in “I tried googling that word” or indirectly in what we do (e.g. the way we now use App stores to get our software).
  5. The architecture has emergent properties and has been extended in ways the architect(s) did not originally envisage.

Based on these criteria here are five architectures that have really changed our lives and our world.

World Wide Web
When Tim Berners-Lee published his innocuous sounding paper Information Management: A Proposal in 1989 I doubt he could have had any idea what an impact his “proposal” was going to have. This was the paper that introduced us to what we now call the world wide web and has quite literally changed the world forever.

Apple’s iTunes
There has been much talk in cyberspace and in the media in general on the effect and impact Steve Jobs has had on the world. When Apple introduced the iPod in October 2001 although it had the usual Apple cool design makeover it was, when all was said and done, just another MP3 player. What really made the iPod take off and changed everything was iTunes. It not only turned the music industry upside down and inside out but gave us the game-changing concept of the ‘App Store’ as a way of consuming digital media. The impact of this is still ongoing and is driving the whole idea of cloud computing and the way we will consume software.

Google
When Google was founded in 1999 it was just another company building a search engine. As Douglas Edwards says in his book I’m Feeling Lucky “everybody and their brother had a search engine in those days”. When Sergey Brin was asked how he was going to make money (out of search) he said “Well…, we’ll figure something out”. Clearly 12 years later they have figured out that something and become one of the fastest growing companies ever. What Google did was not only create a better, faster, more complete search engine than anyone else but also figured out how to pay for it, and all the other Google applications, through advertising. They have created a new market and value network (in other words a disruptive technology) that has changed the way we seek out and use information.

Wikipedia
Before WIkipedia there was a job called an Encyclopedia Salesman who walked from door to door selling knowledge packed between bound leather covers. Now, such people have been banished to the great redundancy home in the sky along with typesetters and comptometer operators.

If you do a Wikipedia on Wikipedia you get the following definition:

Wikipedia is a multilingual, web-based, free-content encyclopedia project based on an openly editable model. The name “Wikipedia” is a portmanteau of the words wiki (a technology for creating collaborative websites, from the Hawaiian word wiki, meaning “quick”) and encyclopedia. Wikipedia’s articles provide links to guide the user to related pages with additional information.

From an architectural point of view Wikipedia is “just another wiki” however what it has bought to the world is community participation on a massive scale and an architecture to support that collaboration (400 million unique visitors monthly more than 82,000 active contributors working on more than 19 million articles in over 270 languages). Wikipedia clearly meets all of the above crtieria (and more).

Facebook
To many people Facebook is social networking. Not only has it seen off all competitors it makes it almost impossible for new ones to join. Whilst the jury is still out on Google+ it will be difficult to see how it can ever reach the 800 million people Facebook has. Facebook is also the largest photo-storing site on the web and has developed its own photo storage system to store and serve its photographs. See this article on Facebook architecture as well as this presentation (slightly old now but interesting nonetheless).

I’d like to thank both Grady Booch and Peter Eeles for providing input to this post. Grady has been doing great work on software archeology  and knows a thing or two about software architecture. Peter is my colleague at IBM as well as co-author on The Process of Software Architecting.

Creative Leaps and the Importance of Domain Knowledge

Sometimes innovation appears to come out of nowhere. Creative individuals, or companies, appear to be in touch with the zeitgeist of the times and develop a product (or service) that does not just satisfy an unknown need but may even create a whole new market that didn’t previously exist. I would put James Dyson (bagless vacuum cleaner) as an example of the former and Steve Jobs/Apple (iPad) as an example of the latter.Sometimes the innovation may even be a disruptive technology that creates a new market where one previously did not exist and may even destroy existing markets. Digital photography and its impact on the 35mm film producing companies (Kodak and Ilford) is a classic example of such a disruptive technology.

Most times however creativity comes from simply putting together existing components in new and interesting ways that meet a business need. For merely mortal software architects if we are to do this we not only need a good understanding of what those components do but also how the domain we are working in really, really works. You need to not only be curious about your domain (whether it be financial services, retail, public sector or whatever) but be able to ask the hard questions that no one else thought or bothered to ask. Sometimes this means not following the herd and being fashionable but being completely unfashionable. As Paul Arden, the Creative Director of Saatchi and Saatchi said in his book Whatever You Think, Think The Opposite:

People who create work that fashionable people emulate do the very opposite of what is in fashion. They create something unfashionable, out of time, wrong. Original ideas are created by original people, people who either through instinct or insight know the value of being different and recognise the commonplace as a dangerous place to be.

So do you want to be fashionable or unfashionable?

The Innovation Conundrum and Why Architecture Matters

A number of items in the financial and business news last week set me thinking about why architecture matters to innovation. Both IBM and Apple announced their second quarter results. IBM’s revenue for Q2 2011 was $26.7B, up 12% on the same quarter last year and Apples revenue for the same quarter was  $24.67B, an incredible 83% jump on the same quarter last year. As I’m sure everyone now knows IBM is 100 years old this year whereas Apple is a mere 35 years old. It looks like both Apple and IBM will become $100B companies this year if all goes to plan (IBM having missed joining the $100B club by a mere $0.1B in 2010). Coincidentally a Forbes article also caught my eye. Forbes listed the top 100 innovative companies. Top of the list was salesforce.com, Apple were number 5 and IBM were, er, not in the top 100! So what’s going on here? How can a company that pretty much invented the mainframe and personal computer, helped put a man on the moon, invented the scanning electron microscope and scratched the letters IBM onto a nickel crystal one atom at a time, and, most recently, took artificial intelligence a giant leap forward with Watson not be classed as innovative?

Perhaps the clue is in what the measure of innovation is. The Forbes article measures innovation by an “innovation premium” which it defines as:

A measure of how much investors have bid up the stock price of a company above the value of its existing business based on expectations of future innovative results (new products, services and markets).

So it would appear that, going by this definition of innovation, investors don’t think IBM is expected to be bringing any innovative products or services to market whereas the world will no doubt be inundated with all sorts of shiny iThingys over the course of the next year or so. But is that really all there is to being innovative? I would venture not.

The final article that caught my eye was about Apples cash reserves. Depending on which source you read this is around $60B and as anyone who has any cash to invest knows, sitting on it is not the best way of getting good returns! Companies generally have a few options with what to do when they amass so much cash, pay out higher dividends to shareholders, buy back their own shares, invest more in R&D or go on a buying spree and buy some companies that fill holes in their portfolio. Whilst this is a good way of quickly entering into markets companies may not be active in it tends to backfire on the innovation premium as mergers and acquisitions (M&A) are not, at least initially, seen as bringing anything new to market. M&A’s has been IBM’s approach over the last decade or so. As well as the big software brands like Lotus, Rational and Tivoli IBM has more recently bought lots of smaller software companies such as Cast Iron Systems, SPSS Statistics and Netezza.

A potential problem with this approach is that people don’t want to buy a “bag of bits” and have to assemble their own solutions Lego style. What they want are business solutions that address the very real and complex (wicked, even) problems they face today. This is where the software architect comes into his or her own. The role of the software architect is to take existing components and assemble them in interesting and important ways“. To that I would add innovative ways as well. Companies no longer want the same old solutions (ERP system, contact management system etc) but new and innovative systems that solve their business problems. This is why we have one of the more interesting jobs there is out there today!

The Legacy Issue

Much of the work we do as architects involves dealing with the dreaded “legacy systems”. Of course legacy actually means the last system built, not one that is necessarily 5, 10, 20 or more years old. As soon as a system goes into production it is basically “legacy”. As soon as new features get added that legacy system gets harder to maintain and  more difficult to understand; entropy (in the sense of an expression of disorder or randomness) sets in.

Apple have recently been in the news again for the wrong reasons because some of the latest iPod’s do not work with previous versions of Mac OSX. Users have been complaining that they are being forced to to upgrade to the latest version of OSX in order to get their shiny new iPods to work. To make matters worse however Apple do support the relatively ancient version of Windows XP. Apple have always taken a fairly hard line when it comes to legacy by not supporting backwards compatibility particularly well when their OS gets upgraded. The upside is the operating systems does not suffer from the “OS bloat” that Windows seems to (the last version of OSX actually had a smaller footprint than the previous version).

As architects it is difficult to focus both on maintaining legacy systems and also figuring out how to replace them. As Seth Godin says:“Driving with your eyes on the rearview mirror is difficult indeed”. At some point you need to figure out whether it is better to abandon the legacy system and replace it or soldier on supporting an ever harder to maintain system. There comes a point where the effort and cost in maintaining legacy is greater than that needed to replace the system entirely. I’m not aware of any formal methods that would help answer this particularly hard architectural decision but it’s one I think any architect should try and answer before embarking on a risky upgrade program that involves updating existing systems.