This week I’ve been attending a cloud architecture workshop. Not to architect a cloud for anyone in particular but to learn what the approach to architecting clouds should be. This being an IBM workshop there was, of course, lots of Tivoli this, WebSphere that and Power the other. Whilst the workshop was full of good advice I couldn’t help of thinking of this cartoon from 2008:
Courtesy geekandpoke.typepad.com
Just replace the word ‘SOA’ with ‘cloud’ (as ‘SOA’ could have been replaced by ‘client-server’ in the early nineties) and you get the idea. As software architects it is very easy to get seduced by technology, especially when it is new and your vendors, consultants and analysts are telling you this really is the future. However if you cannot explain to your client why you’re building him a cloud and what business benefit it will bring him then you are likely to fail just as much with this technology as people have with previous technology choices.
A report on the BBC Today programme this morning argues that the “Facebook generation needs better IT skills” and that UK schools should be providing courses in programming at GCSE. The report bemoaned the fact that so called Information and Communications Technology (ICT) GCSEs did little more than teach students how to use Microsoft Office programmes such as Word and Excel and did not prepare students for a career in IT. The backers of this report were companies like Google and Microsoft.This raises an interesting question of who should be funding such education in these austere times. Is it the role of schools to provide quite specific skills like programming or should they be providing the basics of literacy and numeracy as well as the more fundamental skills of creativity, communication and collaboration and leave the specifics to the industries that need them? Here are some of the issues related to this:
Skills like computer programming are continuously evolving and changing. What is taught at 14 – 16 today (the age of GCSE students in the UK) will almost certainly be out of date when these students hit the work force at 21+.
The computer industry, just like manufacturing before it, long ago sent out the message to students that programming skills (in Western economies at least) were commoditised and better performed by the low-cost economies of the BRIC nations (and now, presumably, the CEVITS).
To most people computers are just tools. Like cars, washing machines and mobile phones they don’t need to know how they work, just how to use them effectively.
Why stop at computer programming GCSE? Why not teach the basics of plumbing, car mechanics, cookery and hairdressing, all of which are in great demand still and needed by their respective industries.
Public education (which essentially did not exist before the 19th century, certainly not for the masses) came about to meet the needs of industrialism and as such demanded skills in left-brained, logical thinking skills rather than right brained, creative skills (see Sir Ken Robinson’s TED talk on why schools kill creativity). As a result we have a system that rewards the former rather than the latter (as in “there’s no point in studying painting or music, you’ll never get a job in that”).
In an ideal world we would all be given the opportunities to learn and apply whatever skills we wanted (both at school and throughout life) and have that learning funded by the tax payer on the basis it benefits society as a whole. Unfortunately we don’t live in that ideal world and in fact are probably moving further from it than ever.
Back in the real world therefore industry must surely fund the acquiring of those skills. Unfortunately in many companies education is the first thing to be cut when times are hard. The opposite should be the case. One of the best things I ever did was to spend five weeks (yes that’s weeks not days), funded entirely by IBM, learning object-oriented programming and design. Whilst five weeks may seem like a long time for a course I know this has paid for itself many, many times over by the work I have been able to do for IBM in the 15 years since attending that course. Further, I suspect that five weeks intensive learning was easily equivalent to at least a years worth of learning in an educational establishment.
Of course such skills are more vital to companies like Google, Microsoft and IBM than ever before. Steve Denning in an article called Why Big Companies Die in Forbes this month quotes from an article by Peggy Noonan in the Wall Street Journal (called A Caveman Won’t Beat a Salesman). Denning uses a theory from Steve Jobs that big companies fail when salesmen and accountants are put in charge of and who don’t know anything about the product or service the company make or how it works. Denning says:
The activities of these people [salesmen and accountants] further dispirit the creators, the product engineers and designers, and also crimp the firm’s ability to add value to its customers. But because the accountants appear to be adding to the firm’s short-term profitability, as a class they are also celebrated and well-rewarded, even as their activities systematically kill the firm’s future.
Steve Jobs showed that there was another way. Namely, to keep playing the offense and focus totally on adding value for customers by creating new and innovative new products. By doing that you can make more money than the companies that are milking their cash cows and focused on making money rather than products.
Companies like Google and Microsoft (and IBM and Apple) need people fully trained in the three C’s (creativity, communication and creativity) who can then apply these to whatever task is most relevant to the companies bottom line. It’s the role of those companies, not government, to train people in the specifics.
Interestingly Seymour Papert (who co-invented the Logo programming language) used programming as a tool to improve the way that children think and solve problems. Papert used Piaget‘s work of cognitive development (that showed how children learn) and used Logo as a way of improving their creativity.
Finally, to see how students themselves view all this see the article by Nikhil Goyal’s (a 16-year-old junior at Syosset High School in New York) who states: “for the 21st century American economy, all economic value will derive from entrepreneurship and innovation. Low-cost manufacturing will essentially be wiped out of this country and shipped to China, India, and other nations” and goes on to propose that “we institute a 21st century model of education, rooted in 21st century learning skills and creativity, imagination, discovery, and project-based learning”. Powerful stuff for one so young, there may yet be hope for us.
I’ve just finished reading Steve Jobs by Walter Isaacson. In case there is anyone out there who doesn’t know it yet, this is the authorised biography that Jobs asked Isaacson to write which was completed a few weeks before Jobs untimely death aged 56 last month. Jobs insisted that Isaacson would have complete control over the contents of the book saying he would not even read it before it was published adding “I don’t have any skeletons in my closet that can’t be allowed out”.Jobs is clearly a very complex personality, on the one hand a creative genius whose zen like focus on simplicity and efficiency helped create some of the most beautiful and useful gadgets of our time (some of which we never even knew we needed) whilst on the other he was a bully and a tyrant who knew exactly how to “size people up, understand their inner thoughts, and know how to relate to them, cajole them, or hurt them at will”. One of jobs girl friends, who later went on to found a mental health resource network in California, even went so far to say that she thought Jobs suffered from Narcissistic Personality Disorder (NPD) in which the individual is described as being excessively preoccupied with issues of personal adequacy, power, prestige and vanity.
Whilst it is to be hoped that NPD is not a prerequisite for being a software architect Jobs did have vision and understanding of IT that we as architects can learn from. Six big ideas that stand out in this respect are:
Engineering matters. When jobs met with President Obama in 2011 he implored the President to reform the US education system and to create more engineering students. Jobs said “if you could educate these engineers we could move more manufacturing plants here”. Whilst there was always an uneasy tension between engineering and design at Apple Jobs recognised and valued the importance of there being an engineering led rather than sales led team at the top of the company berating companies like Microsoft (under Balmer), IBM (under Akers) and HP (under their last several CEOs) for putting sales people in charge rather than engineers. For software architects, engineering clearly translates to being intimately knowledgeable with the technology you are using, knowing how to put the working parts together. The best architects I know are passionate about technology.
Artistry and design matters just as much as engineering. This is a theme that Jobs emphasises over and over again. From when he dropped out of college and instead took a course on calligraphy to his sometimes maniacal focus on the smallest details of design to make the product as satisfying and aesthetically pleasing as possible. He even emphasized that circuit boards, which no one would ever see once the product was fully assembled, should be laid out in as clean and uncluttered was as possible. It is this aspect of design that most matters for architects. Provided that functionally a system does what it is meant to do within the required constraints and system qualities one could argue it does not matter how messily the software is assembled. Whose going to see it anyway? This misses the point though.Great design, as opposed to just good enough design, means the system will be easier to maintain, take less effort to learn and generally be more enjoyable for those that need to carry on working on it once the architects and developers have moved on.
Simple is better than complex. Apple had a design mantra: “Simplicity is the ultimate sophistication” or as Jobs said “very simple, and we’re really shooting for Museum of Modern Art quality”. Jobs felt that design simplicity should be linked to making products easy to use.So much of the software that we create today is far too complex and feature rich and as a result is very hard to use. People will often say that it has to be like that because just look at all the features you are getting. Unfortunately a lot of the time many of those features are not needed but add to the general bloat of the systems we build making them hard to use as well as difficult to maintain. Sadly building a complex system is often easier than building a simple one and it is not many architects that see value in stripping out functionality rather than adding it.
An unremitting focus on detail is key to creating a great product. Jobs was unique in that he was able to hold both the big picture view as well as zooming in to fine details. He would often sweat over the smallest detail until he was satisfied it was just right. This could be anything from the colour of a screw on the back plate of the iPod to the angle of the bevel on the iPad to make someone want to pick it up. This capacity for holding both the big picture view whilst also being able to zoom right down and question low level details is probably one of the hardest things architects have to do but being able to do so gives a definite advantage and enables greater integrity as well as better execution of vision.
Customers don’t always know what they want. In September 1982 when Jobs and his team were designing the original Macintosh he held a retreat for the Mac team near Monteray where he gave a presentation on his thoughts for the Mac. At the end someone asked whether or not they should do some market research to find out what customers wanted. “No”, replied Jobs, “because people don’t know what they want until we’ve shown them”. He then pulled out a device the size of a desk diary and flipped it open, it turned out to be a mock-up of a computer that could fit into your lap with a keyboard and screen hinged together like a notebook. “This is my dream of what we will be making in the mid- to late eighties”, Jobs said. Apple supposedly never did do any market research preferring to follow the Henry Ford approach who said he never asked what people wanted because they would have just asked for a better horseless carriage. Whilst it is probably the case that people can often see how to make incremental improvements to products they usually cannot see how to make disruptive changes that introduce a who new way of doing things, possibly making everything that went before it redundant. It is the job of the architect to show what is in the realms of the possible by creating new and innovative systems.
Putting things together in new and creative ways is sometimes more important than inventing things. Jobs was not the first to market with an MP3 player, a mobile phone or a tablet computer. Others had already innovated and built these things. What Jobs and Apple did were to tweak things that already existed. As Isaacson says “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to”. Jobs applied his design skills to these and came up with a (far) better product and in fact a whole new platform as well (i.e. the computer as the digital hub. Architects to need to learn that its often putting together existing components in new and innovative ways that really counts and gives a competitive and business advantage.
In my previous post on five architectures that changed the world I left out a couple that didn’t fit my self-imposed criteria. Here, therefore, are two more, the first of which is a bit too techie to be a part of everyone’s lives but is nonetheless hugely important and the second of which has not changed the world yet but has pretty big potential to do so.
IBM System/360
Before the System/360 there was very little interchangeability between computers, even from the same manufacturers. Software had to be created for each type of computer making them very difficult to develop applications for as well as maintain. The System/360 practically invented the concept of architecture as applied to computers in that it had an architecture specification that did not make any assumptions on the implementation itself, but rather describes the interfaces and the expected behavior of an implementation. The System/360 was the first family of computers designed to cover the complete range of applications, from small to large, both commercial and scientific. The development of the System/360 cost $5 billion back in 1964, that’s $34 billion of today’s money and almost destroyed IBM.
Watson
Unless you are American you had probably never heard of the TV game show called Jeopardy! up until the start of 2011. Now we know that it is a show that “uses puns, subtlety and wordplay” that humans enjoy but which computers would get tied up in knots over. This, it turns out, was the challenge that David Ferrucci, the IBM scientist who led the four year quest to build Watson, had set himself to compete live against humans in the TV show.
IBM has “form” on building computers to play games! The previous one (Deep Blue) won a six-game match by two wins to one with three draws against world chess champion Garry Kasparov in 1997. Chess, it turns out, is a breeze to play compared to Jeopardy! Here’s why.
Chess…
§Is a finite, mathematically well-defined search space.
•Has a large but limited number of moves and states.
•Makes everything explicit and has unambiguous mathematical rules which computers love.
Games like Jeopardy! play on the subtleties of the human language however which is…
Ambiguous, contextual and implicit.
•Grounded only in human cognition.
•Can have a seemingly infinite number of ways to express the same meaning.
According to IBM Watson is “built on IBM’s DeepQA technology for hypothesis generation, massive evidence gathering, analysis, and scoring.” Phew! The point of Watson however is not its ability to play a game show but in the potential to “weaves its fabric” into the messiness of our human lives where data is not kept in nice ordered relational databases but is unstructured and seemingly unrelated but nevertheless can sometimes have new and undiscovered meaning. One obvious application is in medical diagnosis but it could also be used in a vast array of other situations from help desks through to sorting out what benefits you are entitled to. So, not world changing yet but definitely watch this space.
Software, although an “invisible thread” has certainly had a significant impact on our world and now pervades pretty much all of our lives. Some software, and in particular some software architectures, have had a significance beyond just the everyday and have truly changed the world.
But what constitutes a world changing architecture? For me it is one that meets all of the following:
It must have had an impact beyond the field of computer science or a single business area and must have woven its way into peoples lives.
It may not have introduced any new technology but may instead have used some existing components in new and innovative ways.
The architecture itself may be relatively simple, but the way it has been deployed may be what makes it “world changing”.
It has extended the lexicon of our language either literally (as in “I tried googling that word” or indirectly in what we do (e.g. the way we now use App stores to get our software).
The architecture has emergent properties and has been extended in ways the architect(s) did not originally envisage.
Based on these criteria here are five architectures that have really changed our lives and our world.
World Wide Web
When Tim Berners-Lee published his innocuous sounding paper Information Management: A Proposal in 1989 I doubt he could have had any idea what an impact his “proposal” was going to have. This was the paper that introduced us to what we now call the world wide web and has quite literally changed the world forever.
Apple’s iTunes
There has been much talk in cyberspace and in the media in general on the effect and impact Steve Jobs has had on the world. When Apple introduced the iPod in October 2001 although it had the usual Apple cool design makeover it was, when all was said and done, just another MP3 player. What really made the iPod take off and changed everything was iTunes. It not only turned the music industry upside down and inside out but gave us the game-changing concept of the ‘App Store’ as a way of consuming digital media. The impact of this is still ongoing and is driving the whole idea of cloud computing and the way we will consume software.
Google
When Google was founded in 1999 it was just another company building a search engine. As Douglas Edwards says in his book I’m Feeling Lucky “everybody and their brother had a search engine in those days”. When Sergey Brin was asked how he was going to make money (out of search) he said “Well…, we’ll figure something out”. Clearly 12 years later they have figured out that something and become one of the fastest growing companies ever. What Google did was not only create a better, faster, more complete search engine than anyone else but also figured out how to pay for it, and all the other Google applications, through advertising. They have created a new market and value network (in other words a disruptive technology) that has changed the way we seek out and use information.
Wikipedia
Before WIkipedia there was a job called an Encyclopedia Salesman who walked from door to door selling knowledge packed between bound leather covers. Now, such people have been banished to the great redundancy home in the sky along with typesetters and comptometer operators.
If you do a Wikipedia on Wikipedia you get the following definition:
Wikipedia is a multilingual, web-based, free-content encyclopedia project based on an openly editable model. The name “Wikipedia” is a portmanteau of the words wiki (a technology for creating collaborative websites, from the Hawaiian word wiki, meaning “quick”) and encyclopedia. Wikipedia’s articles provide links to guide the user to related pages with additional information.
From an architectural point of view Wikipedia is “just another wiki” however what it has bought to the world is community participation on a massive scale and an architecture to support that collaboration (400 million unique visitors monthly more than 82,000 active contributors working on more than 19 million articles in over 270 languages). Wikipedia clearly meets all of the above crtieria (and more).
Facebook
To many people Facebook is social networking. Not only has it seen off all competitors it makes it almost impossible for new ones to join. Whilst the jury is still out on Google+ it will be difficult to see how it can ever reach the 800 million people Facebook has. Facebook is also the largest photo-storing site on the web and has developed its own photo storage system to store and serve its photographs. See this article on Facebook architecture as well as this presentation (slightly old now but interesting nonetheless).
I’d like to thank both Grady Booch and Peter Eeles for providing input to this post. Grady has been doing great work on software archeology and knows a thing or two about software architecture. Peter is my colleague at IBM as well as co-author on The Process of Software Architecting.
A number of items in the financial and business news last week set me thinking about why architecture matters to innovation. Both IBM and Apple announced their second quarter results. IBM’s revenue for Q2 2011 was $26.7B, up 12% on the same quarter last year and Apples revenue for the same quarter was $24.67B, an incredible 83% jump on the same quarter last year. As I’m sure everyone now knows IBM is 100 years old this year whereas Apple is a mere 35 years old. It looks like both Apple and IBM will become $100B companies this year if all goes to plan (IBM having missed joining the $100B club by a mere $0.1B in 2010). Coincidentally a Forbes article also caught my eye. Forbes listed the top 100 innovative companies. Top of the list was salesforce.com, Apple were number 5 and IBM were, er, not in the top 100! So what’s going on here? How can a company that pretty much invented the mainframe and personal computer, helped put a man on the moon, invented the scanning electron microscope and scratched the letters IBM onto a nickel crystal one atom at a time, and, most recently, took artificial intelligence a giant leap forward with Watson not be classed as innovative?
Perhaps the clue is in what the measure of innovation is. The Forbes article measures innovation by an “innovation premium” which it defines as:
A measure of how much investors have bid up the stock price of a company above the value of its existing business based on expectations of future innovative results (new products, services and markets).
So it would appear that, going by this definition of innovation, investors don’t think IBM is expected to be bringing any innovative products or services to market whereas the world will no doubt be inundated with all sorts of shiny iThingys over the course of the next year or so. But is that really all there is to being innovative? I would venture not.
The final article that caught my eye was about Apples cash reserves. Depending on which source you read this is around $60B and as anyone who has any cash to invest knows, sitting on it is not the best way of getting good returns! Companies generally have a few options with what to do when they amass so much cash, pay out higher dividends to shareholders, buy back their own shares, invest more in R&D or go on a buying spree and buy some companies that fill holes in their portfolio. Whilst this is a good way of quickly entering into markets companies may not be active in it tends to backfire on the innovation premium as mergers and acquisitions (M&A) are not, at least initially, seen as bringing anything new to market. M&A’s has been IBM’s approach over the last decade or so. As well as the big software brands like Lotus, Rational and Tivoli IBM has more recently bought lots of smaller software companies such as Cast Iron Systems, SPSS Statistics and Netezza.
A potential problem with this approach is that people don’t want to buy a “bag of bits” and have to assemble their own solutions Lego style. What they want are business solutions that address the very real and complex (wicked, even) problems they face today. This is where the software architect comes into his or her own. The role of the software architect is to “take existing components and assemble them in interesting and important ways“. To that I would add innovative ways as well. Companies no longer want the same old solutions (ERP system, contact management system etc) but new and innovative systems that solve their business problems. This is why we have one of the more interesting jobs there is out there today!
The first three of these blog posts (here, here and here) have looked at the process behind developing business processes and services that could be deployed into an appropriate environment, including a cloud (private, public or hybrid). In this final post I’ll take a look at how to make this ‘real’ by describing an architecture that could be used for developing and deploying services, together with some software products for realising that architecture.The diagram below shows both the development time and also run-time logical level architecture of a system that could be used for both developing and deploying business processes and services. This has been created using the sketching capability of Rational Software Architect.
Here’s a brief summary of what each of the logical components in this architecture sketch do (i.e. their responsibilities):
SDLC Repository – The description of the SDLC goes here. That is the work breakdown structure, a description of all the phases, activities and tasks as well as the work products to be created by each task and also the roles used to create them. This would be created and modified by the actor Method Author using a SDLC Developer tool. The repository would typically include guidance (examples, templates, guidelines etc) that show how the SDLC is to be used and how to create work products.
SDLC Developer – The tool used by the Method Author to compose new or modify existing processes. This tool published the SDLC into the SDLC Repository.
Development Artefacts Repository – This is where the work products that are created on an actual project (i.e. ‘instances’ of the work products described in the SDLC) get placed.
Business Process Developer – The tool used to create and modify business processes.
IT Service Developer – The tool used to create and modify services.
Development Repository – This is where ‘code’ level artefacts get stored during development. This could be a subset of the Development Artefacts Repository.
Runtime Services Repository -Services get published hereonce they have been certified and can be released for general use.
Process Engine – Executes the business process.
Enterprise Service Bus – Runs the services and provides adapters to external or legacy systems.
Having described the logical components the next step is to show how these can be realised using one or more vendors products. No surprise that I am going to show how these map to products from IBM’s portfolio however clearly your own particular requirements (including whose on your preferred vendor list of course) may dictate that you choose other vendors products. Nearly all the IBM product links allow you to download trial versions that you can use to try out this approach.
Rational Method Composer – This enables you to manage, author, evolve, measure and deploy effective processes (SDLCs) tailored to your project needs. It is based on Eclipse. Rational Method Composer allows publishing to a web site so effectively covers the needs of both the SDLC Repository and SDLC Developer components.
IBM Business Process Manager – This is the latest name for IBM’s combined development and runtime business process server. As well as a business process runtime, ESB and BPM repository it also includes design tools for building processes and services. The Process Designer allows business authors to build fully executable BPMN processes that include user interfaces for human interaction. The Integration Designer enables IT developers to develop services that easily plug into processes to provide integration and routing logic, data transformation and straight-through BPEL subprocesses. See this whitepaper for more information or click here for the IBM edition of the book BPM for Dummies. IBM Business Process Manager realises the components: Business Process Developer, IT Service Developer, Development Repository, Process Engine and Enterprise Service Bus.
WebSphere Service Registry and Repository – Catalogs, and organizes assets and services allowing customers to get a handle on what assets they have, making it easy to locate or distribute. Also enables policy management across the SOA lifecycle, spanning various domains of policies including runtime policies as well as service governance policies. Included in the Advanced Lifecycle Edition is
Rational Asset Manager which provides life cycle management capabilities to manage asset workflow from concept, development, build, deployment, and retirement as well as Build Forge integration. WebSphere Service Registry and Repository realises the Development Artifacts Repository as well as the Runtime Services Repository.
So, there it is. An approach for developing services as well as an initial architecture allowing for the development and deployment of both business processes and services together with some actual products to get you started. Please feel free to comment here or in any of my links if you have anything you’d like to say.
There has been much Apple bashing in cyberspace as well as the ‘dead-wood’ parts of the press of late. To the extent that some people are now turning on those that own one of Apple’s wunder-devices (an iPad) accusing them of being “selfish elites“. Phew! I thought it was a typically British trait to knock anything and anyone that was remotely successful but it now seems that the whole world has it in for Mr Jobs’ empire.Back in the pre-google days of 1994 Umberto Eco declared that “the Macintosh is Catholic and that DOS is Protestant. Indeed, the Macintosh is counter-reformist and has been influenced by the ratio studiorum of the Jesuits. It is cheerful, friendly, conciliatory; it tells the faithful how they must proceed step by step to reach — if not the kingdom of Heaven — the moment in which their document is printed.”
The big gripe most people have with Apple is their closed architecture which controls not only who is allowed to write apps for their OS’s but who can produce devices that actually run those OS’s (er, that would be Apple). It’s one of life’s great anomalies as to why Apple is so successful in building products with closed architectures when most everyone would agree that open architectures and systems are ultimately the way to go as, in the end, they lead to greater innovation, wider-usage and, presumably, more profit for those involved. The classic case of an open architecture leading to wide-spread usage is that of the original IBM Personal Computer. Because IBM wanted to fast-track its introduction many of the parts were, unusually for IBM, provided by third-parties including, most significantly the processor (from Intel) and the operating system (from the fledgling Microsoft). This together with the fact that the technical information on the innards of the computer were made publicly available essentially made the IBM PC ‘open’. This more than anything gave it an unprecedented penetration into the marketplace allowing many vendors to provide IBM PC ‘clones’.
There is of course a ‘dark side’ to all of this. Thousands of vendors all providing hardware add-ons and extensions as well as applications resulted in huge inter-working problems which in the early days at least required you to be something of a computer engineer if you wanted to get everything working together. This is where Apple stepped in. As Umberto Eco said, Apple guides the faithful every step of the way. What they sacrifice in openness and choice they gain in everything working out the box, sometimes in three simple steps.
So, is open always best when it comes to architecture or does it sometimes pay to have a closed architecture? What does the architect do when faced with such a choice? Here’s my take:
Know your audience. The early PC’s, like it or not were bought by technophiles who enjoyed technology for the sake of technology. The early Mac’s were bought by people who just wanted to use computers to get the job done. In those days both had a market.
Know where you want to go. Apple stuck solidly with creating user friendly (not to mention well designed devices) that people would want to own and use. The plethora of PC providers (which there soon were) couldn’t by and large give a damn about design. They just wanted to sell as many devices as possible and let others worry about how to stitch everything together. This in itself generated a huge industry which in a strange self-fulfilling way led to more devices and world domination of the PC and left Apple in a niche market. Openness certainly seemed to be paying.
Know how to capitalise on your architectural philosopy. Ultimately openness leads to commoditization. When anyone can do it price dominates and the cheapest always wins. If you own the space then you control the price. Apple’s recent success has been not to capitalise on an open architecture but to capitalise on good design which has enabled it to create high value, desirable products showing that good design trounces an open architecture.
So how about combining the utility of an open architecture with the significance of a well thought through architecture to create a great design? Which funnily enough is what Dan Pink meant by this:
Thomas J. Watson Sr, CEO and founder of IBM (100 years old this year). Currently has a computer named after him.
Alan Turing, mathematician and computer scientist (100 years old next year). Has a famous test named after him.
Aurthur C. Clarke, scientist and writer (100 years old in 1917). Has a set of laws named after him (and is also the creator of the fictional HAL computer in 2001: A Space Odyssey).
Unless you have moved into a hut, deep in the Amazon rain forest you cannot have missed the publicity over IBM’s ‘Watson’ computer having competed in, and won, the American TV quiz show Jeopardy. I have to confess that until last week I’d not heard of Jeopardy, possibly because a) I’m not a fan of quizzes, b) I’m not American and c) I don’t watch that much television. To those as ignorant as me on these matters the unique thing about Jeopardy is that contestants are presented with clues in the form of answers, and must phrase their responses in the form of a question.
This, it turns out, is what makes this particular quiz such a hard nut for a computer to crack. The clues in the ‘question’ rely on subtle meanings, puns, and riddles; something humans excel at and computers do not. Unlike IBM’s previous game challenger Deep Blue, which defeated chess world champion Gary Kasparov, it’s not sufficient to rely on raw computing ‘brute force’ but this time the computer has to interpret meaning and the nuances of the human language. So has Watson achieved, met or passed the Turing test (which is basically a measure of whether computer can demonstrate intelligence)?
The answer is almost certainly ‘no’. Turing’s test is a measure of a machines ability to exhibit human intelligence. The test, as originally proposed by Turing was that a questioner should ask a series of questions of both a human being and a machine and see whether he can tell which is which through the answers they give. The idea being that if the two were indistinguishable then the machine and the human must both appear to be as intelligent as each other.
As far as I know Turing never stipulated any constraint on the range or type of questions that could be answered which leads us to the nub of the problem. Watson is supremely good at answering Jeopardy type questions just as Deep Blue was good at playing chess. However neither could do what the other does (at least as well). They have been programmed for that given task. Given that Watson is actually a cluster of POWER7 servers any suitably general purpose computer that could win at Jeopardy, play chess as well as exhibit the full range of human emotions and frailties that would be needed to fool a questioner would presumably occupy the area of several football pitches and consume the power of a small city.
That however misses the point completely. The ability of a computer to almost flawlessly answer a range of questions, phrased in a particular way on a range of different subject areas, blindingly fast has enormous potential in fields of medicine, law and other disciplines where questions based on a huge foundation of knowledge built up over decades need to be answered quickly (for example in accident and emergency where quick diagnoses may literally be a matter of life and death). This indeed is one of IBM’s Smarter Planet goals.
Which brings us to Clarke’s third law which states that “any sufficiently advanced technology is indistinguishable from magic”. This is surely something that is attributable to Watson. The other creation of Clarke of course is HAL the computer aboard the spaceship Discovery One on a trip to Saturn that becomes overwhelmed by guilt at having to keep secret the true nature of the spaceships mission and starts killing members of the crew. The point of Clarke’s story (or one of them) being that the downside to a computer that is indistinguishable from a human being is that the computer may also end up mimicking human frailties and weaknesses. Maybe it’s a good job Watson hasn’t passed Turing’s test then?
A previous entry suggested hiring an architect was a good idea because architects take existing components and assemble them in interesting and important ways. So how should you “think architecturally” in order to create things that are not only interesting but also solve practical, real-world problems? Architectural thinking is about balancing three opposing “forces”: what people want (desirability), what technology can provide (feasibility) and what can actually be built given the constraints of cost, resource and time (viability).
It is basically the role of the architect to help resolve these forces by assembling components “in interesting ways”. There is however a fourth aspect which is often overlooked but which is what separates great architecture from merely good architecture. That is the aesthetics of the architecture.
Aesthetics is what separates a MacBook from a Dell, the Millau Viaduct in France from the Yamba Dam Bridge in Japan and the St Mary Axe from the Ryugyong Hotel in North Korea. Aesthetics is about good design which is what you get when you add ‘significance’ (aesthetic appeal) to ‘utility’ (something that does the job). IBM, the company I work for, is 100 years old this year (check out the centennial video here) and Thomas Watson, IBM’s founder, famously said that “good design is good business”. Watson knew what Steve Jobs, Tim Brown and many other creative designers know; aesthetics is not only good for the people that use or acquire these computers/buildings/systems it’s also good for the businesses that create them. In a world of over-abundance good design/architecture both differentiates companies as well as giving them a competitive advantage.