Disruptive Technologies, Smarter Cities and the New Oil

Last week I attended the Smart City and Government Open Data Hackathon in Birmingham, UK. The event was sponsored by IBM and my colleague Dr Rick Robinson, who writes extensively on Smarter Cities as The Urban Technologist, gave the keynote session to kick off the event. The idea of this particular hackathon was to explore ways in which various sources of open data, including the UK governments own open data initiative, could be used in new and creative ways to improve the lives of citizens and make our cities smarter as well as generally better places to live in. There were some great ideas discussed including how to predict future jobs as well as identifying citizens who had not claimed benefits to which they were entitled (and those benefits then going back into the local economy through purchases of goods and services).The phrase “data is the new oil” is by no means a new one. It was first used by Michael Palmer in 2006 in this article. Palmers says:

Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value.

Whilst this is a nice metaphor I think I actually prefer the slight adaptation proposed by David McCandless in his TED talk: The beauty of data visualization where he coins the phrase “data is the new soil”. The reason being data needs to be worked and manipulated, just like a good farmer looking after his land, to get the best out of it. In the case of the work done by McCandless this involves creatively visualizing data to show new understandings or interpretations and, as Hans Rosling says, to let the data set change your mind set.

Certainly one way data is most definitely not like oil is in the way it is increasing at exponential rates of growth rather than rapidly diminishing. But it’s not only data. The new triumvirate of data, cloud and mobile is forging a whole new mega-trend in IT nicely captured in this equation proposed by Gabrielle Byrne at the start of this video:

e = mc(imc)2

Where:

  • e is any enterprise (or city, see later)
  • m is mobile
  • c is cloud
  • imc is in memory computing, or stream computing, the instant analysis of masses of fast changing data

This new trend is characterized by a number of incremental innovations that have taken place in IT over previous years in each of the three areas nicely captured in the figure below.

Source: CNET – Where IT is going: Cloud, mobile and data

In his blog post: The new architecture of smarter cities, Rick proposes that a Smarter City needs three essential ‘ingredients’ in order to be really characterized as ‘smart’. These are:

  • Smart cities are led from the top
  • Smart cities have a stakeholder forum
  • Smart cities invest in technology infrastructure

It is this last attribute that, when built on a suitable cloud-mobility-data platform, promises to fundamentally change not only how enterprises are set to change but also cities and even whole nations.  However it’s not just any old platform that needs to be built. In this post I discussed the concept behind so-called disruptive technology platforms and the attributes they must have. Namely:

  • A well defined set of open interfaces.
  • A critical mass of both end users and service providers.
  • Both scaleable and extremely robust.
  • An intrinsic value which cannot be obtained elsewhere.
  • Allow users to interact amongst themselves, maybe in ways that were originally envisaged.
  • Service providers must be given the right level of contract that allows them to innovate, but without actually breaking the platform.

So what might a disruptive technology platform, for a whole city, look like and what innovations might it provide? As an example of such a platform IBM have developed something they call the Intelligent Operations Center or IOC. The idea behind the IOC is to use information from a number of city agencies and departments to make smarter decisions based on rules that can be programmed into the platform. The idea then, is that the IOC will be used to anticipate problems to minimize the impact of disruptions to city services and operations as well as assist in the mobilization of resources across multiple agencies. The IOC allows aggregated data to be visualized in ways that the individual data sets cannot and for new insights to be obtained from that data.

Platforms like the IOC are only the start of what is possible in a truly smart city. They are just beginning to make use of mobile technology, data in the cloud and huge volumes of fast moving data that is analysed in real-time. Whether these platforms turn out to be really disruptive remains to be seen but if this is really the age of “new oil” then we only have the limitations of our imagination to restrict us in how we will use that data to give us valuable new insights into building smart cities.

Architect or Architecting?

A discussion has arisen on one of the IBM forums about whether the verb that describes what architects do (as in “to architect” or “architecting”) is valid English or not. The recommendation in the IBM word usage database has apparently always been that when you need a verb to describe what an architect does use “design,” “plan,” or “structure”. Needless to say this has generated quite a bit of comment (145 at the last count) including:

  • Police are policing, judges are judging, dancers are dancing, why then aren’t architects architecting?
  • Architects are not “architecting” because they design.
  •  I feel a need to defend the term ‘architecting’. Engineers do engineering, architects do architecting. We have the role of software or system architecture and the term describes what they do. There is a subtle but useful distinction between a software designer and a software architect that was identified about 30 years ago by the then IBMer Fred Brooks in his foundational text, The Mythical Man Month.
  • From a grammatical point of view use of “architecting” as a verb or gerund is as poor as using leverage as a verb… and as far as meaning is concerned, as poor as any platitude used when knowledge of precise content and detail is lacking.

As someone who has co-authored a book called The Process of Software Architecting I should probably declare more than a passing interest in this and feel that the verb ‘architecting’ or ‘to architect’ is perfectly valid. Whether it is strictly correct English or not I will leave to others far better qualified to pass judgment on. My defence of using architect as a verb is that there is a, sometimes subtle, difference between architecture and design (Grady Booch says “all architecture is design but not all design is architecture”) and although architects do perform elements of design, that is not all they do. I, for one, would not wish to see the two confused.

The definition of architecting we use in the book  The Process of Software Architecting comes from the IEEE standard 1471-2000 which defines architecting as:

The activities of defining, documenting, maintaining, improving, and certifying proper implementation of an architecture.

As a related aside on whether adding ‘ing’ to a noun to turn int into a verb is correct English or not it is interesting to see that the ‘verbing’ of nouns is picking up pace at the London Olympics where we now seem to have ‘medaling’ and ‘platforming’ entering the English language.

Hassle Maps and Expert Integrated Systems

In his book Demand: Creating What People Love Before They Know They Want It the business thinker and management consultant Adrian Slywotzky defines the concept of a Hassle Map thus:

Hassle Map (HA-sul map) noun 1. a diagram of the characteristics of existing products, services and systems that cause people to waste time, energy, money 2. (from a customer’s perspective) a litany of the headaches, disappointments and frustrations one experiences 3. (from a demand creator’s perspective) an array of tantalising opportunities.

Documenting, either literally or mentally, the hassle map for a product, service, system or process is the first step on the way to improving it and to creating something that people will love and want. A key part of the hassle map is finding out what users of an existing product or service find most annoying and stop them from buying it in great quantities. For Steve Jobs this was the inadequacies of existing mobile phones, for Reed Hastings CEO of Netflix it was the ‘hassle’ of having to walk to the video store to rent a movie (and being fined when you forgot to take it back on time), for Jeff Bezos, founder of Amazon, it was not just building an e-reader device (the Kindle) with a great interface but also one which had a massive catalogue of books that could be downloaded in ‘one-click’. The list goes on.

One way of drawing up a hassle map is to think of what the world would be like without the hassle; a sort of idealized view of life. A hassle map, consists of a number of hassles and for each, a view of what life would be like without the hassle. I once worked with a client who was fond of using the phrase “imagine a world where…” Well, the solution bit of a hassle map is the world where that hassle no longer exists.

Expert integrated systems, as manifested by IBM’s PureFlex and PureApplication Systems, are an attempt at addressing the hassle maps currently felt by businesses when building IT systems. Here are 10 hassles that these systems are trying to overcome.

Hassle Solution
IT increasingly seen as a constraint on business innovation rather than an enabler. Expert integrated systems enable delivery of new capabilities, faster allowing IT resources to be moved from ‘running the business’ to ‘changing the business’.
Software and hardware has to be ordered separately taking days or weeks to arrive. System arrives as a single integrated hardware and software package, ready to be turned on.
Components arrive as a “bag of parts” requiring integration and optimization. Components are pre-installed, integrated and optimized.
Specification of deployment environment requires specialist skills, can be brittle and error prone. ‘Patterns of expertise’ that capture proven best practices for complex tasks learned from decades of client and partner engagements that are captured, lab tested and built into the system.
Systems require time-consuming optimization by experts on site. Pre-optimized by experts in the factory before shipment.
Deployment time takes weeks. Deployment time takes minutes.
Multiple management consoles for each middleware software product. Single point of management across all middleware products.
Lack of dynamic elasticity results in cumbersome re-allocation of resources. Repeatable self service provisioning, integrated and elastic application and data runtimes and application-aware workload management.
Takes weeks or months for a development or test environment to be built plus non-standard configurations can cause errors and delay production deployments by weeks. Self service development, test and production environments, provisioned, secured and managed in adherence to corporate policies through customizable pre-defined patterns.
Upgrades involve days of downtime. Zero downtime upgrades.

Of course ‘hassles’, are really only high-level requirements stated in a way that business folk really care about, that is what is causing them pain. These are the right sort of requirements and the sort we IT folk must take most notice of if we are to build systems that solve ‘real-world’ business problems.

What Does IBM’s PureSystem Announcement Mean for Architects?

On April 11th IBM announced what it is referring to as a new category of systems, expert integrated systems. As befits a company like IBM when it makes an announcement such as this, a fair deluge of information has been made available, including this expert integrated systems blog as well as an expert integrated system home at ibm.com.IBM says expert integrated systems are different because of three things: built-in expertise, integration by design and a simplified experience. In other words they are more than just a static stack of software and hardware components – a server here, some database software there, serving a fixed application at the top. Instead, these systems have three unique attributes:

  • Built-in expertise. Expert integrated systems represent the collective knowledge of thousands of deployments, established best practices, innovative thinking, IT industry leadership, and the distilled expertise of solution providers. Captured into the system in a deployable form from the base system infrastructure through the application.
  • Integrated by design.  All the hardware and software components are integrated and tuned in the lab and packaged in the factory into a single ready-to-go system. All of the integration is done for you, by experts.
  • Simplified experience. Expert integrated systems are designed to make every part of the IT lifecycle easier, from the moment you start designing what you need to the time you purchase, set up, operate, maintain and upgrade the system. Expert integrated systems provide a single point of management as well as integrated monitoring and maintenance.

At launch IBM has announced two models, PureFlex System and PureApplication System. IBM PureFlex System provides a factory integrated and optimized system infrastructure with integrated systems management whilst IBM PureApplication System provides an integrated and optimized application aware platform which captures patterns of expertise as well as providing simplified management via a single management console.

For a good, detailed and independent description of the PureSystem announcement see Timothy Prickett Morgan’s article in The Register. Another interesting view, from James Governer on RedMonk, is that PureSystems are IBM’s “iPad moment“. Governer argues that just as the iPad has driven a fundamental break with the past (tablets rather than laptops or even desktops), IBM wants to do the same thing in the data center. Another similarity with the iPad is IBM’s push to have application partners running on the new boxes at launch. The PureSystems site includes a catalog of third party apps customers can buy pre-installed.

What I’m interested in here is not so much what expert integrated systems are but what exactly the implications are for architects, specifically software architects. As Daniel Pink says in his book A Whole New Mind:

..any job that depends on routines – that can be reduced to a set of rules, or broken down into a set of repeatable steps – is at risk.

So are expert integrated systems, with built-in expertise and that are integrated by design, about to put the job of the software architect at risk?

In many ways the advent of the expert integrated system is really another step on the path of increasing levels of abstraction in computing that was started when the first assembler languages did away with the need for writing complex and error-prone machine language instructions in the 1950’s. Since then the whole history of computing has really been about adding additional layers of abstraction on top of the raw processors of the computers themselves. Each layer has allowed the programmers of such systems to worry less about how to control the computer and more on the actual problems to be solved. As we move toward trying to solve increasingly complex business problems the focus has to be more on business than IT. Expert integrated systems therefore have the potential (and it’s early days yet) to let the software architect focus on understanding how application software components can be combined in new and interesting ways (the true purpose of a software architect in my view) to solve complex and wicked problems rather than focusing too much on the complexities of what middleware components work with what and how all of these work with different operating systems and computer platforms.

So, rather than being the end of the era of the software architect I see expert integrated systems as being the start of a new era, even an age of enlightenment, when we can focus on the really interesting problems rather than the tedious ones bought about by the technology we have inherited over the last six decades or so.

Why We Need STEM++ Graduates

The need for more STEM (that’s Science, Technology, Engineering and Maths) skills seems to be on the agenda more and more these days. There is a strong feeling that the so called developed nations have depended too much on financial and other services to grow their economies and as a result “lost” their ability to design, develop and manufacture goods, largely because we are not producing enough STEM graduates to do this.Whilst I would see software as falling fairly and squarely into the STEM skillset (even if it is also used to  underpin nearly all of the modern financial services industry) as this blog post by Jessica Benjamin from IBM points out STEM skills alone won’t solve the really hard problems that are out there. With respect to the particular problems around big data Jessica succinctly says:

All the skills it takes to tell a good story, to compose a complete orchestra, are the skills it takes to put the pieces of this big data world together. If data is just data until its information, what’s a lot of information without the thought and skill of pulling all the chords together?

The need for right as well as left brained thinkers for solving the worlds really, really hard business problems is something that has been recognised for some time now by several prominent business leaders. Indeed the intersection of technology (left-brained) and design (right-brained) has certainly played a part in a lot of what technology companies like IBM and Apple have been a part of and made them successful.

So we need not just STEM skills but STEM++ skills where the addition of  “righty” skills like arts, humanities and design help us build not just a smarter world but one that is better to live in. For more on this check out my other (joint) blog The Versatilist Way.

You’re Building Me a What?

This week I’ve been attending a cloud architecture workshop. Not to architect a cloud for anyone in particular but to learn what the approach to architecting clouds should be. This being an IBM workshop there was, of course, lots of Tivoli this, WebSphere that and Power the other. Whilst the workshop was full of good advice I couldn’t help of thinking of this cartoon from 2008:

Courtesy geekandpoke.typepad.com

Just replace the word ‘SOA’ with ‘cloud’ (as ‘SOA’ could have been replaced by ‘client-server’ in the early nineties) and you get the idea. As software architects it is very easy to get seduced by technology, especially when it is new and your vendors, consultants and analysts are telling you this really is the future. However if you cannot explain to your client why you’re building him a cloud and what business benefit it will bring him then you are likely to fail just as much with this technology as people have with previous technology choices.

Educating an IT Workforce for the 21st Century

A report on the BBC Today programme this morning argues that the “Facebook generation needs better IT skills” and that UK schools should be providing courses in programming at GCSE. The report bemoaned the fact that so called Information and Communications Technology (ICT) GCSEs did little more than teach students how to use Microsoft Office programmes such as Word and Excel and did not prepare students for a career in IT. The backers of this report were companies like Google and Microsoft.This raises an interesting question of who should be funding such education in these austere times. Is it the role of schools to provide quite specific skills like programming or should they be providing the basics of literacy and numeracy as well as the more fundamental skills of creativity, communication and collaboration and leave the specifics to the industries that need them? Here are some of the issues related to this:

  1. Skills like computer programming are continuously evolving and changing. What is taught at 14 – 16 today (the age of GCSE students in the UK) will almost certainly be out of date when these students hit the work force at 21+.
  2. The computer industry, just like manufacturing before it, long ago sent out the message to students that programming skills (in Western economies at least) were commoditised and better performed by the low-cost economies of the BRIC nations (and now, presumably, the CEVITS).
  3. To most people computers are just tools. Like cars, washing machines and mobile phones they don’t need to know how they work, just how to use them effectively.
  4. Why stop at computer programming GCSE? Why not teach the basics of plumbing, car mechanics, cookery and hairdressing, all of which are in great demand still and needed by their respective industries.
  5. Public education (which essentially did not exist before the 19th century, certainly not for the masses) came about to meet the needs of industrialism and as such demanded skills in left-brained, logical thinking skills rather than right brained, creative skills (see Sir Ken Robinson’s TED talk on why schools kill creativity). As a result we have a system that rewards the former rather than the latter (as in “there’s no point in studying painting or music, you’ll never get a job in that”).

In an ideal world we would all be given the opportunities to learn and apply whatever skills we wanted (both at school and throughout life) and have that learning funded by the tax payer on the basis it benefits society as a whole. Unfortunately we don’t live in that ideal world and in fact are probably moving further from it than ever.

Back in the real world therefore industry must surely fund the acquiring of those skills. Unfortunately in many companies education is the first thing to be cut when times are hard. The opposite should be the case. One of the best things I ever did was to spend five weeks (yes that’s weeks not days), funded entirely by IBM, learning object-oriented programming and design. Whilst five weeks may seem like a long time for a course I know this has paid for itself many, many times over by the work I have been able to do for IBM in the 15 years since attending that course. Further, I suspect that five weeks intensive learning was easily equivalent to at least a years worth of learning in an educational establishment.

Of course such skills are more vital to companies like Google, Microsoft and IBM than ever before. Steve Denning in an article called Why Big Companies Die in Forbes this month quotes from an article by Peggy Noonan in the Wall Street Journal (called A Caveman Won’t Beat a Salesman). Denning uses a theory from Steve Jobs that big companies fail when salesmen and accountants are put in charge of and who don’t know anything about the product or service the company make or how it works. Denning says:

The activities of these people [salesmen and accountants] further dispirit the creators, the product engineers and designers, and also crimp the firm’s ability to add value to its customers. But because the accountants appear to be adding to the firm’s short-term profitability, as a class they are also celebrated and well-rewarded, even as their activities systematically kill the firm’s future.

Steve Jobs showed that there was another way.  Namely, to keep playing the offense and focus totally on adding value for customers by creating new and innovative new products. By doing that you can make more money than the companies that are milking their cash cows and focused on making money rather than products.

Companies like Google and Microsoft (and IBM and Apple) need people fully trained in the three C’s (creativity, communication and creativity) who can then apply these to whatever task is most relevant to the companies bottom line. It’s the role of those companies, not government, to train people in the specifics.

Interestingly Seymour Papert (who co-invented the Logo programming language) used programming as a tool to improve the way that children think and solve problems. Papert used Piaget‘s work of cognitive development (that showed how children learn) and used Logo as a way of improving their creativity.

Finally, to see how students themselves view all this see the article by Nikhil Goyal’s (a 16-year-old junior at Syosset High School in New York) who states: “for the 21st century American economy, all economic value will derive from entrepreneurship and innovation. Low-cost manufacturing will essentially be wiped out of this country and shipped to China, India, and other nations” and goes on to propose that
“we institute a 21st century model of education, rooted in 21st century learning skills and creativity, imagination, discovery, and project-based learning”. Powerful stuff for one so young, there may yet be hope for us.

What Can Architects Learn from Steve Jobs

I’ve just finished reading Steve Jobs by Walter Isaacson. In case there is anyone out there who doesn’t know it yet, this is the authorised biography that Jobs asked Isaacson to write which was completed a few weeks before Jobs untimely death aged 56 last month. Jobs insisted that Isaacson would have complete control over the contents of the book saying he would not even read it before it was published adding “I don’t have any skeletons in my closet that can’t be allowed out”.Jobs is clearly a very complex personality, on the one hand a creative genius whose zen like focus on simplicity and efficiency helped create some of the most beautiful and useful gadgets of our time (some of which we never even knew we needed) whilst on the other he was a bully and a tyrant who knew exactly how to “size people up, understand their inner thoughts, and know how to relate to them, cajole them, or hurt them at will”. One of jobs girl friends, who later went on to found a mental health resource network in California, even went so far to say that she thought Jobs suffered from Narcissistic Personality Disorder (NPD) in which the individual is described as being excessively preoccupied with issues of personal adequacy, power, prestige and vanity.

Whilst it is to be hoped that NPD is not a prerequisite for being a software architect Jobs did have vision and understanding of IT that we as architects can learn from. Six big ideas that stand out in this respect are:

  1. Engineering matters. When jobs met with President Obama in 2011 he implored the President to reform the US education system and to create more engineering students. Jobs said “if you could educate these engineers we could move more manufacturing plants here”. Whilst there was always an uneasy tension between engineering and design at Apple Jobs recognised and valued the importance of there being an engineering led rather than sales led team at the top of the company berating companies like Microsoft (under Balmer), IBM (under Akers) and HP (under their last several CEOs) for putting sales people in charge rather than engineers. For software architects, engineering clearly translates to being intimately knowledgeable with the technology you are using, knowing how to put the working parts together. The best architects I know are passionate about technology.
  2. Artistry and design matters just as much as engineering. This is a theme that Jobs emphasises over and over again. From when he dropped out of college and instead took a course on calligraphy to his sometimes maniacal focus on the smallest details of design to make the product as satisfying and aesthetically pleasing as possible. He even emphasized that circuit boards, which no one would ever see once the product was fully assembled, should be laid out in as clean and uncluttered was as possible. It is this aspect of design that most matters for architects. Provided that functionally a system does what it is meant to do within the required constraints and system qualities one could argue it does not matter how messily the software is assembled. Whose going to see it anyway? This misses the point though.Great design, as opposed to just good enough design, means the system will be easier to maintain, take less effort to learn and generally be more enjoyable for those that need to carry on working on it once the architects and developers have moved on.
  3. Simple is better than complex. Apple had a design mantra: “Simplicity is the ultimate sophistication” or as Jobs said “very simple, and we’re really shooting for Museum of Modern Art quality”. Jobs felt that design simplicity should be linked to making products easy to use.So much of the software that we create today is far too complex and feature rich and as a result is very hard to use. People will often say that it has to be like that because just look at all the features you are getting. Unfortunately a lot of the time many of those features are not needed but add to the general bloat of the systems we build making them hard to use as well as difficult to maintain. Sadly building a complex system is often easier than building a simple one and it is not many architects that see value in stripping out functionality rather than adding it.
  4. An unremitting focus on detail is key to creating a great product. Jobs was unique in that he was able to hold both the big picture view as well as zooming in to fine details. He would often sweat over the smallest detail until he was satisfied it was just right. This could be anything from the colour of a screw on the back plate of the iPod to the angle of the bevel on the iPad to make someone want to pick it up. This capacity for holding both the big picture view whilst also being able to zoom right down and question low level details is probably one of the hardest things architects have to do but being able to do so gives a definite advantage and enables greater integrity as well as better execution of vision.
  5. Customers don’t always know what they want. In September 1982 when Jobs and his team were designing the original Macintosh he held a retreat for the Mac team near Monteray where he gave a presentation on his thoughts for the Mac. At the end someone asked whether or not they should do some market research to find out what customers wanted. “No”, replied Jobs, “because people don’t know what they want until we’ve shown them”. He then pulled out a device the size of a desk diary and flipped it open, it turned out to be a mock-up of a computer that could fit into your lap with a keyboard and screen hinged together like a notebook. “This is my dream of what we will be making in the mid- to late eighties”, Jobs said. Apple supposedly never did do any market research preferring to follow the Henry Ford approach who said he never asked what people wanted because they would have just asked for a better horseless carriage. Whilst it is probably the case that people can often see how to make incremental improvements to products they usually cannot see how to make disruptive changes that introduce a who new way of doing things, possibly making everything that went before it redundant. It is the job of the architect to show what is in the realms of the possible by creating new and innovative systems.
  6. Putting things together in new and creative ways is sometimes more important than inventing things. Jobs was not the first to market with an MP3 player, a mobile phone or a tablet computer. Others had already innovated and built these things. What Jobs and Apple did were to tweak things that already existed. As Isaacson says “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to”. Jobs applied his design skills to these and came up with a (far) better product and in fact a whole new platform as well (i.e. the computer as the digital hub. Architects to need to learn that its often putting together existing components in new and innovative ways that really counts and gives a competitive and business advantage.

Plus Two More

In my previous post on five architectures that changed the world I left out a couple that didn’t fit my self-imposed criteria. Here, therefore, are two more, the first of which is a bit too techie to be a part of everyone’s lives but is nonetheless hugely important and the second of which has not changed the world yet but has pretty big potential to do so.

IBM System/360
Before the System/360 there was very little interchangeability between computers, even from the same manufacturers. Software had to be created for each type of computer making them very difficult to develop applications for as well as maintain. The System/360 practically invented the concept of architecture as applied to computers in that it had an architecture specification that did not make any assumptions on the implementation itself, but rather describes the interfaces and the expected behavior of an implementation. The System/360 was the first family of computers designed to cover the complete range of applications, from small to large, both commercial and scientific. The development of the System/360 cost $5 billion back in 1964, that’s $34 billion of today’s money and almost destroyed IBM.

Watson
Unless you are American you had probably never heard of the TV game show called Jeopardy! up until the start of 2011. Now we know that it is a show that “uses puns, subtlety and wordplay” that humans enjoy but which computers would get tied up in knots over. This, it turns out, was the challenge that David Ferrucci, the IBM scientist who led the four year quest to build Watson, had set himself to compete live against humans in the TV show.

IBM has “form” on building computers to play games! The previous one (Deep Blue) won a six-game match by two wins to one with three draws against world chess champion Garry Kasparov in 1997. Chess, it turns out, is a breeze to play compared to Jeopardy! Here’s why.
Chess…

  •  §Is a finite, mathematically well-defined search space.
  • Has a large but limited number of moves and states.
  • Makes everything explicit and has unambiguous mathematical rules which computers love.

Games like Jeopardy! play on the subtleties of the human language however which is…

  • Ambiguous, contextual and implicit.
  • Grounded only in human cognition.
  • Can have a seemingly infinite number of ways to express the same meaning.

According to IBM Watson is “built on IBM’s DeepQA technology for hypothesis generation, massive evidence gathering, analysis, and scoring.” Phew! The point of Watson however is not its ability to play a game show but in the potential to “weaves its fabric” into the messiness of our human lives where data is not kept in nice ordered relational databases but is unstructured and seemingly unrelated but nevertheless can sometimes have new and undiscovered meaning. One obvious application is in medical diagnosis but it could also be used in a vast array of other situations from help desks through to sorting out what benefits you are entitled to. So, not world changing yet but definitely watch this space.

Five Software Architectures That Changed The World

Photo by Kobu Agency on Unsplash
Photo by Kobu Agency on Unsplash

“Software is the invisible thread and hardware is the loom on which computing weaves its fabric, a fabric that we have now draped across all of life”.

Grady Booch

Software, although an “invisible thread” has certainly had a significant impact on our world and now pervades pretty much all of our lives. Some software, and in particular some software architectures, have had a significance beyond just the everyday and have truly changed the world.

But what constitutes a world changing architecture? For me it is one that meets all of the following:

  1. It must have had an impact beyond the field of computer science or a single business area and must have woven its way into peoples lives.
  2. It may not have introduced any new technology but may instead have used some existing components in new and innovative ways.
  3. The architecture itself may be relatively simple, but the way it has been deployed may be what makes it “world changing”.
  4. It has extended the lexicon of our language either literally (as in “I tried googling that word” or indirectly in what we do (e.g. the way we now use App stores to get our software).
  5. The architecture has emergent properties and has been extended in ways the architect(s) did not originally envisage.

Based on these criteria here are five architectures that have really changed our lives and our world.

World Wide Web
When Tim Berners-Lee published his innocuous sounding paper Information Management: A Proposal in 1989 I doubt he could have had any idea what an impact his “proposal” was going to have. This was the paper that introduced us to what we now call the world wide web and has quite literally changed the world forever.

Apple’s iTunes
There has been much talk in cyberspace and in the media in general on the effect and impact Steve Jobs has had on the world. When Apple introduced the iPod in October 2001 although it had the usual Apple cool design makeover it was, when all was said and done, just another MP3 player. What really made the iPod take off and changed everything was iTunes. It not only turned the music industry upside down and inside out but gave us the game-changing concept of the ‘App Store’ as a way of consuming digital media. The impact of this is still ongoing and is driving the whole idea of cloud computing and the way we will consume software.

Google
When Google was founded in 1999 it was just another company building a search engine. As Douglas Edwards says in his book I’m Feeling Lucky “everybody and their brother had a search engine in those days”. When Sergey Brin was asked how he was going to make money (out of search) he said “Well…, we’ll figure something out”. Clearly 12 years later they have figured out that something and become one of the fastest growing companies ever. What Google did was not only create a better, faster, more complete search engine than anyone else but also figured out how to pay for it, and all the other Google applications, through advertising. They have created a new market and value network (in other words a disruptive technology) that has changed the way we seek out and use information.

Wikipedia
Before WIkipedia there was a job called an Encyclopedia Salesman who walked from door to door selling knowledge packed between bound leather covers. Now, such people have been banished to the great redundancy home in the sky along with typesetters and comptometer operators.

If you do a Wikipedia on Wikipedia you get the following definition:

Wikipedia is a multilingual, web-based, free-content encyclopedia project based on an openly editable model. The name “Wikipedia” is a portmanteau of the words wiki (a technology for creating collaborative websites, from the Hawaiian word wiki, meaning “quick”) and encyclopedia. Wikipedia’s articles provide links to guide the user to related pages with additional information.

From an architectural point of view Wikipedia is “just another wiki” however what it has bought to the world is community participation on a massive scale and an architecture to support that collaboration (400 million unique visitors monthly more than 82,000 active contributors working on more than 19 million articles in over 270 languages). Wikipedia clearly meets all of the above crtieria (and more).

Facebook
To many people Facebook is social networking. Not only has it seen off all competitors it makes it almost impossible for new ones to join. Whilst the jury is still out on Google+ it will be difficult to see how it can ever reach the 800 million people Facebook has. Facebook is also the largest photo-storing site on the web and has developed its own photo storage system to store and serve its photographs. See this article on Facebook architecture as well as this presentation (slightly old now but interesting nonetheless).

I’d like to thank both Grady Booch and Peter Eeles for providing input to this post. Grady has been doing great work on software archeology  and knows a thing or two about software architecture. Peter is my colleague at IBM as well as co-author on The Process of Software Architecting.