Forty years of Mac

Screenshot from Apple’s “1984” ad directed by Sir Ridley Scott

Forty years ago today (24th January 1984) a young Steve Jobs took to the stage at the Flint Center in Cupertino, California to introduce the Apple Macintosh desktop computer and the world found out “why 1984 won’t be like ‘1984’.

The Apple Macintosh, or ‘Mac’, boasted cutting-edge specifications for its day. It had an impressive 9-inch monochrome display with a resolution of 512 x 342 pixels, a 3.5-inch floppy disk drive, and 128 KB of RAM. The 32-bit Motorola 68000 microprocessor powered this compact yet powerful machine, setting new standards for graphical user interfaces and ease of use.

The original Apple Macintosh

The Mac had been gestating in Steve Jobs restless and creative mind for at least five years but had not started its difficult birth process until 1981 when Jobs recruited a team of talented individuals, including visionaries like Jef Raskin, Andy Hertzfeld, and Bill Atkinson. The collaboration of these creative minds led to the birth of the Macintosh, a computer that not only revolutionized the industry but also left an indelible mark on the way people interact with technology.

The Mac was one of the first personal computers to feature a graphical user interface (Microsoft Windows 1.0 was not released until November 1985) as well as the use of icons, windows, and a mouse for navigation instead of a command-line interface. This approach significantly influenced the development of GUIs across various operating systems.

Possibly of more significance is that some of the lessons learned from the Mac have and continue to influence the development of subsequent Apple products. Steve Jobs’ (and later Jony Ive’s) commitment to simplicity and elegance in design became a guiding principle for products like the iPod, iPhone, iPad, and MacBook and are what really make the Apple ecosystem (as well as allowing it to charge the prices it does).

One of the pivotal moments in Mac’s development was the now famous “1984” ad , which had its one and only public airing two days before during a Super Bowl XVIII commercial break and built a huge anticipation for the groundbreaking product.

I was a relative late convert to the cult of Apple, not buying my first computer (a MacBook Pro) until 2006. I still have this computer and periodically start it up for old times sake. It still works perfectly albeit very slowly and with a now very old copy of macOS running.

A more significant event, for me at least, was that a year after the Mac launch I moved to Cupertino to take a job as a software engineer at a company called ROLM, a telecoms provider that had just been bought by IBM and was looking to move into Europe. ROLM was on a recruiting drive to hire engineers from Europe who knew how to develop product for that marketplace and I had been lucky enough to have the right skills (digital signalling systems) at the right time.

At the time of my move I had some awareness of Apple but got to know it more as I ended up living only a few blocks from Apple’s HQ on Mariani Avenue, Cupertino (I lived just off Stevens Creek Boulevard which used to be chock-full of car dealerships at that time).

The other slight irony of this is that IBM (ROLM’s owner) was of course “big brother” in Apple’s ad and the young girl with the sledgehammer was out to break their then virtual monopoly on personal computers. IBM no longer makes their machine whilst Apple has obviously gone from strength to strength.

Happy Birthday Mac!

What Have we Learnt from Ten Years of the iPhone?

Ten years ago this week (on 9th January 2007) the late Steve Jobs, then at the hight of his powers at Apple, introduced the iPhone to an unsuspecting world. The history of that little device (which has got both smaller and bigger in the interceding ten years) is writ large over the entire Internet so I’m not going to repeat it here. However it’s worth looking at the above video on YouTube not just to remind yourself what a monumental and historical moment in tech history this was, even though few of us realised it at the time, but also to see a masterpiece in how to launch a new product.

Within two minutes of Jobs walking on stage he has the audience shouting and cheering as if he’s a rock star rather than a CEO. At around 16:25 when he’s unveiled his new baby and shows for the first time how to scroll through a list in a screen (hard to believe that ten years ago know one knew this was possible) they are practically eating out of his hand and he still has over an hour to go!

This iPhone keynote, probably one of the most important in the whole of tech history, is a case study on how to deliver a great presentation. Indeed, Nancy Duart in her book Resonate, has this as one of her case studies for how to “present visual stories that transform audiences”. In the book she analyses the whole event to show how Jobs’ uses all of the classic techniques of storytelling, establish what is and what could be, build suspense, keep your audience engaged, make them marvel and finally  show them a new bliss.

The iPhone product launch, though hugely important, is not what this post is about though. Rather, it’s about how ten years later the iPhone has kept pace with innovations in technology to not only remain relevant (and much copied) but also to continue to influence (for better and worse) the way people interact, communicate and indeed live. There are a number of enabling ideas and technologies, both introduced at launch as well as since, that have enabled this to happen. What are they and how can we learn from the example set by Apple and how can we improve on them?

Open systems generally beat closed systems

At its launch Apple had created a small set of native apps the making of which was not available to third-party developers. According to Jobs, it was an issue of security. “You don’t want your phone to be an open platform,” he said. “You don’t want it to not work because one of the apps you loaded that morning screwed it up. Cingular doesn’t want to see their West Coast network go down because of some app. This thing is more like an iPod than it is a computer in that sense.”

Jobs soon went back on that decision which is one of the factors that has led to the overwhelming success of the device. There are now 2.2 million apps available for download in the App Store with over 140 billion downloads made since 2007.

As has been shown time and time again, opening systems up and allowing access to third party developers nearly always beat keeping systems closed and locked down.

Open systems need easy to use ecosystems

Claiming your system is open does not mean developers will flock to it to extend your system unless it is both easy and potentially profitable to do so. Further, the second of these is unlikely to happen unless the first enabler is put in place.

Today with new systems being built around Cognitive computing, the Internet of Things (IoT) and Blockchain companies both large and small are vying with each other to provide easy to use but secure ecosystems that allow these new technologies to flourish and grow, hopefully to the benefits to business and society as a whole. There will be casualties on the way but this competition, and the recognition that systems need to be built right rather than us just building the right system at the time is what matters.

Open systems must not mean insecure systems

One of the reasons Jobs gave for not initially making the iPhone an open platform was his concerns over security and for hackers to break into those systems wreaking havoc. These concerns have not gone away but have become even more prominent. IoT and artificial intelligence, when embedded in everyday objects like cars and  kitchen appliances as well as our logistics and defence systems have the potential to cause there own unique and potentially disastrous type of destruction.

The cost of data breaches alone is estimated at $3.8 to $4 million and that’s without even considering the wider reputational loss companies face. Organisations need to monitor how security threats are evolving year to year and get well-informed insights about the impact they can have on their business and reputation.

Ethics matter too

With all the recent press coverage of how fake news may have affected the US election and may impact the upcoming German and French elections as well as the implications of driverless cars making life and death decisions for us, the ethics of cognitive computing is becoming a more and more serious topic for public discussion as well as potential government intervention.

In October last year the Whitehouse released a report called Preparing for the Future of Artificial Intelligence. The report looked at the current state of AI, its existing and potential applications, and the questions that progress in AI raise for society and public policy and made a number of recommendations on further actions. These included:

  • Prioritising open training data and open data standards in AI.
  • Industry should work with government to keep government updated on the general progress of AI in industry, including the likelihood of milestones being reached
  • The Federal government should prioritize basic and long-term AI research

As part of the answer to addressing the Whitehouse report this week a group of private investors, including LinkedIn co-founder Reid Hoffman and eBay founder Pierre Omidyar, launched a $27 million research fund, called the Ethics and Governance of Artificial Intelligence Fund. The group’s purpose is to foster the development of artificial intelligence for social good by approaching technological developments with input from a diverse set of viewpoints, such as policymakers, faith leaders, and economists.

I have discussed before about transformative technologies like the world wide web have impacted all of our lives, and not always for the good. I hope that initiatives like that of the US government (which will hopefully continue under the new leadership) will enable a good and rationale public discourse on how  we allow these new systems to shape our lives for the next ten years and beyond.

What Makes a Tech City? (Hint: It’s Not the Tech)

Matthew Boulton, James Watt and William Murdoch

The above photograph is of a statue in Centenary Square, Birmingham in the UK. The three figures in it: Matthew Boulton, James Watt and William Murdoch were the tech pioneers of their day, living in and around Birmingham and being associated with a loosely  knit group who referred to themselves as The Lunar Society. The history of the Lunar Society and the people involved has been captured in the book The Lunar Men by Jenny Uglow.

“Amid fields and hills, the Lunar men build factories, plan canals, make steam-engines thunder. They discover new gases, new minerals and new medicines and propose unsettling new ideas. They create objects of beauty and poetry of bizarre allure. They sail on the crest of the new. Yet their powerhouse of invention is not made up of aristocrats or statesmen or scholars but of provincial manufacturers, professional men and gifted amateurs – friends who meet almost by accident and whose lives overlap until they die.”

From The Lunar Men by Jenny Uglow

You don’t have to live in the UK to have heard that Birmingham, like many of the other great manufacturing cities of the Midlands and Northern England has somewhat lost its way over the century or so since the Lunar Men were creating their “objects of beauty and poetry of bizarre allure”. It’s now sometimes hard to believe that these great cities were the powerhouses and engines of the industrial revolution that changed not just England but the whole world. This is something that was neatly summed up by Steven Knight, creator of the BBC television programme Peaky Blinders set in the lawless backstreets of Birmingham in the  1920’s. In a recent interview in the Guardian Knight says:

“It’s typical of Brum that the modern world was invented in Handsworth and nobody knows about it. I am trying to start a “Make it in Birmingham” campaign, to get high-tech industries – film, animation, virtual reality, gaming – all into one place, a place where people make things, which is what Birmingham has always been.”

Likewise Andy Street, Managing Director of John Lewis and Chair of the Greater Birmingham & Solihull Local Enterprise Partnership had this to say about Birmingham in his University of Birmingham Business School Advisory Board guest lecture last year:

“Birmingham was once a world leader due to our innovations in manufacturing, and the city is finally experiencing a renaissance. Our ambition is to be one of the biggest, most successful cities in the world once more.”

Andy Street  CBE – MD of John Lewis

If Birmingham and cities like it, not just in England but around the world, are to become engines of innovation once again then they need to take a step change in how they go about doing that. The lesson to be learned from the Lunar Men is that they did not wait for grants from central Government or the European Union or for some huge corporation to move in and take things in hand but that they drove innovation from their own passion and inquisitiveness about how the world worked, or could work. They basically got together, decided what needed to be done and got on with it. They literally designed and built the infrastructure that was to be form the foundations of innovation for the next 100 years.

Today we talk of digital innovation and how the industries of our era are disrupting traditional ones (many of them formed by the Lunar Men and their descendants) for better and for worse. Now every city wants a piece of that action and wants to emulate the shining light of digital innovation and disruption, Silicon Valley in California. Is that possible? According to the Medium post To Invent the Future, You Must Understand the Past, the answer is no. The post concludes by saying:

“…no one will succeed because no place else — including Silicon Valley itself in its 2015 incarnation — could ever reproduce the unique concoction of academic research, technology, countercultural ideals and a California-specific type of Gold Rush reputation that attracts people with a high tolerance for risk and very little to lose.”

So can this really be true? High tolerance to risk (and failure) is certainly one of the traits that makes for a creative society. No amount of tax breaks or university research programmes is going to fix that problem. Taking the example of the Lunar Men though, one thing that cities can do to disrupt themselves from within is to effect change from the bottom up rather than the top down. Cities are made up of citizens after all and they are the very people that not only know what needs changing but also are best placed to bring about that change.

Whitepaper-cover-212x300

With this in mind, an organisation in Birmingham called Silicon Canal (see here if you want to know where that name comes from) of which I am a part, has created a white paper putting forward our ideas on how to build a tech and digital ecosystem in and around Birmingham. You can download a copy of the white paper here.

The paper not only identifies the problem areas but also how things can be improved and suggests potential solutions to grow the tech ecosystem in the Greater Birmingham area so that it competes on an international stage. Download the white paper, read it and if you are based in Birmingham join in the conversation and if you’re not use the research contained within it to look at your own city and how you can help change it for the better.

This paper was launched at an event this week in the new iCentrum building at Innovation Birmingham which is a great space that is starting to address one of the issues highlighted in the white paper, namely to bring together two key elements of a successful tech ecosystem, established companies and entrepreneurs.

Another event that is taking place in Birmingham next month is TEDx Brum – The Power of US which promises to have lots of inspiring talks by local people who are already effecting change from within.

As a final comment if you’re still not sure that you have the power to make changes that make a difference here are some words from the late Steve Jobs:

“Everything around you that you call life was made up by people that were no smarter than you and you can change it, you can influence it, you can build your own things that other people can use.”

Steve Jobs

What Have Architects Ever Done for Us?

I’ve been thinking about blogging on the topic of what value architects bring to the table in an age of open source software, commoditized hardware and agile development for a while. I’ve finally been spurred into action by re-discovering the famous Monty Python sketch What have the Romans ever done for us? (I often find that thinking of a name for a blog post helps me to formulate the content and structure what I want to say). Here’s the video in case you haven’t seen it.

So, picture the scene…

You are in a meeting with the chief information office (CIO) of a public or private sector enterprise who has been tasked with aligning IT with the new business strategy to “deliver real business value”. The current hot technologies, namely social media, mobile, big data/analytics and cloud, are all being mooted as the thing the organisation needs to enable it to leapfrog the competition and deliver something new and innovative to its customers. The CIO however has been burnt before by an architecture team that seems to spend most of its time discussing new technology, drawing fine looking pictures that adorn their cubicle walls and attending conferences sponsored by vendors. She struggles to see the value these people bring and asks in a frustrated tone “what have architects ever done for us”? What’s your response? Here’s what I think architects should be doing to support the CIO and help her achieve the enterprise’s goals.

  1. Architects bring order from chaos. The world of IT continues to get ever more challenging. Each new architectural paradigm adds more layers of complexity onto an organisations already overstretched IT infrastructure. As more technologies get thrown into this mix, often to solve immediate and pressing business problems but without being a part of any overall strategic vision, IT systems begin to sink into more and more of a chaotic state. One of the roles of an architect is not only to attempt to prevent this happening in the first place (see number 2) but also to describe a future “to-be” state, together with a road map for how to get to this new world. Some will say that this form of enterprise level architecting is fundamentally flawed however I would argue it still has great value provided it is done at the right level of abstraction (not everything is enterprise level) and recognises change will be continuous and true nirvana will never be achieved.
  2. Architects don’t jump on the latest trend and forget what went before. When a new technology comes along it’s sometimes easy to forget that it’s just a new technology. Whilst the impact on end users may be different, the way enterprises go about integrating that technology into their business, still needs to follow tried and tested methods. Remember, don’t throw out the baby with the bath water.
  3. Architects focus on business value rather than latest technology. Technologies come and go, some change the world, some don’t. Unless technology can provide some tangible benefits to the way a business operates it is unlikely to gain a foothold. Architects know that identifying the business value of technology and realising that value through robust solutions built on the technology is what is key. Technology for the sake of technology no longer works (and probably never did).
  4. Architects know how to apply technology to bring innovation.This is subtlety different from 3. This is about not just using technology to provide incremental improvements in the way a business operates but in using technology to provide disruptive innovation that causes a major shift in the way a business operates. Such disruptions often cause some businesses to disappear but at the same time can cause others to be created.
  5. Architects know the importance of “shipping”. According to Steve Jobs “real artists ship”. Delivering something (anything) on time and within budget is one of the great challenges of software development. Time or money (or both) usually run out before anything is delivered.Good architects know the importance of working within the constraints of time and money and work with project managers to ensure shipping takes place on time and within budget.

So there you have it, my take on the value of architects and what you hopefully do for your organisation or clients. Now, if only we could do something about bringing world peace…

A Tale of Two Presentations

Popular consensus would seem to have it that the 2007 presentation by Steve Jobs at MacWorld where he unveiled the iPhone is one of the all time best business presentations ever. Not just in terms of the delivery but also in terms of the impact it had on the world.

As a stark contrast, according to Ron Galloway in the Huff Post Business Blog a recent presentation by Sony introducing the PS4 will likely go down as one of the worst business presentations ever. I’ve not seen the Sony presentation but according to Wired they held reporters hostage for two hours and never actually showed them their new console, just the controller, and revealed very little about what the new console would be like.

Amazing that a company as large and influential as Sony can make so many fundamental presentation mistakes but a salutary lesson to us all I think.

There is some very good presentation advice at the end of the Huff Post blog by the way. So useful it’s worth cutting out and sticking to your presentation notes.

  1. Respect your audience and their time.
  2. Get on stage.
  3. Make your assertion.
  4. Support it with visual evidence.
  5. Repeat your assertion.
  6. Leave the stage.

The Art of What’s Possible (and What’s Not)

One of the things Apple are definitely good at is giving us products we didn’t know we needed (e.g. the iPad). Steve Jobs, who died a year ago this week, famously said “You’ve got to start with the customer experience and work back to the technology — not the other way around”  (see this video at around 1:55 as well as this interview with Steve Jobs in Wired).

The subtle difference from the “normal” requirements gathering process here is that, rather than asking what the customer wants, you are looking at the customer experience you want to create and then trying to figure out how available technology can realise that experience. In retrospect, we can all see why a device like the iPad is so useful (movies and books on the go, a cloud enabled device that lets you move data between it and other devices, mobile web on a screen you can actually read etc, etc). Chances are however that it would have been very difficult to elicit a set of requirements from someone that would have ended up with such a device.

Jobs goes on to say “you can’t start with the technology and try to figure out where you’re going to try and sell it”. In many ways this is a restatement of the well known “golden hammer” anti-pattern (to a man with a hammer, everything appears as a nail) from software development, the misapplication of a favored technology, tool or concept in solving a problem.

Whilst all this is true and would seem to make sense, at least as far as Apple is concerned, there is still another subtlety at play when building truly successful products that people didn’t know they wanted. As an illustration of this consider another, slightly more infamous Apple product, the Newton Message Pad.

In many ways the Newton was an early version of the iPad or iPhone (see above for the two side by side), some 25 years ahead of its time. One of its goals was to “reinvent personal computing”. There were many reasons why the Newton did not succeed (including it’s large, clunky size and poor handwriting recognition system) however one of them must surely have been that the device was just too far ahead of the technology available at the time in terms of processing power, memory, battery life and display technology. Sometimes ideas can be really great but the technology is just not there to support them.So, whilst Jobs is right in saying you cannot start with the technology then decide how to sell it equally you cannot start with an idea if the technology is not there to support it, as was the case with the Newton. So what does this mean for architects?

A good understanding of technology, how it works and how it can be used to solve business problems is, of course, a key skill of any architect however, equally important is an understanding of what is not possible with current technology. It is sometimes too easy to be seduced by technology and to overstate what it is capable of. Looking out for this, especially when there may be pressure on to close a sale, is something we must all do and be forceful in calling it out when we think something is not possible.

Giving Users What They Want (Maybe)

Tradition has it that users come up with a set of requirements which architects and designers take and turn into “a solution”. That is, a combination of bespoke and off-the-shelf, hardware and software components, that are assembled in such a way they address all the requirements (non-functional as well as functional). Another point of view is that users don’t actually know what they want and therefore need to be guided toward solutions they never knew they needed or indeed knew were possible. Two famous proponents of this approach were Henry Ford who supposedly said:

If I had asked people what they wanted, they would have said faster horses.

which is debunked here and of course Steve Jobs and Apple whose “Eureka” moments continue to give us gadgets we never knew we needed. As Adrian Slywotzky points out here however, the magic that Jobs and Apple seem to regularly perform is actually based on highly focused and detailed business design, continuous refinement through prototyping and a manic attention to the customer experience.In other words it really is 90% perspiration and 10% inspiration.

One thing that both Henry Ford and Steve Jobs/Apple did have in common was also a deep understanding of the technology in their chosen fields of expertise and, more importantly, where that technology was heading.

If, as an architect, you are to have a sensible conversation with users (AKA customers, clients, stakeholders et al) about how to create an architecture that addresses their needs you not only need a good understanding of their business you also need a deep understanding of what technology is available and where that technology is heading. This is a tall order for one persons brain which is why the job of an architect is uniquely fascinating (but also hard work). It’s also why, if you’re good at it, you’ll be in demand for a while yet.

Remember that, even though they may not know it, users are looking at you to guide them not only on what the knowns are but also on what the unknowns are. In other words, it’s your job to understand the art of the possible, not just the art of the common-or-garden.

Educating an IT Workforce for the 21st Century

A report on the BBC Today programme this morning argues that the “Facebook generation needs better IT skills” and that UK schools should be providing courses in programming at GCSE. The report bemoaned the fact that so called Information and Communications Technology (ICT) GCSEs did little more than teach students how to use Microsoft Office programmes such as Word and Excel and did not prepare students for a career in IT. The backers of this report were companies like Google and Microsoft.This raises an interesting question of who should be funding such education in these austere times. Is it the role of schools to provide quite specific skills like programming or should they be providing the basics of literacy and numeracy as well as the more fundamental skills of creativity, communication and collaboration and leave the specifics to the industries that need them? Here are some of the issues related to this:

  1. Skills like computer programming are continuously evolving and changing. What is taught at 14 – 16 today (the age of GCSE students in the UK) will almost certainly be out of date when these students hit the work force at 21+.
  2. The computer industry, just like manufacturing before it, long ago sent out the message to students that programming skills (in Western economies at least) were commoditised and better performed by the low-cost economies of the BRIC nations (and now, presumably, the CEVITS).
  3. To most people computers are just tools. Like cars, washing machines and mobile phones they don’t need to know how they work, just how to use them effectively.
  4. Why stop at computer programming GCSE? Why not teach the basics of plumbing, car mechanics, cookery and hairdressing, all of which are in great demand still and needed by their respective industries.
  5. Public education (which essentially did not exist before the 19th century, certainly not for the masses) came about to meet the needs of industrialism and as such demanded skills in left-brained, logical thinking skills rather than right brained, creative skills (see Sir Ken Robinson’s TED talk on why schools kill creativity). As a result we have a system that rewards the former rather than the latter (as in “there’s no point in studying painting or music, you’ll never get a job in that”).

In an ideal world we would all be given the opportunities to learn and apply whatever skills we wanted (both at school and throughout life) and have that learning funded by the tax payer on the basis it benefits society as a whole. Unfortunately we don’t live in that ideal world and in fact are probably moving further from it than ever.

Back in the real world therefore industry must surely fund the acquiring of those skills. Unfortunately in many companies education is the first thing to be cut when times are hard. The opposite should be the case. One of the best things I ever did was to spend five weeks (yes that’s weeks not days), funded entirely by IBM, learning object-oriented programming and design. Whilst five weeks may seem like a long time for a course I know this has paid for itself many, many times over by the work I have been able to do for IBM in the 15 years since attending that course. Further, I suspect that five weeks intensive learning was easily equivalent to at least a years worth of learning in an educational establishment.

Of course such skills are more vital to companies like Google, Microsoft and IBM than ever before. Steve Denning in an article called Why Big Companies Die in Forbes this month quotes from an article by Peggy Noonan in the Wall Street Journal (called A Caveman Won’t Beat a Salesman). Denning uses a theory from Steve Jobs that big companies fail when salesmen and accountants are put in charge of and who don’t know anything about the product or service the company make or how it works. Denning says:

The activities of these people [salesmen and accountants] further dispirit the creators, the product engineers and designers, and also crimp the firm’s ability to add value to its customers. But because the accountants appear to be adding to the firm’s short-term profitability, as a class they are also celebrated and well-rewarded, even as their activities systematically kill the firm’s future.

Steve Jobs showed that there was another way.  Namely, to keep playing the offense and focus totally on adding value for customers by creating new and innovative new products. By doing that you can make more money than the companies that are milking their cash cows and focused on making money rather than products.

Companies like Google and Microsoft (and IBM and Apple) need people fully trained in the three C’s (creativity, communication and creativity) who can then apply these to whatever task is most relevant to the companies bottom line. It’s the role of those companies, not government, to train people in the specifics.

Interestingly Seymour Papert (who co-invented the Logo programming language) used programming as a tool to improve the way that children think and solve problems. Papert used Piaget‘s work of cognitive development (that showed how children learn) and used Logo as a way of improving their creativity.

Finally, to see how students themselves view all this see the article by Nikhil Goyal’s (a 16-year-old junior at Syosset High School in New York) who states: “for the 21st century American economy, all economic value will derive from entrepreneurship and innovation. Low-cost manufacturing will essentially be wiped out of this country and shipped to China, India, and other nations” and goes on to propose that
“we institute a 21st century model of education, rooted in 21st century learning skills and creativity, imagination, discovery, and project-based learning”. Powerful stuff for one so young, there may yet be hope for us.

What Can Architects Learn from Steve Jobs

I’ve just finished reading Steve Jobs by Walter Isaacson. In case there is anyone out there who doesn’t know it yet, this is the authorised biography that Jobs asked Isaacson to write which was completed a few weeks before Jobs untimely death aged 56 last month. Jobs insisted that Isaacson would have complete control over the contents of the book saying he would not even read it before it was published adding “I don’t have any skeletons in my closet that can’t be allowed out”.Jobs is clearly a very complex personality, on the one hand a creative genius whose zen like focus on simplicity and efficiency helped create some of the most beautiful and useful gadgets of our time (some of which we never even knew we needed) whilst on the other he was a bully and a tyrant who knew exactly how to “size people up, understand their inner thoughts, and know how to relate to them, cajole them, or hurt them at will”. One of jobs girl friends, who later went on to found a mental health resource network in California, even went so far to say that she thought Jobs suffered from Narcissistic Personality Disorder (NPD) in which the individual is described as being excessively preoccupied with issues of personal adequacy, power, prestige and vanity.

Whilst it is to be hoped that NPD is not a prerequisite for being a software architect Jobs did have vision and understanding of IT that we as architects can learn from. Six big ideas that stand out in this respect are:

  1. Engineering matters. When jobs met with President Obama in 2011 he implored the President to reform the US education system and to create more engineering students. Jobs said “if you could educate these engineers we could move more manufacturing plants here”. Whilst there was always an uneasy tension between engineering and design at Apple Jobs recognised and valued the importance of there being an engineering led rather than sales led team at the top of the company berating companies like Microsoft (under Balmer), IBM (under Akers) and HP (under their last several CEOs) for putting sales people in charge rather than engineers. For software architects, engineering clearly translates to being intimately knowledgeable with the technology you are using, knowing how to put the working parts together. The best architects I know are passionate about technology.
  2. Artistry and design matters just as much as engineering. This is a theme that Jobs emphasises over and over again. From when he dropped out of college and instead took a course on calligraphy to his sometimes maniacal focus on the smallest details of design to make the product as satisfying and aesthetically pleasing as possible. He even emphasized that circuit boards, which no one would ever see once the product was fully assembled, should be laid out in as clean and uncluttered was as possible. It is this aspect of design that most matters for architects. Provided that functionally a system does what it is meant to do within the required constraints and system qualities one could argue it does not matter how messily the software is assembled. Whose going to see it anyway? This misses the point though.Great design, as opposed to just good enough design, means the system will be easier to maintain, take less effort to learn and generally be more enjoyable for those that need to carry on working on it once the architects and developers have moved on.
  3. Simple is better than complex. Apple had a design mantra: “Simplicity is the ultimate sophistication” or as Jobs said “very simple, and we’re really shooting for Museum of Modern Art quality”. Jobs felt that design simplicity should be linked to making products easy to use.So much of the software that we create today is far too complex and feature rich and as a result is very hard to use. People will often say that it has to be like that because just look at all the features you are getting. Unfortunately a lot of the time many of those features are not needed but add to the general bloat of the systems we build making them hard to use as well as difficult to maintain. Sadly building a complex system is often easier than building a simple one and it is not many architects that see value in stripping out functionality rather than adding it.
  4. An unremitting focus on detail is key to creating a great product. Jobs was unique in that he was able to hold both the big picture view as well as zooming in to fine details. He would often sweat over the smallest detail until he was satisfied it was just right. This could be anything from the colour of a screw on the back plate of the iPod to the angle of the bevel on the iPad to make someone want to pick it up. This capacity for holding both the big picture view whilst also being able to zoom right down and question low level details is probably one of the hardest things architects have to do but being able to do so gives a definite advantage and enables greater integrity as well as better execution of vision.
  5. Customers don’t always know what they want. In September 1982 when Jobs and his team were designing the original Macintosh he held a retreat for the Mac team near Monteray where he gave a presentation on his thoughts for the Mac. At the end someone asked whether or not they should do some market research to find out what customers wanted. “No”, replied Jobs, “because people don’t know what they want until we’ve shown them”. He then pulled out a device the size of a desk diary and flipped it open, it turned out to be a mock-up of a computer that could fit into your lap with a keyboard and screen hinged together like a notebook. “This is my dream of what we will be making in the mid- to late eighties”, Jobs said. Apple supposedly never did do any market research preferring to follow the Henry Ford approach who said he never asked what people wanted because they would have just asked for a better horseless carriage. Whilst it is probably the case that people can often see how to make incremental improvements to products they usually cannot see how to make disruptive changes that introduce a who new way of doing things, possibly making everything that went before it redundant. It is the job of the architect to show what is in the realms of the possible by creating new and innovative systems.
  6. Putting things together in new and creative ways is sometimes more important than inventing things. Jobs was not the first to market with an MP3 player, a mobile phone or a tablet computer. Others had already innovated and built these things. What Jobs and Apple did were to tweak things that already existed. As Isaacson says “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to”. Jobs applied his design skills to these and came up with a (far) better product and in fact a whole new platform as well (i.e. the computer as the digital hub. Architects to need to learn that its often putting together existing components in new and innovative ways that really counts and gives a competitive and business advantage.

Steve Jobs 1955 – 2011

During the coming days and weeks millions of words will be written about Steve Jobs, many of them on devices he created. Why does the world care so much about an American CEO and computer nerd? For those of us that work with technology, and hope to use it to make the world a better place, the reason Steve Jobs was such a role model is that he not only had great vision and a brilliant understanding of design but also knew how to deliver technology in a form that was usable by everyone, not just technophiles, nerds and developers. Steve Jobs and Apple have transformed the way we interact with data, and the way that we think about computing, moving it from the desktop to the palm of our hands. As IT becomes ever more pervasive we could all learn from that and maybe even hope to emulate Steve Jobs a little.