The story so far…

 

Photo by Joshua Sortino on Unsplash
Photo by Joshua Sortino on Unsplash

It’s hard to believe that this year is the 30th anniversary of Tim Berners-Lee’s great invention, the World-Wide Web, and that much of the technology that enabled his creation is still less than 60 years old. Here’s a brief history of the Internet and the Web, and how we got to where we are today, in ten significant events.

 

#1: 1963 – Ted Nelson begins developing a model for creating and using linked content he calls hypertext and hypermedia. Hypertext is born.

#2: 1969 – The first message is sent over the ARPANET from computer science Professor Leonard Kleinrock’s laboratory at University of California, Los Angeles to the second network node at Stanford Research Institute. The Internet is born.

#3: 1969 – Charles Goldfarb, leading a small team at IBM, developed the first markup language, called Generalized Markup Language, or GML. Markup languages are born.

#4: 1989 – Tim Berners-Lee whilst working at CERN publishes his paper, Information Management: A Proposal. The World Wide Web (WWW) is born.

#5: 1993Mosaic, a graphical browser aiming to bring multimedia content to non-technical users (images and text on the same page) is invented by Marc Andreessen. The web browser is born.

#6: 1995 – Jeff Bezos launches Amazon “earth’s biggest bookstore” from a garage in Seattle. E-commerce is born.

#7: 1998 – The Google company is officially launched by Larry Page and Sergey Brin to market Google Search. Web search is born.

#8: 2003Facebook (then called FaceMash but changed to The Facebook a year later) is founded by Mark Zuckerberg with his college roommate and fellow Harvard University student Eduardo Saverin. Social media is born.

#9: 2007 – Steve Jobs launches the iPhone at MacWorld Expo in San Francisco. Mobile computing is born.

#10: 2018 – Tim Berners-Lee instigates act II of the web when he announces a new initiative called Solid, to reclaim the Web from corporations and return it to its democratic roots. The web is reborn?

I know there have been countless events that have enabled the development of our modern Information Age and you will no doubt think others should be included in preference to some of my suggestions. Also, I suspect that many people will not have heard of my last choice (unless you are a fairly hardcore computer type). The reason I have added this one is because I think/hope it will start to address what is becoming one of the existential threats of our age, namely how we survive in a world awash with data (our data) that is being mined and used without us knowing, much less understanding, the impact of such usage. Rather than living in an open society in which ideas and data are freely exchanged and used to everyones benefit we instead find ourselves in an age of surveillance capitalism which, according to this source, is defined as being:

…the manifestation of George Orwell’s prophesied Memory Hole combined with the constant surveillance, storage and analysis of our thoughts and actions, with such minute precision, and artificial intelligence algorithmic analysis, that our future thoughts and actions can be predicted, and manipulated, for the concentration of power and wealth of the very few.

In her book The Age of Surveillance Capitalism, Shoshana Zuboff provides a sweeping (and worrying) overview and history of the techniques that the large tech companies are using to spy on us in ways that even George Orwell would have found alarming. Not least because we have voluntarily given up all of this data about ourselves in exchange for what are sometimes the flimsiest of benefits. As Zuboff says:

Thanks to surveillance capitalism the resources for effective life that we seek in the digital realm now come encumbered with a new breed of menace. Under this new regime, the precise moment at which our needs are met is also the precise moment at which our lives are plundered for behavioural data, and all for the sake others gain.

Tim Berners-Lee invented the World-Wide Web then gave it away so that all might benefit. Sadly some have benefited more than others, not just financially but also by knowing more about us than most of us would ever want or wish. I hope for all our sakes the work that Berners-Lee and his small group of supporters is doing make enough progress to reverse the worst excesses of surveillance capitalism before it is too late.

Educating an IT Workforce for the 21st Century

A report on the BBC Today programme this morning argues that the “Facebook generation needs better IT skills” and that UK schools should be providing courses in programming at GCSE. The report bemoaned the fact that so called Information and Communications Technology (ICT) GCSEs did little more than teach students how to use Microsoft Office programmes such as Word and Excel and did not prepare students for a career in IT. The backers of this report were companies like Google and Microsoft.This raises an interesting question of who should be funding such education in these austere times. Is it the role of schools to provide quite specific skills like programming or should they be providing the basics of literacy and numeracy as well as the more fundamental skills of creativity, communication and collaboration and leave the specifics to the industries that need them? Here are some of the issues related to this:

  1. Skills like computer programming are continuously evolving and changing. What is taught at 14 – 16 today (the age of GCSE students in the UK) will almost certainly be out of date when these students hit the work force at 21+.
  2. The computer industry, just like manufacturing before it, long ago sent out the message to students that programming skills (in Western economies at least) were commoditised and better performed by the low-cost economies of the BRIC nations (and now, presumably, the CEVITS).
  3. To most people computers are just tools. Like cars, washing machines and mobile phones they don’t need to know how they work, just how to use them effectively.
  4. Why stop at computer programming GCSE? Why not teach the basics of plumbing, car mechanics, cookery and hairdressing, all of which are in great demand still and needed by their respective industries.
  5. Public education (which essentially did not exist before the 19th century, certainly not for the masses) came about to meet the needs of industrialism and as such demanded skills in left-brained, logical thinking skills rather than right brained, creative skills (see Sir Ken Robinson’s TED talk on why schools kill creativity). As a result we have a system that rewards the former rather than the latter (as in “there’s no point in studying painting or music, you’ll never get a job in that”).

In an ideal world we would all be given the opportunities to learn and apply whatever skills we wanted (both at school and throughout life) and have that learning funded by the tax payer on the basis it benefits society as a whole. Unfortunately we don’t live in that ideal world and in fact are probably moving further from it than ever.

Back in the real world therefore industry must surely fund the acquiring of those skills. Unfortunately in many companies education is the first thing to be cut when times are hard. The opposite should be the case. One of the best things I ever did was to spend five weeks (yes that’s weeks not days), funded entirely by IBM, learning object-oriented programming and design. Whilst five weeks may seem like a long time for a course I know this has paid for itself many, many times over by the work I have been able to do for IBM in the 15 years since attending that course. Further, I suspect that five weeks intensive learning was easily equivalent to at least a years worth of learning in an educational establishment.

Of course such skills are more vital to companies like Google, Microsoft and IBM than ever before. Steve Denning in an article called Why Big Companies Die in Forbes this month quotes from an article by Peggy Noonan in the Wall Street Journal (called A Caveman Won’t Beat a Salesman). Denning uses a theory from Steve Jobs that big companies fail when salesmen and accountants are put in charge of and who don’t know anything about the product or service the company make or how it works. Denning says:

The activities of these people [salesmen and accountants] further dispirit the creators, the product engineers and designers, and also crimp the firm’s ability to add value to its customers. But because the accountants appear to be adding to the firm’s short-term profitability, as a class they are also celebrated and well-rewarded, even as their activities systematically kill the firm’s future.

Steve Jobs showed that there was another way.  Namely, to keep playing the offense and focus totally on adding value for customers by creating new and innovative new products. By doing that you can make more money than the companies that are milking their cash cows and focused on making money rather than products.

Companies like Google and Microsoft (and IBM and Apple) need people fully trained in the three C’s (creativity, communication and creativity) who can then apply these to whatever task is most relevant to the companies bottom line. It’s the role of those companies, not government, to train people in the specifics.

Interestingly Seymour Papert (who co-invented the Logo programming language) used programming as a tool to improve the way that children think and solve problems. Papert used Piaget‘s work of cognitive development (that showed how children learn) and used Logo as a way of improving their creativity.

Finally, to see how students themselves view all this see the article by Nikhil Goyal’s (a 16-year-old junior at Syosset High School in New York) who states: “for the 21st century American economy, all economic value will derive from entrepreneurship and innovation. Low-cost manufacturing will essentially be wiped out of this country and shipped to China, India, and other nations” and goes on to propose that
“we institute a 21st century model of education, rooted in 21st century learning skills and creativity, imagination, discovery, and project-based learning”. Powerful stuff for one so young, there may yet be hope for us.

What Can Architects Learn from Steve Jobs

I’ve just finished reading Steve Jobs by Walter Isaacson. In case there is anyone out there who doesn’t know it yet, this is the authorised biography that Jobs asked Isaacson to write which was completed a few weeks before Jobs untimely death aged 56 last month. Jobs insisted that Isaacson would have complete control over the contents of the book saying he would not even read it before it was published adding “I don’t have any skeletons in my closet that can’t be allowed out”.Jobs is clearly a very complex personality, on the one hand a creative genius whose zen like focus on simplicity and efficiency helped create some of the most beautiful and useful gadgets of our time (some of which we never even knew we needed) whilst on the other he was a bully and a tyrant who knew exactly how to “size people up, understand their inner thoughts, and know how to relate to them, cajole them, or hurt them at will”. One of jobs girl friends, who later went on to found a mental health resource network in California, even went so far to say that she thought Jobs suffered from Narcissistic Personality Disorder (NPD) in which the individual is described as being excessively preoccupied with issues of personal adequacy, power, prestige and vanity.

Whilst it is to be hoped that NPD is not a prerequisite for being a software architect Jobs did have vision and understanding of IT that we as architects can learn from. Six big ideas that stand out in this respect are:

  1. Engineering matters. When jobs met with President Obama in 2011 he implored the President to reform the US education system and to create more engineering students. Jobs said “if you could educate these engineers we could move more manufacturing plants here”. Whilst there was always an uneasy tension between engineering and design at Apple Jobs recognised and valued the importance of there being an engineering led rather than sales led team at the top of the company berating companies like Microsoft (under Balmer), IBM (under Akers) and HP (under their last several CEOs) for putting sales people in charge rather than engineers. For software architects, engineering clearly translates to being intimately knowledgeable with the technology you are using, knowing how to put the working parts together. The best architects I know are passionate about technology.
  2. Artistry and design matters just as much as engineering. This is a theme that Jobs emphasises over and over again. From when he dropped out of college and instead took a course on calligraphy to his sometimes maniacal focus on the smallest details of design to make the product as satisfying and aesthetically pleasing as possible. He even emphasized that circuit boards, which no one would ever see once the product was fully assembled, should be laid out in as clean and uncluttered was as possible. It is this aspect of design that most matters for architects. Provided that functionally a system does what it is meant to do within the required constraints and system qualities one could argue it does not matter how messily the software is assembled. Whose going to see it anyway? This misses the point though.Great design, as opposed to just good enough design, means the system will be easier to maintain, take less effort to learn and generally be more enjoyable for those that need to carry on working on it once the architects and developers have moved on.
  3. Simple is better than complex. Apple had a design mantra: “Simplicity is the ultimate sophistication” or as Jobs said “very simple, and we’re really shooting for Museum of Modern Art quality”. Jobs felt that design simplicity should be linked to making products easy to use.So much of the software that we create today is far too complex and feature rich and as a result is very hard to use. People will often say that it has to be like that because just look at all the features you are getting. Unfortunately a lot of the time many of those features are not needed but add to the general bloat of the systems we build making them hard to use as well as difficult to maintain. Sadly building a complex system is often easier than building a simple one and it is not many architects that see value in stripping out functionality rather than adding it.
  4. An unremitting focus on detail is key to creating a great product. Jobs was unique in that he was able to hold both the big picture view as well as zooming in to fine details. He would often sweat over the smallest detail until he was satisfied it was just right. This could be anything from the colour of a screw on the back plate of the iPod to the angle of the bevel on the iPad to make someone want to pick it up. This capacity for holding both the big picture view whilst also being able to zoom right down and question low level details is probably one of the hardest things architects have to do but being able to do so gives a definite advantage and enables greater integrity as well as better execution of vision.
  5. Customers don’t always know what they want. In September 1982 when Jobs and his team were designing the original Macintosh he held a retreat for the Mac team near Monteray where he gave a presentation on his thoughts for the Mac. At the end someone asked whether or not they should do some market research to find out what customers wanted. “No”, replied Jobs, “because people don’t know what they want until we’ve shown them”. He then pulled out a device the size of a desk diary and flipped it open, it turned out to be a mock-up of a computer that could fit into your lap with a keyboard and screen hinged together like a notebook. “This is my dream of what we will be making in the mid- to late eighties”, Jobs said. Apple supposedly never did do any market research preferring to follow the Henry Ford approach who said he never asked what people wanted because they would have just asked for a better horseless carriage. Whilst it is probably the case that people can often see how to make incremental improvements to products they usually cannot see how to make disruptive changes that introduce a who new way of doing things, possibly making everything that went before it redundant. It is the job of the architect to show what is in the realms of the possible by creating new and innovative systems.
  6. Putting things together in new and creative ways is sometimes more important than inventing things. Jobs was not the first to market with an MP3 player, a mobile phone or a tablet computer. Others had already innovated and built these things. What Jobs and Apple did were to tweak things that already existed. As Isaacson says “he had noticed something odd about the cell phones on the market: They all stank, just like portable music players used to”. Jobs applied his design skills to these and came up with a (far) better product and in fact a whole new platform as well (i.e. the computer as the digital hub. Architects to need to learn that its often putting together existing components in new and innovative ways that really counts and gives a competitive and business advantage.

You Can No Longer Call Yourself an Architect When…

Ten behaviours that might mean you can no longer call yourself an Architect…

  1. The majority of your output is created using Microsoft’s Office suite.
  2. Your days are spent fighting the system rather than creating a system.
  3. You’re fitting business to technology rather than the other way around.
  4. You think reuse is what you do with your shirt when you unexpectedly have to spend one extra day with the client (possibly wasting your time doing 2 above).
  5. The only stakeholders you deal with are your non-vegetarian colleagues during your evening meals at the Angus Steakhouse.
  6. You can no longer remember the difference between a For loop and a While loop.
  7. You think a view is something you don’t normally get from your room in the tourist class hotels your company puts you up in.
  8. Your definition of a non-functional requirement is a functional requirement that isn’t.
  9. You spend more time in the box than out of it.
  10. Add your favourite reason below.

This is probably my last post of the year and most certainly the last one before Christmas. Merry Christmas to everyone out there and, as it used to say on the front of The Beano at this time of year:

Happy New Year to All My Readers!