Why it’s different this time

Image Created Using Adobe Photoshop and Firefly

John Templeton, the American-born British stock investor, once said: “The four most expensive words in the English language are, ‘This time it’s different.’”

Templeton was referring to people and institutions who had invested in the next ‘big thing’ believing that this time it was different, the bubble could not possibly burst and their investments were sure to be safe. But then, for whatever reason, the bubble did burst and fortunes were lost.

Take as an example the tech boom of the late 1980s and 1990s. Previously unimagined technologies that no one could ever see any sign of failing meant investors poured their money into this boom. Then it all collapsed and many fortunes were lost as the Nasdaq dropped 75 percent.

It seems to be an immutable law of economics that busts will follow booms as sure as night follows day. The trick then is to predict the boom and exit your investment at the right time – not too soon and not too late, to paraphrase Goldilocks.

Most recently the phrase “this time it’s different” is being applied to the wave of AI technology which has been hitting our shores, especially since the widespread release of large language model technologies which current AI tools like OpenAI’s ChatGPT, Google’s PaLM, and Meta’s LLaMA use as their underpinning.

Which brings me to the book The Coming Wave by Mustafa Suleyman.

Suleyman was the co-founder of DeepMind (now owned by Google) and is currently CEO of Inflection an AI ‘studio’ that, according to its company blurb is “creating a personal AI for everyone”.

The Coming Wave provides us with an overview not just of the capabilities of current AI systems but also contains a warning which Suleyman refers to as the containment problem. If our future is to depend on AI technology (which it increasingly looks like it will given that, according to Suleyman, LLMs are the “fastest, diffusing consumer models we have ever seen“) how do you make it a force for good rather than evil whereby a bunch of ‘bad actors’ could imperil our very existence? In other words, how do you monitor, control and limit (or even prevent) this technology?

Suleyman’s central premise in this book is that the coming technological wave of AI is different from any that have gone before for five reasons which makes containment very difficult (if not impossible). In summary, these are:

  • Reason #1: Asymmetry – the potential imbalances or disparities caused by artificial intelligence systems being able to transfer extreme power from state to individual actors.
  • Reason #2: Exponentiality – the phenomenon where the capabilities of AI systems, such as processing power, data storage, or problem-solving ability, increase at an accelerating pace over time. This rapid growth is often driven by breakthroughs in algorithms, hardware, and the availability of large datasets.
  • Reason #3: Generality – the ability of an artificial intelligence system to apply their knowledge, skills, or capabilities across a wide range of tasks or domains.
  • Reason #4: Autonomy – the ability of an artificial intelligence system or agent to operate and make decisions independently, without direct human intervention.
  • Reason #5: Technological Hegemony – the malignant concentrations of power that inhibit innovation in the public interest, distort our information systems, and threaten our national security.

Suleyman’s book goes into each of these attributes in detail and I do not intend to repeat any of that here (buy the book or watch his explainer video). Suffice it to say however that collectively these attributes mean that this technology is about to deliver us nothing less than a radical proliferation of power which, if unchecked, could lead to one of two possible (and equally undesirable) outcomes:.

  1. A surveillance state (which China is currently building and exporting).
  2. An eventual catastrophe born of runaway development.

Other technologies have had one or maybe two of these capabilities but I don’t believe any have had all five, certainly at the level AI has. For example electricity was a general purpose technology with multiple applications but even now individuals cannot build their own generators (easily) and there is certainly not any autonomy in power generation. The internet comes closest to having all five attributes but it is not currently autonomous (though AI itself threatens to change that).

To be fair, Suleyman does not just present us with what, by any measure, is a truly wicked problem he also offers a ten point plan for for how we might begin to address the containment problem and at least dilute the effects the coming wave might have. These stretch from including built in safety measures to prevent AI from acting autonomously in an uncontrolled fashion through regulation by governments right up to cultivating a culture around this technology that treats it with caution from the outset rather than adopting the move fast and break things philosophy of Mark Zuckerberg. Again, get the book to find out more about what these measures might involve.

My more immediate concerns are not based solely on the five features described in The Coming Wave but on a sixth feature I have observed which I believe is equally important and increasingly overlooked by our rush to embrace AI. This is:

  • Reason #6: Techno-paralysis – the state of being overwhelmed or paralysed by the rapid pace of technological change caused by technology systems.

As is the case of the impact of the five features of Suleyman’s coming wave I see two, equally undesirable outcomes of techno-paralysis:

  1. People become so overwhelmed and fearful because of their lack of understanding of these technological changes they choose to withdraw from their use entirely. Maybe not just “dropping out” in an attempt to return to what they see as a better world, one where they had more control, but by violently protesting and attacking the people and the organisations they see as being responsible for this “progress”. I’m talking the Luddites here but on a scale that can be achieved using the organisational capabilities of our hyper-connected world.
  2. Rather than fighting against techno-paralysis we become irretrevably sucked into the systems that are creating and propagating these new technologies and, to coin a phrase, “drink the Kool-Aid”. The former Greek finance minister and maverick economist Yanis Varoufakis, refers to these systems, and the companies behind them, as the technofeudalists. We have become subservient to these tech overlords (i.e. Amazon, Alphabet, Apple, Meta and Microsoft) by handing over our data to their cloud spaces. By spending all of our time scrolling and browsing digital media we are acting as ‘cloud-serfs’ — working as unpaid producers of data to disproportionately benefit these digital overlords.

There is a reason why the big-five tech overlords are spending hundreds of billions of dollars between them on AI research, LLM training and acquisitions. For each of them this is the next beachhead that must be conquered and occupied, the spoils of which will be huge for those who get there first. Not just in terms of potential revenue but also in terms of new cloud-serfs captured. We run the risk of AI being the new tool of choice in weaponising the cloud to capture larger portions of our time in servitude to these companies who produce evermore ingenious ways of controlling our thoughts, actions and minds.

So how might we deal with this potentially undesirable outcome of the coming wave of AI? Surely it has to be through education? Not just of our children but of everyone who has a vested interest in a future where we control our AI and not the other way round.

Last November the UK governments Department for Education (DfE) released the results from a Call for Evidence on the use of GenAI in education. The report highlighted the following benefits:

  • Freeing up teacher time (e.g. on administrative tasks) to focus on better student interaction.
  • Improving teaching and education materials to aid creativity by suggesting new ideas and approaches to teaching.
  • Helping with assessment and marking.
  • Adaptive teaching by analysing students’ performance and pace, and to tailor educational materials accordingly.
  • Better accessibility and inclusion e.g. for SEND students, teaching materials could be more easily and quickly differentiated for their specific.

whilst also highlighting some potential risks including:

  • An over reliance on AI tools (by students and staff) which would compromise their knowledge and skill development by encouraging them to passively consume information.
  • Tendency of GenAI tools to produce inaccurate, biased and harmful outputs.
  • Potential for plagiarism and damage to academic integrity.
  • Danger that AI will be used for the replacement or undermining of teachers.
  • Exacerbation of digital divides and problems of teaching AI literacy in such a fast changing field.

I believe that to address these concerns effectively, legislators should consider implementing the following seven point plan:

  1. Regulatory Framework: Establish a regulatory framework that outlines the ethical and responsible use of AI in education. This framework should address issues such as data privacy, algorithm transparency, and accountability for AI systems deployed in educational settings.
  2. Teacher Training and Support: Provide professional development opportunities and resources for educators to effectively integrate AI tools into their teaching practices. Emphasize the importance of maintaining a balance between AI-assisted instruction and traditional teaching methods to ensure active student engagement and critical thinking.
  3. Quality Assurance: Implement mechanisms for evaluating the accuracy, bias, and reliability of AI-generated content and assessments. Encourage the use of diverse datasets and algorithms to mitigate the risk of producing biased or harmful outputs.
  4. Promotion of AI Literacy: Integrate AI literacy education into the curriculum to equip students with the knowledge and skills needed to understand, evaluate, and interact with AI technologies responsibly. Foster a culture of critical thinking and digital citizenship to empower students to navigate the complexities of the digital world.
  5. Collaboration with Industry and Research: Foster collaboration between policymakers, educators, researchers, and industry stakeholders to promote innovation and address emerging challenges in AI education. Support initiatives that facilitate knowledge sharing, research partnerships, and technology development to advance the field of AI in education.
  6. Inclusive Access: Ensure equitable access to AI technologies and resources for all students, regardless of their gender, socioeconomic background or learning abilities. Invest in infrastructure and initiatives to bridge the digital divide and provide support for students with special educational needs and disabilities (SEND) to benefit from AI-enabled educational tools.
  7. Continuous Monitoring and Evaluation: Regularly monitor and evaluate the implementation of AI in education to identify potential risks, challenges, and opportunities for improvement. Collect feedback from stakeholders, including students, teachers, parents, and educational institutions, to inform evidence-based policymaking and decision-making processes.

The coming AI wave cannot be another technology that we let wash over and envelop us. Indeed Suleyman himself towards the end of his book makes the following observations…

Technologists cannot be distant, disconnected architects of the future, listening only to themselves.

Technologists must also be credible critics who…
…must be practitioners. Building the right technology, having the practical means to change its course, not just observing and commenting, but actively showing the way, making the change, effecting the necessary actions at source, means critics need to be involved.

If we are to avoid widespread techno-paralysis caused by this coming wave than we need a 21st century education system that is capable of creating digital citizens that can live and work in this brave new world.

Enchanting Minds and Machines – Ada Lovelace, Mary Shelley and the Birth of Computing and Artificial Intelligence

Today (10th October 2023) is Ada Lovelace Day. In this blog post I discuss why Ada Lovelace (and indeed Mary Shelley who was indirectly connected to Ada) is as relevant today as she was then.

Villa Diodati, Switzerland

In the summer of 1816 [1], five young people holidaying at the Villa Diodati near Lake Geneva in Switzerland found their vacation rudely interrupted by a torrential downfall which trapped them indoors. Faced with the monotony of confinement, one member of the group proposed an ingenious idea to break the boredom: each of them should write a supernatural tale to captivate the others.

Among these five individuals were some notable figures of their time. Lord Byron, the celebrated English poet and his friend and fellow poet, Percy Shelley. Alongside them was Shelley’s wife, Mary, her stepsister Claire Clairmont, who happened to be Byron’s mistress, and Byron’s physician, Dr. Polidori.

Lord Byron, burdened by the legal disputes surrounding his divorce and the financial arrangements for his newborn daughter, Ada, found it impossible to fully engage in the challenge (despite having suggested it). However, both Dr. Polidori and Mary Shelley embraced the task with fervor, creating stories that not only survived the holiday but continue to thrive today. Polidori’s tale would later appear as Vampyre – A Tale, serving as the precursor to many of the modern vampire movies and TV programmes we know today. Mary Shelley’s story, which had come to her in a haunting nightmare that very night, gave birth to the core concept of Frankenstein, published in 1818 as Frankenstein: or, The Modern Prometheus. As Jeanette Winterson asserts in her book 12 Bytes [2], Frankenstein is not just a story about “the world’s most famous monster; it’s a message in a bottle.” We’ll see why this message resounds even more today, later.

First though, we must shift our focus to another side of Lord Byron’s tumultuous life and his divorce settlement with his wife, Anabella Wentworth. In this settlement, Byron expressed his desire to shield his daughter from the allure of poetry—an inclination that suited Anabella perfectly, as one poet in the family was more than sufficient for her. Instead, young Ada received a mathematics tutor, whose duty extended beyond teaching mathematics and included eradicating any poetic inclinations Ada might have inherited. Could this be an early instance of the enforced segregation between the arts and STEM disciplines, I wonder?

Ada excelled in mathematics, and her exceptional abilities, combined with her family connections, earned her an invitation, at the age of 17, to a London soirée hosted by Charles Babbage, the Lucasian Professor of Mathematics at Cambridge. Within Babbage’s drawing room, Ada encountered a model of his “Difference Engine,” a contraption that so enraptured her, she spent the evening engrossed in conversation with Babbage about its intricacies. Babbage, in turn, was elated to have found someone who shared his enthusiasm for his machine and generously shared his plans with Ada. He later extended an invitation for her to collaborate with him on the successor to the machine, known as the “Analytical Engine”.

A Model of Charles Babbage’s Analytical Engine

This visionary contraption boasted the radical notion of programmability, utilising punched cards like those employed in weaving machines of that era. In 1842, Ada Lovelace (as she had become by then) was tasked with translating a French transcript of one of Babbage’s lectures into English. However, Ada went above and beyond mere translation, infusing the document with her own groundbreaking ideas about Babbage’s computing machine. These contributions proved to be more extensive and profound than the original transcript itself, solidifying Ada Lovelace’s place in history as a pioneer in the realm of computer science and mathematics.

In one of these notes, she wrote an ‘algorithm’ for the Analytical Engine to compute Bernoulli numbers, the first published algorithm (AKA computer program) ever! Although Babbage’s engine was too far ahead of its time and could not be built using current day technology, Ada is still credited as being the world’s first computer programmer. But there is another twist to this story that brings us closer to the present day.

Fast forward to the University of Manchester, 1950. Alan Turing, the now feted but ultimately doomed mathematician who led the team that cracked intercepted, coded messages sent by the German navy in WWII, has just published a paper called Computing Machinery and Intelligence [3]. This was one of the first papers ever written on artificial intelligence (AI) and it opens with the bold premise: “I propose to consider the question, ‘Can machines think?”.

Alan Turing

Turing did indeed believe computers would one day (he thought in about 50 years’ time in the year 2000) be able to think and devised his famous “Turing Test” as a way of verifying his proposition. In his paper Turing also felt the need to “refute” arguments he thought might be made against his bold claim, including one made by no other than Ada Lovelace over one hundred years earlier. In the same notes where she wrote the world’s first computer algorithm, Lovelace also said:

It is desirable to guard against the possibility of exaggerated ideas that might arise as to the powers of the Analytical Engine. The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis, but it has no power of anticipating any analytical relations or truths”.

Although Lovelace might have been optimistic about the power of the Analytical Engine, should it ever be built, the possibility of it thinking creatively wasn’t one of the things she thought it would excel at.

Turing disputed Lovelace’s view because she could have had no idea of the enormous speed and storage capacity of modern (remember this was 1950) computers, making them a match for that of the human brain, and thus, like the brain, capable of processing their stored information to arrive at sometimes “surprising” conclusions. To quote Turing directly from his paper:

It is a line of argument we must consider closed, but it is perhaps worth remarking that the appreciation of something as surprising requires as much of a ‘ creative mental act ‘ whether the surprising event originates from a man, a book, a machine or anything else.”

Which brings us bang up to date with the current arguments that are raging about whether systems like ChatGPT, DALL-E or Midjourney are creative or even sentient in some way. Has Turing’s prophesy finally been fulfilled or was Ada Lovelace right all along, computers can never be truly creative because creativity requires not just a reconfiguration of what someone else has made, it requires original thought based on actual human experience?

One undeniable truth prevails in this narrative: Ada was good at working with what she didn’t have. Not only was Babbage unable to build his machine, meaning Lovelace never had one to play with, she also didn’t have male privilege or a formal education – something that was a scarce commodity for women – a stark reminder of the limitations imposed on her gender during that time.

Have things moved on today for women and young girls? A glimpse into the typical composition of a computer science classroom, be it at the secondary or tertiary level, might beg the question: Have we truly evolved beyond the constraints of the past? And if not, why does this gender imbalance persist?

Over the past five or more years there have been many studies and reports published into the problem of too few women entering STEM careers and we seem to be gradually focusing in on not just what the core issues are, but also how to address them. What seems to be lacking is the will, or the funding (or both) to make it happen.

So, what to do, first some facts:

  1. Girls lose interest in STEM as they get older. A report from Microsoft back in 2018 found that confidence in coding wanes as girls get older, highlighting the need to connect STEM subjects to real-world people and problems by tapping into girls’ desire to be creative [4].
  2. Girls and young women do not associate STEM jobs with being creative. Most girls and young women describe themselves as being creative and want to pursue a career that helps the world. They do not associate STEM jobs as doing either of these things [4].
  3. Female students rarely consider a career in technology as their first choice. Only 27% of female students say they would consider a career in technology, compared to 61% of males, and only 3% say it is their first choice [5].
  4. Most students (male and female) can’t name a famous female working in technology. A lack of female role models is also reinforcing the perception that a technology career isn’t for them. Only 22% of students can name a famous female working in technology. Whereas two thirds can name a famous man [5].
  5. Female pupils feel STEM subjects, though highly paid, are not ‘for them’. Female Key Stage 4 pupils perceived that studying STEM subjects was potentially a more lucrative choice in terms of employment. However, when compared to male pupils, they enjoyed other subjects (e.g., arts and English) more [6].

The solutions to these issues are now well understood:

  1. Increasing the number of STEM mentors and role models – including parents – to help build young girls’ confidence that they can succeed in STEM. Girls who are encouraged by their parents are twice as likely to stay in STEM, and in some areas like computer science, dads can have a greater influence on their daughters than mums yet are less likely than mothers to talk to their daughters about STEM.
  2. Creating inclusive classrooms and workplaces that value female opinions. It’s important to celebrate the stories of women who are in STEM right now, today.
  3. Providing teachers with more engaging and relatable STEM curriculum, such as 3D and hands-on projects, the kinds of activities that have proven to help keep girls’ interest in STEM over the long haul.
  4. Multiple interventions, starting early and carrying on throughout school, are important ways of ensuring girls stay connected to STEM subjects. Interventions are ideally done by external people working in STEM who can repeatedly reinforce key messages about the benefits of working in this area. These people should also be able to explain the importance of creativity and how working in STEM can change the world for the better [7].
  5. Schoolchildren (all genders) should be taught to understand how thinking works, from neuroscience to cultural conditioning; how to observe and interrogate their thought processes; and how and why they might become vulnerable to disinformation and exploitation. Self-awareness could turn out to be the most important topic of all [8].

Before we finish, let’s return to that “message in a bottle” that Mary Shelley sent out to the world over two hundred years ago. As Jeanette Winterson points out:

Mary Shelley maybe closer to the world that is to become than either Ada Lovelace or Alan Turing. A new kind of life form may not need to be human-like at all and that’s something that is achingly, heartbreakingly, clear in ‘Frankenstein’. The monster was originally designed to be like us. He isn’t and can’t be. Is that the message we need to hear?” [2].

If we are to heed Shelley’s message from the past, the rapidly evolving nature of AI means we need people from as diverse a set of backgrounds as possible. These should include people who can bring constructive criticism to the way technology is developed and who have a deeper understanding of what people really need rather than what they think they want from their tech. Women must become essential players in this. Not just in developing, but also guiding and critiquing the adoption and use of this technology. As Mustafa Suleyman (co-founder of DeepMind) says in his book The Coming Wave [10]:

Credible critics must be practitioners. Building the right technology, having the practical means to change its course, not just observing and commenting, but actively showing the way, making the change, effecting the necessary actions at source, means critics need to be involved.

As we move away from the mathematical nature of computing and programming to one driven by so called descriptive programming [9] it is going to be important we include those who are not technical but are creative as well as empathetic to people’s needs and maybe even understand the limits we should place on technology. The four C’s (creativity, critical thinking, collaboration and communications) are skills we all need to be adopting and are ones which women in particular seem to excel at.

On this, Ada Lovelace Day 2023, we should not just celebrate Ada’s achievements all those years ago but also recognize how Ada ignored and fought back against the prejudices and severe restrictions on education that women like her faced. Ada pushed ahead regardless and became a true pioneer and founder of a whole industry that did not actually really get going until over 100 years after her pioneering work. Ada, the world’s first computer programmer, should be the role model par excellence that all girls and young women look to for inspiration, not just today but for years to come.

References

  1. Mary Shelley, Frankenstein and the Villa Diodati, https://www.bl.uk/romantics-and-victorians/articles/mary-shelley-frankenstein-and-the-villa-diodati
  2. 12 Bytes – How artificial intelligence will change the way we live and love, Jeanette Winterson, Vintage, 2022.
  3. Computing Machinery and Intelligence, A. M. Turing, Mind, Vol. 59, No. 236. (October 1950), https://www.cs.mcgill.ca/~dprecup/courses/AI/Materials/turing1950.pdf
  4. Why do girls lose interest in STEM? New research has some answers — and what we can do about it, Microsoft, 13th March 2018, https://news.microsoft.com/features/why-do-girls-lose-interest-in-stem-new-research-has-some-answers-and-what-we-can-do-about-it/
  5. Women in Tech- Time to close the gender gap, PwC, https://www.pwc.co.uk/who-we-are/her-tech-talent/time-to-close-the-gender-gap.html
  6. Attitudes towards STEM subjects by gender at KS4, Department for Education, February 2019, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/913311/Attitudes_towards_STEM_subjects_by_gender_at_KS4.pdf
  7. Applying Behavioural Insights to increase female students’ uptake of STEM subjects at A Level, Department for Education, November 2020, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/938848/Applying_Behavioural_Insights_to_increase_female_students__uptake_of_STEM_subjects_at_A_Level.pdf
  8. How we can teach children so they survive AI – and cope with whatever comes next, George Monbiot, The Guardian, 8th July 2023, https://www.theguardian.com/commentisfree/2023/jul/08/teach-children-survive-ai
  9. Prompt Engineering, Microsoft, 23rd May 2023, https://learn.microsoft.com/en-us/semantic-kernel/prompt-engineering/
  10. The Coming Wave, Mustafa Suleyman, The Bodley Head, 2023.

Tech skills are not the only type of skill you’ll need in 2021

Image by Gerd Altmann from Pixabay

Whilst good technical skills continue to be important these alone will not be enough to enable you to succeed in the modern, post-pandemic workplace. At Digital Innovators, where I am Design and Technology Director, we believe that skills with a human element are equally, if not more, important if you are to survive in the changed working environment of the 2020’s. That’s why, if you attend one of our programmes during 2021, you’ll also learn these, as well as other, people focused, as well as transferable, skills.

1. Adaptability

The COVID-19 pandemic has changed the world of work not just in the tech industry but across other sectors as well. Those organisations most able to thrive during the crisis were ones that were able to adapt quickly to new ways of working whether that is full-time office work in a new, socially distanced way, a combination of both office and remote working, or a completely remote environment. People have had to adapt to these ways of working whilst continuing to be productive in their roles. This has meant adopting different work patterns, learning to communicate in new ways and dealing with a changed environment where work, home (and for many school) have all merged into one. Having the ability to adapt to these new challenges is a skill which will be more important than ever as we embrace a post-pandemic world.

Adaptability also applies to learning new skills. Technology has undergone exponential growth in even the last 20 years (there were no smartphones in 2000) and has been adopted in new and transformative ways by nearly all industries. In order to keep up with such a rapidly changing world you need to be continuously learning new skills to stay up-to-date and current with industry trends. 

2. Collaboration and Teamwork

Whilst there are still opportunities for the lone maverick, working away in his or her bedroom or garage, to come up with new and transformative ideas, for most of us, working together in teams and collaborating on ideas and new approaches is the way we work best.

In his book Homo Deus – A Brief History of Tomorrow, Yuval Noah Harari makes the observation: “To the best of our knowledge, only Sapiens can collaborate in very flexible ways with countless numbers of strangers. This concrete capability – rather than an eternal soul or some unique kind of consciousness – explains our mastery over planet Earth.

On our programme we encourage and demand our students to collaborate from the outset. We give them tasks to do (like drawing how to make toast!) early on, then build on these, leading up to a major 8-week projects where students work in teams of four or five to define a solution to a challenge set by one of our industry partners. Students tell us this is one of their favourite aspects of the programme as it allows them to work with new people from a diverse range of backgrounds to come up with new and innovative solutions to problems.

3. Communication

Effective communication skills, whether they be written spoken or aural, as well as the ability to present ideas well, have always been important. In a world where we are increasingly communicating through a vast array of different channels, we need to adapt our core communications skills to thrive in a virtual as well as an offline environment.

Digital Innovators teach their students how to communicate effectively using a range of techniques including a full-day, deep dive into how to create presentations that tell stories and really enable you to get across your ideas.

4. Creativity

Pablo Picasso famously said “Every child is an artist; the problem is staying an artist when you grow up”.

As Hugh MacLeod, author of Ignore Everybody, And 39 Other Keys to Creativity says: “Everyone is born creative; everyone is given a box of crayons in kindergarten. Then when you hit puberty they take the crayons away and replace them with dry, uninspiring books on algebra, history, etc. Being suddenly hit years later with the ‘creative bug’ is just a wee voice telling you, ‘I’d like my crayons back please.’”

At Digital Innovators we don’t believe that it’s only artists who are creative. We believe that everyone can be creative in their own way, they just need to learn how to let go, be a child again and unlock their inner creativity. That’s why on our skills programme we give you the chance to have your crayons back.

5. Design Thinking

Design thinking is an approach to problem solving that puts users at the centre of the solution. It includes proven practices such as building empathy, ideation, storyboarding and extreme prototyping to create new products, processes and systems that really work for the people that have to live with and use them.

For Digital Innovators, Design Thinking is at the core of what we do. As well as spending a day-and-a-half teaching the various techniques (which our students learn by doing) we use Design Thinking at the beginning of, and throughout, our 8-week projects to ensure the students deliver solutions are really what our employers want.

6. Ethics

The ethical aspects on the use of digital technology in today’s world is something that seems to be sadly missing from most courses in digital technology. We may well churn out tens of thousands of developers a year, from UK universities alone, but how many of these people ever give anything more than a passing thought to the ethics of the work they end up doing? Is it right, for example, to build systems of mass surveillance and collect data about citizens that most have no clue about? Having some kind of ethical framework within which we operate is more important today than ever before.

That’s why we include a module on Digital Ethics as part of our programme. In it we introduce a number of real-world, as well as hypothetical case studies that challenge students to think about the various ethical aspects of the technology they already use or are likely to encounter in the not too distant future.

7. Negotiation

Negotiation is a combination of persuasion, influencing and confidence as well as being able to empathise with the person you are negotiating with and understanding their perspective. Being able to negotiate, whether it be to get a pay rise, buy a car or sell the product or service your company makes is one of the key skills you will need in your life and career, but one that is rarely taught in school or even at university.

As Katherine Knapke, the Communications & Operations Manager at the American Negotiation Institute says: “Lacking in confidence can have a huge impact on your negotiation outcomes. It can impact your likelihood of getting what you want and getting the best possible outcomes for both parties involved. Those who show a lack of confidence are more likely to give in or cave too quickly during a negotiation, pursue a less-aggressive ask, and miss out on opportunities by not asking in the first place”. 

On the Digital Innovators skills programme you will work with a skilled negotiator from The Negotiation Club to practice and hone your negotiation skills in a fun way but in a safe environment which allows you to learn from your mistakes and improve your negotiation skills.

What Makes a Tech City? (Hint: It’s Not the Tech)

Matthew Boulton, James Watt and William Murdoch

The above photograph is of a statue in Centenary Square, Birmingham in the UK. The three figures in it: Matthew Boulton, James Watt and William Murdoch were the tech pioneers of their day, living in and around Birmingham and being associated with a loosely  knit group who referred to themselves as The Lunar Society. The history of the Lunar Society and the people involved has been captured in the book The Lunar Men by Jenny Uglow.

“Amid fields and hills, the Lunar men build factories, plan canals, make steam-engines thunder. They discover new gases, new minerals and new medicines and propose unsettling new ideas. They create objects of beauty and poetry of bizarre allure. They sail on the crest of the new. Yet their powerhouse of invention is not made up of aristocrats or statesmen or scholars but of provincial manufacturers, professional men and gifted amateurs – friends who meet almost by accident and whose lives overlap until they die.”

From The Lunar Men by Jenny Uglow

You don’t have to live in the UK to have heard that Birmingham, like many of the other great manufacturing cities of the Midlands and Northern England has somewhat lost its way over the century or so since the Lunar Men were creating their “objects of beauty and poetry of bizarre allure”. It’s now sometimes hard to believe that these great cities were the powerhouses and engines of the industrial revolution that changed not just England but the whole world. This is something that was neatly summed up by Steven Knight, creator of the BBC television programme Peaky Blinders set in the lawless backstreets of Birmingham in the  1920’s. In a recent interview in the Guardian Knight says:

“It’s typical of Brum that the modern world was invented in Handsworth and nobody knows about it. I am trying to start a “Make it in Birmingham” campaign, to get high-tech industries – film, animation, virtual reality, gaming – all into one place, a place where people make things, which is what Birmingham has always been.”

Likewise Andy Street, Managing Director of John Lewis and Chair of the Greater Birmingham & Solihull Local Enterprise Partnership had this to say about Birmingham in his University of Birmingham Business School Advisory Board guest lecture last year:

“Birmingham was once a world leader due to our innovations in manufacturing, and the city is finally experiencing a renaissance. Our ambition is to be one of the biggest, most successful cities in the world once more.”

Andy Street  CBE – MD of John Lewis

If Birmingham and cities like it, not just in England but around the world, are to become engines of innovation once again then they need to take a step change in how they go about doing that. The lesson to be learned from the Lunar Men is that they did not wait for grants from central Government or the European Union or for some huge corporation to move in and take things in hand but that they drove innovation from their own passion and inquisitiveness about how the world worked, or could work. They basically got together, decided what needed to be done and got on with it. They literally designed and built the infrastructure that was to be form the foundations of innovation for the next 100 years.

Today we talk of digital innovation and how the industries of our era are disrupting traditional ones (many of them formed by the Lunar Men and their descendants) for better and for worse. Now every city wants a piece of that action and wants to emulate the shining light of digital innovation and disruption, Silicon Valley in California. Is that possible? According to the Medium post To Invent the Future, You Must Understand the Past, the answer is no. The post concludes by saying:

“…no one will succeed because no place else — including Silicon Valley itself in its 2015 incarnation — could ever reproduce the unique concoction of academic research, technology, countercultural ideals and a California-specific type of Gold Rush reputation that attracts people with a high tolerance for risk and very little to lose.”

So can this really be true? High tolerance to risk (and failure) is certainly one of the traits that makes for a creative society. No amount of tax breaks or university research programmes is going to fix that problem. Taking the example of the Lunar Men though, one thing that cities can do to disrupt themselves from within is to effect change from the bottom up rather than the top down. Cities are made up of citizens after all and they are the very people that not only know what needs changing but also are best placed to bring about that change.

Whitepaper-cover-212x300

With this in mind, an organisation in Birmingham called Silicon Canal (see here if you want to know where that name comes from) of which I am a part, has created a white paper putting forward our ideas on how to build a tech and digital ecosystem in and around Birmingham. You can download a copy of the white paper here.

The paper not only identifies the problem areas but also how things can be improved and suggests potential solutions to grow the tech ecosystem in the Greater Birmingham area so that it competes on an international stage. Download the white paper, read it and if you are based in Birmingham join in the conversation and if you’re not use the research contained within it to look at your own city and how you can help change it for the better.

This paper was launched at an event this week in the new iCentrum building at Innovation Birmingham which is a great space that is starting to address one of the issues highlighted in the white paper, namely to bring together two key elements of a successful tech ecosystem, established companies and entrepreneurs.

Another event that is taking place in Birmingham next month is TEDx Brum – The Power of US which promises to have lots of inspiring talks by local people who are already effecting change from within.

As a final comment if you’re still not sure that you have the power to make changes that make a difference here are some words from the late Steve Jobs:

“Everything around you that you call life was made up by people that were no smarter than you and you can change it, you can influence it, you can build your own things that other people can use.”

Steve Jobs

Getting Started with Blockchain

In an earlier post I discussed the UK government report on distributed ledger technology (AKA ‘blockchain‘) and how the government’s Chief Scientific Advisor, Sir Mark Walport, was doing the rounds advocating the use of blockchain for a variety of (government) services.

Blockchain is a shared, trusted, public ledger that everyone can inspect, but which no single user controls. The participants in a blockchain system collectively keep the ledger up to date: it can be amended only according to strict rules and by general agreement. For a quick introduction to blockchain this article in the Economist is a pretty good place to start.

Blockchains are going to be useful wherever there is a need for a trustworthy record, something which is pretty vital for transactions of all sorts whether it be in banking, for legal documents or for registries of things like land or high value art works etc. Startups such as Stampery are looking to use blockchain technology to provide low cost certification services. Blockchain is not just for pure startups however. Twenty-five banks are part of the blockchain company, called R3 CEV, which aims to develop common standards around this technology. R3 CEV’s Head of Technology is Richard Gendal Brown an ex-colleague from IBM.

IBM recently announced that, together with Intel, J.P. Morgan and several large banks, it was joining forces to create the Open Ledger Project with the Linux Foundation, with the goal of re-imagining supply chains, contracts and other ways information about ownership and value are exchanged in a digital economy.

As part of this IBM is creating some great tools, using its Bluemix platform, to get developers up and running on the use of blockchain technology. If you have a Bluemix account you can quickly deploy some applications and study the source code on GitHub to see how to start making use of blockchain APIs.

This service is intended for developers who consider themselves early adopters and want to get involved with IBM’s approach to business networks that maintain, secure and share a replicated ledger using blockchain technology. It shows how you can:

  • Deploy and invoke simple transactions to test out IBM’s approach to blockchain technology.
  • Learn and test out IBM’s novel contributions to the blockchain open source community, including the concept of confidential transactions, containerized code execution etc.

It provides some simple demo applications you can quickly deploy into Bluemix to play around with this technology.

Marbles
Marbles Running in IBM Bluemix

This service is not production ready. It is pre-alpha and intended for testing and experimentation only. There are additional security measures that still must be implemented before the service can be used to store any confidential data. That said it’s still a great way to learn about the use and potential for this technology.

 

Hello, World (from IBM Bluemix)

“The only way to learn a new programming language is by writing programs in it. The first program to write is the same for all languages: Print the words ‘hello, world’.”

So started the introduction to the book The C Programming Language by Brian Kernighan and Dennis Ritchie back in 1978. Since then many a programmer learning a new language has heeded those words of wisdom by trying to write their first program to put up those immortal words on their computer screens. Even the Whitehouse is now in on the game.

You can find a list of how to write “hello, world” in pretty much any language you have ever heard of (as well as some you probably haven’t) here. The idea of writing such a simple program is not so much that it will teach you anything about the language syntax but it will teach you how to get to grips with the environment that the code (whether compiled or interpreted) runs in. Back in 1978 when C ran under Unix on hardware like Digital Equipment Corporation’s PDP-11 the environment was a relatively simple affair consisting of a processor, some storage and rudimentary cathode ray terminal (CRT). Then the ‘environment’ amounted to locating the compiler, making sure the right library was provided to the program and figuring out the options to run the compiler and the binary files output. Today things are a bit more complicated which is why the basic premise of getting the most simple program possible (i.e. writing ‘hello, world’ to a screen) is still very relevant as a way of learning the environment.

All of this is by way of an introduction to how to get ‘hello, world’ to work in the IBM Bluemix Platform as a Service (PaaS) environment.  In case you haven’t heard, IBM Bluemix is an open source platform based on Cloud Foundry that provides developers with a complete set of DevOps tools to develop, deploy and maintain web and mobile applications in the cloud with minimal hassle. Bluemix-hosted applications have access to the capabilities of the underlying cloud infrastructure to support the type of non-functional requirements (performance, availability, security etc) that are needed to support enterprise applications. Bluemix also provides a rich set of services to extend your applications with capabilities like analytics, social, internet of things and even IBM Watson cognitive services. The Bluemix platform frees developers and organizations from worrying about infrastructure-related plumbing details and focus on what matters to their organizations – business scenarios that drive better value for their customers.

IBM Bluemix
IBM Bluemix

Because Bluemix supports a whole range of programming languages and services the options for creating ‘hello, world’ are many and varied. Here though are the basic instructions for creating this simplest of programs using the JavaScript language Node.js.  Follow these steps for getting up and running on Bluemix.

Step 1: Sign Up for a Free Bluemix Trial

You can sign up for a free Bluemix trial (and get an an IBM ID if you don’t have one) here. You’ll need to do this before you do anything else. The remainder of this tutorial assumes you have Bluemix running and you are logged into your account.

Step 2: Download the Cloud Foundry Command Line Interface

You can write code and get it up and running in numerous ways in Bluemix including within Bluemix itself, using Eclipse tools or with the Cloud Foundry command line interface (CLI). As this example uses the latter you’ll need to ensure you have the CLI downloaded on your computer. To do that follow the instructions here.

Step 3: Download the Example Code

You can download the code for this example from my GitHub here. Thanks to Carl Osipov over at Clouds with Carl for this code. Once you have downloaded the zip file unpack it into a convenient folder. You will see there are three files (plus a readme).

  • main.js – the Javascript source code. The code returns a ‘hello, world’ message to any HTTP request sent to the web server running the code.
  • package.json – which tells Bluemix it needs a Node.js runtime.
  • manifest.yml – this file is used when you deploy your code to Bluemix using the command line interface.  It contains the values that you would otherwise have to type on the command line when you ‘push’ your code to Bluemix. I suggest you edit this and change the ‘host’ parameter to something unique to you (e.g. change my name to yours).

Step 4: Deploy and Run the Code

Because all your code and the instructions for deploying it are contained in the three files just downloaded deploying into Bluemix is simplicity itself. Do the following:

  1. Open a command a Command Prompt window.
  2. Change to the directory that you unpacked the source code into by typing: cd your_directory.
  3. Connect to Bluemix by typing: cf api https://api.ng.bluemix.net.
  4. Login to Bluemix with your IBM ID credentials: cf login -u user-id -o password -s devHere dev is the Bluemix space you want to use (‘dev’ by default).
  5. Deploy your app to Bluemix by typing: cf push.

That’s it! It will take a while to upload, install and start the code and you will receive a notification when it’s done.  Once you get that response back on the command line you can switch to your Bluemix console and should see this.

IBM Bluemix Dashboard
IBM Bluemix Dashboard

To show the program is working you can either click on the ‘Open URL’ widget (the square with the right pointing arrow in the hello-world-node-js application) or type the URL: ‘hello-world-node-js-your-name.mybluemix.net’ into a browser window (your-name is whatever you set ‘host’ to in the manifest file). The words ‘hello, world’ will magically appear in the browser. Congratulations you have written and deployed your first Bluemix app. Pour yourself a fresh cup of coffee and bask in your new found glory.

If you live in the UK and would like to learn more about the IBM Bluemix innovation platform then sign up for this free event in London at the Rainmaking Loft on Thursday 25th June 2015 here.

Is the Raspberry Pi the New BBC Microcomputer?

There has been much discussion here in the UK over the last couple of years about the state of tech education and what should be done about it. The concern being that our schools are not doing enough to create the tech leaders and entrepreneurs of the future.

The current discussion kicked off  in January 2011 when Microsoft’s director of education, Steve Beswick, claimed that in UK schools there is much “untapped potential” in how teenagers use technology. Beswick said that a Microsoft survey had found that 71% of teenagers believed they learned more about information technology outside of school than in formal information and communication technology (ICT) lessons. An interesting observation given that one of the criticisms often leveled at these ICT classes is that they just teach kids how to use Microsoft Office.The discussion moved in August of 2011, this time at the Edinburgh International Television Festival where Google chairman Eric Schmidt said he thought education in Britain was holding back the country’s chances of success in the digital media economy. Schmidt said he was flabbergasted to learn that computer science was not taught as standard in UK schools, despite what he called the “fabulous initiative” in the 1980s when the BBC not only broadcast programmes for children about coding, but shipped over a million BBC Micro computers into schools and homes.

January 2012 saw even the schools minister, Michael Gove, say that the ICT curriculum was “a mess” and must be radically revamped to prepare pupils for the future (Gove suspended the ICT Curriculum in September 2012). All well and good but as some have commented “not everybody is going to need to learn to code, but everyone does need office skills”.

In May 2012 Schmidt was back in the UK again, this time at London’s Science Museum where he announced that Google would provide the funds to support Teach First – a charity which puts graduates on a six-week training programme before deploying them to schools where they teach classes over a two-year period.

So, what now? With the new ICT curriculum not due out until 2014 what are the kids who are about to start their GCSE’s to do? Does it matter they won’t be able to learn ICT at school? The Guardian’s John Naughton proposed a manifesto for teaching computer science in March 2012 as part of his papers digital literacy campaign.  As I’ve questioned before should it be the role of schools to teach the very specific programming skills being proposed; skills that might be out of date by the time the kids learning them enter the workforce? Clearly something needs to be done otherwise, as my colleague Dr Rick Robinson says, where will the next generation of technology millionaires come from? bbc micro

Whatever shape the new curriculum takes, one example (one that Eric Schmidt himself used) of a success story in the learning of IT skills is that of the now almost legendary BBC Microcomputer. A project started 30 years ago this year. For those too young to remember, or were not around in the UK at the time, the BBC Microcomputer got its name from project devised by the BBC to enhance the nation’s computer literacy. The BBC wanted a machine around which they could base a series called The Computer Programme, showing how computers could be used, not just for computer programming but also for graphics, sound and vision, artificial intelligence and controlling peripheral devices. To support the series the BBC drew up a spec for a computer that could be bought by people watching the programme to actually put into practice what they were watching. The machine was built by Acorn the spec of which you can read here.ba8dd-bbcmicroscreen

The BBC Micro was not only a great success in terms of the television programme, it also helped spur on a whole generation of programmers. On turning the computer on you were faced with the screen on the right. The computer would not do anything unless you fed it instructions using the BASIC programming language so you were pretty much forced to learn programming! I can vouch for this personally because although I had just entered the IT profession at the time this was in the days of million pound mainframes hidden away in backrooms guarded jealously by teams of computer operators who only gave access via time-sharing for minutes at a time. Having your own computer which you could tap away on and get instant results was, for me, a revelation.

Happily it looks like the current gap in the IT curriculum may about to be filled by the humble Raspberry Pi computer. The idea behind the Raspberry Pi came from a group of computer scientists at Cambridge, England’s computer laboratory back in 2006. As Ebon Upton founder and trustee of the Raspberry Pi Foundation said:

Something had changed the way kids were interacting with computers. A number of problems were identified: the colonisation of the ICT curriculum with lessons on using Word and Excel, or writing webpages; the end of the dot-com boom; and the rise of the home PC and games console to replace the Amigas, BBC Micros, Spectrum ZX and Commodore 64 machines that people of an earlier generation learned to program on.

Out of this concern at the lack of programming and computer skills in today’s youngsters was born the Raspberry Pi computer (see below) which began shipping in February 2012. Whilst the on board processor and peripheral controllers on this credit card sized, $25 device are orders of magnitude more powerful than anything the BBC Micros and Commodore 64 machines had, in other ways this computer is even more basic than any of those computers. It comes with no power supply, screen, keyboard, mouse or even operating system (Linux can be installed via a SD card). There is quite a learning curve just to get up and running although what Raspberry Pi has going for it that the BBC Micro did not is the web and the already large number of help pages as well as ideas for projects and even the odd Raspberry Pi Jam (get it). Hopefully this means these ingenious devices will not become just another piece of computer kit lying around in our school classrooms.e65ef-raspberrypi

The Computer Literacy Project (CLP) which was behind the idea of the original BBC Micro and “had the grand ambition to change the culture of computing in Britain’s homes” produced a report in May of this year called The Legacy of the BBC Micro which, amongst other things, explores whether the CLP had any lasting legacy on the culture of computing in Britain. The full report can be downloaded here. One of the recommendations from the report is that “kit, clubs and formal learning need to be augmented by support for individual learners; they may be the entrepreneurs of the future“. 30 years ago this support was provided by the BBC as well as schools. Whether the same could be done today in schools that seem to be largely results driven and a BBC that seems to be imploding in on itself is difficult to tell.

And so to the point of this post: is the Raspberry Pi the new BBC Micro in the way it spurred on a generation of programmers that spread their wings and went on to create the tech boom (and let’s not forget odd bust) of the last 30 years? More to the point, is that what the world needs right now? Computers are getting getting far smarter “out of the box”. IBM’s recent announcements of it’s PureSystems brand promise a “smarter approach to IT” in terms of installation, deployment, development and operations. Who knows what stage so called expert integrated systems will be at by the time today’s students begin to hit the workforce in 5 – 10 years time? Does the Raspberry Pi have a place in this world? A world where many, if not most, programming jobs continue to be shipped to low cost regions, currently the BRIC, MIST countries and so on, I am sure, the largely untapped African sub-continent.

I believe that to some extent the fact that the Raspberry Pi is a computer and yes, with a bit of effort, you can program it, is largely an irrelevance. What’s important is that the Raspberry Pi ignites an interest in a new generation of kids that gets them away from just consuming computing (playing games, reading Facebook entries, browsing the web etc) to actually creating something instead. It’s this creative spark that is needed now, today and as we move forward that, no matter what computing platforms we have in 5, 10 or 50 years time, will always need creative thinkers to solve the worlds really difficult business and technical problems.

And by the way my Raspberry Pi is on order.

Why We Need STEM++ Graduates

The need for more STEM (that’s Science, Technology, Engineering and Maths) skills seems to be on the agenda more and more these days. There is a strong feeling that the so called developed nations have depended too much on financial and other services to grow their economies and as a result “lost” their ability to design, develop and manufacture goods, largely because we are not producing enough STEM graduates to do this.Whilst I would see software as falling fairly and squarely into the STEM skillset (even if it is also used to  underpin nearly all of the modern financial services industry) as this blog post by Jessica Benjamin from IBM points out STEM skills alone won’t solve the really hard problems that are out there. With respect to the particular problems around big data Jessica succinctly says:

All the skills it takes to tell a good story, to compose a complete orchestra, are the skills it takes to put the pieces of this big data world together. If data is just data until its information, what’s a lot of information without the thought and skill of pulling all the chords together?

The need for right as well as left brained thinkers for solving the worlds really, really hard business problems is something that has been recognised for some time now by several prominent business leaders. Indeed the intersection of technology (left-brained) and design (right-brained) has certainly played a part in a lot of what technology companies like IBM and Apple have been a part of and made them successful.

So we need not just STEM skills but STEM++ skills where the addition of  “righty” skills like arts, humanities and design help us build not just a smarter world but one that is better to live in. For more on this check out my other (joint) blog The Versatilist Way.

Computing: The Human Experience

Grady Booch, IBM Fellow and Chief Scientist for Software Engineering in IBM Research has kicked off an initiative to produce a documentary on the history of computing called Computing: The Human Experience.  This is a crowd sourcing initiative for which Grady is trying to raise $25,000 by January 2nd to get the project underway. It’s an all or nothing model, the project must be fully funded before time expires or no money changes hands.
I guess you may ask why you should contribute funds to an initiative like this in these austere times when there are far better causes that could take care of your $$$$. Here are three reasons:

  1. If you are reading this blog you are almost certainly involved at some level in computing. You have helped, or still are helping, change the world in fundamental and unprecedented ways, ways that affect pretty much everyone who walks the face of the planet right now. Isn’t it time that story was told?
  2. Computing more than any other industry has its roots at a very personal level. How many great computing ideas have started in kid’s bedrooms, dormitories or their parent’s garages? You can now help by making your own personal contribution. 
  3. You can donate as little as one dollar, a lot less than your first latte of the day or final glass of alcoholic beverage in the evening. Forego that and spend it on this instead, you could even get a hand written letter of thanks from Grady.

If you do donate, or even if you don’t, make sure you tweet it, blog it, Tumblr it or Facebook it so all your friends know about this.

Educating an IT Workforce for the 21st Century

A report on the BBC Today programme this morning argues that the “Facebook generation needs better IT skills” and that UK schools should be providing courses in programming at GCSE. The report bemoaned the fact that so called Information and Communications Technology (ICT) GCSEs did little more than teach students how to use Microsoft Office programmes such as Word and Excel and did not prepare students for a career in IT. The backers of this report were companies like Google and Microsoft.This raises an interesting question of who should be funding such education in these austere times. Is it the role of schools to provide quite specific skills like programming or should they be providing the basics of literacy and numeracy as well as the more fundamental skills of creativity, communication and collaboration and leave the specifics to the industries that need them? Here are some of the issues related to this:

  1. Skills like computer programming are continuously evolving and changing. What is taught at 14 – 16 today (the age of GCSE students in the UK) will almost certainly be out of date when these students hit the work force at 21+.
  2. The computer industry, just like manufacturing before it, long ago sent out the message to students that programming skills (in Western economies at least) were commoditised and better performed by the low-cost economies of the BRIC nations (and now, presumably, the CEVITS).
  3. To most people computers are just tools. Like cars, washing machines and mobile phones they don’t need to know how they work, just how to use them effectively.
  4. Why stop at computer programming GCSE? Why not teach the basics of plumbing, car mechanics, cookery and hairdressing, all of which are in great demand still and needed by their respective industries.
  5. Public education (which essentially did not exist before the 19th century, certainly not for the masses) came about to meet the needs of industrialism and as such demanded skills in left-brained, logical thinking skills rather than right brained, creative skills (see Sir Ken Robinson’s TED talk on why schools kill creativity). As a result we have a system that rewards the former rather than the latter (as in “there’s no point in studying painting or music, you’ll never get a job in that”).

In an ideal world we would all be given the opportunities to learn and apply whatever skills we wanted (both at school and throughout life) and have that learning funded by the tax payer on the basis it benefits society as a whole. Unfortunately we don’t live in that ideal world and in fact are probably moving further from it than ever.

Back in the real world therefore industry must surely fund the acquiring of those skills. Unfortunately in many companies education is the first thing to be cut when times are hard. The opposite should be the case. One of the best things I ever did was to spend five weeks (yes that’s weeks not days), funded entirely by IBM, learning object-oriented programming and design. Whilst five weeks may seem like a long time for a course I know this has paid for itself many, many times over by the work I have been able to do for IBM in the 15 years since attending that course. Further, I suspect that five weeks intensive learning was easily equivalent to at least a years worth of learning in an educational establishment.

Of course such skills are more vital to companies like Google, Microsoft and IBM than ever before. Steve Denning in an article called Why Big Companies Die in Forbes this month quotes from an article by Peggy Noonan in the Wall Street Journal (called A Caveman Won’t Beat a Salesman). Denning uses a theory from Steve Jobs that big companies fail when salesmen and accountants are put in charge of and who don’t know anything about the product or service the company make or how it works. Denning says:

The activities of these people [salesmen and accountants] further dispirit the creators, the product engineers and designers, and also crimp the firm’s ability to add value to its customers. But because the accountants appear to be adding to the firm’s short-term profitability, as a class they are also celebrated and well-rewarded, even as their activities systematically kill the firm’s future.

Steve Jobs showed that there was another way.  Namely, to keep playing the offense and focus totally on adding value for customers by creating new and innovative new products. By doing that you can make more money than the companies that are milking their cash cows and focused on making money rather than products.

Companies like Google and Microsoft (and IBM and Apple) need people fully trained in the three C’s (creativity, communication and creativity) who can then apply these to whatever task is most relevant to the companies bottom line. It’s the role of those companies, not government, to train people in the specifics.

Interestingly Seymour Papert (who co-invented the Logo programming language) used programming as a tool to improve the way that children think and solve problems. Papert used Piaget‘s work of cognitive development (that showed how children learn) and used Logo as a way of improving their creativity.

Finally, to see how students themselves view all this see the article by Nikhil Goyal’s (a 16-year-old junior at Syosset High School in New York) who states: “for the 21st century American economy, all economic value will derive from entrepreneurship and innovation. Low-cost manufacturing will essentially be wiped out of this country and shipped to China, India, and other nations” and goes on to propose that
“we institute a 21st century model of education, rooted in 21st century learning skills and creativity, imagination, discovery, and project-based learning”. Powerful stuff for one so young, there may yet be hope for us.