There have been many, many reports both globally and within the UK bemoaning the lack of digital skills in todays workforce. The term digital skills is somewhat amorphous however and can mean different things to different people.
To more technical types it can mean the ability to write code, develop new computer hardware or have deep insights into how networks are set up and configured. To less digital savvy people it may just mean the ability to operate digital technology such as tablets and mobile phones or how to find information on the world wide web or even just fill out forms on web sites (e.g. to apply for a bank account).
A recent report from the CBI, Delivering Skills for the New Economy , which comes up with a number of concrete steps on how the UK might address a shortage of digital skills, suggests the following as a way of categorising these skills. Useful if we are to find way in which to address their scarcity.
Basic digital skills: Businesses define basic digital skills in similar terms. For most businesses this means computer literacy such as familiarity with Microsoft Office; handling digital information and content; core skills such as communication and problem-solving; and understanding how digital technologies work. This understanding of digital technologies includes understanding how data can be used to glean new insights, how social media provides value for a business or how an algorithm or piece of digitally-enabled machinery works.
Basic digital skills: Businesses define basic digital skills in similar terms. For most businesses this means computer literacy such as familiarity with Microsoft Office; handling digital information and content; core skills such as communication
and problem-solving; and understanding how digital technologies work. This understanding of digital technologies includes understanding how data can be used to glean new insights, how social media provides value for a business or how an algorithm or piece of digitally-enabled machinery works.
Advanced digital skills: Businesses also broadly agree on the definitions of advanced digital skills. For most businesses, these include software engineering and development (77%), data analytics (77%), IT support and system maintenance (81%) and digital marketing and sales (72%). Businesses have highlighted their increasing need for specific advanced digital skills, including programming, visualisation, machine learning, data analytics, app development, 3D printing expertise, cloud awareness and cybersecurity.
It is important that a good grounding in the basic (core) skills is given to as many people as possible. The so called digital natives or “Gen Zs” (at least in first world countries) have grown up knowing nothing else but the world wide web, touch screen technology and pervasive social media. Older generations, less so. All need this information if they are to operate effectively in the “New Economy” (or know enough to actively disengage from it if they choose to do so).
The basic skills will also allow for a more critical assessment of what advanced digital skills should be considered if making choices about jobs or if people just need to understand what social media companies they should or should not be using or how artificial intelligence might affect their career prospects.
I would argue that a basic level of advanced digital knowledge is also a requirement so that everyone can play a more active role in this modern economy and understand the implications of technology.
The genius of Tim Berners-Lee when he invented the World Wide Web back in 1989 was that he brought together three arcane technologies (hypertext, markup languages and internet communication protocols) in a way no one had thought of before and literally transformed the world. Could blockchain do the same thing? Satoshi Nakamoto in his paper that introduced the world to bitcoins 20 years later in 2009 also used three existing ideas (distributed databases, public key or asymetric cryptography and proof-of-work) to show how a peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution.
The documentary The Blockchain and Us made by Manuel Stagars in 2017 interviews software developers, cryptologists, researchers, entrepreneurs, consultants, VCs, authors, politicians, and futurists from around the world and poses a number of questions such as: How can the blockchain benefit the economies of nations? How will it change society? What does it mean for each of us? The intent of the film is not to explain the technology but to give views on the it and encourage a conversation about its potential wider implications.
Since I have begun to focus my architecture efforts on blockchain I often get asked the question that is the title of this blog post. According to Gartner blockchain has gone through the ‘peak of inflated expectations’ and is now sliding down into the ‘trough of disillusionment’. The answer to the question, as is the case for most new technologies, will be it’s “good for” some things but not everything.
As a technologist it pains me to say this but in the business world technology itself is not usually the answer to anything on its own. As the Venture Capitalist Jimmy Song said at Consensus earlier this year*, “When you have a technology in search of a use, you end up with the crap that we see out there in the enterprise today.” Harsh words indeed but probably true.
Instead, what is needed is the business and organisational change that drives new business models in which technology is, if required, slotted in at the right time and place. Business people talk about the return on investment of tech and the fact that technology often “gobbles up staff time and money, without giving enough back“. Blockchain runs the risk of gobbling up too much time and money if the right considerations are not given to its use and applicability to business.
If we are to ensure blockchain has a valid business use and gets embedded into the technology stack that businesses use then we need to ensure the right questions get asked when considering its use. As a start in doing this you could do worse than consider this set of questions from the US Department for Homeland Security.
Many blockchain projects are still at the proof of technology stage although there are some notable exceptions. The IBM Food Trust is a collaborative network of growers, processors, wholesalers, distributors, manufacturers, retailers and others enhancing visibility and accountability in each step of the food supply chain whilst the recently announced TradeLens aims to apply blockchain to the world’s global supply chain. Both of these solutions are built on top of the open source Hyperledger Fabric blockchain platform which is one of the projects under the umbrella of the Linux Foundation.
What these and other successful blockchain systems are showing is actually that another question should be tagged onto the flowchart above (probably it should be the first question). This would be something like: “Are you willing to be part of a collaborative business network to share information on a needs to know basis?” The thing about permissioned networks like Hyperledger Fabric is that people don’t need to trust everyone on the network but they do need to agree who will be a part of it. Successful blockchain business networks are proving to be the ones whose participants understand this and are willing to collaborate.
Today (12th March, 2018) is the World Wide Web’s 29th birthday. Sir Tim Berners-Lee (the “inventor of the world-wide web”), in an interview with the Financial Times and in this Web Foundation post has used this anniversary to raise awareness of how the web behemoths Facebook, Google and Twitter are “promoting misinformation and ‘questionable’ political advertising while exploiting people’s personal data”. Whilst I admire hugely Tim Berners-Lee’s universe-denting invention it has to be said he himself is not entirely without fault in the wayhe bequeathed us with his invention. In his defence, hindsight is a wonderful thing of course, no one could have possibly predicted at the time just how the web would take off and transform our lives both for better and for worse.
If, as Marc Andreessen famously said in 2011, software is eating the world then many of those powerful tech companies are consuming us (or at least our data and I’m increasingly becoming unsure there is any difference between us and the data we choose to represent ourselves by.
Here are five recent examples of some of the negative ways software is eating up our world.
Over the past 40+ years the computer software industry has undergone some fairly major changes. Individually these were significant (to those of us in the industry at least) but if we look at these changes with the benefit of hindsight we can see how they have combined to bring us to where we are today. A world of cheap, ubiquitous computing that has unleashed seismic shocks of disruption which are overthrowing not just whole industries but our lives and the way our industrialised society functions. Here are some highlights for the 40 years between 1976 and 2016.
I have written before about how I believe that we, as software architects, have a responsibility, not only to explain the benefits (and there are many) of what we do but also to highlight the potential negative impacts of software’s voracious appetite to eat up our world.
This is my 201st post on Software Architecture Zen (2016/17 were barren years in terms of updates). This year I plan to spend more time examining some of the issues raised in this post and look at ways we can become more aware of them and hopefully not become so seduced by those sirenic entrepreneurs.
Ten years ago this week (on 9th January 2007) the late Steve Jobs, then at the hight of his powers at Apple, introduced the iPhone to an unsuspecting world. The history of that little device (which has got both smaller and bigger in the interceding ten years) is writ large over the entire Internet so I’m not going to repeat it here. However it’s worth looking at the above video on YouTube not just to remind yourself what a monumental and historical moment in tech history this was, even though few of us realised it at the time, but also to see a masterpiece in how to launch a new product.
Within two minutes of Jobs walking on stage he has the audience shouting and cheering as if he’s a rock star rather than a CEO. At around 16:25 when he’s unveiled his new baby and shows for the first time how to scroll through a list in a screen (hard to believe that ten years ago know one knew this was possible) they are practically eating out of his hand and he still has over an hour to go!
This iPhone keynote, probably one of the most important in the whole of tech history, is a case study on how to deliver a great presentation. Indeed, Nancy Duart in her book Resonate, has this as one of her case studies for how to “present visual stories that transform audiences”. In the book she analyses the whole event to show how Jobs’ uses all of the classic techniques of storytelling, establish what is and what could be, build suspense, keep your audience engaged, make them marvel and finally show them a new bliss.
The iPhone product launch, though hugely important, is not what this post is about though. Rather, it’s about how ten years later the iPhone has kept pace with innovations in technology to not only remain relevant (and much copied) but also to continue to influence (for better and worse) the way people interact, communicate and indeed live. There are a number of enabling ideas and technologies, both introduced at launch as well as since, that have enabled this to happen. What are they and how can we learn from the example set by Apple and how can we improve on them?
Open systems generally beat closed systems
At its launch Apple had created a small set of native apps the making of which was not available to third-party developers. According to Jobs, it was an issue of security. “You don’t want your phone to be an open platform,” he said. “You don’t want it to not work because one of the apps you loaded that morning screwed it up. Cingular doesn’t want to see their West Coast network go down because of some app. This thing is more like an iPod than it is a computer in that sense.”
Jobs soon went back on that decision which is one of the factors that has led to the overwhelming success of the device. There are now 2.2 million apps available for download in the App Store with over 140 billion downloads made since 2007.
Claiming your system is open does not mean developers will flock to it to extend your system unless it is both easy and potentially profitable to do so. Further, the second of these is unlikely to happen unless the first enabler is put in place.
Today with new systems being built around Cognitive computing, the Internet of Things (IoT) and Blockchain companies both large and small are vying with each other to provide easy to use but secure ecosystems that allow these new technologies to flourish and grow, hopefully to the benefits to business and society as a whole. There will be casualties on the way but this competition, and the recognition that systems need to be built right rather than us just building the right system at the time is what matters.
Open systems must not mean insecure systems
One of the reasons Jobs gave for not initially making the iPhone an open platform was his concerns over security and for hackers to break into those systems wreaking havoc. These concerns have not gone away but have become even more prominent. IoT and artificial intelligence, when embedded in everyday objects like cars and kitchen appliances as well as our logistics and defence systems have the potential to cause there own unique and potentially disastrous type of destruction.
The cost of data breaches alone is estimated at $3.8 to $4 million and that’s without even considering the wider reputational loss companies face. Organisations need to monitor how security threats are evolving year to year and get well-informed insights about the impact they can have on their business and reputation.
In October last year the Whitehouse released a report called Preparing for the Future of Artificial Intelligence. The report looked at the current state of AI, its existing and potential applications, and the questions that progress in AI raise for society and public policy and made a number of recommendations on further actions. These included:
Prioritising open training data and open data standards in AI.
Industry should work with government to keep government updated on the general progress of AI in industry, including the likelihood of milestones being reached
The Federal government should prioritize basic and long-term AI research
As part of the answer to addressing the Whitehouse report this week a group of private investors, including LinkedIn co-founder Reid Hoffman and eBay founder Pierre Omidyar, launched a $27 million research fund, called the Ethics and Governance of Artificial Intelligence Fund. The group’s purpose is to foster the development of artificial intelligence for social good by approaching technological developments with input from a diverse set of viewpoints, such as policymakers, faith leaders, and economists.
I have discussed before about transformative technologies like the world wide web have impacted all of our lives, and not always for the good. I hope that initiatives like that of the US government (which will hopefully continue under the new leadership) will enable a good and rationale public discourse on how we allow these new systems to shape our lives for the next ten years and beyond.
As I sit here typing this, I look out of the window at my garden, the sun is shining and nothing much seems to have changed since yesterday. For my generation, the one that had free university education, final salary pensions and the ability to fairly easily get on the housing ladder probably not a lot will change. In the short term our investments will go down, our houses may decrease in value and our German cars may become more expensive but in what time we have left on this earth I’m pretty sure we will not find ourselves starving or homeless.
For the millennial and subsequent generations however this may not be the case. This is the generation that is already drowning in student debt with little ability to buy their own houses and have a secure future. As a parting blow to that generation* we have now taken away their right to the freedom of movement to live and work in 27 other European countries. We are about to remove the protections they have from European laws covering their human and working rights and we are threatening to cut off the free flow of immigration that has contributed both economically and culturally to the lifeblood of this country, certainly in my lifetime. All for what? To save ourselves £8 Billion a year which for even a higher rate tax payer only equates to something like £100 a year in income tax and National Insurance.
So what to do? As my friend Jeremy Walker says in this post let’s use this time to take stock of where we are and where we want to go as a nation. Let’s not allow the nationalists and “little Englanders”to dictate our future. As this referendum has shown, politics is important and impacts all of our lives. One week ago a British politician was murdered because what she believed in did not tie in with the beliefs of someone else. Hopefully that is an isolated incident that will not be repeated. As a nation we now need to work together more than ever if we are to navigate our way through the choppy waters we are all going to face for the coming months and years.
Just a few short weeks ago I and several hundred other people attended TEDx Brum where the theme was the Power Of Us. In both the speakers and the attendees it was heartening to see such an array of ages, gender, race, genre and opinions – diversity in every spectrum that all fed into the aim of the conference. After yesterdays historic and game changing referendum result we now need more than ever “the power of us” to pull together as a nation and to work with, rather than against each other.
Last Saturday (11th June) Birmingham held its very own TED conference, TEDx Brum – Power of Us, at its Town Hall in Victoria Square. To say this was one of the most incredibly well organised events I have ever attended is a major, major understatement. Everything about TEDx Brum was just superbly well designed; from the beautifully laid out and printed program of events (below) to the military like precision of the event itself where a continuous stream of speakers and performers came out on stage and wow’ed the audience with their passion and the power of their messages.
Lauren Currie, one of the speakers at this years conference, has summarised why this event was so different and greater than its #PowerOfUs hashtag here. For me, her point ‘no painting-by-numbers’ really sums up why this was such a different conference from ones I, and I’m sure many others at the event, have attended before.
“It was a conference that wasn’t about ‘meeting new people’ or ‘learning new things’ – which are very middle-class objectives for actions. Nobody had an objective of getting new business cards. No speaker had slides full of ‘tweetable wisdom’. These weren’t presentations that had been done a thousand times before to a thousand different conference halls – this was new and real. There was no existing structures justifying themselves. Only the new, the vibrant and the experimental – at a stage where we can start to test and adjust and adapt and copy.”
Anyone who has watched a TED talk at ted.com will know that the presentation skills of the speakers are absolutely top-notch and something any of us that does public speaking, no matter how small or large the audience, aspires too. I can honestly say that every single one of the speakers and performers at TEDx Brum could easily have presented at a full blown TED and exceeded the very considerable speaking skills of those presenters. Whether it was @AdnanSharif1979 telling us about the horrors of forced organ donation (and why we should all sign up to be organ donors), @AnisaHaghdadi, founder of @beatfreeks telling us we needed to “build the thing that builds more things” or the heartfelt and incredibly brave talk by @JayneHardy, founder of Blurt who got a standing ovation for speaking about her own struggles with depression, everyone spoke with total and absolute passion and dedication to their own cause as well as the wider one of unleashing the #PowerOfUs.
As @ImmyKaur the curator of TEDx Brum says in her introduction to this years conference:
“Birmingham is an archetype of the future many cities face. This future will not come without hard work, disruption and genuine collaboration. We will need to come together across our traditional sectors and divides to create, imagine and build the future together. We must unleash the true #PowerOfUs to catalyse this transformation.”
There are lots of truly amazing things happening in Birmingham right now. I was part of an event a few weeks ago whose aim is to pull together the tech community in Birmingham and its wider surrounds. All of these strands need to come together to make the change that this great city deserves and which is long overdue. Here’s to the #PowerOfUs and all the great people in Birmingham that are making this happen.
The above photograph is of a statue in Centenary Square, Birmingham in the UK. The three figures in it: Matthew Boulton, James Watt and William Murdoch were the tech pioneers of their day, living in and around Birmingham and being associated with a loosely knit group who referred to themselves as The Lunar Society. The history of the Lunar Society and the people involved has been captured in the book The Lunar Men by Jenny Uglow.
“Amid fields and hills, the Lunar men build factories, plan canals, make steam-engines thunder. They discover new gases, new minerals and new medicines and propose unsettling new ideas. They create objects of beauty and poetry of bizarre allure. They sail on the crest of the new. Yet their powerhouse of invention is not made up of aristocrats or statesmen or scholars but of provincial manufacturers, professional men and gifted amateurs – friends who meet almost by accident and whose lives overlap until they die.”
You don’t have to live in the UK to have heard that Birmingham, like many of the other great manufacturing cities of the Midlands and Northern England has somewhat lost its way over the century or so since the Lunar Men were creating their “objects of beauty and poetry of bizarre allure”. It’s now sometimes hard to believe that these great cities were the powerhouses and engines of the industrial revolution that changed not just England but the whole world. This is something that was neatly summed up by Steven Knight, creator of the BBC television programme Peaky Blinders set in the lawless backstreets of Birmingham in the 1920’s. In a recent interview in the Guardian Knight says:
“It’s typical of Brum that the modern world was invented in Handsworth and nobody knows about it. I am trying to start a “Make it in Birmingham” campaign, to get high-tech industries – film, animation, virtual reality, gaming – all into one place, a place where people make things, which is what Birmingham has always been.”
Likewise Andy Street, Managing Director of John Lewis and Chair of the Greater Birmingham & Solihull Local Enterprise Partnership had this to say about Birmingham in his University of Birmingham Business School Advisory Board guest lecture last year:
“Birmingham was once a world leader due to our innovations in manufacturing, and the city is finally experiencing a renaissance. Our ambition is to be one of the biggest, most successful cities in the world once more.”
Andy Street CBE – MD of John Lewis
If Birmingham and cities like it, not just in England but around the world, are to become engines of innovation once again then they need to take a step change in how they go about doing that. The lesson to be learned from the Lunar Men is that they did not wait for grants from central Government or the European Union or for some huge corporation to move in and take things in hand but that they drove innovation from their own passion and inquisitiveness about how the world worked, or could work. They basically got together, decided what needed to be done and got on with it. They literally designed and built the infrastructure that was to be form the foundations of innovation for the next 100 years.
Today we talk of digital innovation and how the industries of our era are disrupting traditional ones (many of them formed by the Lunar Men and their descendants) for better and for worse. Now every city wants a piece of that action and wants to emulate the shining light of digital innovation and disruption, Silicon Valley in California. Is that possible? According to the Medium post To Invent the Future, You Must Understand the Past, the answer is no. The post concludes by saying:
“…no one will succeed because no place else — including Silicon Valley itself in its 2015 incarnation — could ever reproduce the unique concoction of academic research, technology, countercultural ideals and a California-specific type of Gold Rush reputation that attracts people with a high tolerance for risk and very little to lose.”
So can this really be true? High tolerance to risk (and failure) is certainly one of the traits that makes for a creative society. No amount of tax breaks or university research programmes is going to fix that problem. Taking the example of the Lunar Men though, one thing that cities can do to disrupt themselves from within is to effect change from the bottom up rather than the top down. Cities are made up of citizens after all and they are the very people that not only know what needs changing but also are best placed to bring about that change.
With this in mind, an organisation in Birmingham called Silicon Canal (see here if you want to know where that name comes from) of which I am a part, has created a white paper putting forward our ideas on how to build a tech and digital ecosystem in and around Birmingham. You can download a copy of the white paper here.
The paper not only identifies the problem areas but also how things can be improved and suggests potential solutions to grow the tech ecosystem in the Greater Birmingham area so that it competes on an international stage. Download the white paper, read it and if you are based in Birmingham join in the conversation and if you’re not use the research contained within it to look at your own city and how you can help change it for the better.
This paper was launched at an event this week in the new iCentrum building at Innovation Birmingham which is a great space that is starting to address one of the issues highlighted in the white paper, namely to bring together two key elements of a successful tech ecosystem, established companies and entrepreneurs.
I’ve recently been spending a fair bit of time in hospital. Not, thankfully, for myself but with my mother who fell and broke her arm a few weeks back which has resulted in lots of visits to our local Accident & Emergency (A&E) department as well as a short stay in hospital whilst they pinned her arm back in place.
Anyone who knows anything about the UK also knows how much we value our National Health Service (NHS). So much so that when it was our turn to run the Olympic Games back in 2012 Danny Boyle’s magnificent opening ceremony dedicated a whole segment to this wonderful institution featuring doctors, nurses and patients dancing around beds to music from Mike Oldfield’s Tubular Bells.
The NHS was created out of the ideal that good healthcare should be available to all, regardless of wealth. When it was launched by the then minister of health, Aneurin Bevan, on July 5 1948, it was based on three core principles:
that it meet the needs of everyone
that it be free at the point of delivery
that it be based on clinical need, not ability to pay
These three principles have guided the development of the NHS over more than 60 years, remain at its core and are embodied in its constitution.
NHS net expenditure (resource plus capital, minus depreciation) has increased from £64.173 billion in 2003/04 to £113.300bn in 2014/15. Planned expenditure for 2015/16 is £116.574bn.
Health expenditure (medical services, health research, central and other health services) per capita in England has risen from £1,841 in 2009/10 to £1,994 in 2013/14.
The NHS net deficit for the 2014/15 financial year was £471 million (£372m underspend by commissioners and a £843m deficit for trusts and foundation trusts).
Current expenditure per capita for the UK was $3,235 in 2013. This can be compared to $8,713 in the USA, $5,131 in the Netherlands, $4,819 in Germany, $4,553 in Denmark, $4,351 in Canada, $4,124 in France and $3,077 in Italy.
The NHS also happens to be the largest employer in the UK. In 2014 the NHS employed 150,273 doctors, 377,191 qualified nursing staff, 155,960 qualified scientific, therapeutic and technical staff and 37,078 managers.
So does it work?
From my recent experience I can honestly say yes. Whilst it may not be the most efficient service in the world the doctors and nurses managed to fix my mothers arm and hopefully set her on the road to recovery. There have been, and I’m sure there will be more, setbacks but given her age (she is 90) they have done an amazing job.
Whilst sitting in those A&E departments whiling away the hours (I did say they could be more efficient) I had plenty of time to observe and think. By its very nature the health service is hugely people intensive. Whilst there is an amazing array of machines beeping and chirping away most activities require people and people cost money.
The UK’s health service, like that of nearly all Western countries, is under a huge amount of pressure:
The UK population is projected to increase from an estimated 63.7 million in mid-2012 to 67.13 million by 2020 and 71.04 million by 2030.
The UK population is expected to continue ageing, with the average age rising from 39.7 in 2012 to 42.8 by 2037.
The number of people aged 65 and over is projected to increase from 10.84m in 2012 to 17.79m by 2037. The number of over-85s is estimated to more than double from 1.44 million in 2012 to 3.64 million by 2037.
The number of people of State Pension Age (SPA) in the UK exceeded the number of children for the first time in 2007 and by 2012 the disparity had reached 0.5 million (though this is projected to reverse by).
There are an estimated 3.2 million people with diabetes in the UK (2013). This is predicted to reach 4 million by 2025.
In England the proportion of men classified as obese increased from 13.2 per cent in 1993 to 26.0 per cent in 2013 (peak of 26.2 in 2010), and from 16.4 per cent to 23.8 per cent for women over the same timescale (peak of 26.1 in 2010).
The doctors and nurses that looked after my mum so well are going to be coming under a increasing pressures as this ageing and less healthy population begins to suck ever more resources out of an already stretched system. So why, given the passion everyone has about the NHS, isn’t there more of a focus on getting technology to ease the burden of these overworked healthcare providers?
Part of the problem of course is that historically the tech industry hasn’t exactly covered itself with glory when it comes to delivering technology to the healthcare sector (I’m thinking the NHS National Programme for IT and the US HealthCare.gov system as being two high profile examples). Whilst some of this may be due to the blunders of government much of it is down to a combination of factors caused by both the providers and consumers of healthcare IT mis-communication and not understanding the real requirements that such complex systems tend to have.
Have a clear monetization strategy and understand your customers’ willingness-to-pay.
Know the rules and regulations.
Figure out what your unfair competitive advantage is.
Of course, these are strategies that actually apply to any industry when trying to bring about innovation and disruption – they are not unique to healthcare. I would say that when it comes to the healthcare industry the reason why there has been no Uber is because the tech industry is ignoring the generation that is in most need of benefiting from technology, namely the post 65 age group. This is the age group that struggle most with technology either because they are more likely to be digitally disadvantaged or because they simply find it too difficult to get to grips with it.
“Venture capitalists are too busy investing in Uber and things that get virality. The reality is that selling to older people is harder, and if venture capitalists detect resistance, they don’t invest.”
Matters are not helped by the fact that most tech entrepreneurs are between the ages of 20 and 35 and have different interests in life than the problems faced by the aged. As this article by Kevin Maney in the Independent points out:
“Entrepreneurs are told that the best way to start a company is to solve a problem they understand. It makes sense that those problems range from how to get booze delivered 24/7 to how to build a cloud-based enterprise human resources system – the tangible problems in the life and work of a 25- or 30-year-old.”
If it really is the case that entrepreneurs only look at problems they understand or are on their immediate event horizon then clearly we need more entrepreneurs of my age group (let’s just say 45+). We are the people either with elderly parents, like my mum, who are facing the very real problems of old age and poor health and who themselves will very soon be facing the same issues.
“For healthcare in particular, the timing for a game changer couldn’t be better. The industry is coping with upheaval triggered by varied economic, societal and industry influences. Empowered consumers living in an increasingly digital world are demanding more from an industry that is facing growing regulation, soaring costs and a shortage of skilled resources.”
At SXSW, which is running this week in Austin, Texas IBM is providing an exclusive look at its cognitive technology called Watson and showcasing a number of inspiring as well as entertaining applications of this technology. In particular on Tuesday 15th March there is a session called Ageing Populations & The Internet of Caring Things where you can take a look at accessible technology and how it will create a positive impact on an aging person’s quality of life.
Also at SXSW this year President Obama gave a keynote interview where he called for action in the tech world, especially for applications to improve government IT. The President urged the tech industry to solve some of the nation’s biggest problems by working in conjunction with the government. “It’s not enough to focus on the cool, next big thing,” Obama said, “It’s harnessing the cool, next big thing to help people in this country.”
It is my hope that with the vision that people such as Obama have given the experience of getting old will be radically different 10 or 20 years from now and that cognitive and IoT technology will make all of out lives not only longer but more more pleasant.
* Unicorns are referred to companies whose valuation has exceeded $1 billion dollars.
This week (Monday 25th) I gave a lecture about IBM’s Watson technology platform to a group of first year students at Warwick Business School. My plan was to write up the transcript of that lecture, with links for references and further study, as a blog post. The following day when I opened up my computer to start writing the post I saw that, by a sad coincidence, Marvin Minsky the American cognitive scientist and co-founder of the Massachusetts Institute of Technology’s AI laboratory had died only the day before my lecture. Here is that blog post, now updated with some references to Minsky and his pioneering work on machine intelligence.
First though, let’s start with Alan Turing, sometimes referred to as “the founder of computer science”, who led the team that developed a programmable machine to break the Nazi’s Enigma code, which was used to encrypt messages sent between units on the battlefield during World War 2. The work of Turing and his team was recently brought to life in the film The Imitation Game starring Benedict Cumberbatch as Turing and Keira Knightley as Joan Clarke, the only female member of the code breaking team.
Sadly, instead of being hailed a hero, Turing was persecuted for his homosexuality and committed suicide in 1954 having undergone a course of hormonal treatment to reduce his libido rather than serve a term in prison. It seems utterly barbaric and unforgivable that such an action could have been brought against someone who did so much to affect the outcome of WWII. It took nearly 60 years for his conviction to be overturned when on 24 December 2013, Queen Elizabeth II signed a pardon for Turing, with immediate effect.
In 1949 Turing became Deputy Director of the Computing Laboratory at Manchester University, working on software for one of the earliest computers. During this time he worked in the emerging field of artificial intelligence and proposed an experiment which became known as the Turing test having observed that: “a computer would deserve to be called intelligent if it could deceive a human into believing that it was human.”
The idea of the test was that a computer could be said to “think” if a human interrogator could not tell it apart, through conversation, from a human being.
Turing’s test was supposedly ‘passed’ in June 2014 when a computer called Eugene fooled several of its interrogators that it was a 13 year old boy. There has been much discussion since as to whether this was a valid run of the test and that the so called “supercomputer,” was nothing but a chatbot or a script made to mimic human conversation. In other words Eugene could in no way considered to be intelligent. Certainly not in the sense that Professor Marvin Minsky would have defined intelligence at any rate.
In the early 1970s Minsky, working with the computer scientist and educator Seymour Papert, wrote a book called The Society of Mind, which combined both of their insights from the fields of child psychology and artificial intelligence.
Minsky and Papert believed that there was no real difference between humans and machines. Humans, they maintained, are actually machines of a kind whose brains are made up of many semiautonomous but unintelligent “agents.” Their theory revolutionized thinking about how the brain works and how people learn.
Despite the more widespread accessibility to apparently intelligent machines with programs like Apple Siri Minsky maintained that there had been “very little growth in artificial intelligence” in the past decade, saying that current work had been “mostly attempting to improve systems that aren’t very good and haven’t improved much in two decades”.
Minsky also thought that large technology companies should not get involved the field of AI saying: “we have to get rid of the big companies and go back to giving support to individuals who have new ideas because attempting to commercialise existing things hasn’t worked very well,”
Whilst much of the early work researching AI certainly came out of organisations like Minsky’s AI lab at MIT it seems slightly disingenuous to believe that commercialistion of AI, as being carried out by companies like Google, Facebook and IBM, is not going to generate new ideas. The drive for commercialisation (and profit), just like war in Turing’s time, is after all one of the ways, at least in the capitalist world, that innovation is created.
Which brings me nicely to Watson.
IBM Watson is a technology platform that uses natural language processing and machine learning to reveal insights from large amounts of unstructured data. It is named after Thomas J. Watson, the first CEO of IBM, who led the company from 1914 – 1956.
IBM Watson was originally built to compete on the US television program Jeopardy. On 14th February 2011 IBM entered Watson onto a special 3 day version of the program where the computer was pitted against two of the show’s all-time champions. Watson won by a significant margin. So what is the significance of a machine winning a game show and why is this a “game changing” event in more than the literal sense of the term?
Today we’re in the midst of an information revolution. Not only is the volume of data and information we’re producing dramatically outpacing our ability to make use of it but the sources and types of data that inform the work we do and the decisions we make are broader and more diverse than ever before. Although businesses are implementing more and more data driven projects using advanced analytics tools they’re still only reaching 12% of the data they have, leaving 88% of it to go to waste. That’s because this 88% of data is “invisible” to computers. It’s the type of data that is encoded in language and unstructured information, in the form of text, that is books, emails, journals, blogs, articles, tweets, as well as images, sound and video. If we are to avoid such a “data waste” we need better ways to make use of that data and generate “new knowledge” around it. We need, in other words, to be able to discover new connections, patterns, and insights in order to draw new conclusions and make decisions with more confidence and speed than ever before.
For several decades we’ve been digitizing the world; building networks to connect the world around us. Today those networks connect not just traditional structured data sources but also unstructured data from social networks and increasingly Internet of Things (IoT) data from sensors and other intelligent devices.
These additional sources of data mean that we’ve reached an inflection point in which the sheer volume of information generated is so vast; we no longer have the ability to use it productively. The purpose of cognitive systems like IBM Watson is to process the vast amounts of information that is stored in both structured and unstructured formats to help turn it into useful knowledge.
There are three capabilities that differentiate cognitive systems from traditional programmed computing systems.
Understanding: Cognitive systems understand like humans do, whether that’s through natural language or the written word; vocal or visual.
Reasoning: They can not only understand information but also the underlying ideas and concepts. This reasoning ability can become more advanced over time. It’s the difference between the reasoning strategies we used as children to solve mathematical problems, and then the strategies we developed when we got into advanced math like geometry, algebra and calculus.
Learning: They never stop learning. As a technology, this means the system actually gets more valuable with time. They develop “expertise”. Think about what it means to be an expert- – it’s not about executing a mathematical model. We don’t consider our doctors to be experts in their fields because they answer every question correctly. We expect them to be able to reason and be transparent about their reasoning, and expose the rationale for why they came to a conclusion.
The idea of cognitive systems like IBM Watson is not to pit man against machine but rather to have both reasoning together. Humans and machines have unique characteristics and we should not be looking for one to supplant the other but for them to complement each other. Working together with systems like IBM Watson, we can achieve the kinds of outcomes that would never have been possible otherwise:
IBM is making the capabilities of Watson available as a set of cognitive building blocks delivered as APIs on its cloud-based, open platform Bluemix. This means you can build cognition into your digital applications, products, and operations, using any one or combination of a number of available APIs. Each API is capable of performing a different task, and in combination, they can be adapted to solve any number of business problems or create deeply engaging experiences.
So what Watson APIs are available? Currently there are around forty which you can find here together with documentation and demos. Four examples of the Watson APIs you will find at this link are:
Understand someones personality from what they have written.
It’s never been easier to get started with AI by using these cognitive building blocks. I wonder what Turing would have made of this technology and how soon someone will be able to pin together current and future cognitive building blocks to really pass Turing’s famous test?
You can always tell when a technology has reached a certain level of maturity when it gets its own slot on the BBC Radio 4 news program ‘Today‘ which runs here in the UK every weekday morning from 6am – 9am.
Yesterday (Tuesday 19th January) morning saw the UK government’s Chief Scientific Advisor, Sir Mark Walport, talking about blockchain (AKA distributed ledger) and advocating its use for a variety of (government) services. The interview was to publicise a new government report on distributed ledger technology (the Blackett review) which you can find here.
The report has a number of recommendations including the creation of a distributed ledger demonstrator and calls for collaboration between industry, academia and government around standards, security and governance of distributed ledgers.
As you would expect there are a number of startups as well as established companies working on applications of distributed ledger technology including R3CEV whose head of technology is Richard Gendal Brown, an ex-colleague of mine from IBM. Richard tweets on all things blockchain here and has a great blog on the subject here. If you want to understand blockchain you could take a look at Richard’s writings on the topic here. If you want an extremely interesting weekend read on the current state of bitcoin and blockchain technology this is a great article.
IBM, recognising the importance of this technology and the impact it could have on society, is throwing its weight behind the Linux Foundations project that looks to advance this technology following the open source model.
From a software architecture perspective I think this topic is going to be huge and is ripe for some first mover advantage. Those architects who can steal a lead on not only understanding but explaining this technology are going to be in high demand and if you can help with applying the technology in new and innovative ways you are definitely going to be a rockstar!