Is the Raspberry Pi the New BBC Microcomputer?

There has been much discussion here in the UK over the last couple of years about the state of tech education and what should be done about it. The concern being that our schools are not doing enough to create the tech leaders and entrepreneurs of the future.

The current discussion kicked off  in January 2011 when Microsoft’s director of education, Steve Beswick, claimed that in UK schools there is much “untapped potential” in how teenagers use technology. Beswick said that a Microsoft survey had found that 71% of teenagers believed they learned more about information technology outside of school than in formal information and communication technology (ICT) lessons. An interesting observation given that one of the criticisms often leveled at these ICT classes is that they just teach kids how to use Microsoft Office.The discussion moved in August of 2011, this time at the Edinburgh International Television Festival where Google chairman Eric Schmidt said he thought education in Britain was holding back the country’s chances of success in the digital media economy. Schmidt said he was flabbergasted to learn that computer science was not taught as standard in UK schools, despite what he called the “fabulous initiative” in the 1980s when the BBC not only broadcast programmes for children about coding, but shipped over a million BBC Micro computers into schools and homes.

January 2012 saw even the schools minister, Michael Gove, say that the ICT curriculum was “a mess” and must be radically revamped to prepare pupils for the future (Gove suspended the ICT Curriculum in September 2012). All well and good but as some have commented “not everybody is going to need to learn to code, but everyone does need office skills”.

In May 2012 Schmidt was back in the UK again, this time at London’s Science Museum where he announced that Google would provide the funds to support Teach First – a charity which puts graduates on a six-week training programme before deploying them to schools where they teach classes over a two-year period.

So, what now? With the new ICT curriculum not due out until 2014 what are the kids who are about to start their GCSE’s to do? Does it matter they won’t be able to learn ICT at school? The Guardian’s John Naughton proposed a manifesto for teaching computer science in March 2012 as part of his papers digital literacy campaign.  As I’ve questioned before should it be the role of schools to teach the very specific programming skills being proposed; skills that might be out of date by the time the kids learning them enter the workforce? Clearly something needs to be done otherwise, as my colleague Dr Rick Robinson says, where will the next generation of technology millionaires come from? bbc micro

Whatever shape the new curriculum takes, one example (one that Eric Schmidt himself used) of a success story in the learning of IT skills is that of the now almost legendary BBC Microcomputer. A project started 30 years ago this year. For those too young to remember, or were not around in the UK at the time, the BBC Microcomputer got its name from project devised by the BBC to enhance the nation’s computer literacy. The BBC wanted a machine around which they could base a series called The Computer Programme, showing how computers could be used, not just for computer programming but also for graphics, sound and vision, artificial intelligence and controlling peripheral devices. To support the series the BBC drew up a spec for a computer that could be bought by people watching the programme to actually put into practice what they were watching. The machine was built by Acorn the spec of which you can read here.ba8dd-bbcmicroscreen

The BBC Micro was not only a great success in terms of the television programme, it also helped spur on a whole generation of programmers. On turning the computer on you were faced with the screen on the right. The computer would not do anything unless you fed it instructions using the BASIC programming language so you were pretty much forced to learn programming! I can vouch for this personally because although I had just entered the IT profession at the time this was in the days of million pound mainframes hidden away in backrooms guarded jealously by teams of computer operators who only gave access via time-sharing for minutes at a time. Having your own computer which you could tap away on and get instant results was, for me, a revelation.

Happily it looks like the current gap in the IT curriculum may about to be filled by the humble Raspberry Pi computer. The idea behind the Raspberry Pi came from a group of computer scientists at Cambridge, England’s computer laboratory back in 2006. As Ebon Upton founder and trustee of the Raspberry Pi Foundation said:

Something had changed the way kids were interacting with computers. A number of problems were identified: the colonisation of the ICT curriculum with lessons on using Word and Excel, or writing webpages; the end of the dot-com boom; and the rise of the home PC and games console to replace the Amigas, BBC Micros, Spectrum ZX and Commodore 64 machines that people of an earlier generation learned to program on.

Out of this concern at the lack of programming and computer skills in today’s youngsters was born the Raspberry Pi computer (see below) which began shipping in February 2012. Whilst the on board processor and peripheral controllers on this credit card sized, $25 device are orders of magnitude more powerful than anything the BBC Micros and Commodore 64 machines had, in other ways this computer is even more basic than any of those computers. It comes with no power supply, screen, keyboard, mouse or even operating system (Linux can be installed via a SD card). There is quite a learning curve just to get up and running although what Raspberry Pi has going for it that the BBC Micro did not is the web and the already large number of help pages as well as ideas for projects and even the odd Raspberry Pi Jam (get it). Hopefully this means these ingenious devices will not become just another piece of computer kit lying around in our school classrooms.e65ef-raspberrypi

The Computer Literacy Project (CLP) which was behind the idea of the original BBC Micro and “had the grand ambition to change the culture of computing in Britain’s homes” produced a report in May of this year called The Legacy of the BBC Micro which, amongst other things, explores whether the CLP had any lasting legacy on the culture of computing in Britain. The full report can be downloaded here. One of the recommendations from the report is that “kit, clubs and formal learning need to be augmented by support for individual learners; they may be the entrepreneurs of the future“. 30 years ago this support was provided by the BBC as well as schools. Whether the same could be done today in schools that seem to be largely results driven and a BBC that seems to be imploding in on itself is difficult to tell.

And so to the point of this post: is the Raspberry Pi the new BBC Micro in the way it spurred on a generation of programmers that spread their wings and went on to create the tech boom (and let’s not forget odd bust) of the last 30 years? More to the point, is that what the world needs right now? Computers are getting getting far smarter “out of the box”. IBM’s recent announcements of it’s PureSystems brand promise a “smarter approach to IT” in terms of installation, deployment, development and operations. Who knows what stage so called expert integrated systems will be at by the time today’s students begin to hit the workforce in 5 – 10 years time? Does the Raspberry Pi have a place in this world? A world where many, if not most, programming jobs continue to be shipped to low cost regions, currently the BRIC, MIST countries and so on, I am sure, the largely untapped African sub-continent.

I believe that to some extent the fact that the Raspberry Pi is a computer and yes, with a bit of effort, you can program it, is largely an irrelevance. What’s important is that the Raspberry Pi ignites an interest in a new generation of kids that gets them away from just consuming computing (playing games, reading Facebook entries, browsing the web etc) to actually creating something instead. It’s this creative spark that is needed now, today and as we move forward that, no matter what computing platforms we have in 5, 10 or 50 years time, will always need creative thinkers to solve the worlds really difficult business and technical problems.

And by the way my Raspberry Pi is on order.

Oh Dear, Here We Go Again!

So, here we go again. The BBC today report that “IT giants are ‘ripping off’ Whitehall, say MPs”. As I presumably work for one of those “IT giants” I will attempt to comment on this in as impartial a way as is possible.

  • As long as we have ‘IT projects’ rather than ‘business improvement’ or ‘business change’ projects in government, or anywhere else come to that, we (and it is ‘we’ as tax payers) will continue to get ‘ripped off’. Buying IT because it is ‘sexy‘  is always going to end in tears. IT is a tool that may or may not fix a business problem. Unless you understand the true nature of that business problem throwing IT at it is doomed to failure. This is what software architects need to focus on. I’m coming to the conclusion that the best architects are actually technophobes rather than technophiles.
  • It’s not Whitehall that is being ‘ripped off’ here. It’s you and me as tax payers (assuming you live in the UK and pay taxes to the UK government of course). Whether you work in IT or anywhere else this effects you.
  • It’s not only understanding the requirements that is important, it’s also challenging those requirements as well as the business case that led to them in the first place. I suspect that many, many projects have been dreamt up as someones fantasy, nice to have system rather than having any real business value.
  • Governments should be no different from anyone else when it comes to buying IT. If I’m in the market for a new laptop I usually spend a little time reading up on what other buyers think and generally make sure I’m not about to buy something that’s not fit for purpose. One of the criticisms leveled at government in this report is the “lack of IT skills in government and over-reliance on contracting out”. In other words there are not enough experienced architects who work in government that can challenge some of the assumptions and proposed solutions that come from vendors.
  • Both vendors and government departments need to learn how to make agile work on large projects. We have enough experience now to know that multi-year, multi-person, multi-million pound projects that aim to deliver ‘big-bang’ fashion just do not work. Bringing a more agile approach to the table, delivering a little but more often so users can verify and feedback on what they are getting for their money is surely the way to go. This approach depends on more trust between client and supplier as well as better and more continuous engagement throughout the project’s life.

Ethics and Architecture

If you’ve not seen the BBC2 documentary All Watched Over By Machines of Loving Grace catch it now on the BBC iPlayer while you can (doesn’t work outside the UK unfortunately). You can see a preview of the series (another two to go) on Adam Curtis’ (the film maker) web site here. The basic premise of the first programme is as follows.Back in the 50’s a small group of people took up the ideas of the novelist Ayn Rand whose philosophy of Objectivism advocated reason as the only means of acquiring knowledge and rejecting all forms of faith and religion. They saw themselves as a prototype for a future society where everyone could follow their own selfish desires. One of the Rand ‘disciples’ was Alan Greenspan. Cut to the 1990’s where several Silicon Valley entrepreneurs,  also followers of Rand’s philosophy, believed that the new computer networks would allow the creation of a society where everyone could follow their own desires without there being any anarchy. Alan Greenspan, now Chairman of the Federal Reserve, also became convinced that the computers were creating a new kind of stable capitalism and convinced President Bill Clinton of a radical approach to cut the United States huge deficit. He proposed that Clinton cut government spending and reduce interest rates letting the markets control the fate of the economy, the country and ultimately the world. Whilst this approach appeared to work in the short term, it set off a chain of events which, according to Curtis’ hypothesis, led to 9/11, the Asian financial crash of 1997/98, the current economic crisis and the rise of China as a superpower that will soon surpass that of the United States. What happened was that the “blind faith” we put in the machines that were meant to serve us led us to a “dream world” where we trusted the machines to manage the markets for us but in fact they were operating in ways we could not understand resulting in outcomes we could never predict.

So what the heck has this got to do with architecture?  Back in the mid-80’s when I worked in Silicon Valley I remember reading an article in the San Jose Mercury News about a programmer who had left his job because he didn’t like the applications that the software he’d been working on were being put to (something of a military nature I suspect). Quite a noble act you might think (though given where he worked I suspect the guy didn’t have too much trouble finding another job pretty quickly). I wonder how many of us really think about what the uses of the software systems we are working on are being put to?

Clearly if you are working on the control software for a guided missile it’s pretty clear cut what the application is going to be used for. However what about if you are creating some piece of generic middleware? Yes it could be put to good use in hospital information systems or food-aid distribution systems however the same software could be used for the ERP system of a tobacco company or in controlling surveillance systems that “watch over us with loving grace”.

Any piece of software can be used for both good and evil and the developers of that software can hardly have it on their conscious to worry about what that end use will be. Just like nuclear power leads to both good (nuclear reactors, okay, okay I know that’s debatable given what’s just happened in Japan) and bad (bombs) it is the application of a particular technology that decides whether something is good or bad. However, here’s the rub. As architects aren’t we the ones who are meant to be deciding on how software components are put together to solve problems, for both better and for worse? Is it not within our remit to control those ‘end uses’ therefore and to walk away from those projects that will result in systems that are being built for bad rather than good purposes? We all have our own moral compass and it is up to us as individuals to decide which way we point our own compasses. From my point of view I would hope that I never got involved in systems that in anyway lead to an infringement of a persons basic human rights but how do I decide or know this? I doubt the people that built the systems that are the subject of the Adam Curtis films ever dreamed they would be used in ways which have almost led to the economic collapse of our society? I guess it is beholden on all of us to research and investigate as much as we can those systems we find ourselves working on and decide for ourselves whether we think we are creating machines that watch over us with “loving grace” or which are likely to have more sinister intents. As ever, Aurthur C. Clarke predicted this several decades ago and if you have not read his short story Dial F for Frankenstein now might be a good time to do so.