How Cloud is Changing Roles in IT

Every major change in technology comes with an inevitable upheaval in the job market. New jobs appear, existing ones go away and others morph into something different. When the automobile came along and gradually replaced the horse drawn carriage, I’m sure carriage designers and builders were able to apply their skills to designing the new horseless carriage (at least initially) whilst engine design was a completely new role that had to be invented. The role of the blacksmith however declined rapidly as far fewer horses were needed to pull carriages.
Smedje i Hornbæk, 1875The business of IT has clearly gone through several transformational stages since the modern age of commercial computing began 55 years ago with the introduction of the IBM 1401 the world’s first fully transistorized computer.  By the mid-1960?s almost half of all computer systems in the world were 1401 type machines.
IBM 1401During the subsequent 50 years we have gone through a number of different ages of computing; each corresponding to the major, underlying architecture which was dominant during that period. The ages with their (very) approximate time spans are:

  1.  The Age of the Mainframe (1960 – 1975)
  2. The Age of the Mini Computer (1975 – 1990)
  3. The Age of Client-Server (1990 – 2000)
  4. The Age of the Internet (2000 – 2010)
  5. The Age of Mobile (2010 – 20??)

Of course, the technologies from each age have never completely gone away, they are just not the predominant driving IT force any more. For example there are still estimated to be some 15,000 mainframe installations world-wide so mainframe programmers are not about to see the end of their careers any time soon. Similarly, there are other technologies bubbling under the surface running alongside and actually overlapping these major waves. For example, networking has evolved from providing the ability to connect a “green screen” to a centralised mainframe, and then mini, to the ability to connect thousands, then millions and now billions of devices. The client-server age and internet age were dependent on cheap and ubiquitous desktop personal computers whilst the age of mobile is driven by offspring’s of the PC, now unshackled from the desktop, which run the same applications (and much, much more) on smaller and smaller devices.

The current mobile age is about far more than the ubiquitous smart devices which we now all own. It’s also driven by the technologies of cloud, analytics and social media but more than anything, it’s about how these technologies are coming together to form a perfect storm that promises to take us beyond computing as just a utility, which serves up the traditional corporate data from systems of record, to systems of engagement where our devices become an extension of ourselves that anticipate our needs and help us get what we want, when we want it. If the first three ages helped us define our systems of record the last two have not just moved us to systems of engagement they have also created what has been termed the age of context – an always on society where pervasive computing is reshaping our lives in ways that could not have been possible as little as ten years ago.

For those of us that work in IT what does this new contextual age mean in terms of our jobs and the roles we play in interacting with our peers and our clients? Is the shift to cloud, analytics, mobile and social just another technology change or does it represent something far more fundamental in how we go about doing the business of IT?

perfect-stormIn 2012 the IBM Distinguished Engineer John Easton produced a thought leadership white paper Exploring the impact of Cloud on IT roles and responsibilities which used an IBM patented technique called Component Business Modeling to map out the key functions of a typical IT department and look at how each of these might change when the delivery of IT services was moved to a cloud provider. Not entirely without surprise John’s paper concluded that “many roles will move from the enterprise to the cloud provider” and that “the responsibilities and importance of the surviving IT roles will change in the new world.”

As might be expected the roles that are likely to be no longer needed are the ones that are today involved in the building and running of IT systems, those to do with the development and deployment aspects of IT and those in ancillary functions like support operations and planning.

Some functions, whilst they still exist, are likely to be dramatically reduced in scope. Things like risk and compliance, information architecture and security, privacy and data protection fall into this category. These are all functions which the enterprise at least needs to have some say in but which will largely be dictated by the cloud provider and have to be taken or not depending on the service levels needed by the enterprise.

The most interesting category of functions affected by moving to the cloud are those that grow in importance. These by and large are in the competencies of customer relationship and business strategy and administration. These cover areas like enterprise architecture, portfolio & service management and demand & performance planning. In other words the areas that are predicted to grow in importance are those that involve IT talking to the business to understand what it is they want both in terms of functionality and service levels as well as ensuring the enterprise has a vision of how it can use IT to maintain competitive advantage.

Back in 2005 the research firm Gartner predicted that demand for IT specialists could shrink as much as 40 percent within the next five years. It went on to coin the term “IT versatilist”, people who are not only specialized in IT, but who demonstrate business competencies by handling multidisciplinary assignments. According to the research firm, businesses will increasingly look to employ versatilists saying “the long-term value of today’s IT specialists will come from understanding and navigating the situations, processes and buying patterns that characterize vertical industries and cross-industry processes”. In 2005 the concept of cloud computing was still in its infancy; the term did not really enter popular usage until a year later when Amazon introduced the Elastic Compute Cloud.  What had been talked about before this was the concept of utility computing and indeed as far back as 1961 the computer scientist John McCarthy predicted that “computation may someday be organized as a public utility.”

Fast forward to 2014 and cloud computing is very much here to stay. IT professionals are in the midst of a fundamental change that, just like with the advent of the “horseless carriage” (AKA the motor car), is going to remove some job roles altogether but at the same time open up new and exciting opportunities that allow us to focus on our clients real needs and craft IT solutions that provide new and innovative ways of doing business. The phrase “may you live in interesting times” has been taken to mean “may you experience much disorder and trouble in your life”.  I prefer to interpret the phrase as “may you experience much disruption and amazement in your life” for that is most certainly what this age of context seems to be creating.

A slightly edited version of this post also appears here.

Software Architecture Zen is Five Years Old

So Software Architecture Zen is five years old (actually on 28th July, I missed my own birthday).

I started this blog on the Blogger platform and moved to WordPress earlier this year. Whilst my WordPress following has not built up to the same level I had on Blogger I far prefer the tools and whole look and feel offered by WordPress so do not regret the move.

I’ve had just over 103,000 hits in total across both platforms in five years. Not quite up there with the Joel on Software blogs of the world but not too shabby either I think. Here are my top five posts of the last five years:

  1. Architecture vs. Design
  2. How to Create Effective Technical Presentations
  3. On Thinking Architecturally
  4. Two Diagrams All Software Architects Need
  5. The Moral Architect

I’m pleased the last one is up there as it’s a topic dear to my heart and one I plan to blog on more in the future. Here’s to the next five years!

 

Complexity is Simple

I was taken with this cartoon and the comments put up by Hugh Macleod last week over at his gapingvoid.com blog so I hope he doesn’t mind me reproducing it here.

Complexity is Simple (c) Hugh Macleod 2014
Complexity is Simple (c) Hugh Macleod 2014

Complex isn’t complicated. Complex is just that, complex.

Think about an airplane taking off and landing reliably day after day. Thousands of little processes happening all in sync. Each is simple. Each adds to the complexity of the whole.

Complicated is the other thing, the thing you don’t want. Complicated is difficult. Complicated is separating your business into silos, and then none of those silos talking to each other.

At companies with a toxic culture, even what should be simple can end up complicated. That’s when you know you’ve really got problems…

I like this because it resonates perfectly well with a blog post I put up almost four years ago now called Complex Systems versus Complicated Systems. where I make the point that “whilst complicated systems may be complex (and exhibit emergent properties) it does not follow that complex systems have to be complicated“. A good architecture avoids complicated systems by building them out of lots of simple components whose interactions can certainly create a complex system but not one that needs to be overly complicated.

Avoiding the Legacy of the Future

Kerrie Holley is a software architect and IBM Fellow. The title of IBM Fellow is not won easily. They are a group that includes a Kyoto Prize winner and five Nobel Prize winners, they have fostered some of the IBM company’s most stunning technical breakthroughs―from the Fortran computing language to the systems that helped put the first man on the moon to the Scanning Tunneling Microscope, the first instrument to image atoms. These are people that are big thinkers and don’t shy away from tackling some of the worlds wicked problems.

Listening to Kerrie give an inspirational talk called New Era of Computing at an IBM event recently I was struck by a comment he made which is exactly the kind of hard question I would expect an IBM Fellow to make. It was:

The challenge we have is to avoid the legacy of the future. How do we avoid applications becoming an impediment to business change?

Estimates vary, but it is reckoned that most organizations spend between 70% and 80% on maintenance and only 30% to 20% on innovation. When 80% of a companies IT budget is being spent in just keeping the existing systems running then how are they to deploy new capabilities that keep them competitive? That, by any measure, is surely an “impediment to business change”. So, what to do? Here are a few things that might help avoid the legacy of the future and show how we as architects can play our part in addressing the challenge posed in Kerrie’s question.

  1. Avoid (or reduce) technical debt. Technical debt is what you get when you release not-quite-right code out into the world. Every minute spent on not-quite-right code counts as interest on that debt. Entire engineering organizations can be brought to a stand-still under the debt load of an error-prone deployment. Reducing the amount of technical debt clearly reduces the amount of money you have to spend on finding and patching buggy code. Included here is code that is “buggy” because it is not doing what was intended of it. The more you can do to ensure “right-first-time” deployments the lower your maintenance costs and the more of your budget you’ll have to spend on innovation rather than maintenance. Agile is one tried and tested approach to ensuring better, less buggy code that meets the original requirements but traditionally agile has only focused on a part of the software delivery lifecycle, the development part. DevOps uses the best of the agile approach but extends that into operations. DevOps works by engaging and aligning all participants in the software delivery lifecycle — business teams; architects, developers, and testers; and IT operations and production — around a single, shared goal: sustained innovation, fueled by continuous delivery and shaped by continuous feedback.
  2. Focus on your differentiators. It’s tempting for CIOs and CTOs to think all of the technology they use is somehow going to give them that competitive advantage and must therefore be bespoke or at least highly customised packages. This means more effort in supporting those systems once they are deployed. Better is to focus on those aspects of the business’ IT which truly give real business advantage and focus IT budget on those. For the rest use COTS packages or put as much as possible into the cloud and standardise as much as possible. One of the implications of standardisation is that your business needs to change to match the systems you use rather than the other way around. This can often be a hard pill for a business to swallow as they think their processes are unique. Rarely is this the case however so recognising this and adopting standard processes is a good way of freeing up IT time and budget to focus on stuff that really is novel.
  3. Adopt open standards and componentisation. Large monolithic packages which purport to do everything, with appropriate levels of customisation, are not only expensive to build in the first place are likely to be more expensive to run as they cannot easily be updated in a piecemeal fashion. If you want to upgrade the user interface or open up the package to different user channels it may be difficult if interfaces are not published or packages themselves do not have replaceable parts. Very often you may have to replace the whole package or wait for the vendor to come up with the updates. Building applications from a mix of COTS and bespoke components and services which talk through an open API allows more of a mix and match approach to procuring and operating business systems. It also makes it easier to retire services that are no longer required or used. The term API economy is usually used to refer to how a business can expose its business functions (as APIs) to external parties however there is no reason why an internal API economy should not exist. This allows for the ability to quickly subscribe to or unsubscribe to business functionality making business more agile by driving a healthy competition for business function.

Businesses will always need to devote some portion of their IT budget to “keeping the lights on” however there is no reason why, with the adoption of one of more of these practices, the split between maintenance and innovation budgets should not be a more 50:50 one than the current highly imbalanced 70:30 or worse!

The Wicked Problems of Government

The dichotomy of our age is surely that as our machines become more and more intelligent the problems that we need them to solve are becoming ever more difficult and intractable. They are indeed truly wicked problems, no more so than in our offices of power where the addition of political and social ‘agendas’ would seem to make some of the problems we face even more difficult to address.

Poll Tax
A Demonstration Against the Infamous ‘Poll Tax’

In their book The Blunders of Our Governments the authors Anthony King and Ivor Crewe recall some of the most costly mistakes made by British governments over the last three decades. These include policy blunders such as the so called poll tax introduced by the Thatcher government in 1990 which led to rioting on the streets of many UK cities (above). Like the poll tax many, in fact most, of the blunders recounted are not IT related however the authors do devote a whole chapter (chapter 13 rather appropriately) to the more egregious examples of successive governments IT blunders. These include:

  • The Crown Prosecution Service, 1989 – A computerised system for tracking prosecutions. Meant to be up and running by 1993-94, abandoned in 1997 following a critical report from the National Audit Office (NAO).
  • The Department of Social Security, 1994 – A system to issue pensions and child benefits using swipe cards rather than the traditional books which were subject to fraud and also inefficient. The government cancelled the project in 1999 after repeated delays and disputes between the various stakeholders and following another critical report by the NAO.
  • The Home Office (Immigration and Nationality Directorate), 1996 – An integrated casework system to deal with asylum, refugee and citizenship applications. The system was meant to be live by October of 1998 but was cancelled in 1999 at a cost to the UK taxpayer of at least £77 million. The backlog of cases for asylum and citizenship which the system had meant to address had got worse not better.

Whilst the authors don’t offer any cast iron solutions to how to solve these problems they do highlight a number of factors these blunders had in common. Many of these were highlighted in a joint Royal Academy of Engineering and British Computer Society report published 10 years ago this month called The Challenges of Complex IT Projects.The major reasons found for why complex IT projects fail included:

  • Lack of agreed measures of success.
  • Lack of clear senior management ownership.
  • Lack of effective stakeholder management.
  • Lack of project/risk management skills.
  • Evaluation of proposals driven by price rather than business benefits.
  • Projects not broken into manageable steps.

In an attempt to address at least some of the issues around the procurement and operation of government IT systems (which is not restricted to the UK of course), in particular those citizen facing services over the internet, the coalition government that came to power in May 2010 commissioned a strategic review of its online delivery of public services by the UK Digital Champion Martha Lane Fox. Her report published in November 2010 recommended:

  • Provision of a common look and feel for all government departments’ transactional online services to citizens and business.
  • The opening up of government services and content, using application programme interfaces (APIs), to third parties.
  • Putting a new central team in Cabinet Office that is in absolute control of the overall user experience across all digital channels and that commissions all government online information from other departments.
  • Appointing a new CEO for digital in the Cabinet Office with absolute authority over the user experience across all government online services and the power to direct all government online spending.

Another government report, published in July of 2011, by the Public Administration Select Committee entitled Government and IT – “a recipe for rip-offs” – time for a new approach proposed 33 recommendations on how government could improve it’s woeful record for delivering IT. These included:

  • Developing a strategy to either replace legacy systems with newer, less costly systems, or open up the intellectual property rights to competitors.
  • Contracts to be broken up to allow for more effective competition and to increase opportunities for SMEs.
  • The Government must stop departments specifying IT solutions and ensure they specify what outcomes they wish to achieve.
  • Having a small group within government with the skills to both procure and manage a contract in partnership
    with its suppliers.
  • Senior Responsible Owners (SROs) should stay in post to oversee the delivery of the benefits for which they are accountable and which the project was intended to deliver.

At least partly as a result of these reports and their recommendations the Government Digital Service (GDS) was established in April 2011 under the leadership of Mike Bracken (previously Director of Digital Development at The Guardian newspaper). GDS works in three core areas:

  • Transforming 25 high volume key exemplars from across government into digital services.
  • Building and maintaining the consolidated GOV.UK website –  which brings government services together in one place.
  • Changing the way government procures IT services.

To the large corporates that have traditionally provided IT software, hardware and services to government GDS has had a big impact on how they do business. Not only does most business now have to be transacted through the governments own CloudStore but GDS also encourages a strong bias in favour of:

  • Software built on open source technology.
  • Systems that conform to open standards.
  • Using the cloud where it makes sense to do so.
  • Agile based development.
  • Working with small to medium enterprises (SME’s) rather than the large corporates seen as “an oligarchy that is ripping off the government“.

There can be no doubt that the sorry litany of public sector IT project failures, rightly or wrongly, have caused the pendulum to swing strongly in the direction that favours the above approach when procuring IT. However some argue that the pendulum has now swung a little too far. Indeed the UK Labour party has launched its own digital strategy review led by shadow Cabinet Office minister Chi Onwurah. She talks about a need to be more context-driven, rather than transaction focused saying that while the GDS focus has been on redesigning 25 “exemplar” transactions, Labour feels this is missing the complexity of delivering public services to the individual. Labour is also critical of the GDSs apparent hostility to large IT suppliers saying it is an “exaggeration” that big IT suppliers are “the bogeymen of IT”. While Labour supports competition and creating opportunities for SMEs, she said that large suppliers “shouldn’t be locked out, but neither should they be locked in”.

The establishment of the GDS has certainly provided a wake up call for the large IT providers however, and here I agree with the views expressed by Ms Onwurah, context is crucial and it’s far too easy to take an overly simplistic approach to trying to solve government IT issues. A good example of this is that of open source software. Open source software is certainly not free and often not dramatically cheaper than proprietary software (which is often built using some elements of open source anyway) once support costs are taken into account. The more serious problem with open source is where the support from it comes from. As the recent Heartbleed security issue with OpenSSL has shown there are dangers in entrusting mission critical enterprise software to people who are not accountable (and even unknown).

One aspect to ‘solving’ wicked problems is to bring more of a multi-disciplinary approach to the table. I have blogged before about the importance of a versatilist approach in solving such problems. Like it or not, the world cannot be viewed in high contrast black and white terms. One of the attributes of a wicked problem is that there is often no right or wrong answer and addressing one aspect of the problem can often introduce other issues. Understanding context and making smart architecture decisions is one aspect to this. Another aspect is whether the so called SMAC (social, mobile, analytics and cloud) technologies can bring a radically new approach to the way government makes use of IT? This is something for discussion in future blog posts.

“I’ll Send You the Deck”

Warning, this is a rant!

I’m sure we’ve all been here. You’re in a meeting or on a conference call or just having a conversation with a colleague discussing some interesting idea or proposal which he or she has previous experience of and at some point they issue the immortal words “I’ll send you the deck”. The “deck” in question is usually a (at least) 20 page presentation, maybe with lots of diagrams so quite large, of material some of which may, if you’re lucky, relate to what you were actually talking about but most of which won’t. Now, I’m not sure about you but I find this hugely annoying for several reasons. Here are some:

  1. A presentation is for, well presenting. It’s not for relaying information after the event with no speaker to justify its existence. That’s what documents are for. We need to make careful decisions about the tools we use for conveying information recognising that the choice of tool can equally well enhance as well as detract from the information being presented.
  2. Sending a presentation in an email just clogs up your inbox with useless megabytes of data. Not only that but you are then left with the dilemma of what to do with the presentation. Do you detach it and store it somewhere in the hope you will find it later or just leave it in the email to ultimately get lost or forgotten?
  3. Chances are that only a small part of the presentation is actually relevant to what was been discussed so you are left trying to find out what part of the presentation is important and what is largely irrelevant.

So, what is the alternative to “sending a deck”? In this age of social the alternatives are almost too overwhelming but here are a few.

  • If your presentation contains just a few core ideas then take the time to extract the relevant ones and place in the email itself.
  • If the information is actually elsewhere on the internet (or your company intranet) then send a link. If it’s not commercially sensitive and available externally to your organisation why not use Twitter? That way you can also socialize the message more widely.
  • Maybe the content you need to send is actually worth creating as a blog post for a wider, and more permanent distribution (I actually create a lot of my posts like that).
  • Many large organisations are now investing in enterprise social software. Technology such as IBM Connections provides on premise, hybrid and in the cloud based software that not only seamlessly integrates email, instant messaging, blogs, wikis and files but also delivers the information to virtually any mobile device. Enterprise social software allows people to share content and collaborate in new and more creative ways and avoids the loss of information in the ‘tar pits‘ of our hard drives and mail inboxes.

Finally, here’s the last word from Dilbert, who is spot on the money as usual.

Dilbert PowerPoint

(c) 2010 Scott Adams Inc

Architect Salary Survey

The first ever architecture specific salary survey has just been published in the UK by FMC Technology. In total over 1000 architects responded to the survey. The report looks at architect roles in six main areas:

  • Architecture Management
  • Enterprise Architecture
  • Business Architecture
  • Information Architecture
  • Application Architecture
  • Technology Architecture

It also looks at pay rises (in 2013), regional and industry differences as well as motivational factors

The results can be viewed here.

How to Deal with the TED Effect

Nancy Duarte, CEO of Duarte Design and author of the books Resonate: Present Visual Stories that Transform Audiences and slide:ology: The Art and Science of Creating Great Presentations has written a great blog post about what she refers to as the TED effect. The TED effect refers to the impact that the TED conferences have had on all of us who need to present as part of our daily lives.

Nancy’s basic assertion is that “in public speaking it’s no longer okay to be boring”. In the years BT (before TED) it was okay to deliver boring presentations because actually no one knew if you were being boring or not because most people’s bar for what constituted a good presentation was pretty low anyway. In the dark years of BT we would all just sit stoically through those presentations that bored us to death and missed the point completely because bad presentations were just an occupational hazard we all had to learn to deal with. If nothing else it gave us time to catch up on our email or quietly chatter away to a colleague in the back row.

Now though everything has changed! For anyone that has seen more than half a dozen TED talks we know that if we are not engaged within the first 30 seconds we are ready to walk. Not only that if we felt you were wasting our time we go onto Twitter or Facebook and tell the rest of the world how boring you were. If however you did engage us and managed to get across your idea in 18 minutes or under (the maximum time of a TED talk) then we will reward you by spreading your ideas and help you get them adopted and funded.

As technical people software architects often struggle with presentations simply because they are communicating technology so, by definition, that must be complicated and take loads of time with lots of slides containing densely populated text or diagrams that cannot be read unless you are sitting less than a metre from the screen. But, as Nancy Duarte has explained countless times in her books and her blog, it needn’t be like that, even for a die-hard techno-geek.

Here’s my take on on how to deal with the TED effect:

  1. Just because you are given an hour to present, don’t think you have to actually spend that amount of time talking. Use the TED 18 minute rule and try and condense your key points into that time. Use the rest of the time for discussion and exchange of ideas.
  2. Use handouts for providing more detail. Handouts don’t just have to be documents given out during the presentation. Consider writing up the detail in a blog post or similar and provide a link to this at the end of your talk.
  3. Never, ever present slides someone else has created. If a presentation is worth doing then it’s worth investing the time to make it your presentation.
  4. Remember the audience is there to see you speak and hear your ideas. Slides are an aid to get those ideas across and are not an end in their own right. If you’re just reading what’s on the presentation then so can the audience so you may as well not be there.
  5. The best talks are laid out like a book or a movie. They have a beginning, a middle and an end. It often helps to think of the end first (what is the basic idea or point you want to get across) and work backwards from there. As Steven Pressfield says in the book Do the Work, “figure out where you want to go; then work backwards from there”.
  6. Finally, watch as many TED talks as you can to see to see how they engage with the audience and get their ideas across. One of the key attributes you will see all the great speakers have is they are passionate about their subject and this really shines through in their talk. Maybe, just maybe, if you are not really passionate about what your subject you should not be talking about it in the first place?

The Times They Are A-Changin’

Come senators, congressmen
Please heed the call
Don’t stand in the doorway
Don’t block up the hall
For he that gets hurt
Will be he who has stalled
There’s a battle outside and it is ragin’
It’ll soon shake your windows and rattle your walls
For the times they are a-changin’

So sang Bob Dylan in The Times They Are a-Changin’ from his third album of the same name released in early 1964 which makes it 50 years old this year.

These are certainly epochal changing times as we all try to understand the combined forces that social, mobile, analytic and cloud computing are going to have on the world and how we as software architects react to them.

You may have noticed a lack of posts in this blog recently. This is partly due to my own general busyness but also due to the fact that I have been trying to understand and assimilate myself what impact these changes are likely to have on this profession of ours. Is it more of the same, just that the underlying technology is changing (again) or is it really a fundamental change in the way the world is going to work from now on? Whichever it is these are some of the themes I will be covering in upcoming posts in this (hopefully) reinvigorated blog.

I’d like to welcome you to my new place for Software Architecture Zen on the WordPress blogging platform. I’ve been running this blog over on Blogger for getting on five years now but have decided this year to gradually move over here. I hope my readers will follow me here but for now aim to put posts in both places.

The Art of the Possible

This is an edited version of a talk I recently gave to a client. The full talk used elements of my “Let’s Build a Smarter Planet” presentation which you can find starting here.

The author, entrepreneur, marketer, public speaker and blogger Seth Godin has a wonderful definition for what architects do:

Architects take existing components and assemble them in interesting and important ways.

Software architects today have at their disposal a number of ‘large grain’ components, the elements of which we can assemble in a multitude of “interesting and important” ways to make fundamental changes to the world and truly build a smarter planet. These components are shown in the diagram below.

The authors Robert Scoble and Shel Israel in their book Age of Context describe the coming together of these components (actually their components are mobile, social, data, sensors and location) as a perfect storm comparing them with the forces of nature that occasionally converge to whip up a fierce tropical storm.

Of course, like any technological development, there is a down side to all this. As Scoble and Israel point out in their book:

The more the technology knows about you, the more benefits you will receive. That can leave you with the chilling sensation that big data is watching you…

I’ve taken a look at some of this myself here.

Predicting the future is of course a notoriously tricky business. As the late, great science fiction author Aurtur C. Clarke said:

When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

The future, even five years hence, is likely to be very different from what it is now and predicting what might be or not be, even that far ahead, is not an exact science. Despite the perils of making predictions such as this IBM Research’s so called 5 in 5 predictions for this year describe five innovations that will change the way we live, from classrooms that learn to cyber guardians, within the next five years. Here are five YouTube videos that describe these innovations. Further information of 5 in 5 can be found here.

  1. The classroom will learn you.
  2. Buying local will beat online.
  3. Doctors will routinely use your DNA to keep you well.
  4. The city will help you live in it.
  5. A digital guardian will protect you online.

We already have the technology to make our planet ‘smarter’. How we use that technology is limited only by our imagination…