The Fall and Rise of the Full Stack Architect

strawberry-layer-cake

Almost three years ago to the day on here I wrote a post called Happy 2013 and Welcome to the Fifth Age! The ‘ages’ of (commercial) computing discussed there were:

  • First Age: The Mainframe Age (1960 – 1975)
  • Second Age: The Mini Computer Age (1975 – 1990)
  • Third Age: The Client-Server Age (1990 – 2000)
  • Fourth Age: The Internet Age (2000 – 2010)
  • Fifth Age: The Mobile Age (2010 – 20??)

One of the things I wrote in that article was this:

“Until a true multi-platform technology such as HTML5 is mature enough, we are in a complex world with lots of new and rapidly changing technologies to get to grips with as well as needing to understand how the new stuff integrates with all the old legacy stuff (again). In other words, a world which we as architects know and love and thrive in.”

So, three years later, are we any closer to having a multi-platform technology? Where does cloud computing fit into all of this and is multi-platform technology making the world get more or less complex for us as architects?

In this post I argue that cloud computing is actually taking us to an age where rather than having to spend our time dealing with the complexities of the different layers of architecture we can be better utilised by focussing on delivering business value in the form of new and innovative services. In other words, rather than us having to specialise as layer architects we can become full-stack architects who create value rather than unwanted or misplaced technology. Let’s explore this further.

The idea of the full stack architect.

Vitruvius, the Roman architect and civil engineer, defined the role of the architect thus:

“The ideal architect should be a [person] of letters, a mathematician, familiar with historical studies, a diligent student of philosophy, acquainted with music, not ignorant of medicine, learned in the responses of juriconsults, familiar with astronomy and astronomical calculations.”

Vitruvius also believed that an architect should focus on three central themes when preparing a design for a building: firmitas (strength), utilitas (functionality), and venustas (beauty).

vitruvian man
Vitruvian Man by Leonardo da Vinci

For Vitruvius then the architect was a multi-disciplined person knowledgable of both the arts and sciences. Architecture was not just about functionality and strength but beauty as well. If such a person actually existed then they had a fairly complete picture of the whole ‘stack’ of things that needed to be considered when architecting a new structure.

So how does all this relate to IT?

In the first age of computing (roughly 1960 – 1975) life was relatively simple. There was a mainframe computer hidden away in the basement of a company managed by a dedicated team of operators who guarded their prized possession with great care and controlled who had access to it and when. You were limited by what you could do with these systems not only by cost and availability but also by the fact that their architectures were fixed and the choice of programming languages (Cobol, PL/I and assembler come to mind) to make them do things was also pretty limited. The architect (should such a role have actually existed then) had a fairly simple task as their options were relatively limited and the number of architectural decisions that needed to be made were correspondingly fairly straight forward. Like Vitruvias’ architect one could see that it would be fairly straight forward to understand the full compute stack upon which business applications needed to run.

Indeed, as the understanding of these computing engines increased you could imagine that the knowledge of the architects and programmers who built systems around these workhorses of the first age reached something of a ‘plateau of productivity’*.

Architecture Stacks 3

However things were about to get a whole lot more complicated.

The fall of the full stack architect.

As IT moved into its second age and beyond (i.e. with the advent of mini computers, personal computers, client-server, the web and early days of the internet) the breadth and complexity of the systems that were built increased. This is not just because of the growth in the number of programming languages, compute platforms and technology providers but also because each age has built another layer on the previous one. The computers from a previous age never go away, they just become the legacy that subsequent ages must deal with. Complexity has also increased because of the pervasiveness of computers. In the fifth age the number of people whose lives are now affected by these machines is orders of magnitude greater than it was in the first age.

All of this has led to niches and specialisms that were inconceivable in the early age of computing. As a result, architecting systems also became more complex giving rise to what have been termed ‘layer’ architects whose specialities were application architecture, infrastructure architecture, middleware architecture and so on.

Architecture Stacks

Whole professions have been built around these disciplines leading to more and more specialisation. Inevitably this has led to a number of things:

  1. The need for communications between the disciplines (and for them to understand each others ‘language’).
  2. As more knowledge accrues in one discipline, and people specialise in it more, it becomes harder for inter-disciplinary understanding to happen.
  3. Architects became hyper-specialised in their own discipline (layer) leading to a kind of ‘peak of inflated expectations’* (at least amongst practitioners of each discipline) as to what they could achieve using the technology they were so well versed in but something of a ‘trough of disillusionment’* to the business (who paid for those systems) when they did not deliver the expected capabilities and came in over cost and behind schedule.

Architecture Stacks 4

So what of the mobile and cloud age which we now find ourselves in?

The rise of the full stack architect.

As the stack we need to deal with has become more ‘cloudified’ and we have moved from Infrastructure as a Service (IaaS) to Platform as a Service (PaaS) it has become easier to understand the full stack as an architect. We can, to some extent, take for granted the lower, specialised parts of the stack and focus on the applications and data that are the differentiators for a business.

Architecture Stacks 2

We no longer have to worry about what type of server to use or even what operating system or programming environments have to be selected. Instead we can focus on what the business needs and how that need can be satisfied by technology. With the right tools and the right cloud platforms we can hopefully climb the ‘slope of enlightenment’ and reach a new ‘plateau of productivity’*.

Architecture Stacks 5

As Neal Ford, Software Architect at Thoughtworks says in this video:

“Architecture has become much more interesting now because it’s become more encompassing … it’s trying to solve real problems rather than play with abstractions.”

 

I believe that the fifth age of computing really has the potential to take us to a new plateau of productivity and hopefully allow all of us to be architects described by this great definition from the author, marketeer and blogger Seth Godin:

“Architects take existing components and assemble them in interesting and important ways.”

What interesting and important things are you going to do in this age of computing?

* Diagrams and terms borrowed from Gartner’s hype cycle.

It’s that time of year…

… for everyone to predict what will be happening in the world of tech in 2016. Here’s a roundup of some of the cloud and wider IT predictions that have been hitting my social media feeds over the last week or so.

First off is Information Week with 8 Cloud Computing Predictions for 2016.

  1. Hybrid will become the next-generation infrastructure foundation.
  2. Security will continue to be a concern.
  3. We’re entering the second wave of cloud computing where cloud native apps will be the new normal.
  4. Compliance will no longer be such an issue meaning barriers to entry onto the cloud for most enterprises, and even governments, will be lowered or even disappear.
  5. Containers will become mainstream.
  6. Use of cloud storage will grow (companies want to push the responsibility of managing data, especially its security, to third parties).
  7. Momentum of IoT will pick up.
  8. Use of hyper-converged (software defined infrastructure) platforms will increase.

Next up IBM’s Thoughts on Cloud site has a whole slew of predictions including  5 reasons 2016 will be the year of the ‘new IT’ and  5 digital business predictions for 2016. In summary these two sets of predictions believe that the business will increasingly “own the IT” as web scale architectures become available to all and there is increasing pressure on CIOs to move to a consumption based model. At the fore of all CxO’s minds will be that digital business strategy, corporate innovation, and the digital customer experience are all mantras that must be followed. More ominous is the prediction that there will be a cyber attack or data breach in the cloud during 2016 as more and more data is moved to that environment.

No overview of the predictors would be complete without looking at some of the analyst firms of course. Gartner did their 2016 predictions back in October but edged their bets by saying they were for 2016 and beyond (actually until 2020). Most notable, in my view, of Gartner’s predictions are:

  1. By 2018, six billion connected things will be requesting support.
  2. By 2018, two million employees will be required to wear health and fitness tracking devices as a condition of employment.
  3. Through 2020, 95 percent of cloud security failures will be the customer’s fault

Forrester also edged their predictive bets a little by talking about shifts rather than hard predictions.

  • Shift #1 – Data and analytics energy will continue drive incremental improvement.
  • Shift #2 – Data science and real-time analytics will collapse the insights time-to-market.
  • Shift #3 – Connecting insight to action will only be a little less difficult.

To top off the analysts we have IDC. According to IDC Chief Analyst Frank Gens:

“We’ll see massive upshifts in commitment to DX [digital transformation] initiatives, 3rd Platform IT, the cloud, coders, data pipelines, the Internet of Things, cognitive services, industry cloud platforms, and customer numbers and connections. Looked at holistically, the guidance we’ve shared provides a clear blueprint for enterprises looking to thrive and lead in the DX economy.”

Predictions are good fun, especially if you actually go back to them at the end of the year and see how many things you actually got right. Simon Wardley in his excellent blog Bits or pieces? has his own predictions here with the added challenge that these are predictions for things you absolutely should do but will ignore in 2016. Safe to say none of these will come true then!

With security being of ever greater concern, especially with the serious uptake of Internet of Things technology, what about security (or maybe lack of) in 2016? Professional Security Magazine Online has Culture Predictions for 2016 has in its predictions that:

  1. The role of the Security Chief will include risk and culture.
  2. Process, process, process will become a fundamental aspect of your security strategy.
  3. Phishing-Data Harvesting will grow in sophistication and catch out even more people.
  4. The ‘insider threat’ continues to haunt businesses.
  5. Internet of Things and ‘digital exhaust’ will render the ‘one policy fits all’ approach defunct.

Finally here’s not so much a prediction but a challenge for 2016 for possibly one of the most hyped technologies of 2015: Why Blockchain must die in 2016.

So what should we make of all this?

In a world of ever tighter cost control and IT having to be more responsive than ever before it’s not hard to imagine that the business will be seeking more direct control of infrastructure so it can deploy applications faster and be more responsive to its customers. This will accentuate more than ever two speed IT where legacy systems are supported by the traditional IT shop and new, web, mobile and IoT applications get delivered on the cloud by the business. For this to happen the cloud must effectively ‘disappear’. To paraphrase a quote I read here,

“Ultimately, like mobile, like the internet, and like computers before that, Cloud is not the thing. It’s the thing that enables the thing.”

Once the cloud really does become a utility (and I’m not just talking about the IaaS layer here but the PaaS layer as well) then we can really focus on enabling new applications faster, better, cheaper and not have to worry about the ‘enabling thing’.

Part of making the cloud truly utility like means we must implicitly trust it. That is to say it will be secure, it will recognise our privacy and will always be there.

Hopefully 2016 will be the year when the cloud disappears and we can focus on enabling business value in a safe and secure environment.

This leaves us as architects with a more interesting question of course? In this brave new world where the business is calling the shots and IT is losing control over more and more of its infrastructure, as well as its people, where does that leave the role of the humble architect? That’s a topic I hope to look at in some upcoming posts in 2016.

Happy New Year!

2015-12-31: Updated to add reference to Simon Wardley’s 2016 predictions.

Is the Cloud Secure?

I’ve lost track of the number of times I’ve been asked this question over the last 12 months. Everyone from CIO’s of large organisations through small startups and entrepreneurs, academics and even family members has asked me this when I tell them what I do. Not surprisingly it gets asked a lot more when hacking is on the 10 o’clock news as it has been a number of times over the last year or so with attacks on companies like TalkTalk, iCloud, Fiat Chrysler and, most infamously, Ashley Madison.

I’ve decided therefore to research the facts around cloud and security and even if I cannot come up with the definitive answer (the traditional answer from an architect about any hard question like this usually being “it depends”) at least point people who ask it to somewhere they can find out more information and hopefully be more informed. That is the purpose of this post.

First of all it helps to clarify what we mean by “the Cloud” or at least cloud computing. Let’s turn to a fairly definitive source on this, namely the definition given in the National Institute of Standards and Technology (NIST) Definition of Cloud Computing. According to the official NIST definition:

“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”

Note that that this definition makes no statement about who the cloud service provider actually is. This definition allows for clouds to be completely on premise (that is, within a companies own data centre) and managed by companies whose business is not primarily that of IT just as much as it could be the big ‘public’ cloud service providers such as Microsoft, IBM, Amazon and Google to name but four. As long as there is network access and resources can be rapidly provisioned then it is a cloud as far as NIST is concerned. Of course I suspect the subtleties around this are lost when most people ask questions about security and the cloud. What they are really asking is “is it safe to store my data out on the internet” to which the answer very much is “it depends”.

So, let’s try to get some hard data on this. The website Hackmageddon tracks cyber attacks around the world and publishes twice monthly statistics on who is being hacked by whom (if known). Taking at random the month of August 2015 there were 79 recorded cyber attacks by Hackmageddon (which as the website points out could well be the tip of a very large iceberg as many companies do not report attacks). Of these there seem to be no attacks that are on systems provided by public cloud service providers but the rub here of course is that it is difficult to know who is actually hosting the site and whether or not they are clouds in the NIST definition of the word.

To take one example from the August 2015 data the UK website Mumsnet suffered both a distributed denial of service (DDoS) attack and a hack where some user data was compromised. Mumsnet is built and hosted by the company DSC a hosting company not a provider of cloud services according to the NIST definition. Again this is probably academic as far as the people affected by this attack are concerned. All they know is their data may have been compromised and the website was temporarily offline during the DDoS attack.

Whilst looking at one month of hacking activity is by no stretch of the imagination representative it does seem that most attacks identified were against private or public companies, that is organisations or individuals that either manage their own servers or use a hosting provider. The fact is that when you give your data away to an organisation you have no real way of knowing where they will be storing that data or how much security that organisation has in place (or even who they are). As this post cites the biggest threat to your privacy can often come from the (mis)practices of small (and even not so small) firms who are not only keeping sensitive client information on their own servers but also moving it onto the cloud, even though some haven’t the foggiest notion of what they’re doing.

As individuals and companies start to think more about storing information out in the cloud they should really be asking how cloud service providers are using people, processes and technology to defend against attackers and keep their data safe. Here are a few things you should ask or try to find out about your cloud service provider before entrusting them with your data.

Let’s start with people. According to IBM’s 2014 Cyber Security Intelligence Index 95% of all security incidents involve human error. These incidents tend to be security attacks from external agents who use “human weakness” in order to lure insiders within organisations to unwittingly provide them with access to sensitive information. A white paper from the data security firm Vormetric says that the impacts of successful security attacks involving insiders are exposure of sensitive data, theft of intellectual property and the introduction of malware. Whilst human weakness can never be completely eradicated (well not until humans themselves are removed from data centres) there are security controls that can be put in place. For example insider threats can be protected against by adopting best practice around:

  • User activity monitoring
  • Proactive privileged identity management
  • Separation-of-duty enforcement
  • Implementing background checks
  • Conducting security training
  • Monitoring suspicious behaviour

Next cloud providers need to have effective processes in place to ensure that the correct governance, controls, compliance and risk management approaches are taken to cloud security. Ideally these processes will have evolved over time and take into account multiple different types of cloud deployments to be as robust as possible. They also need to be continuously evolving. As you would expect there are multiple standards (e.g. ISO 27001, ISO 27018, CSA and PCI) that must be followed and good cloud providers will publish what standards they adhere to as well as how they comply.

Finally what about technology? It’s often been said that internet security is a bit like an arms race where the good guys have to continuously play catch up to make sure they have better weapons and defences than the bad guys. As hacking groups get better organised, better financed and more knowledgable so security technology must be continuously updated to stay ahead of the hackers. At the very least your cloud service provider must:

  • Manage Access: Multiple users spanning employees, vendors and partners require quick and safe access to cloud services but at the same time must have the right security privileges and only have access to what they are authorised to see and do.
  • Protect Data: Sensitive data must be identified and monitored so developers can find vulnerabilities before attackers do.
  • Ensure Visibility: To remain ahead of attackers, security teams must understand security threats happening within cloud services and correlate those events with activity across traditional IT infrastructures.
  • Optimize Security Operations: The traditional security operations center (SOC) can no longer operate by building a perimeter firewall to keep out attackers as the cloud by definition must be able to let in outsiders. Modern security practices need to rely on things like big data analytics and threat intelligence capabilities to continuously monitor what is happening and respond quickly and effectively to threats.

Hopefully your cloud service provider will have deployed the right technology to ensure all of the above are adequately dealt with.

So how do we summarise all this and condense the answer into a nice sentence or two that you can say when you find yourself in the dreaded elevator with the CIO of some large company (preferably without saying “it depends”)? How about this:

The cloud is really a data centre that provides network access to a pool of resources in a fast and efficient way. Like any data centre it must ensure that the right people, processes and technology are in place to protect those resources from unauthorised access. When choosing a cloud provider you need to ensure they are fully transparent and publish as much information as they can about all of this so you can decide whether they meet your particular security requirements.

Ping. Floor 11.