Is cloud having a negative impact on IT spend and jobs?

I have always been a performance and efficiency geek. Throughout my career, I have worked on making software run faster and more efficiently, on automating many development and business processes, and on helping organizations increase their application delivery agility. Occasionally, I encountered resistance in increasing efficiency, typically within large organizations where people were worried about job security, but I rarely thought of any downsides to this efficiency-enhancing work.

When I was in the Israeli Air Force, we made a game-changing enhancement to developer productivity. We offered it for free to our US supplier, since we knew our work would benefit future deliverables. I was quite shocked that the guys who had been doing the same work for many years had no interest in adopting a better way. Clearly being more productive can threaten job security, which seemed to be the reason for their lack of interest.

Later, when I spent time working with a variety of organizations to adopt a continuous delivery methodology, I found many had an interest to increase efficiency but I also met some significant resistance from the IT ops teams, who had no desire to move towards a DevOps culture.  Possible reason? Some of the teams didn’t want to automate themselves out of a job. They clearly hadn’t heard one of my favorite sayings, “the most irreplaceable people are the ones that make themselves replaceable.”

Currently I work for Amazon Web Services and we strive to maximize value delivered to customers by delivering the best, most cost-effective services. We are able to do that by leveraging innovation and economies of scale, and delivering unprecedented value at customer-friendly costs. This model is different than what we have seen in the past. There is significantly less hardware waste in this model, because we are able to dramatically improve hardware utilization and so the cost savings go back to the customer. There is no shelfware in this model, since customers only pay for what they use and therefore incur less wasted spend. And customers benefit from not needing to have their IT staff manually handle many tedious tasks.

But this does beg the big question of whether cloud will erase a huge amount of IT spend and jobs, due to increased efficiencies and lower costs. Is progress actually having a negative impact on my field and am I contributing to its demise? Am I innovating myself out of a job? Should IT go back to buying hardware, manually plugging in cables and buying, installing and self-managing software?

In the past couple of years, I encountered a similar question related to PHP. We were getting ready to release PHP 7, the most widely deployed Web development language. The new version promised to be at least twice as fast with significantly lower memory usage than prior versions. On average, one needed at most half the machines (and often fewer) to drive the same amount of workload. At the time, we were collaborating with Intel on performance and efficiency enhancements and one person was pondering whether the overall server monetization opportunity (software & hardware) would be negatively impacted.

An Intel engineering manager pointed to the Jevons paradox. This paradox occurs when technological progress increases the efficiency with which a resource is used, but the rate of consumption of that resource rises because of increasing demand. In fact, consistent with the Jevons paradox, the manager suggested that with PHP being so much more efficient, companies would likely use more of it, derive additional use-cases for the language and therefore cause the overall applicable market to grow. Initially, I had to scratch my head; it was easier to think that 2-3x the density on a given server meant at least 50% fewer servers! But the more I read about the Jevons paradox, the more I realized there was a credible case to be made for both performance and efficiency progress (in our case typically 3x more throughput for a given server).

I am convinced that the Jevons paradox holds true for cloud computing and the value we deliver at Amazon Web Services.  By delivering not only significantly improved resource and cost efficiencies and also making the resources easier to use, we see IT consuming more software and hardware than ever before. The cost of entry is low, so organizations are able to test and adopt new capabilities quickly, without long and tedious decision cycles and without the risk of wasted shelfware. In addition, we are also seeing customers build out new use-cases that were previously difficult and/or typically reserved for only a select few companies. New use-cases are driving an increase in IT innovation both in breadth and depth, which adds to increased consumption.

But what about the IT staff with job security due to waste and manual work? The growth in cloud adoption, increased consumption and use-cases appears to be driving demand for even more talent that can leverage these new technologies and fully automate the next generation of applications; there’s plenty of work!

All signs signal more vs. less demand for IT talent and increased resource consumption. This is an amazing time to be in IT!

Disclaimer: The opinions in this post are solely my own and do not represent the opinion of my employer. My opinions are based on what I see in the broad industry and my own personal experiences.

Leave a comment