Over recent years, I’ve noticed that pretty much every macro-level keynote in our industry covers the same three horsemen of the digital apocalypse:
- Software is eating the world!
- Businesses must change or die!
- Buy our products/services (or die)!
Few things rankle me more than a business meme I’m forced to hear over and over. I’m left to delight myself with whatever new clip-art the presenter uses. Corporate clip-art, deep as that well is, can only sate me for so long, though, and I’ve recently been curious to investigate the validity of the horsemen. The third one, of course, is evaluated on a case by case basis, but the first two are applicable to all cases. Usually, if you follow the citation chain for the first two, you get lost in a dense forest of leprechauns, if there’s any citations at all.
So, I thought I’d start checking in on these two bedrock ideas more:
- What does “software is eating the world” really mean, and is it really on a tear like a starved man at a Shoney’s?
- Are businesses really facing such an existential crisis that they need to dramatically change? Should we expect them to change?
Below is what I’ve found so far. I’d love to hear what y’all are seeing out there. Comments welcome!
Software is Eating the World (SIETW)
My biggest question around “software eating the world” is “hasn’t software always been eating the world?” Perhaps the point is more about the magnitude of that hunger. Or, as I’ve begun to suspect, it’s more about how often people use software on a daily basis, indicating that businesses should start “doing” software more. Let’s look at these ideas through a few data points.
More Devices Means More Software
Each computational device out there needs software. So, we might look at the idea of software eating the world simply as the rise of devices. There are certainly more networked devices out there. For example, mobile, as a category of computing, has grown incredibly fast, and the total foot-print of software used across these devices is huge—just compare PCs, smartphones, and tablets from Gartner data over the past three years:
Source: Gartner 2012, 2013; 2014 to 2016
This growth would drive a significant increase in the amount of software in the world. The mobile operating systems, apps, and all the back-end services involved must equal much net-new code. More importantly, the growth in non-PC devices certainly drives more use of software since we have “a super computer in our pocket” at all times, ready to go for scooting numbers together, making slides, or price checking steak sales.
Back When It Was Just “Things”: Embedded Software
In addition to the obvious rise of new device types, there’s a whole category of software us application-minded folks barely think about—embedded software. That’s the stuff that runs your TV, factory robots, and other industrial devices. Surely there’s a lot of this software around already? To rate it by spend, the embedded systems market was around €160bn back in 2009, and each of those systems had pretty big chunks of code behind them:
Of course, these devices existed well over 5 years ago and before software started all that eating. Imagine how much code there is now. More recent estimates (from IDC, it seems) put the embedded market at around €1.5 trillion.
I bring up embedded software as it’s the very real ancestor of what we’ve been calling “The Internet of Things.” The numbers there get equally massive as more and more devices—“things”—not only have software embedded in them but are always on the network, coordinating with each other, “regular” computing devices, and us humans. Adam gave a great example of this in his recent post on connected cars. To measure by market-forecasts, IDC is estimating that the IoT market will be $7.1 trillion(!) by 2020(!). With the hire of Ben Black, we’re starting up a lab to tackle this space head-on—interested in helping out?.
First Course: The Back-Office
When the SIETW meme was unleashed in 2011, it was clear that software was already a chunky fellow. Indeed, if you treat yourself to all of the 5 hours and 46 minutes of Goldratt’s Beyond the Goal, you can hear some of the breathless hopes and dreams about how software was chewing through the enterprise back office over the past few decades…and plenty of advice on avoiding pitfalls that still apply today. We see this in our individual lives when we sign up for health-insurance, setup up direct deposit, go through corporate training, and otherwise deal with “paperwork” for the companies we work for. And while the joke may be that every company’s finances really runs on Excel, accounting and financial departments have long had systems of record in place that have been feeding software’s hunger.
Only Use Shows Value
It should be clear that there is already a lot—uncountable—lines of code out there in the world, and that the sheer bulk of software is growing daily. If measured by size, software long ago became morbidly obese from all that eating. However, I don’t think sheer poundage is really the intent of “software is eating the world.” As any modestly aged developer will tell you, measuring pure volume—lines of code, object instructions, etc.—of software is a poor yardstick. You want lines-of-code? I can get 20,000 LoC’s easy, believe me, by 3pm! What you want to measure is something more like “benefit achieved,” or features, or even just straight up applications.
I would argue that our quest to track software’s “caloric intake” should be measured by the number of times software shows up in your life each day—how often you use software. Instead of paying for a coffee with a knuckle dragger, I use Apple Pay; instead of asking advice from a person at a camera store, I look up reviews on my phone; instead of tracking my mileage with pen and paper, I just look at my logs in Automatic. And so forth.
In this framing, I’m pretty sure “software eating the world” means businesses need to write more software—re-writing existing applications into mobile apps, writing net new applications to replace “analog” processes, and coding the brains out of whatever the Internet of Things pans out to be (our friends at GE have some great ideas beyond dimming light-bulbs). Importantly, businesses also realize that software is the primary point of interaction with customers—storefronts, email, chat, app usage, IVRs, EDI, etc. All of these are enabled by computers moving beyond the desktop and the server room to your pocket, environment, and industrial equipment.
With software driving deeper roots into products, services, and customer interactions, businesses will use more software in the front-office and do more data analysis in the back-end. Here, Uber has become the most voracious software-augmented eater in recent years. While it’s a tired business meme by now, with rumors of net revenue (cash after paying the drivers) growing from $400m in 2014 to an expected $2bn in 2015, Uber’s phenomenal momentum is instructive for other would-be SIETW hopefuls. If something as boring as the taxi industry can go all topsy-turvy from software, it’s easier to imagine how many other industries could become equally interesting after a healthy injection of more software.
Now The Real Gorging can Begin
With the table set, we can see that there are more networked devices than ever and even more with the IoT courses on the way! Software is clearly being used for more customer engagement and analytics as well, and these things start a new, sharp rise in the actual use of software, not only in the consumer space but also in the business sector. For me, this is where things get interesting. I’m deeply interested in how enterprises use technology to establish competitive advantage, in particular, how they use custom written software to drive profits. This SIETW path of “software use” translates directly into profit—companies are using software more than ever to compete and win customer cash. And if they’re not doing so, they certainly have the potential to do so.
Better business has always been the promise of IT, but advances in programming languages and cloud infrastructure are speeding up the process in tangible ways. Software development has gotten more efficient and quicker in recent years thanks to cloud and agile practices. In an application of Jevon’s Paradox, that means businesses are (likely) creating even more of it. From what I can tell so far, software really is consuming more and more of our daily lives. It’s not just the sheer bulk of software that drives the SIETW conclusion—we’ve always had lots of software around us—but rather the amount of quality time that we spend with software.
If you buy into the notion that you should use and create more software as one of the core enablers of your business (“become a technology company” to use another trope), then it’s likely that you must “change, or die,” as so many keynote slides have said over the years.
Change Isn’t For Everyone
As we run around imploring companies to change, to become more like “technology companies,” and submit to SIETW, are we being realistic? Change isn’t easy, and there’s little evidence that companies have done well at mastering change since, you know, when we did accounting on clay tablets.
In recent years, there’s been an ongoing refinement and explanation of disruption theory, one of the better bodies of work on the need for this kind of change in business. I look to Horace Dediu as the person who can bob and weave an explanation best. Disruption theory amounts to the idea that you, an existing business, will be blind-sided by a competitor that starts with a cheaper alternative to your product, builds a base, and slowly, but then all of a sudden overtakes you. (Bobbing and weaving comes in with companies like Apple who seem to get the same results by following seemingly different tactics; Ben Thompson is good at tweaking the idea to account for these disruption mutants like Apple.)
In the tech industry, the pattern is well known but companies are trapped by past success and existing revenue so often that they don’t change. Also, there’s often a Halo Effect of only focusing on the disruptive winners and ignoring all the failed, would-be disrupters. In the tech community, we follow the old “only the paranoid survive” mantra, but my time in strategy and M&A showed me a distressing pattern—large tech companies find it incredibly hard to self-disrupt or even innovate! In fact, they often use acquisitions to fill that need.
If tech companies find it so hard to change, how do non-tech companies do? Well, from one study, we might infer that companies do poorly:
One must tread carefully here—the amount of work it would take to draw strong causation between “change is hard” and these companies failing or raising is almost impossibly large. However, it does suggest that mighty companies find it hard to be agile enough to stay on-top. One has to think “inability to change fast enough” contributes to their downfalls somehow. I don’t think any management team wants to fail, and they’d be happy to change if it kept them afloat.
At the very least, this rate of decay in large companies points towards the introduction of new companies in the future. One prediction estimates that, by 2027, nearly 75% of the members in the S&P 500 will be new, previously unheard of companies. If we anecdotally go industry by industry, you can see the patterns in place for this to happen, often with software enabling the disruption. Again, Uber provides the Uber pattern. Mary Meeker’s annual data-rich presentation last year highlighted how several other industries like education, finance, healthcare, and even grocery shopping are entering the SIETW pattern.
When you see these kinds of slides, they’re not really predicting success (the logos of all the funny named companies come and go each year). Instead what you’re supposed to think is that a previously disruption proof industry now has cracks in its wall. The leaders in that industry can no longer be asleep at the wheel of their cash barges. For disruption nerds, the hotel industry has famously resisted disruption for decades, but the rise of AirBnB, HomeAway, and others are now showing that even that very analog industry is susceptible to SIETW-based competitors.
Narrowing down to how IT departments are dealing with change, in our 451 DevOps studies we found that moving the next stage of DevOps was often held back by people’s resistance to change. A more recent wet-finger in the wind poll from Gartner’s Tom Bittman illustrates this same point, that change is hard. He surveyed 140 conference attendees about their private cloud project and found that 95% of them were unsatisfied with the progress they were making; much of the blame was on not addressing operations concerns enough, setting expectations wrong, and otherwise not changing enough. This is reflected in broader studies like the McKinsey surveys on “going digital.” Consistently, changing the organizational structures and processes helps IT-based business projects, while the lack of doing so hurts.
So, the answer here is more grim—change is hard, often dramatically so. Even for technology companies who are supposed to be masters of change in their paranoid, disruption-craving states. This horseman is more of an imperative—change or die. And while I find that annoyingly hyperbolic (which means a lot coming from someone who’s neurotically so in my rhetoric), it sure seems to be the case.
Moving Beyond The Blinking Cursor
Recently, I had the chance to hear many Pivotal customer stories at our annual sales kickoff meeting, which proved this point anecdotally, for sure. As Stacey covered earlier this week, in most cases, companies were struggling, failing even, if they didn’t change enough. Just installing and using our lovely products wasn’t enough. They had to learn new methods of not only doing software but running their businesses. It didn’t happen in every story, but most of them had a dramatic point where a team or leader took a bold step to try something new, to change.
If you’ve heard Warner Music’s Jonathan Murray talk over the years, you’ll be familiar with the idea of building a “software factory.” This is a major part of the change and another meaning of “software is eating the world.” Organizations are hiring more and more developers as they realize that they need to do more than just upgrade to the latest version of their ERP suite or maintain years old internal applications. Using IT to do this has often been too expensive, but agile companies are seeking to change how they operate. They are eager to build out IT platforms that let them rapidly experiment and learn—to become technology companies, as one of our customers put it recently.
The first step, as any developer will tell you, is to get yourself a platform to up-level the part of the stack you pay attention to. You don’t want to spend your time and money standing up the raw infrastructure to just get a blinking cursor. You want to do the work after the blinking cursor—write applications, deploy them to your customers, observe how your customers are using them, and repeat the cycle to perfect the apps and drive more revenue. At Pivotal, our goal is to encapsulate as much of that platform as possible in Pivotal Cloud Foundry. This way, you can focus on the applications that actually bring in revenue instead of getting that blinking cursor up on the screen.
Learn More at Cloud Platform Roadshow
See Cote’s presentation, “Software Kept Eating the World,” along with sessions on Cloud Foundry development, operations, architecture, continuous delivery, and micro services at the Pivotal Cloud Foundry Roadshow in Cincinnati (Mar 10) and Columbus (Mar 12).
Recommended Reading:
- Change and Transformation with Pivotal Labs
- Making Decisions to Build or Buy a PaaS and Where to Prioritize Development Efforts
- The Benefits and Capabilities Behind Pivotal Cloud Foundry
- More Cloud Foundry blog articles
About the Author