(“In Case You Missed It Monday” is my chance to showcase something that I wrote and published in another venue, but is still relevant. This week’s post originally appeared on IT Pro Portal)
Cloud computing costs continue to grow and organizations are now spending more of their budgets on cloud than ever before. While part of the cloud computing market’s growth can be contributed to digital transformation, the pandemic has also forced businesses to make it easier for their employees to access company documents and files while working remotely.
To learn more about the current state of cloud computing and how rising cloud costs are affecting businesses, ITProPortal spoke to SolarWinds’ Head Geek Leon Adato.
Can you tell us a bit more about what being a Head Geek™ at SolarWinds entails?
There are two sides to the role: the outward-facing job and the inward-facing one.
I like to frame my customer-facing work as being a “storyteller.” I get to talk about the problems I’ve encountered as an IT professional in general and as one who’s focused on the subdiscipline of monitoring. I can contextualise these problems or situations to help folks understand whether they might be on the verge of an issue and point to potential actions capable of helping them solve—or preferably avoid—the problem.
How I tell this story varies based on the best medium for the message. I might tell it in a blog post, an eBook, as a talk on stage at a conference, in the booth at a convention, on a podcast, or in a video.
Turning inward, Head Geeks take all the stories and reactions we hear and the conversations they spark and present them as “voice of the customer” insights.
In this way, the things SolarWinds builds drive the stories I tell, which affect the conversations I have, which become the feedback I share with our developers, which complete the circle to become the next generation of features and tools we build.
The price of using public cloud continues to increase for businesses. What advice would you give to companies looking to lower their cloud spend, and should they consider moving some of their data back on-premises?
I want to clarify a bit: the cost of public cloud may or may not increase, depending on which platform customers choose, which service(s) they’re leveraging, and what workloads they’re putting into it. In some cases, the cost of cloud has gone down. But what’s undeniably going up is the amount of money companies are spending on cloud. Some of this is good business, and other times it’s an embarrassing dumpster fire of false assumptions and throwing good money after bad.
Of course, the part every company wants to discuss is the former, while the part pundits, journalists, and Corey Quinn like to focus on the latter. I’m not saying we’re inventing the story of rising cloud costs out of thin air (or vapor, as it were), but “man pets dog, dog wags tail” is not a “Film at 11!” story.
Many organisations move workloads to the cloud without completely understanding what they’re getting into. They either don’t understand their workload, don’t understand the cloud platform in general, or don’t understand the tier of service they’ve chosen.
Not understanding the workload boils down to “lift and shift” of an existing (but unmonitored) server/application and the magical belief all things are easier/cheaper/better in the cloud. If you have no statistics on how a system is operating on-premises, you can’t hope to understand (much less contain) the cost of putting it on a platform designed to automatically scale (for a price) based on usage. I’m not saying lift and shift is universally bad. But if you can’t document the resources an application used while in the DC, you’re asking for a nasty shock about a month down the road when the first bill comes due.
Not understanding the cloud platform in general means being cognisant of what types of activities will generate cost (moving data from one system to another, for example), the optimal way applications should be set up in the cloud, and how this differs from on-premises best practices.
Finally, not understanding the specifics of the chosen tier of service boils down to ensuring you’ve got the right “tool” for the job. Some tiers were built for microservices and will cost an arm and a leg if you house a persistent container or VM there. In other cases, the exact opposite is true.
Still, when you say, “The price of using public cloud continues to increase for businesses,” it’s functionally true—the line-item cost labelled “cloud” on corporate budget sheets is getting larger. This is due, in equal parts (when you look at it across the spectrum of industries), to some companies buying the wrong thing, buying the right thing and using it wrong, or buying the right thing, using it right, and choosing to use it more.
Do you think rising cloud computing costs could lead to cloud economists who are able to help businesses get the best deals possible?
Oh God, I hope so. My colleague Kevin Sparenberg and I toyed with the idea on a recent ActualTech Media webcast of a potentially attractive new role—CFO (cloud finance officer), oCFO (the “other” CFO), or (as you more correctly labelled it) cloud economist—for many IT professionals who have a head for tech and a nose for business and are looking to pivot their careers to some new challenge.
This role cannot be pure finance, as the technical aspect of the role isn’t trivial. And it can’t be pure tech because the level of integration with business goals is equally complex and important.
I hope to see businesses embrace the idea they need a cloud economist on staff, even as external consultants like Corey Quinn ply their trade. This would follow the same model as internal vs. external legal teams. There’s enough work to support both.
What steps are companies taking to be more environmentally conscious? What is the IT industry doing for sustainability beyond improving efficiency?
Though this isn’t an area I spend a lot of time focusing on, I think there’s a level of effort being made. It’s not enough, and there’s plenty of fruit—both low-hanging and higher up the tree—ripe for the taking.
Still, I think many companies are more aware of their waste management policies, disposing of everything from paper and toner to old hardware more thoughtfully. I think the heyday of bitcoin farming raised awareness for many IT folks of the global consequences of power consumption choices, and this awareness has caused changes in behaviour for everything from lighting automation in buildings to the way power and heat/cooling distribution systems are designed in the NOC.
In one respect, the desire to be more sustainable is a minor aspect of the drive to cloud. The idea (which may in some cases be a naive hope) of consolidating our compute workloads into a single mega-NOC (i.e., the cloud) and therefore reducing overall power and cooling consumption is attractive to many of the environmentally-minded folks in IT.
How can organisations improve their cyber hygiene, and have more companies taken steps to do so to better protect their data and systems?
Looking back across the 30+ years I’ve worked in IT, I feel like “security is everyone’s responsibility” is a mantra repeated frequently and ignored just as often.
The plain truth is there are plenty of flashy (and expensive) security procedures companies can employ, and there’s no lack of poorly designed systems from a security perspective.
But despite all this, far and away the largest area of vulnerability is the end user. If we as IT professionals don’t commit to understanding our end users’ needs, to listening to their goals, hearing their concerns, and building procedures that don’t penalise them in terms of getting their work done, then we’ll continue to fail to secure our environments.
Should IT pros take the time to learn the language of business, and how could this help them in their roles? Has business literacy become a requirement for leading IT roles?
Honestly, it never stopped being a requirement. We (IT folks) have gotten so good (or so adamant) about avoiding it, the business has stopped asking us to be part of the conversation.
I often refer to learning the language of business like this: it’s like learning any other language. Just because I learn French doesn’t mean I stop knowing how to speak English. And learning French doesn’t somehow magically turn me into a French person. What it does (through the process of learning the vocabulary and syntax) is impart an appreciation for aspects of French culture.
So in asking IT folks to learn the language of business, we’re not somehow implying that they’ll lose their technical edge or become management. They’ll just develop an understanding and appreciation of the business culture within their companies.
And here’s why it’s important: if I stood on a street in Paris and screamed at people in English, not only would they not understand me, they wouldn’t want to help me. I think most of us can intuitively understand this concept. And yet IT people have no problem speaking in “tech” and becoming frustrated when our business colleagues not only fail to understand us but refuse to engage with us.
Business leaders aren’t going to stop speaking business-ese. This is the lingua franca of corporate activity. We have to make the effort to learn and adapt.
How do you think the ongoing pandemic has affected the demand for cloud computing?
I want to address three ways here, and they speak to the wide appeal of the platform and why the advent of cloud computing has transformed our industry.
First, I think this global crisis and the resulting push to a more remote workforce has created a greater need, appreciation, and demand for cloud-based software as a service (SaaS) offerings. This includes everything from email and document creation to management, collaboration tools, and scheduling. There are many reasons for this: it lowers the overall support load for already overtaxed internal IT staff and eliminates the need for installing and maintaining software on the local machine. Finally, it generally reduces the system requirements for operation, thus giving the equipment a longer life.
The pandemic has shone a light on the need for companies to shift their business-critical applications (whether they’re customer-facing or internal systems) to a more scalable and available platform than the NOC-based version.
Finally, this health crisis has sparked an appreciation for the way cloud-based data modelling and machine learning techniques can identify, gather, process, and present vast quantities of rapidly shifting data sets. Businesses are realising the processes involved in the Johns Hopkins COVID map can be equally applied to sales patterns, supply chain logistics, and more.
What advice would you give to businesses turning to cloud for the first time in response to the need for remote working?
Understand not every workload works well in the cloud (in fact, the 2017 IT Trends Report showed 20% of the workloads that moved to the cloud moved back within six months). But businesses can only make this assessment if they understand the current performance is suboptimal vis-à-vis the on-premises version. So monitoring is key. I know this sounds self-serving because I work for a monitoring software vendor, but I’ve been doing this for 20 years with various tools, so I mean it.
Remember, the “cloud” continues to change. Providers are constantly updating with new offerings and making fundamental changes to their existing ones. You have to be prepared to shift with the changing tides.
Appoint a cloud economist. As we already covered, “cloud” is equal parts technical and financial. Someone on your team needs to be comfortable with both aspects, or you (and your business) will be disappointed.
How else do you anticipate the cloud computing industry will be affected by coronavirus in the long term?
Apart from the changing demands outlined in the previous question, I think cloud is fundamentally going to continue its current trajectory. The move of businesses to the cloud has been inexorable for at least four years, and it shows no sign of slowing; people will continue to leverage cloud-based resources and architectures in the building of modern applications.
If anything, the coronavirus will cause cloud offerings to improve because more IT people will be supporting more applications, services, and workloads running in it. It’ll demand a higher level of functionality, flexibility, and innovation, just like every other new technology to come down the road.