Over the years, I’ve been able to bounce around various areas within the IT world. This includes virtualization, cloud design and architecture, the data center, and now application development and DevOps. Throughout the years, there’s always been a fairly clear line separating many of these IT functions. Sure, they would interact, but it didn’t happen often and was mainly out of necessity.
Let’s fast-forward to today. In having conversations with several business and data center leaders, it’s clear that these lines have become pretty blurred. I think that’s a really good thing. Data center and IT leaders are working much more closely with application development and DevOps professionals. Why? DevOps is literally paving the way for next-generation solutions, service, and delivery methods. These are the folks creating AI engines, big data analytics systems, machine learning capabilities, and so much more. Through their advancements, we’ve seen even further utilization of both cloud and data center systems.
Many of these new and evolving DevOps solutions are born in the cloud. However, there’s still a lot of development that’s very terrestrial. In some situations – because of data locality or sovereignty, network performance requirements, or even security – DevOps has to take place at a data center or colocation facility. For example, running an AI engine on a virtual desktop infrastructure (VDI) that’s housed on premise. Remember, these aren’t just your typical VMs. In some cases, you’ll need to take new design considerations into your architecture, like processing requirements, storage, and much more. For heavy compute cycles, you may even need to attach a graphics processing unit or GPU (think NVIDIA) to support advanced functionality and processing.
I’m getting ahead of myself … we’ll get to talk about integration between data center and DevOps in just a minute.
Before we go on, it’s important to note that there’s actual investment and deployment happening in the data center world to support DevOps operations. For example, in the latest AFCOM State of the Data Center Industry study, Linux container management and orchestration tools came third in a list covering the top 5 data center strategy trends around technology implementation. This shows that data center and infrastructure leaders are poised to support DevOps, as it can actually help them and their business move forward in the market.
Application containers can include runtime components from security to monitoring, orchestration networking and storage – basically the code required to execute and alter a piece of software – in a single package, regardless of the infrastructure it sits on. Containers go hand in hand with continuous development, agile, and DevOps practices. This means that modern data centers and the infrastructure behind them must support new container technologies like Kubernetes, Docker Swarm and Mesosphere, among others. Leveraging direct connections into cloud and colocation providers, you can extend container management into Google Cloud Platform, AWS, or Azure.
DevOps Addresses Legacy Components
This is such a critical point to understand. Gartner recently pointed out that legacy infrastructure and operations (I&O) practices and traditional data center architectures are not sufficient to meet the demands of digital business. Digital transformation requires IT agility and velocity that outstrips classical architectures and practices. In fact, through 2018, 90 percent of organizations lack a postmodern application integration strategy and execution ability, resulting in integration disorder, complexity and cost.
In working with a few DevOps teams, I’ve seen them do some really amazing things with legacy applications, code, web portals, backend and frontend systems, and so much more. I’m seeing how DevOps can work with industries like healthcare to revolutionize patient engagement and even population health. I’m also seeing new DevOps solutions help optimize and modernize entire manufacturing floors.
Before you get too excited, you have to identify which parts of your business are actually slowing you down. Even if the “code is keeping the lights on,” you have to consider the time, money, and resources being spent to keep that legacy code operational. Let me put it into perspective – if you’re afraid to make the smallest changes or even log into the machine, maybe it’s time to review your applications.
DevOps Can Improve Efficiency
Here’s the other big factor to consider; legacy applications can make your data center less efficient. Can these apps run on converged systems? Do they require older operating systems to function? Maybe they’re written in legacy code which makes them less compatible with new systems. Whichever way you look at this, legacy can really slow things down.
A DevOps-driven approach can help you abstract these legacy applications with APIs, better data management solutions, and even integration with cloud. The big point here is that DevOps teams can help re-write and even fork-lift these applications into a continuous delivery approach. This means they’ll be modernized and integrated with newer solutions.
You’ll need to work with good infrastructure and DevOps professionals to really understand your backend systems, how applications are coded, and if there are proprietary solutions you have to work with. Some applications might be easier to modernize than others; but the key is to get started!
There are a lot of benefits in working with DevOps to help modernize the infrastructure and develop new application as well as data capabilities. However, there are definitely some challenges to be aware of.
The DevOps, Data Center, and Infrastructure Divide
Remember that line I talked about earlier? It’s still there, and it can definitely slow things down. There’s a new breed of IT experts who can communicate with DevOps as well as work with infrastructure and data center teams. These folks are able to translate DevOps needs into cloud and even data center design.
Let’s look back at my earlier example of running an AI on a VM. It’s quite possible that a DevOps staffer can design a simple VM to run their AI. But is it really running optimally? Was the store tuned properly? Maybe there needed to be special network confirmations to optimize traffic flow. The point is that it’s really key to be able to work with all necessary teams to bring a solution forward.
In a recent survey done by Quali about DevOps and cloud computing, they found that 14% reported that company culture stood in the way of implementing DevOps. Other issues included testing automation (13%), legacy systems (12%), application complexity (11%), and finally budget constraints (11%). Other responses revolved around challenges with limited IT skills and even a lack of executive buy-in.
Here’s the statistic that really brings my point to light: The study found that respondents are still burdened by complex applications that make the transition to cloud and DevOps challenging. Over 44% of applications in traditional environments were considered complex for cloud.
“While the term DevOps is often associated with leading-edge projects, mastering DevOps isn’t only about innovating on the ‘cool’ technologies faster; it’s also about building the capabilities to perform modern application development across the board,” wrote Diego Lo Giudice of Forrester Research in the December 2016 report, Master DevOps For Faster Delivery of Software Innovation. “For many companies, staying ahead of disruption means not only delivering new innovations but also modernizing current software and systems.”
Figuring out the Next Steps
I know this may be a lot to take in. However, there’s a good chance that your organization has applications that could use an update. Whether they’re proprietary solutions or custom-coded applications, it’s critical to review the viability of those applications and the value they’re bringing to your business.
Modern DevOps organizations are equipped to have conversations which will range from data center to cloud. This can equip you to simplify the application modernization process. Either way, you have to start somewhere, and the starting point is usually an audit and assessment of your application, user, and data environment.
Remember, your ability to innovate directly revolves around how you leverage applications and data. Please don’t take the “just because it works’ we won’t touch it” mentality. This is the kind of thinking that wll actually put you behind the innovation curve and weaken your competitive position.
For those folks in the data center and infrastructure space, make sure to have conversations with DevOps professionals. This should include discussion around use-cases, ways to optimize both the code and the underlying systems that the apps sit on, and how you can continuously, and positively, impact your end users. All of this can translate to a functioning business technology engine working to help your organization stay relevant in an ever-evolving digital market.