Six Key Variables to Consider When Setting Up IT Workloads

June 2, 2016
In this week’s Voices of the Industry, Chris Sharp, CTO, Digital Realty, gives in sights on a strategy for examining your various IT workloads and making sure that data is set up in the ideal environment.

In this week’s Voices of the Industry, Chris Sharp, CTO, Digital Realty, covers six key variables to consider when setting up IT workloads.

CHRIS SHARP, CTO, Digital Realty

In today’s world of social, mobile, analytics, cloud and content, the data center is no longer just a white floor where organizations store their servers. The data center is now a hub for cloud and network connectivity, responsible for liberating information collected on servers to help an organization’s community – made up of customers, partners and employees – easily and quickly exchange information to drive revenue and growth.

However, not every IT workload is made equal. To participate in today’s exchange economy, each IT workload needs to be carefully assessed and managed in the right environment. For instance, one workload may require high data transfer rates, while another may need to be specially secured due to sensitive data. Further, certain workloads – such as virtual, desktop and hybrid storage – might benefit from sitting next door to major cloud service providers, such as Amazon Web Services (AWS) or IBM SoftLayer, for greater efficiency.

The first step IT teams must take when designing their systems is to sort out the workloads based on their requirements and rules. Below are six key variables every IT team needs to consider, along with strategic questions that need to be answered:

1. Performance: What is the required time to complete a unit of work (e.g., page load, transaction)? What does this mean in terms of location, latency and bandwidth?
2. Capacity: What resources (e.g., compute, memory, storage, network bandwidth) are needed to deliver a unit of work (e.g., transaction, session)?
3. Read vs. write: What is the proportion of data that is read from a data source compared with what is written? This has implications for storage design and capacity growth. It also affects where data needs to be located.
4. Security: How much information should be stored, transmitted and used? Considerations include compliance, data sovereignty and encryption.
5. Variability: How constant or variable is the workload?
6. Reliability: What happens if the service is unavailable for a period of time? What does this imply for the corresponding workload? For example, how should an emergency response service be designed so that it is always reachable?

To deliver the right workload in the right place at the right value, many businesses today are turning to hybrid cloud architectures. According to RightScale’s 2016 State of the Cloud Report, 82 percent of enterprises have a hybrid cloud strategy in place. Common deployment methods include:
• Distribute data across both private and public cloud storage, depending on its risk classification or its latency and bandwidth needs
• Federate private and public cloud storage, using public cloud storage for archive, back up, disaster recovery, or workflow sharing and distribution

As a result, private cloud consumption and interconnection to public cloud providers is a critical part of identifying how to architect an elastic, hybrid cloud. Not to mention, there’s a culmination of services that customers want to pull together and utilize. Approximately 99 percent of all services deployed today are a mash up or culmination of other services.

Additionally, given workloads are becoming more bloated and sensitive to latency, it’s important to be mindful of the interconnection capabilities offered by an operator. A majority of hybrid cloud architectures today are being re-architected due to the lack of network between public and private clouds.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Iron Mountain Data Centers

Navigating Industry Challenges and Opportunities in the Age of Generative AI

Mark Kidd, Executive Vice President & General Manager, Iron Mountain Data Centers & Asset Lifecycle Management, takes a look at how Generative AI is reshaping the data center ...

White Papers

Dcf Schneider64 Wp Cover 2022 02 10 9 39 54 230x300

Why Data Centers Must Prioritize Environmental Sustainability: Four Key Drivers

Feb. 11, 2022
Energy efficiency has been a top priority for data center colocation providers over the past twenty years. Now, sustainability is a priority. In this white paper, Schneider Electric...