Six Key Variables to Consider When Setting Up IT Workloads

June 2, 2016
In this week’s Voices of the Industry, Chris Sharp, CTO, Digital Realty, gives in sights on a strategy for examining your various IT workloads and making sure that data is set up in the ideal environment.

In this week’s Voices of the Industry, Chris Sharp, CTO, Digital Realty, covers six key variables to consider when setting up IT workloads.

CHRIS SHARP, CTO, Digital Realty

In today’s world of social, mobile, analytics, cloud and content, the data center is no longer just a white floor where organizations store their servers. The data center is now a hub for cloud and network connectivity, responsible for liberating information collected on servers to help an organization’s community – made up of customers, partners and employees – easily and quickly exchange information to drive revenue and growth.

However, not every IT workload is made equal. To participate in today’s exchange economy, each IT workload needs to be carefully assessed and managed in the right environment. For instance, one workload may require high data transfer rates, while another may need to be specially secured due to sensitive data. Further, certain workloads – such as virtual, desktop and hybrid storage – might benefit from sitting next door to major cloud service providers, such as Amazon Web Services (AWS) or IBM SoftLayer, for greater efficiency.

The first step IT teams must take when designing their systems is to sort out the workloads based on their requirements and rules. Below are six key variables every IT team needs to consider, along with strategic questions that need to be answered:

1. Performance: What is the required time to complete a unit of work (e.g., page load, transaction)? What does this mean in terms of location, latency and bandwidth?
2. Capacity: What resources (e.g., compute, memory, storage, network bandwidth) are needed to deliver a unit of work (e.g., transaction, session)?
3. Read vs. write: What is the proportion of data that is read from a data source compared with what is written? This has implications for storage design and capacity growth. It also affects where data needs to be located.
4. Security: How much information should be stored, transmitted and used? Considerations include compliance, data sovereignty and encryption.
5. Variability: How constant or variable is the workload?
6. Reliability: What happens if the service is unavailable for a period of time? What does this imply for the corresponding workload? For example, how should an emergency response service be designed so that it is always reachable?

To deliver the right workload in the right place at the right value, many businesses today are turning to hybrid cloud architectures. According to RightScale’s 2016 State of the Cloud Report, 82 percent of enterprises have a hybrid cloud strategy in place. Common deployment methods include:
• Distribute data across both private and public cloud storage, depending on its risk classification or its latency and bandwidth needs
• Federate private and public cloud storage, using public cloud storage for archive, back up, disaster recovery, or workflow sharing and distribution

As a result, private cloud consumption and interconnection to public cloud providers is a critical part of identifying how to architect an elastic, hybrid cloud. Not to mention, there’s a culmination of services that customers want to pull together and utilize. Approximately 99 percent of all services deployed today are a mash up or culmination of other services.

Additionally, given workloads are becoming more bloated and sensitive to latency, it’s important to be mindful of the interconnection capabilities offered by an operator. A majority of hybrid cloud architectures today are being re-architected due to the lack of network between public and private clouds.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Conduit Sweeps and Elbows for Data Centers and Utilities

Data Centers and Utilities projects require a large number of electrical conduit sweeps and elbows. Learn why Champion Fiberglass is the best supplier for these projects.

Prefabricated Conduit Duct Banks Enable Smooth and Safe Electrical Installation for a Data Center

Prefabricated conduit duct banks encourage a smooth, safe electrical conduit installation for a data center.

ComAp Group
Source: ComAp Group

The Emerging Challenges for the Data Center Industry in the Age of AI

Pavel Durst, Global Product Manager at ComAp Group, outlines key challenges the data center industry must address to accommodate the advancements of AI.

White Papers

Get the full report.

From Cloud-Native Applications to Composable Infrastructure: 5 New Realities at the Edge

July 8, 2022
DartPoints outlines five new realities at the edge from decentralization to cloud-native applications and composable infrastructure.