Raul Martynek, CEO at DataBank, explores how a campus data center that will showcase the importance a school places on science and technology calls for an expert data center partner that higher education institutions can trust.
Universities commonly operate data centers that have been built over time and in an ad hoc fashion. Very often these facilities house critical IT gear which supports important prospective programs and is powered by antiquated data center infrastructure. It’s time for a new model for enabling high performance computing (HPC) systems in higher education, wherein universities upgrade outdated data center infrastructure and future proof their research activities for the next ten years.
Researchers working for universities, especially in engineering and scientific fields, are generating astounding amounts of data and petabytes of information. This “computational research” is replacing “wet lab” environments and is being driven by advances in machine learning, artificial intelligence and high performance computing systems. The data often carries the progress of important initiatives such as research into climate change, cancer, human health and astrophysics.
For example, the NASA Center for Climate Simulation executes on research science initiatives using HPC, giving scientists the ability to access large data sets and powerful compute resources. The MIT Lincoln Laboratory Supercomputing Center is focused on accelerating U.S. research in machine learning, device physics and autonomous systems. And Frontera, University of Texas at Austin’s new National Science Foundation-funded supercomputer, currently 5th most-powerful in global rankings, will support break-throughs in earthquake forecasting and astrophysics by leveraging this HPC cluster’s massive-scale compute power.
The new computing paradigm for education institutions means abandoning aging infrastructure and embracing an entirely new approach that will drive the goals of super computing, and by extension, the university. To accomplish this, it’s necessary to solve for the challenges of heating, cooling, space, and managing massive workloads. HPC requires highly specialized IT infrastructure and data center designs because the performance demands are staggeringly high.
A New Paradigm for Enabling HPC in Higher Education
Building a high-performance computing center is an expensive undertaking, requiring specialized talent in construction and operation as well as significant capital to build and operate a cutting-edge facility. For this reason, when a university is planning to develop an HPC strategy, there are many considerations that need to be made with respect to IT infrastructure and data center planning.
The complexities involved can be so overwhelming that the advisability of forming a partnership with an advanced data center provider, specifically one that is well-versed in the IT challenges of higher education and HPC, is self-evident. Very often the benefits of an HPC partnership with a data center partner will eliminate the need to expend capital dollars to maximize super computing capabilities as well as the burden of maintaining infrastructure. This allows universities to redirect focus to core research activities and extend budget as a result.
It’s best to start by walking technology decision makers through existing infrastructure and evaluating a number of factors to determine the changes that need to be made. Among these are the existing IT assets, the potential to repurpose, power density and reliability needs, and the computing demand profile.
Working with a data center partner is also valuable in the context of higher-ed HPC initiatives because the right partner will work with universities on their security and compliance concerns. For example, universities focused on medical or government agency research can gain access to an HPC expert with security and compliance expertise, from FedRAMP to HIPAA and GDPR.
The complexities involved can be so overwhelming that the advisability of forming a partnership with an advanced data center provider, specifically one that is well-versed in the IT challenges of higher education and HPC, is self-evident.
But whether an educational institution handles data subject to regulatory entities, or it needs to secure its environment to protect valuable information, it’s possible to gain the proper power, cooling, and space for the environment and simultaneously secure it. Backups and disaster recovery are also critical considerations. Research moves fast, and backing it up regularly and properly will protect findings as they evolve.
Georgia Tech recently built a new HPC system that supports data-driven research in astrophysics, computational biology, health sciences, computational chemistry, materials and manufacturing, and numerous other projects. It will also be used for research that improves the energy efficiency and performance of the HPC systems themselves. This necessitates specific needs for proper heating, cooling, space and managing massive workloads. Last fall, DataBank built a high-density solution for Georgia Tech to solve for these very issues. By partnering with DataBank, Georgia Tech was also able to reduce the upfront costs while still achieving its goals of a world-class facility.
Supercomputers have the potential to improve human lives through discoveries and advancements in science and research. This is why it’s so critical for universities to have access to a data center and infrastructure provider with experience in helping higher education institutions create roadmaps for their current and future demands, from configurations to reliability.
A campus data center should be an opportunity to showcase the importance a school places on science and technology. But make no mistake, it’s no small commitment for a university to entrust critical infrastructure components that underpin its research efforts, and it calls for an expert data center partner that higher education institutions can trust. If a university technology team is looking to pursue high performance computing initiatives, but is struggling to figure out support and financing, consulting with a specialized data center provider is the best first step in its journey.
Raul Martynek is CEO at DataBank.