Leveraging Virtual Machines and Containers to Create a Powerful Edge Computing Solution
In a recent interview, Rudy de Anda, Head of Strategic Alliances at Penguin Solutions, spoke to Matt Vincent, Editor-in-Chief of Data Center Frontier. They discussed how Penguin Solutions’ Stratus ztC Edge platform, combined with Kubernetes management, creates a powerful, low-maintenance edge computing solution.
Matt Vincent:
Can you share a real-world example of how combining virtual machines, or VMs, and containers improved scalability or fault tolerance in an edge environment?
Rudy de Anda:
Sure! We've worked with upstream oil and gas customers who needed to consolidate workloads across departments to meet cybersecurity requirements. To do that, they consolidated cybersecurity, SCADA, and historian applications onto a single, fault-tolerant device, which is what the Stratus platforms from Penguin Solutions deliver.
The consolidated infrastructure also enabled these companies to run a virtual machine hosting containerized applications for AI-powered video analytics. This set up lets them handle tasks like personal protective equipment (PPE) compliance monitoring, leak detection, and a variety of other video-based monitoring tasks from a single platform.
Vincent:
How would you explain the main differences between VMs and containers? Why do they work so well together in hybrid setups?
de Anda:
Virtual machines predate containers and represent a full virtualization of an operating system and machine. Think of a VM as a complete replica of your PC at home. It’s a complete system including the operating system, applications, and all the services needed to make your computer work.
Containerized solutions, on the other hand, are a completely different framework. They’re similar to your smartphone and package applications with only the dependencies they need, without virtualizing an entire machine. This makes them significantly lighter and easier to install, giving users the ability to customize and build their own stack.
Essentially, the biggest difference between VMs and containerized applications is that virtual machines are a little more locked down, bigger, and slower to turn on and off and less portable. Containerized applications share many services and are more portable. Together, they complement each other in hybrid setups, with VMs providing stability and containers offering agility.
Vincent:
When you compare VMs and containers, what's the real trade-off in startup time and issues like resource use and scalability?
de Anda:
Startup time is a significant factor. Restarting a virtual machine is a bigger task compared to restarting a specific application or container. Restarting the entire containerized OS and system can take quite a bit of time as well.
Containers are designed for rapid updates—once you get your microservices up and running, patches can be applied in near real-time. This means you can use the container, make the update, and then the next transaction will use the latest version. Updates are very easy to push out with containers, but virtual machines require a complete restart of your operating system for the update to take place.
Now, on the other end of that, virtual machines excel in resource dedication. By locking down resources, a VM delivers more reliable and repeatable performance out of the applications running on it, as it's not competing for resources.
In a containerized environment, as more containers are added and demand increases, there is competition for those shared resources, potentially degrading the performance of the containers.
The trade-off lies in balancing the need for agility with the need for stability, depending on the workload.
Vincent:
When it comes to edge workloads, how do you decide when to prioritize containers over VMs or vice versa?
de Anda:
It often depends on the type of workload. In the control world, where physical resources are connected, you’d likely use virtual machines so you can lock them down, enhance cybersecurity, limit access, and ensure consistent application performance and your output.
In the IoT world, where data is collected, analyzed, and used to make decisions or recommendations and then pushed to control, these types of applications tend to be more containerized than virtual machines because you’re doing multiple iterations, updates and patches.
Vincent:
What's one tool or strategy that simplifies integrating VMs and containers at the edge?
de Anda:
A robust orchestration tool is a must. It allows you to manage containers running within a VM and orchestrate the entire system. But if you also have a platform, like Penguin Solutions Stratus ztC Edge, with strong APIs, you can use that same orchestration tool to manage both the VM layer and the virtual machine applications, as well as your entire fleet, across your enterprise.
Vincent:
How is Kubernetes changing the way organizations deploy apps across both VMs and containers?
de Anda:
Kubernetes has really risen to be one of the top ways to manage containerized applications. It simplifies and standardizes the deployment, makes applications more portable, and ensures they’re always available.
Kubernetes, within its environment, does a great job of helping you deploy a DevOps approach. It allows for constant iteration and improvement, which is critical in fast-paced environments. It’s established a standard for working with containerized environments.
Vincent:
Where do you see containerized microservices taking virtualization technologies in the next few years?
de Anda:
I don't think it’s about one technology replacing the other. There are trade-offs with each one. It's more of a symbiotic relationship. VMs and containers each have their strengths, and the key is understanding which technology is best for the applications you're using.
For your data science and DevOps teams, who are constantly updating, patching and learning about your business, containers are the way to go. However, for your traditional, locked-down applications like cybersecurity or control systems, VMs are the right choice.
It’s all about understanding the trade-offs and leveraging the strengths of each technology. When you do that, you can build a solution that’s both powerful and flexible.
About the Author

Rudy de Anda
Rudy de Anda is Head of Strategic Alliances at Penguin Solutions.