Welcome!

Government Cloud Authors: Elizabeth White, Liz McMillan, Gopala Krishna Behara, Raju Myadam, Kevin Jackson

Related Topics: Government Cloud, Open Source Cloud, Containers Expo Blog

Government Cloud: Article

Testing Virtualization: What’s Really Going on in There?

The bottom line is that virtualization promises many benefits for providers and customers

It's no secret what's driving the move to virtualization in data centers. The demand for new and expanded software systems is growing, but the geographic and carbon footprint required for scaling underutilized dedicated servers is too costly on many levels.

This issue has led to the maturation of the virtual server, where increased reliability and stability means reduced risk, making virtualization of the data center a viable solution. Virtualization makes it possible to replace physical servers running at 10 percent capacity with fewer, more powerful servers running at 60 percent capacity or more.

Figure 1: A-Network cloud connected to four servers running one process each. B-Network cloud connected to two servers each running many processes.

Legacy virtualization platforms support up to four or eight virtual machine (VM) instances per physical server. New virtualization platforms can support 32 VMs or more, with a virtual switch directing traffic from the network interface to the appropriate VM.

This growth has implications for the performance of the physical and virtual servers and for the network.

When working with a dedicated server, it's a fairly straightforward process to characterize performance and isolate factors that affect it. A test system may be used to emulate realistic users, traffic and network conditions and measure the response times of the application with the test system.

When working with dozens of virtual servers in a single physical server, it's not as straightforward. A connection from the test system to the physical server won't provide the granularity of testing required for meaningful results. The single physical interface handles traffic for many VM instances, making it difficult to isolate and measure the performance of each VM instance or the performance of the virtual switch.

Figure 2: A-Test system connected to four servers running one process each. B-Test system connected to two servers running many processes each, with a virtual switch.

Visibility into the performance of each VM is needed, but how can we get it? By creating the capability to capture results and generate traffic from within the virtual server instance. We need a virtual test system inside the virtual server.

What exactly is a virtual test system?

A virtual server is implemented in a virtual machine, a protected partition in a physical server, but to the user it looks and behaves exactly as if it were a dedicated, physical server. In the same way, a virtual test system is a software-based test system implemented in a virtual machine, but to the network devices under test, it looks and behaves exactly as if it were a hardware test system.

Figure 3: Test box connected to a server running 12 processes. These processes contain virtual test systems.

A virtual test system supports the same capabilities as a physical test system. It generates traffic between any number and any combination of virtual and physical interfaces. It reports results per instance and aggregate results for the system under test. It supports realistic layer-2 through layer-7 traffic on virtual and physical interfaces.

The only way to get the granular test results required to reveal the true performance of a virtual server implementation is through a virtualized test system running in VMs on the server.

Another area of concern in the process of virtualizing the data center is the performance of the virtual switches. Ethernet switches are the work horses of the data center. They're not flashy but they consistently and reliably support line-rate throughput, low latency and low jitter. In fact, they can often be taken for granted. Test engineers don't expect switches to introduce problems during testing.

By contrast, virtual switches haven't yet achieved hardware-level performance and reliability. They use the same CPU resources for switching frames as the applications running on the VMs and therefore under heavy loads may not be as reliable as physical switches.

However, that doesn't mean that virtual switches are not ready for business applications. The key is testing. Knowing in advance the performance limits of the virtual switch implemented on your platform means reducing or eliminating the risk of downtime or performance problems. It's the difference between a confident deployment or the fear of customer service issues.

Beyond performance testing, the virtual switch must support the same protocols and functions as a physical switch. Functional testing is just as important for virtual switches as performance and scalability testing, and should be a part of any data center test methodology.

A network manager can reasonably expect any modern Ethernet switch to support features such as virtual LANs (VLANs), access control lists (ACLs), and Internet group management protocol (IGMP) for forwarding multicast traffic. These protocols, and others, are often included as part of physical switch performance testing and should be included when testing virtual switches.

Finally, there is the impact of virtualization on the network itself. A standard rack holds numerous physical servers and dozens of blade servers. Some data centers house hundreds or thousands of racks. With the number of server instances per rack possibly increasing by an order of magnitude, the increase in network traffic can be significant.

In addition, a VM instance often uses more bandwidth than a physical server, due to the traffic generated by managing a virtual machine from a central location. Also, some platforms move virtual machines between physical servers, enhancing uptime and reliability, but also generating considerable network load.

The implications for the data center infrastructure are clear. Virtualization requires that not only the VM solution be tested, including the virtual switch, but also the transport network and elements of the infrastructure supporting functions such as security, load balancing, and others. Establishing performance limits for the components of the network avoids unpleasant surprises and enables planning for growth as the data center expands.

The bottom line is that virtualization promises many benefits for providers and customers, from better margins to better prices to a greener network. Proper testing verifies that it will deliver on those promises.

More Stories By Jurrie van den Breekel

Jurrie van den Breekel is a product marketing manager at Spirent Communications and serves as market segment lead for the company’s Enterprise and Data Center switching test solutions. Previously he served as a technical marketing manager where he was responsible for the market development of Spirent’s Performance Analysis Broadband division in the EMEA region. Jurrie has also held a variety of other positions at Spirent. Prior to joining Spirent in 2000, he spent several years at the Dutch-based system integrator, TrueCom, as product manager for telecom test systems.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...