Welcome!

Government Cloud Authors: Pat Romanski, Elizabeth White, Liz McMillan, Dana Gardner, Gopala Krishna Behara

Related Topics: Containers Expo Blog, @CloudExpo

Containers Expo Blog: Article

Recession-Proofing IT via Virtualization and Cloud Computing

Recessions are about as appealing as a root canal; but they do force us to think differently

Recessions are about as appealing as a root canal; but they do force us to think differently. Now that the recession is official, it's an ideal time to explore how virtualization and cloud computing can help "recession-proof" IT by transforming yesterday’s costly and rigid computing model to one that puts costs under control and sets applications free.

The National Bureau of Economic Research recently declared that the U.S. has been in a recession since December 2007. The news would be darkly amusing if it weren’t so utterly painful. But now that the recession is official, this seemed to be the ideal time to explore how virtualization and cloud computing can help recession-proof IT. Consider the following four tips:

1. Virtualize infrastructure to increase capacity utilization.

Traditional server infrastructure tightly couples applications to hardware, wasting computing capacity whenever applications utilize less than 100 percent of system resources. Virtualized infrastructure decouples applications from hardware, freeing excess capacity for use by other applications. A single virtualized server can often support 5X the workload of a non-virtualized server. This allows IT to consolidate server infrastructure, which reduces capital costs associated with server acquisition and datacenter infrastructure, as well as operating costs associated with management, maintenance, and energy consumption.

2. Use external clouds to offset capital infrastructure expense.

While virtualized infrastructure can reduce capital expenses, IT may have the opportunity to eliminate those expenses altogether by using the variable compute model of external clouds like Amazon’s Elastic Compute Cloud (Amazon EC2). In this model, compute capacity becomes elastic, allowing lines of business to align the cost of application consumption to actual demand. Swapping traditional datacenter for external cloud provides infinitely scalable capacity and the ability to align cost to value received.

3. Virtualize applications to accelerate and simplify deployment.

Packaging and deploying application workloads as virtual images can close the “deployment gap” which adds cost and delay to the deployment of enterprise applications. The virtualized application is separated from its operating infrastructure and a self-contained unit that includes just enough operating system (JeOS), databases, and middleware required to run the software in production. These bits travel with the application package and allow it to run as an image in any virtualized or cloud-based execution environment without any manual setup, tuning, configuration, or certification. Suddenly, applications are set free and deployment cycles are compressed from months to minutes. This equates to cost savings and improved business agility.

4. Construct virtual applications for simplified management, automated maintenance.

 

The reality is that this new approach to application delivery can create new costs and risks. Taking the friction out of application deployment will lead to an onslaught of volume and demand, resulting in what is often called “VM sprawl.” What organizations must recognize is that they may be exchanging one cost and management burden for another, as physical machines become virtual machines. In fact, virtual sprawl is likely to far outstrip any physical sprawl you’ve witnessed heretofore. As such, organizations need a scalable approach for managing and maintaining application images. Adding headcount isn’t an option, so the answer is finding ways to do more with less. In this case, this means architecting application images for management and control, trading manual one-at-a-time updates for seamless changes that are implemented en masse. It also means complete lifecycle control and transparency wherever the application is being run — datacenter or cloud, internal or external.

Recessions are about as appealing as a root canal. But they do force us to think differently — to take an inventory of costs, retool, reinvent. The reality is that this recession is coincident with a fundamental inflection point in IT. The friction and the economics of traditional computing models no longer work. This is why organizations must embrace virtualization and cloud — both to weather the storm of a down economy and to transform yesterday’s costly and rigid computing model to one that puts costs under control and sets applications free.

 

More Stories By Jake Sorofman

Jake Sorofman is chief marketing officer of rPath, an innovator in system automation software for physical, virtual and cloud environments. Contact Jake at [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...
Rafay enables developers to automate the distribution, operations, cross-region scaling and lifecycle management of containerized microservices across public and private clouds, and service provider networks. Rafay's platform is built around foundational elements that together deliver an optimal abstraction layer across disparate infrastructure, making it easy for developers to scale and operate applications across any number of locations or regions. Consumed as a service, Rafay's platform elimi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace.