Welcome!

Government Cloud Authors: Elizabeth White, Pat Romanski, Dana Gardner, Liz McMillan, Gopala Krishna Behara

Related Topics: Machine Learning , IBM Cloud, @CloudExpo

Machine Learning : Blog Post

Cloud Computing and Virtual Images

A look at several different cloud computing solutions will reveal a technological enabler present in almost each one

Cloud Expo on Ulitzer

A look at several different cloud computing solutions will reveal a technological enabler present in almost each one. The enabler I'm talking about is the use of virtual images. I cannot think of many, if any, cloud computing solutions that provide software elements (i.e. more than just servers, storage, memory, etc.) that do not use virtual images in some form or fashion.

Of course, one of the reasons virtual images form the backbone of many cloud solutions is obvious. Virtual images provide the benefits of server virtualization. We can activate many virtual images on the same physical machine, thus allowing us to achieve multi-tenancy (multiple operating systems and software stacks installed on the same physical machine). Besides driving higher hardware utilization rates, it also provides us the capability to run heterogeneous software environments on the same piece of hardware. This both enables and encourages the creation of a shared pool of compute resources which is a key characteristic of cloud computing environments.

Server virtualization may be the first thing that comes to mind when we think about virtual images, but at least in the context of cloud computing, I do not believe this is the most important benefit. If we are looking at cloud computing as a means to quickly and consistently provision software environments, then I think virtual images provide us a capability and benefit more valuable than server virtualization. In this respect, virtual images provide us a medium through which we can templatize the configuration of our software environments.

Consider the case of a fairly basic application serving environment. In this environment, you are likely to install an operating system, application server, and probably some type of load balancing solution. Each of these typically requires a different piece of software, different installation procedures, and finally integration with the other components. Installing these into a typical environment, without the use of virtual images, means that you either have scripts that account for each different piece of software and finally integration of the different components, or it means that a person manually installs and integrates the pieces each time you need an environment. Either process can be time-consuming and costly to maintain over time.

Enter the use of virtual images. With a virtual image, you can install and integrate all three components ONE time, and then capture the resultant environment as a virtual image. At this point, when an application environment is needed, the virtual image can simply be activated on top of a hypervisor platform. The application environment is typically available in a much more timely fashion than if manually installed or integrated because the installation and configuration have already been captured in the virtual image.

From what I described above, you may have caught on to what would seem like a drawback of using virtual images to templatize software environments. Specifically, it may seem that you need a distinct virtual image for every unique configuration of your application environment. If this were the case, management of your virtual image library would soon become a nightmare and the resulting cost (in both resource and time) would likely outweigh the original benefits. However, thanks to a relatively new standards-based approach to virtual images, this is not necessarily a problem.

The standard I'm talking about is the Open Virtualization Format (OVF) standard which has been published by the Distributed Management Task Force (DMTF). According to the OVF standard, it "describes an open, secure, portable, efficient and extensible format for the packaging and distribution of software to be run in virtual machines." In particular to our discussion here, there is a part of the standard that describes the use of an ovf-env.xml file within the virtual image.

This file is essentially a key-value style XML file that describes desired aspects of the environment. Keys and values can be supplied during image activation, and configurations scripts that run during virtual image activation can read information from the file and react appropriately. Thus, instead of supplying N different virtual images for N different software environments, you can supply 1 virtual image and utilize the ovf-env.xml file in conjunction with configuration scripts within the image to produce N different environments. The use of this mechanism with virtual images delivers the capability to templatize software environments without sacrificing flexibility or encouraging unsustainable virtual image proliferation.

In WebSphere, we utilize the model outlined in the OVF standard when packaging our WebSphere Application Server Hypervisor Edition virtual images. This allows our WebSphere CloudBurst product to provision these images and create many different types of WebSphere Application Server environments from a single virtual image (read this article for more information). I expect the use of this standard and the mechanisms it provides will become pretty prevalent in the near future. Now if we could just get to the point where virtual disk formats are standardized, but that's an entirely different topic.

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

IoT & Smart Cities Stories
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...