Click here to close now.

Welcome!

GovIT Authors: Kevin Jackson, Yeshim Deniz, Jason Bloomberg, Pat Romanski, Liz McMillan

Related Topics: Cloud Expo, Microservices Journal, Virtualization, Security, GovIT, SDN Journal

Cloud Expo: Blog Feed Post

Cloud Computing Service Models

A look at the three different service models for cloud computing as defined by NIST

In this post I will look at the three different service models for cloud computing as defined by NIST. More specifically I will look at the management and operations overhead for each one of the models and compare it to the traditional on-premise model.

Traditional Model

Let's look at how things have been done in the past. Traditionally enterprises have been responsible for managing their own IT infrastructure as well as the software stack that runs their applications. For small companies that meant hiring polyglot employees with wide range of skills varying from low level networking to high level application support. For larger ones, that can afford more staff, it meant creating specialized teams responsible for only networking infrastructure, or only storage or servers and virtualization. However for lot of those enterprises the core business has never been managing IT infrastructure - the only thing they are interested in is to manage their line-of-business (LOB) apps.

Cloud Service Models

Here are just some of the tasks IT teams enterprises have been required to do in the past:

  • Build racks with servers and wire them into the network
  • Build storage arrays and wire them into the network
  • Configure routers
  • Configure firewalls and DMZ zones
  • Install operating system software on the servers
  • Create virtual machines (if virtualization is utilized)
  • Install operating system software on the virtual machines
  • Install databases, set up replication and backups
  • Install middleware used for hosting the application code
  • Patch and update operating system software
  • Patch and update databases
  • Patch and update middleware
  • Patch and update runtime
  • Install application software
  • Patch and update application software

Although long this is by no mean the complete list of tasks that IT personnel has been responsible for. From the list above only the last two (install application software and update application software) have been essential to the core business of the enterprise. In addition to the IT operational costs (OpEx) enterprises also incurred significant capital expenditures (CapEx) used to procure the necessary hardware.

More than a decade ago hosting providers recognized the need to help businesses with those tasks and allowed them to outsource the build-up of infrastructure, and concentrate on just managing their applications. Although hosting providers helped enterprises with OpEx and CapEx they still lacked some of the essential cloud characteristics like on-demand self-service, rapid elasticity and measured service as outlined in Essential Cloud Computing Characteristics.

Infrastructure-as-a-Service (IaaS) Model
IaaS model was the first model that complies with NIST cloud computing characteristics. In essence it offers cloud computing environment consisting of virtual machines. It offers self-service portal where you can on-demand start a virtual machine with preferred operating system, it is broadly accessible, elastic (you can easily start identical virtual machine or shut down exiting one), it uses pool of virtual machines that are collocated on common hardware, and at the end it measures your usage of those virtual machines.

If you look at the picture above you will see that the IaaS model provides automation in the lower layers (up to the virtualization layer) of the application stack. What that means is that tasks like starting the virtual machine, adding it to the network, configuring the routing and the firewalls and attaching storage to it is automatically done by the automation software. The vendor that provides the service is also responsible for managing any hardware failures and service the underlying hardware.

As you have already noticed the IaaS model provides cloud services up to the virtualization layer of the application stack. However as a consumer of the IaaS service you are still responsible for managing the virtual machine. Hence you are still responsible for patching and updating the operating system on the VM, installing and maintaining any databases or middleware that your application uses in addition to maintaining your actual application.

IaaS is very similar to the traditional hosting model with the added benefits of self-service, elasticity and metering.

Platform-as-a-Service (PaaS) Model
With PaaS you have much less things to worry about. As you can see from the picture the whole stack needed for your application is managed by the vendor. Your only responsibilities are your application and the data your application uses. In addition to the tasks managed in IaaS case the vendor (or in the case of private PaaS the platform owner) is also responsible for patching and updating the operating system, installing and maintaining the middleware as well as the runtime your application uses.

One important thing that you need to be aware of when using PaaS is that the automatic updates the vendor does may sometimes have negative impact on your application. Why is that? Very often OS and middleware vendors do incompatible changes between versions of their software. If your application depends on any underlying OS and middleware functionality it may break between platform updates. And because you are not in control of those updates you may end up with your application being down.

The premise of PaaS though is not only to offer maintenance free application stack but also additional services that you can utilize in your application. Very often PaaS providers are exposing middleware and databases as services and abstract the connectivity to those through APIs in order to free up developers from the need to locate the actual systems. Additional services can be authentication and authorization, video encoding, location based services etc. Using the PaaS services will allow you to abstract your applications from the underlying stack and as long as the APIs are kept intact it will be protected from failures between platform updates.

Software-as-a-Service (SaaS) Model
SaaS is the model with the highest abstraction and offers the most maintenance free option. As a SaaS consumer you are just using the software offered by the vendor. As depicted on the picture the whole stack is maintained by the vendor. This includes also updates for the application as well as application data management. SaaS model is very similar to the off-the-shelf software model where you go and buy the CD, install the software and start using it.

Traditionally one of the hardest problems application developers had to deal with was the data migration between different versions. SaaS vendors are also responsible for migrating your data and keeping it consistent. Similar to the off-the-shelf software model you can rely that you can access and read your data once you upgrade to a new version.

The SaaS model is the most resource-efficient model because it utilizes application multi-tenancy. What this means is that the same application instance handles multiple user-organizations. This is good for both the vendor and the customer because better resource utilization brings the maintenance costs down and hence the price for the services down. On the other side though tenant data is comingled and there is the security risk of one tenant accidentally getting access to another tenant's data.

Although not exhaustive the cloud computing service models explanation above should be enough to kick-start your initial discussion about your cloud strategy.

Read the original blog entry...

More Stories By Toddy Mladenov

Toddy Mladenov has more than 15 years experience in software development and technology consulting at companies like Microsoft, SAP and 3Com. Currently he is a CTO of Agitare Technologies, Inc. - a boutique consulting company that specializes in Cloud Computing and Big Data Solutions. Before Agitare Tech Toddy spent few years with PaaS startup Apprenda and more than six years working on Microsft's cloud computing platform Windows Azure, Windows Client and MSN/Windows Live. During his career at Microsoft he managed different aspects of the software development process for Windows Azure and Windows Services. He also evangelized Microsoft cloud services among open source communities like PHP and Java. In the past he developed enterprise software for German's software giant SAP and several startups in Europe, and managed the technical sales for 3Com in the Balkan region.

With his broad industry experience, international background and end-user point of view Toddy has an unique approach towards technology. He believes that technology should be develop to improve people's lives and is eager to share his knowledge in topics like cloud computing, mobile and web development.

@ThingsExpo Stories
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fil...
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
Cloud is not a commodity. And no matter what you call it, computing doesn’t come out of the sky. It comes from physical hardware inside brick and mortar facilities connected by hundreds of miles of networking cable. And no two clouds are built the same way. SoftLayer gives you the highest performing cloud infrastructure available. One platform that takes data centers around the world that are full of the widest range of cloud computing options, and then integrates and automates everything. Join SoftLayer on June 9 at 16th Cloud Expo to learn about IBM Cloud's SoftLayer platform, explore se...
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionality and provide short-term introductory projects that developers can complete between sessions.
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo – to be held June 9-11, 2015, at the Javits Center in New York City, NY – is now accepting Hackathon proposals. Hackathon sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem. At Cloud Expo 2014 Silicon Valley, IBM held the Bluemix Developer Playground on November 5 and ElasticBox held the DevOps Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...