Welcome!

Government Cloud Authors: Elizabeth White, Flint Brenton, Liz McMillan, Gopala Krishna Behara, Raju Myadam

Related Topics: @CloudExpo

@CloudExpo: Article

A Brief History of Cloud Computing: Is the Cloud There Yet?

A look at the Cloud's forerunners and the problems they encountered

Paul Wallis's Blog

In order to discuss some of the issues surrounding The Cloud concept, I think it is important to place it in historical context. Looking at the Cloud's forerunners, and the problems they encountered, gives us the reference points to guide us through the challenges it needs to overcome before it is adopted.

Nick Carr recently commented on IBM's new initiative called Project KittyHawk, which sets out to use their Blue Gene technology. The project aspires to create a “global-scale shared computer capable of hosting the entire Internet as an application”.

There have been a range of online discussions on the back of the article as, once again, Nick Carr manages to hit more than a couple of raw nerves.

The premise of the article is that IBM Blue Gene technology is creating computers of such power that data centres can offer vast amounts of computational power that businesses can plug into and use according to need at a particular time.

These supercomputers can emulate many individual smaller servers (virtualisation) so businesses can migrate their IT services to this new model.

Rather than data centres just offering a place to put your own servers, they can start to offer virtual servers or services, enabling new business models to be adopted.

The IBM technology is so fast that Project Kittyhawk can emulate the entire internet.

In the past, there have been two ways of creating a supercomputer. Firstly, there is the Blue Gene style approach, which creates a massive computer with thousands (or hundreds of thousands) of CPUs. The other approach, as adopted by Google, is to take hundreds of thousands of small, low cost, computers and hook them together in a “cluster” in such a way that they all work together as one large computer.

Basically, supercomputers have many processors plugged into a single machine, sharing common memory and I/O, while clusters are made up of many smaller machines, each of which contain a fewer number of processors with each machine having it's own local memory and I/O.

There have always been advocates on both sides of the fence, and Nick Carr's article has done a fine job of stirring them into action again - but this time it has become clear that the concept of “The Cloud” is gaining momentum, a concept whose origins lie in clustering and grid computing.

John Willis seeks to 'demystify' clouds and received some interesting comments. James Urquhart is an advocate of cloud computing and thinks that, as with any disruptive change, some people are in denial about The Cloud. He has responded to some criticism of his opinions. Bob Lewis, one of Urquhart's “deniers” has written a few posts on the subject and offers a space for discussion of Nick Carr's arguments.

In order to discuss some of the issues surrounding The Cloud concept, I think it is important to place it in historical context. Looking at the Cloud's forerunners, and the problems they encountered, gives us the reference points to guide us through the challenges it needs to overcome before it is adopted.

In the past computers were clustered together to form a single larger computer. This was a technique common to the industry, and used by many IT departments. The technique allowed you to configure computers to talk with each other using specially designed protocols to balance the computational load across the machines. As a user, you didn't care about which CPU ran your program, and the cluster management software ensured that the “best” CPU at that time was used to run the code.

In the early 1990s Ian Foster and Carl Kesselman came up with a new concept of “The Grid”. The analogy used was of the electricity grid where users could plug into the grid and use a metered utility service. If companies don't have their own powers stations, but rather access a third party electricity supply, why can't the same apply to computing resources? Plug into a grid of computers and pay for what you use.

Grid computing expands the techniques of clustering where multiple independent clusters act like a grid due to their nature of not being located in a single domain.

A key to efficient cluster management was engineering where the data was held, known as “data residency”. The computers in the cluster were usually physically connected to the disks holding the data, meaning that the CPUs could quickly perform I/O to fetch, process and output the data.

One of the hurdles that had to be jumped with the move from clustering to grid was data residency. Because of the distributed nature of the Grid the computational nodes could be situated anywhere in the world. It was fine having all that CPU power available, but the data on which the CPU performed its operations could be thousands of miles away, causing a delay (latency) between data fetch and execution. CPUs need to be fed and watered with different volumes of data depending on the tasks they are processing. Running a data intensive process with disparate data sources can create a bottleneck in the I/O, causing the CPU to run inefficiently, and affecting economic viability.

Storage management, security provisioning and data movement became the nuts to be cracked in order for grid to succeed. A toolkit, called Globus, was created to solve these issues, but the infrastructure hardware available still has not progressed to a level where true grid computing can be wholly achieved.

But, more important than these technical limitations, was the lack of business buy in. The nature of Grid/Cloud computing means a business has to migrate its applications and data to a third party solution. This creates huge barriers to the uptake.

In 2002 I had many long conversations with the European grid specialist for the leading vendor of grid solutions. He was tasked with gaining traction for the grid concept with the large financial institutions and, although his company had the computational resource needed to process the transactions from many banks, his company could not convince them to make the change.

Each financial institution needed to know that the grid company understood their business, not just the portfolio of applications they ran and the infrastructure they ran upon. This was critical to them. They needed to know that whoever supported their systems knew exactly what the effect of any change could potentially make to their shareholders.

The other bridge that had to be crossed was that of data security and confidentiality. For many businesses their data is the most sensitive, business critical thing they possess. To hand this over to a third party was simply not going to happen. Banks were happy to outsource part of their services, but wanted to be in control of the hardware and software - basically using the outsourcer as an agency for staff.

Traditionally, banks do not like to take risks. In recent years, as the market sector has consolidated and they have had to become more competitive, they have experimented outwith their usual lending practice, only to be bitten by sub-prime lending. Would they really risk moving to a totally outsourced IT solution under today's technological conditions?

Taking grid further into the service offering, is “The Cloud”. This takes the concepts of grid computing and wraps it up in a service offered by data centres. The most high profile of the new “cloud” services is Amazons S3 (Simple Storage Service) third party storage solution. Amazon's solution provides developers with a web service to store data. Any amount of data can be read, written or deleted on a pay per use basis.

EMC plans to offer a rival data service. EMCs solution creates a global network of data centres each with massive storage capabilities. They take the approach that no-one can afford to place all their data in one place, so data is distributed around the globe. Their cloud will monitor data usage, and it automatically shunts data around to load-balance data requests and internet traffic, being self tuning to automatically react to surges in demand.

However, the recent problems at Amazon S3, which suffered a “massive” outage in February, has only served to highlight the risks involved with adopting third party solutions.

So is The Cloud a reality? In my opinion we're not yet there with the technology nor the economics required to make it all hang together.

In 2003 the late Jim Gray published a paper on Distributed Computing Economics:

Computing economics are changing. Today there is rough price parity between (1) one database access, (2) ten bytes of network traffic, (3) 100,000 instructions, (4) 10 bytes of disk storage, and (5) a megabyte of disk bandwidth. This has implications for how one structures Internet-scale distributed computing: one puts computing as close to the data as possible in order to avoid expensive network traffic.

The recurrent theme of this analysis is that “On Demand” computing is only economical for very cpu-intensive (100,000 instructions per byte or a cpu-day-per gigabyte of network traffic) applications. Pre-provisioned computing is likely to be more economical for most applications - especially data-intensive ones.

If telecom prices drop faster than Moore's law, the analysis fails. If telecom prices drop slower than Moore's law, the analysis becomes stronger.

When Jim published this paper the fastest Supercomputers were operating at a speed of 36 TFLOPS. A new Blue Gene/Q is planned for 2010-2012 which will operate at 10,000 TFLOPS, out stripping Moore's law by a factor of 10. Telecom prices have fallen and bandwidth has increased, but more slowly than processing power, leaving the economics worse than in 2003.

I'm sure that advances will appear over the coming years to bring us closer, but at the moment there are too many issues and costs with network traffic and data movements to allow it to happen for all but select processor intensive applications, such as image rendering and finite modelling.

There has been talk of a two tier internet where businesses pay for a particular Quality of Service, and this will almost certainly need to happen for The Cloud to become a reality. Internet infrastructure will need to be upgraded, newer faster technologies will need to be created to ensure data clouds speak to supercomputer clouds with the efficiency to keep the CPUs working. This will push the telecoms costs higher rather than bringing them in line with Moore's Law, making the economics less viable.

Then comes the problem of selling to the business. Many routine tasks which are not processor intensive and time critical are the most likely candidates to be migrated to cloud computing, yet these are the least economical to be transferred to that architecture. Recently we've seen the London Stock Exchange fail, undersea data cables cut in the Gulf, espionage in Lithuania and the failure of the most modern and well-known data farm at Amazon.

In such a climate it will require asking the business to take a leap of faith to find solid footing in the cloud for mission critical applications.

And that is never a good way to sell to the business.

[This appeared originally here and is republished by kind permission of the author, who retains copyright.]

 

More Stories By Paul Wallis

Paul Wallis is Chief Technology Officer at Stroma Software Limited. He blogs at www.keystonesandrivets.com, where he tries to bridge the understanding gap between business and IT.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Virtualization news for the channel community and you ! 06/08/08 04:59:13 PM EDT

Trackback Added: From Virtualization to cloud computing?; Over the last months, years (as virtualization grew big) more and more people started thinking again about Cloud Computing. Now cloud computing has been around for over decades, yet it has not been able to become mainstream. Possibly with the trend to ...

@ThingsExpo Stories
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo at the Javits Center in New York City, NY.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Michael Maximilien, better known as max or Dr. Max, is a computer scientist with IBM. At IBM Research Triangle Park, he was a principal engineer for the worldwide industry point-of-sale standard: JavaPOS. At IBM Research, some highlights include pioneering research on semantic Web services, mashups, and cloud computing, and platform-as-a-service. He joined the IBM Cloud Labs in 2014 and works closely with Pivotal Inc., to help make the Cloud Found the best PaaS.
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution. In his session at @ThingsExpo, Akvelon expert and IoT industry leader Sergey Grebnov provided an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abilit...