Welcome!

GovIT Authors: Trevor Parsons, Kevin Jackson, Pat Romanski, Bob Gourley, Yakov Fain

Related Topics: Weblogic, Cloud Expo

Weblogic: Article

The Economics of Cloud Computing Analyzed

Addressing the Benefits of Infrastructure in the Cloud

Government Cloud Computing on Ulitzer

The President's budget for fiscal year 2010 (FY10) includes $75.8B in information technology (IT) spending, which is a 7-percent increase from FY09. Of this, at least $20B will be spent on IT infrastructure investments. [1] The FY11 budget for IT is projected to be nearly $88B. The government is actively seeking ways to reduce IT costs, and the FY10 budget request highlights opportunities for the federal government to achieve significant long-term cost savings through the adoption of cloud computing technologies:

"Of the investments that will involve up-front costs to be recouped in outyear savings, cloud-computing is a prime case in point. The Federal Government will transform its Information Technology Infrastructure by virtualizing data centers, consolidating data centers and operations, and ultimately adopting a cloud-computing business model. Initial pilots conducted in collaboration with Federal agencies will serve as test beds to demonstrate capabilities, including appropriate security and privacy protection at or exceeding current best practices, developing standards, gathering data, and benchmarking costs and performance. The pilots will evolve into migrations of major agency capabilities from agency computing platforms to base agency IT processes and data in the cloud. Expected savings in the outyears, as more agencies reduce their costs of hosting systems in their own data centers, should be many times the original investment in this area." [2]

The language in the budget makes three key points: (1) up-front investment will be made in cloud computing, (2) long-term savings are expected, and (3) the savings are expected to be significantly greater than the investment costs.

Booz Allen Hamilton has created a detailed cost model that can create life-cycle cost (LCC) estimates of public, private, and hybrid clouds. We used this model, and our extensive experience in economic analysis of IT programs, to arrive at a first-order estimate of each of the three key points in the President's budget. Overall, it appears likely that the expectations highlighted in the budget can be met, but several factors could affect the overall degree of economic benefit.

Economic Implications
The government's adoption of this new IT model warrants careful consideration of the model's broad economic implications-including the potential long-term benefits in terms of cost savings and avoidance as well as the near-term costs and other impacts of a transition from the current environment. Factors such as the number and rate of federal agencies adopting cloud computing, the length of their transitions to cloud computing, and the cloud computing deployment model (public, private, or hybrid) all will affect the total costs, potential benefits, and time required for the expected benefits to offset the investment costs.

Booz Allen developed a first-order economic analysis by considering how agencies might migrate to a cloud-based environment and what the costs and potential savings might be under a variety of scenarios. Specifically, given long-standing efforts to protect the privacy and security of the federal government's data and systems, a key variable will be whether agencies take advantage of public clouds, build their own private clouds, or adopt a hybrid approach., The focus was on Cloud Computing infrastructure services as these tend to represent a relatively more consistent set of costs/investments/operating requirements across all agencies. We made some high-level, simplifying assumptions in our initial analysis:

  1. There is an existing data center(s) currently operational that is a baseline for economic comparison of migrating to a cloud environment.
  2. Existing application software will migrate with the infrastructure to the cloud.  Application software support costs remain out of scope.
  3. Migration decisions will be made at the department or agency (rather than bureau) level in order to aggregate demand and drive scale efficiencies.
  4. We assume the perceived sensitivity of an agency's mission and data will be a primary factor (though by no means the only factor) driving its decisions on which path to follow..

Next, we developed three high-level scenarios that represent potential migration paths. The three scenarios are as follows:

Scenario 1: Public Cloud Adopters
Key Agency Characteristic:
Migrates low-sensitivity data to an existing public cloud.

Assumptions: Transition to the new cloud environment will occur steadily over 3 years; workload remains constant (i.e., no increase in capacity demand).

Scenario 2: Hybrid Cloud Adopters
Key Agency Characteristic:
Uses a private cloud solution to handle the majority of its IT workload; also uses a public cloud solution to provide "surge" support and/or support for low-sensitivity data.

Assumptions: Seventy-five percent of the IT server workload will migrate to a private cloud, and the remaining 25 percent will transition to a public cloud; transition to the new cloud environments will occur steadily over 3 years; existing facilities will be used (i.e., no new investment is required in physical facilities); workload remains constant (i.e., no increase in capacity demand).

Scenario 3: Private Cloud Adopters
Key Agency Characteristic:
Builds its own private cloud solution or participates in an interagency cloud solution (i.e., community cloud). Broad mission sensitivity results in the need to maintain control of infrastructure and data.

Assumptions: Transition to the new cloud environment will occur steadily over 3 years; existing facilities will be used (i.e., no new investment is required in physical facilities); workload remains constant (i.e., no increase in capacity demand).

Agencies publicly report only their "consolidated" IT infrastructure expenditures, which include end-user support systems (e.g., desktops, laptops) and telecommunications. Additional spending on application-specific IT infrastructure is typically rolled up into individual IT investments. In an effort to isolate data center costs, we extrapolated findings based on our experience with actual federal data centers. Specifically, we developed a "representative" agency data center profile that serves as a useful proxy for other agencies and enables us to explore the potential savings of a migration to cloud computing under the scenarios described above. Although agencies of similar size can have very different IT infrastructure profiles, we modeled an agency with a classic standards-based web application infrastructure. For our representative agency, we began with an assumption that a Status Quo (SQ) data center containing 1,000 servers with no virtualization is already operational. [3]  The results at different scales are shown in our analysis.

Using a Booz Allen proprietary cloud computing cost and economic model that employs data collected internally, data from industry, and parametric estimating techniques, we estimated the LCCs for our representative agency to migrate its IT infrastructure (i.e., its server hardware and software) to the cloud under each of the three scenarios described above. We compared these costs to the LCCs of the SQ scenario (i.e., no cloud migration). [4] We also calculated three common metrics to analyze each scenario's potential economic benefits. These metrics allowed us to evaluate the three elements of the business case in the President's budget and estimate the absolute and relative benefits, as well as the time over which the outyear savings will pay back the investment costs.

The three key metrics in our analysis are as follows:

  • Net Present Value (NPV) is calculated as each cloud scenario's discounted net benefits (i.e., the cloud scenario's reduced operations and support [O&S] costs relative to the SQ environment's O&S costs) minus the cloud's discounted one-time investment costs. A positive dollar figure indicates a positive economic benefit versus the SQ environment. NPV is an absolute economic metric.
  • Benefit-to-Cost Ratio (BCR) is calculated as each cloud scenario's discounted net benefits divided by its discounted investment costs. A number greater than 1.0 indicates a positive economic benefit versus the SQ environment. BCR is a relative economic metric.
  • Discounted Payback Period (DPP) reflects the number of years (from FY10) it takes for each scenario's accumulated annual benefits to equal its total investment costs.

The top portion of Exhibit 1 shows the analysis results. This exhibit presents the one-time investment phase costs as well as the recurring O&S phase costs for each scenario with a 13-year life cycle (3-year investment phase and 10-year steady-state O&S phase) from FY10 through FY22.

Assuming a 3-year transition period for each scenario, investment costs are expected to be incurred from FY10 to FY12 and include (depending on the scenario) hardware procurement and commercial off-the-shelf (COTS) software license fees; contractor labor required for installation, configuration, and testing; and technical and planning support (i.e., system engineering and program management costs) before and during the cloud migration. Because the SQ reflects an operational steady state, no investment costs are estimated for that scenario. Although the public cloud scenario does not present any up-front investment costs for hardware or software procurement, it does require program planning and technical support, support for porting applications over to the new cloud environment, and testing support to ensure programs and applications are working correctly in the new environment.

Recurring O&S costs "ramp up" for all cloud scenarios beginning in FY10 and enter steady state in FY13, continuing through FY22. For private clouds, these costs include hardware and software maintenance, periodic replacement/license renewal costs, system operations labor support costs, and IT power and cooling costs. For hybrid clouds, the O&S costs include the same items as the private cloud (albeit on a reduced scale), as well as the unit consumption costs of IT services procured from the public cloud. For public cloud scenarios, the O&S costs are the unit costs of services procured from the cloud provider and a small amount of IT support labor for the cloud provider to communicate any service changes or problems. In all three cloud scenarios, a significant portion of the O&S costs are incurred while phasing out the SQ environment during the transition. The SQ phase-out costs "ramp down" from FY10 to FY12, dove-tailing with the ramp up of the new clouds' O&S costs. Not surprisingly, the total LCCs are lowest for the public cloud scenario and highest for the private cloud scenario, with the hybrid cloud scenario's LCCs falling in the middle.

The economic analysis confirms that the projected NPV and BCR for all three scenarios are significant relative to the SQ environment. Once the cloud migrations are completed, our model estimates annual O&S savings in the 65-85 percent range, with the lower end corresponding to the private cloud scenario and the upper end corresponding to the public cloud scenario. These percentages can be applied to overall federal IT spending for data centers to estimate the potential absolute savings across the federal government. (As part of the Information Technology Infrastructure Line of Business [ITI LoB] initiative, General Services Administration [GSA] is coordinating a benchmarking effort across the government. If those figures are made public, a total dollar savings estimate will be possible).

Our model shows that the net benefits and payback periods for agencies adopting the hybrid cloud scenario are closer to those for the private cloud than the public cloud. This variation is largely a result of our assumption that 75 percent of the current server workload would migrate to a private cloud and only 25 percent would transition to the public cloud. If we were instead to assume the opposite mix (i.e., 25 percent of the workload migrating to a private cloud and 75 percent to a public cloud), the hybrid scenario economic results would be closer to the public cloud results.

We conducted a sensitivity analysis on several of the variables in our cost model to determine the major drivers for cloud economics. The two most influential factors driving the economic benefits are (1) the reduction in hardware as a smaller number of virtualized servers in the cloud replace physical servers in the SQ data center and (2) the length of the cloud migration schedule. Exhibits 2, 3, and 4 show the results of varying these factors.

In practice, several factors could cause agencies to realize lower economic benefits than our estimates suggest. One factor is underestimation of the costs associated with the investment or O&S phase for the cloud scenarios. Another factor is server utilization rates (both in the current environment and the new cloud environment). Our analysis assumes an average utilization rate of 12 percent of available CPU capacity in the SQ environment and 60 percent in the virtualized cloud scenarios. This difference in server utilization, in turn, enables a large reduction in the number of servers (and their associated support costs) required in a cloud environment to process the same workload relative to the SQ environment. Agencies with server utilization rates that are already relatively high should expect lower potential savings from a virtualized cloud environment.

The charts indicate two key takeaways:

  • Scale is important: The economic benefit increases as virtualized servers replace larger numbers of underutilized servers.
  • Time is money: Because of the cost of parallel IT operations (i.e., cloud and non-cloud), the shorter the server migration schedule, the greater the economic benefits.

These findings, in turn, lead us to the following recommendations for agencies and policymakers contemplating a cloud migration:

  • It is more cost-effective to group smaller existing data centers together into as large a cloud as possible, rather than creating several smaller clouds.

To reduce the cost of running parallel operations, organization should properly plan for and then migrate to the new cloud environment as quickly as possible. The three lines in Exhibit 5 show (in this case, for the public cloud) that the BCR goes down rapidly and the DPP increases as the transition time increases.

Budgeting Implications
A few agencies are already moving quickly to explore cloud computing solutions and are even redirecting existing funds to begin implementations. However, for most of the federal government, the timeframe for redirecting IT funding to support cloud migrations is likely to be at least 1-2 years, given that agencies formulate budgets 18 months before receiving appropriations.

Specifically, an agency develops IT investment requests each spring and submits them to OMB in September, along with the agency's program budget request, for the following government fiscal year. OMB reviews agency submissions in the fall and can implement funding changes via passback decisions (generally in late November) before submitting the President's budget to the Congress in February. Theoretically, the earliest opportunity for OMB to push agencies to revise their IT budgets to support a transition to the cloud will be fall 2009; however, agencies typically only have about 1 month to incorporate changes to their IT portfolios during passback. To give GSA and OMB time to develop more detailed guidance, as well as necessary procurement mechanisms and vehicles, it is more likely that OMB will direct or encourage agencies to plan for cloud migrations during the FY12 budget cycle (starting in the spring of 2010).

Other Considerations with Potential Economics Effects
When deciding on moving to the cloud, agencies need to consider some additional technical aspects of cloud computing and their potential impact on their organization.  Such areas include but are not limited to data security, software migration, technical architectures, and the skill set of the IT workforce.

Data Security

All government organizations struggle with ensuring that the data they have remains secure and adheres to current policies and regulations.  Because data security is such a critical issue, cloud providers will be required to address it in their products and services, and should be able to tailor the level of security to meet demand.  Additionally, by centralizing data and servers, a cloud environment will allow for easier detection and investigation of incidents, and allow enabling IT staff to replicate and address them efficiently.

However, there are currently no security standards for cloud computing, and until such standards have been developed, and used effectively to measure provider services and enforce accountability, any failures will fall on the agency's in-house IT organization.  In awareness of this reality, organizations should be careful about putting mission-critical and core processes into a public cloud, and private cloud architectures should be designed to minimize any security concerns while realizing the benefits of cloud optimization.

Service Oriented Architecture
As the government moves towards embracing Service-Oriented Architecture (SOA), cloud computing will optimize the benefits of those investments.  Cloud computing is inherently a Service Oriented Architecture and implementing the private clouds will provide for more control over data, security and privacy.

Migration of Applications to the Cloud
This article identifies the financial benefits of migrating the IT infrastructure to the cloud.

Workforce
Cloud architectures and service delivery models will lead to changing needs for technical skills amongst agencies'  IT workforces.  CIOs will need to plan to conduct or refresh workforce assessments and training, as well as set aside the necessary funding, to ensure technical staff are trained on cloud architecture, implementation and operations.

Economic Influence on Policy
From an economic perspective, GSA and OMB can take a number of steps to maximize the probability that the cloud computing business model can work in the federal government; i.e., that it can achieve its objective of enabling significant cost savings. These steps promote information sharing and transparency in the realistic costs and benefits of various cloud models, as well as establishing the necessary policy and contracting frameworks. Because scale is a key variable affecting both costs and benefits, policy guidance regarding scale considerations will be particularly critical (e.g., determining how much flexibility, if any, agencies and departments have to create private clouds at the bureau and/or interagency level).

As a cloud "storefront," GSA should conduct due diligence reviews to establish that public cloud providers, once identified, indeed offer highly efficient, highly scalable (both up and down) usage-based pricing beyond traditional managed services (e.g., by comparing proposed rates against commercial benchmarks). GSA should also work with potential providers to ensure agencies can readily understand service definitions, service levels, terms, conditions, and pricing. These steps will provide transparency to facilitate agencies' ability to compare potential provider pricing against their legacy operations costs-an essential component of building a credible business case for any type of cloud migration. In earlier shared services initiatives, such as financial management, the lack of such standardized information on pricing and service levels in the first few years proved a major impediment to progress, as agencies faced decisions about alternative solutions that were often based on unreliable cost data from potential vendors.

Finally, GSA will need to establish and communicate its own schedule for cloud services founded in the pricing for the services with different cloud venders..

Summary of Key Observations

Although cloud computing offers potentially significant savings to federal agencies by reducing their expenditures on server hardware and associated support costs, chief information officers, policymakers, and other interested parties should bear in mind a number of practical considerations:

  • It will take, on average, 18-24 months for most agencies to redirect funding to support this transition, given the budget process.
  • Some up-front investment will be required, even for agencies seeking to take advantage of public cloud options.
  • Implementations may take several years, depending on the size of the agency and the complexity of the cloud model it selects (i.e., public, private, or hybrid).
  • It could take as long as 4 years for the accumulated savings from agency investments in cloud computing to offset the initial investment costs; this timeframe could be longer if implementations are improperly planned or inefficiently executed.

Given these observations, we offer the following recommendations:

  • OMB, GSA, and other organizations, such as National Institute of Standards and Technology (NIST), should provide timely, well-coordinated support-in the form of necessary standards, guidance, policy decisions, and issue resolution-to ensure agencies have the necessary tools to efficiently plan and carry out migrations to cloud environments. As the length of the migration period increases, the potential economic benefits of the migration decrease.
  • OMB and GSA should seek to identify those agencies with the highest near-term IT costs and expedite their migration to the cloud.
  • To encourage steady progress, OMB should establish a combination of incentives and disincentives; e.g., consider allowing agencies to retain a small percentage of any savings realized from cloud computing for investments in future initiatives. To monitor progress and heighten transparency and accountability, OMB could incorporate cloud-related metrics into the new government-wide IT dashboard.
  • Agencies should consider which of the high-level scenarios described in this article best suits their needs, with the understanding that regardless of scenario chosen, proper planning and efficient execution are critical success factors from an economic perspective.
  • Given the significant impact of scale efficiencies, agencies selecting a private cloud approach should fully explore the potential for interdepartmental and interagency collaboration and investment (consistent with emerging OMB and GSA guidance). This, in effect, leads to the fourth cloud deployment model-the community cloud. A community cloud is a collaboration between private cloud operators to share resources and services.
  • Agencies should identify the aspects of their current IT workload that can be transitioned to the cloud in the near term to yield "early wins" to help build momentum and support for the migration to cloud computing.

Cloud computing has received executive backing and offers clear opportunities for agencies to significantly reduce their growing data center and IT hardware expenditures. However, for the government to achieve the envisioned savings, organizations charged with oversight, such as OMB, NIST, and GSA, will have to facilitate progress, and departments and agencies will have to carefully select and plan for future cloud scenarios that yield the best tradeoffs among their respective costs, benefits, and risks.

References

  1. Figures from INPUT data for the FY10 President's budget; of the $20B in expenditures categorized as office automation and IT infrastructure spending, about $12.2B is spent on major IT investments, with the remainder on non-majors. Additional expenditures on application-specific IT infrastructure are typically reported as part of individual IT investments.
  2. President's budget, FY10 (Analytical Perspectives).
  3. The 1,000 servers are  broken down in our cost model by server processing capacity (small, medium, and large) based on proportions consistent with our experience.
  4. Our model focuses on the costs that a cloud migration will most likely directly affect; i.e., costs for server hardware (and associated support hardware, such as internal routers and switches, rack hardware, cabling, etc.), basic server software (OS software, standard backup management, and security software), associated contractor labor for engineering and planning support during the transition phase, hardware and software maintenance, IT operations labor, and IT power/cooling costs. It does not address other costs that would be less likely to vary significantly between cloud scenarios, such as storage, application software, telecommunications, or WAN/LAN. In addition, it does not include costs for government staff. Further, for simplicity we removed facilities cost from the analysis

More Stories By Ted Alford

Ted Alford, an Associate at Booz Allen Hamilton, has 20 years of professional experience providing cost and economic analysis support to federal government clients, including the National Security Agency, Department of Defense, Department of Labor, Federal Aviation Administration, and Defense Logistics Agency. He has specifically focused on estimating the costs and benefits and analyzing the economics of information technology projects. Over the years, Mr. Alford has been the lead analyst supporting the development of analyses of alternatives, program office estimates, economic analyses, and cost benefit analyses. In supporting these efforts, he has developed life-cycle cost estimates, estimated quantifiable benefits, analyzed cost and schedule risks, and analyzed justification of investment decisions.

More Stories By Gwen Morton

Gwen Morton is a Senior Associate in Booz Allen Hamilton’s economic and business analysis practice. She has more than 16 years of experience supporting both government and commercial clients in conducting financial, economic, and market analyses to support executive decision-making, with particular expertise in IT capital planning, benefits estimation, and performance measures and management. Ms. Morton’s major clients include the Department of Treasury, Department of Agriculture, Social Security Administration, Department of the Interior, and General Services Administration.

Comments (3) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
jhbeil 10/21/09 03:51:00 PM EDT

so when is "cloudonomics" going to hit the bookshelves?

Phillip Hallam-Baker 10/20/09 09:30:00 PM EDT

Looking at the numbers in the article a little further, it is assumed that the utilization rate will increase from 16% to 60% and that the reduction in the number of machines is the reason for the purported 60% cost saving.

The only way I can make those numbers work is if it is assumed that 80% of the costs in a data center are driven by nothing more than the number of machines in the data center that are powered.

This seems to be an absurdly high assumption to me.

Phillip Hallam-Baker 10/19/09 05:07:00 PM EDT

I found the basic assumptions in this article to be unsupported. It is really easy to assume 65% savings from an infrastructure change if you ignore most of the costs of making the change.

I examine this in more detail on my blog.

I think this type of article will do great damage to cloud computing as it sets out claims that are simply ludicrous and will not be believed. It is entirely credible that newly deployed software services will be cheaper when designed for cloud deployment. It is not credible that anyone should expect to save a single dollar by taking a deployed application that does not otherwise need changing and throwing it into the cloud.

Once hardware costs are sunk, they are sunk. thus there are no savings to be won through 'migration' if you are a large corporation or a government agency. There will be real savings, but they will be modest and come gradually.

The savings from cloud computing will be for the smaller enterprise right down to the small business which does not even have a machine room let alone a data center. There the savings are real and dramatic. But let's not get cloud computing dismissed as hype with unsupported claims.

@ThingsExpo Stories
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
There will be 50 billion Internet connected devices by 2020. Today, every manufacturer has a propriety protocol and an app. How do we securely integrate these "things" into our lives and businesses in a way that we can easily control and manage? Even better, how do we integrate these "things" so that they control and manage each other so our lives become more convenient or our businesses become more profitable and/or safe? We have heard that the best interface is no interface. In his session at Internet of @ThingsExpo, Chris Matthieu, Co-Founder & CTO at Octoblu, Inc., will discuss how these devices generate enough data to learn our behaviors and simplify/improve our lives. What if we could connect everything to everything? I'm not only talking about connecting things to things but also systems, cloud services, and people. Add in a little machine learning and artificial intelligence and now we have something interesting...
Last week, while in San Francisco, I used the Uber app and service four times. All four experiences were great, although one of the drivers stopped for 30 seconds and then left as I was walking up to the car. He must have realized I was a blogger. None the less, the next car was just a minute away and I suffered no pain. In this article, my colleague, Ved Sen, Global Head, Advisory Services Social, Mobile and Sensors at Cognizant shares his experiences and insights.
We are reaching the end of the beginning with WebRTC and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) irreversibly encoded. In his session at Internet of @ThingsExpo, Peter Dunkley, Technical Director at Acision, will look at how this identity problem can be solved and discuss ways to use existing web identities for real-time communication.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT. Attendees will learn real-world benefits of WebRTC and explore future possibilities, as WebRTC and IoT intersect to improve customer service.
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at Internet of @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, will share some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, an Open Source Cloud Communications company that helps the shift from legacy IN/SS7 telco networks to IP-based cloud comms. An early investor in multiple start-ups, he still finds time to code for his companies and contribute to open source projects.
The Internet of Things (IoT) promises to create new business models as significant as those that were inspired by the Internet and the smartphone 20 and 10 years ago. What business, social and practical implications will this phenomenon bring? That's the subject of "Monetizing the Internet of Things: Perspectives from the Front Lines," an e-book released today and available free of charge from Aria Systems, the leading innovator in recurring revenue management.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines.
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at Internet of @ThingsExpo, Erik Lagerway, Co-founder of Hookflash, will walk through the shifting landscape of traditional telephone and voice services to the modern P2P RTC era of OTT cloud assisted services.
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehension and conference efficiency.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example to explain some of these concepts including when to use different storage models.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. These technological reforms have not only changed computers and smartphones, but are also changing the data processing model for all information devices. In particular, in the area known as M2M (Machine-To-Machine), there are great expectations that information with a new type of value can be produced using a variety of devices and sensors saving/sharing data via the network and through large-scale cloud-type data processing. This consortium believes that attaching a huge number of devic...
Innodisk is a service-driven provider of industrial embedded flash and DRAM storage products and technologies, with a focus on the enterprise, industrial, aerospace, and defense industries. Innodisk is dedicated to serving their customers and business partners. Quality is vitally important when it comes to industrial embedded flash and DRAM storage products. That’s why Innodisk manufactures all of their products in their own purpose-built memory production facility. In fact, they designed and built their production center to maximize manufacturing efficiency and guarantee the highest quality of our products.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. Download Slide Deck: ▸ Here
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. Every other IT news item seems to be about IoT and its implications on the future of digital business.
BSQUARE is a global leader of embedded software solutions. We enable smart connected systems at the device level and beyond that millions use every day and provide actionable data solutions for the growing Internet of Things (IoT) market. We empower our world-class customers with our products, services and solutions to achieve innovation and success. For more information, visit www.bsquare.com.
With the iCloud scandal seemingly in its past, Apple announced new iPhones, updates to iPad and MacBook as well as news on OSX Yosemite. Although consumers will have to wait to get their hands on some of that new stuff, what they can get is the latest release of iOS 8 that Apple made available for most in-market iPhones and iPads. Originally announced at WWDC (Apple’s annual developers conference) in June, iOS 8 seems to spearhead Apple’s newfound focus upon greater integration of their products into everyday tasks, cross-platform mobility and self-monitoring. Before you update your device, here is a look at some of the new features and things you may want to consider from a mobile security perspective.