Welcome!

Government Cloud Authors: Liz McMillan, SmartBear Blog, Elizabeth White, Kevin Jackson, XebiaLabs Blog

Blog Feed Post

COTS Cloud security reference design and related NIST workshop

By

NISTSince the beginning of the modern Cloud movement (which we trace to November 2006 — see here if you want to know why) technologists have been seeking ways to mitigate key risks. Top on our list include

1) The increased risk due to multi-tenancy

2) The mission needs of availability (including the need for always available path to resources)

3) New and at times nuanced challenges regarding data confidentiality

4) New challenges regarding integrity of data.

There are many other policy related risks that planners must consider, including how to establish the best user authentication methods and how to ensure compliance with regulations and laws of the geography that holds the data. But for a technologist, the four above are a continual concern, and if those technical concerns are mitigated it makes other concerns so much easier to deal with.

That is why we read with such great pleasure a recent announcement that NIST is continuing to work with industry to ensure advancements are being made in cloud security. The NIST National Cyber Center of Excellence (NNCOE) in Rockville, MD is a focal point for many great industry/government interactions, including a workshop at their facility January 14 that we are especially excited about.

This workshop is on the topic of  Trusted Geo location in the Cloud. It is a proof of concept implementation that uses technology that has proven to be the most scalable technology on the globe: Intel processors.  Technologists presenting and discussing these developments come from Intel, EMC-RSA, NIST and the NCCoE. This will be a great workshop that includes hands-on demonstrations of this technology, and we believe it will show ways to help mitigate all four of the challenges we  provide above.

Following the workshop the NCCoE will have a two day cloud computing event (details can be found on that here)

From the workshop flyer:

An upcoming workshop to be held at the NIST National Cyber Center of Excellence (NNCOE) facility in Rockville, MD on Monday, January 14th on Trusted Geo location in the Cloud : Proof of Concept Implementation.

There is a very interesting workshop being provided to a technical audience next week on Monday the 14th by NIST and private industry on a cloud use case embracing the security challenges involving Infrastructure as a Service (IaaS) cloud computing technologies and geolocation.

The motivation behind this use case is to improve the security of cloud computing and accelerate the adoption of cloud computing technologies by establishing an automated hardware root of trust method for enforcing and monitoring geolocation restrictions for cloud servers. A hardware root of trust is an inherently trusted combination of hardware and firmware that maintains the integrity of the geolocation information and the platform. This information is accessed using secure protocols to assert the integrity of the platform and confirm the location of the host.

At the heart of the solution is a reference design provided through the utilization of commercial off the shelf (COTS) products provided by Intel, VmWare and RSA Archer. The use case is of significant relevance to US Federal agencies in solving the security problem in question: improving the security of virtualized infrastructure cloud computing technologies by enforcing geolocation restrictions.

NIST now moves in conjunction with private industry in a workshop specific to this research (attached to this email) that explains and details how to implement this trusted cloud solution on January 14th at the NIST National Cyber Center of Excellence (NCCOE).

Audience 

This workshop and IR document has been created for security researchers, cloud computing practitioners, system integrators, and other parties interested in techniques for solving the security problem in question: improving the security of virtualized infrastructure cloud computing technologies by enforcing geolocation restrictions. 2:00 PM – 2:15 PM  NCCoE Introduction NIST 
2:15 PM – 2:30 PM  Trusted Cloud Description NIST 
2:30 PM – 2:45 PM  Trusted Geolocation in the Cloud Implementation – Trusted Measurement and Remote Attestation Intel Corporation 
2:45 PM – 3:00 PM  Trusted Geolocation in the Cloud Trusted – Monitoring of Measurements in a Governance, Risk, and Compliance Dashboard EMC-RSA 
3:00 PM – 3:15 PM  Trusted Cloud Demonstration Intel, EMC-RSA, and NIST 
3:15 PM – 4:00 PM  Questions and Answers / Hands-on Session Intel, EMC-RSA, and NIST 

 

Participation from all parties is welcomed and to register for this workshop: Please send an email with the attendee’s name, affiliation, and email address in the body of the message to [email protected], with the subject “Trusted Location in the cloud” by January 13, 2013.

This workshop is now part of their Big Data and Cloud Computing Workshop to be held at the NIST HQ in Gaithersburg, MD on January 15-17. http://www.nist.gov/itl/cloud/cloudbdworkshop.cfm

The importance of this secure cloud computing proof of concept can be found in the NIST Draft publication at the following link to the publication which details this reference design and clearry delineates how to stand up this secure cloud structure. The NIST Interagency Report (NISTIR) is a public/ private collaboration with co-authors from both NIST and private industry authors and is now taking public comments: http://csrc.nist.gov/publications/drafts/ir7904/draft_nistir_7904.pdf

____________________________________________________________________________________

Background Information taken from NISTIR 7904:

Shared cloud computing technologies are designed to be very agile and flexible, transparently using whatever resources are available to process workloads for their customers. However, there are security and privacy concerns with allowing unrestricted workload migration. Whenever multiple workloads are present on a single cloud server, there is a need to segregate those workloads from each other so that they do not interfere with each other, gain access to each other’s sensitive data, or otherwise compromise the security or privacy of the workloads. Imagine two rival companies with workloads on the same server; each company would want to ensure that the server can be trusted to protect their information from the other company.

Another concern with shared cloud computing is that workloads could move from cloud servers located in one country to servers located in another country. Each country has its own laws for data security, privacy, and other aspects of information technology (IT). Because the requirements of these laws may conflict with an organization’s policies or mandates (e.g., laws, regulations), an organization may decide that it needs to restrict which cloud servers it uses based on their location. A common desire is to only use cloud servers physically located within the same country as the organization. Determining the approximate physical location of an object, such as a cloud computing server, is generally known as geolocation. Geolocation can be accomplished in many ways, with varying degrees of accuracy, but traditional geolocation methods are not secured and they are enforced through management and operational controls that cannot be automated and scaled, and therefore traditional geolocation methods cannot be trusted to meet cloud security needs.

The motivation behind this use case is to improve the security of cloud computing and accelerate the adoption of cloud computing technologies by establishing an automated hardware root of trust method for enforcing and monitoring geolocation restrictions for cloud servers. A hardware root of trust is an inherently trusted combination of hardware and firmware that maintains the integrity of the geolocation information and the platform. The hardware root of trust is seeded by the organization, with the host’s unique identifier and platform metadata stored in tamperproof hardware. This information is accessed using secure protocols to assert the integrity of the platform and confirm the location of the host.

NIST now moves in conjunction with private industry in a workshop specific to this research (attached to this email) that explains and details how to implement this trusted cloud solution on January 14th at the NIST National Cyber Center of Excellence (NCCOE). This workshop is now part of their Big Data and Cloud Computing Workshop to be held at the NIST HQ in Gaithersburg, MD on January 15-17. http://www.nist.gov/itl/cloud/cloudbdworkshop.cfm

Here is the link to the publication from both NIST and private industry authors that is now taking public comments: http://csrc.nist.gov/publications/drafts/ir7904/draft_nistir_7904.pdf

For media interviews and comments, please contact:

Kevin Fiftal

Intel Corporation

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@ThingsExpo Stories
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...