Welcome!

Government Cloud Authors: Elizabeth White, Liz McMillan, Pat Romanski, Dana Gardner, Gopala Krishna Behara

Related Topics: @CloudExpo, Cloud Security, Government Cloud

@CloudExpo: Blog Post

Government Can Influence Cloud Interoperability | @CloudExpo #API #SaaS #Cloud #FedRAMP

Why government agencies could lead the way in demanding inter-public cloud interoperability and standardization

The next BriefingsDirect thought leadership panel discussion explores how public-sector organizations can gain economic benefits from cloud interoperability and standardization.

Our panel comes to you in conjunction with The Open Group Paris Event and Member Meeting October 24 through 27, 2016 in France, with a focus on the latest developments in eGovernment.

As government agencies move to the public cloud computing model, the use of more than one public cloud provider can offer economic benefits by a competition and choice. But are the public clouds standardized efficiently for true interoperability, and can the large government contracts in the offing for cloud providers have an impact on the level of maturity around standardization?

To learn how to best procure multiple cloud services as eGovernment services at low risk and high reward, we're joined by our panel, Dr. Chris Harding, Director for Interoperability at The Open Group; Dave Linthicum, Senior Vice President at Cloud Technology Partners, and Andras Szakal, Vice President and Chief Technology Officer at IBM U.S. Federal. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Andras, I've spoken to some people in the lead-up to this discussion about the level of government-sector adoption of cloud services, especially public cloud. They tell me that it’s lagging the private sector. Is that what you're encountering, that the public sector is lagging the private sector, or is it more complicated than that?

Szakal

Szakal: It's a bit more complicated than that. The public sector born-on-the-cloud adoption is probably much greater than the public sector and it differentiates. So the industry at large, from a born-on-the-cloud point of view is very much ahead of the public-sector government implementation of born-on-the-cloud applications.

What really drove that was innovations like the Internet of Things (IoT), gaming systems, and platforms, whereas the government environment really was more about taking existing government citizens to government shared services and so on and so forth and putting them into the cloud environment.

When you're talking about public cloud, you have to be very specific about the public sector and government, because most governments have their own industry instance of their cloud. In the federal government space, they're acutely aware of the FedRAMP certified public-cloud environments. That can go from moderate risk, where you can have access to the yummy goodness of the entire cloud industry, but then, to FedRAMP High, which would isolate these clouds into their own environments in order to increase the level of protection and lower the risk to the government.

So, the cloud service provider (CSP) created instances of these commercial clouds fit-for-purpose for the federal government. In that case, if we're talking about enterprise applications shifting to the cloud, we're seeing the public sector government side, at the national level, move very rapidly, compared to some of the commercial enterprises who are more leery about what the implications of that movement may be over a period of time. There isn't anybody that's mandating that they do that by law, whereas that is the case on the government side.

Attracting contracts

Gardner: Dave, it seems that if I were a public cloud provider, I couldn't think of a better customer, a better account in terms of size and longevity, than some major government agencies. What are we seeing from the cloud providers in trying to attract the government contracts and perhaps provide the level of interoperability and standardization that they require?

Linthicum: The big three -- Amazon, Google and Microsoft -- are really making an effort to get into that market. They all have federal sides to their house. People are selling into that space right now, and I think that they're seeing some progress. The FAA and certainly the DoD have been moving in that direction.

Linthicum

However, they do realize that they have to build a net new infrastructure, a net new way of doing procurement to get into that space. In the case where the US is building the world’s biggest private cloud at the CIA, they've had to change their technology around the needs of the government.

They see it as really the "Fortune 1." They see it as the largest opportunity that’s there, and they're willing to make huge investments in the billions of dollars to capture that market when it arrives.

Gardner: It seems to me, Chris, that we might be facing a situation where we have cloud providers offering a set of services to large government organizations, but perhaps a different set to the private sector. From an interoperability and standardization perspective, that doesn’t make much sense to me.

What’s your perspective on how public cloud services and standardization are shaping up? Where did you expect things to be at this point?

Harding: The government has an additional dimension to that of the private sector when it comes to procurement in terms of the need to be transparent and to be spending the money that’s entrusted to them by the public in a wise manner. One of the issues they have with a lack of standardization is that it makes it more difficult for them to show that they're visibly getting the best deals from the taxpayers when they come to procure cloud services.

Harding

In fact, The Open Group produced a guide to cloud computing for business a couple of years ago. One of the things that we argued in that was that, when procuring cloud services, the enterprise should model the use that it intends to make of the cloud services and therefore be able to understand the costs that they were likely to incur. This is perhaps more important for government, even more than it is for private enterprises. And you're right, the lack of standardization makes it more difficult for them to do this.

Gardner: Chris, do you think that interoperability is of a higher order of demand in public-sector cloud acquisition than in the private sector, or should there be any differentiation?

Need for interoperability

Harding: Both really have the need for interoperability. The public sector perhaps has a greater need, simply because it’s bigger than a small enterprise and it’s therefore more likely to want to use more cloud services in combination.

Gardner: We've certainly seen a lot of open-source platforms emerge in private cloud as well as hybrid cloud. Is that a driving force yet in the way that the public sector is looking at public cloud services acquisition? Is open source a guide to what we should expect in terms of interoperability and standardization in public-cloud services for eGovernment?

Szakal: Open source, from an application implementation point of view, is one of the questions you're asking, but are you also suggesting that somehow these cloud platforms will be reconsidered or implemented via open source? There's truth to both of those statements.

IBM is the number two cloud provider in the federal government space, if you look at hybrid and the commercial cloud for which we provide three major cloud environments. All of those cloud implementations are based on open source -- OpenStack and Cloud Foundry are key pieces of this -- as well as the entire DevOps lifecycle.

So, the economy of APIs and the creation of this composite services are going to be very, very important elements. If they're closed and not open to following the normal RESTful approaches defined by the W3C and other industry consortia, then it’s going to be difficult to create these composite clouds.

So, open source is important, but if you think of open source as a way to ensure interoperability, kind of what we call in The Open Group environment "Executable Standards," it is a way to ensure interoperability.

That’s more important at the cloud-stack level than it is between cloud providers, because between cloud providers you're really going to be talking about API-driven interoperability, and we have that down pretty well.

So, the economy of APIs and the creation of this composite services are going to be very, very important elements. If they're closed and not open to following the normal RESTful approaches defined by the W3C and other industry consortia, then it’s going to be difficult to create these composite clouds.

Gardner: We saw that OpenStack had its origins in a government agency, NASA. In that case, clearly a government organization, at least in the United States, was driving the desire for interoperability and standardization, a common platform approach. Has that been successful, Dave? Why wouldn’t the government continue to try to take that approach of a common, open-source platform for cloud interoperability?

Linthicum: OpenStack has had some fair success, but I wouldn’t call it excellent success. One of the issues is that the government left it dangling out there, and while using some aspects of it, I really expected them to make some more adoption around that open standard, for lots of reasons.

So, they have to hack the operating systems and meet very specific needs around security, governance, compliance, and things like that. They have special use cases, such as the DoD, weapons control systems in real time, and some IoT stuff that the government would like to move into. So, that’s out there as an opportunity.

In other words, the ability to work with some of the distros out there, and there are dozens of them, and get into a special government version of that operating system, which is supported openly by the government integrators and providers, is something they really should take advantage of. It hasn’t happened so far and it’s a bit disappointing.

Insight into Europe

Gardner: Do any of you have any insight into Europe and some of the government agencies there? They haven’t been shy in the past about mandating certain practices when it comes to public contracts for acquisition of IT services. I think cloud should follow the same path. Is there a big difference in what’s going on in Europe and in North America?

Szakal: I just got off the phone a few minutes ago with my counterpart in the UK. The nice thing about the way the UK government is approaching cloud computing is that they're trying to do so by taking the handcuffs off the vendors and making sure that they are standards-based. They're meeting a certain quality of services for them, but they're not mandating through policy and by law the structure of their cloud. So, it allows for us, at least within IBM, to take advantage of this incredible industry ecosystem you have on the commercial side, without having to consider that you might have to lift and shift all of this very expensive infrastructure over to these industry clouds.

The EU is, in similar ways, following a similar practice. Obviously, data sovereignty is really an important element for most governments. So, you see a lot of focus on data sovereignty and data portability, more so than we do around strict requirements in following a particular set of security controls or standards that would lock you in and make it more difficult for you to evolve over a period of time.

Gardner: Chris Harding, to Andras’ point about data interoperability, do you see that as a point on the arrow that perhaps other cloud interoperability standards would follow? Is that something that you're focused on more specifically than more general cloud infrastructure services?

Harding: Cloud is a huge spectrum, from the infrastructure services at the bottom,up to the business services, the application services, to software as a service (SaaS), and data interoperability sits on top of that stack.

I'm not sure that we're ready to get real data interoperability yet, but the work that's being done on trying to establish common frameworks for understanding data, for interpreting data, is very important as a basis for gaining interoperability at that level in the future.

We also need to bear in mind that the nature of data is changing. It’s no longer a case that all data comes from a SQL database. There are all sorts of ways in which data is represented, including human forms, such as text and speech, and interpreting those is becoming more possible and more important.

This is the exciting area, where you see the most interesting work on interoperability.

Gardner: Dave Linthicum, one of the things that some of us who have been proponents of cloud for a number of years now have looked to is the opportunity to get something that couldn’t have been done before, a whole greater than the sum of the parts.

It seems to me that if you have a common cloud fabric and the sufficient amount of interoperability for data and/or applications and infrastructure services and that cuts across both the public and the private sector, then this difficulty we've had with health insurance, payer and provider, interoperability and communication, sharing of government services, and data with the private sector, many of the things that have been probably blamed on bureaucracy and technical backwardness in some ways could be solved if there was a common public cloud approach adopted by the major public cloud providers. It seems to me a very significant benefit could be drawn when the public and private sector have a commonality that having your own data centers of the past just couldn't provide.

Am I chewing on too much pie in the sky here, Dave, or is there actually something to be said about the cloud model, not just between government to government agencies, but the public and private sectors?

Getting more savvy

Linthicum: The public-cloud providers out there, the big ones, are getting more savvy about providing interoperability, because they realized that it’s going to be multi-cloud. It’s going to be different private and public cloud instances, different kinds of technologies, that are there, and you have to work and play well with a number of different technologies.

However, to be a little bit more skeptical, over the years, I've found out that they're in it for their own selfish interests, and they should be, because they're corporations. They're going to basically try to play up their technology to get into a market and hold on to the market, and by doing that, they typically operate against interoperability. They want to make it as difficult as possible to integrate with the competitors and leverage their competitors’ services.

So, we have that kind of dynamic going on, and it’s incredibly frustrating, because we can certainly stand up, have the discussion, and reveal the concepts. You just did a really good job in revealing that this has been Nirvana, and we should start moving in this direction. You will typically get lots of head-nodding from the public-cloud providers and the private-cloud providers but actions speak louder than words, and thus far, it’s been very counterproductive.

Interoperability is occurring but it’s in dribs and drabs and nothing holistic.

Gardner: Chris, it seems as if the earlier you try to instill interoperability and standardization both in technical terms, as well as methodological, that you're able to carry that into the future where we don't repave cow paths, but we have highly non-interoperable data centers replaced by them being in the cloud, rather than in some building that you control.

The public-cloud providers out there, the big ones, are getting more savvy about providing interoperability, because they realized that it’s going to be multi-cloud.

What do you think is going to be part of the discussion at The Open Group Paris Event, October 24, around some of these concepts of eGovernment? Shouldn’t they be talking about trying to make interoperability something that's in place from the start, rather than something that has to be imposed later in the process?

Harding: Certainly this will be an important topic at the forthcoming Paris event. My personal view is that the question of when you should standardize something to gain interoperability is a very difficult balancing act. If you do it too late, then you just get a mess of things that don’t interoperate, but equally, if you try to introduce standards before the market is ready for them, you generally end up with something that doesn’t work, and you get a mess for a different reason.

Part of the value of industry events, such as The Open Group events, is for people in different roles in different organizations to be able to discuss with each other and get a feel for the state of maturity and the directions in which it's possible to create a standard that will stick. We're seeing a standard paradigm, the API paradigm, that was mentioned earlier. We need to start building more specific standards on top of those, and certainly in Paris and at future Open Group events, those are the things we'll be discussing.

Gardner: Andras, you wear a couple of different hats. One is the Chief Technology Officer at IBM US Federal, but you're also very much involved with The Open Group. I think you're on the Board of Directors. How do you see this progression of what The Open Group has been able to do in other spheres around standardization and both methodological, such as an enterprise architecture framework, TOGAF®, an Open Group standard,, as well as the implementation enforcement of standards? Is what The Open Group has done in the past something you expect to be applicable to these cloud issues?

Szakal: IBM has a unique history, being one of the only companies in the technology arena. It’s over a 100-years-old and has been able to retain great value to its customers over that long period of time, and we shifted from a fairly closed computing environment to this idea of open interoperability and freedom of choice.

That's our approach for our cloud environment as well. What drives us in this direction is because our customers require it from IBM, and we're a common infrastructure and a glue that binds together many of our enterprise and the largest financial banking and healthcare institutions in the world to ensure that they can interoperate with other vendors.

As such, we were one of the founders of The Open Group, which has been at the forefront of helping facilitate this discussion about open interoperability. I'm totally with Chris as to when you would approach that. As I said before, my concern is that you interoperate at the service level in the economy of APIs. That would suggest that there are some other elements for that, not just the API itself, but the ability to effectively manage credentials, security, or some other common services, like being able to manage object stores to the place that you would like to be able to store your information, so that data sovereignty isn’t an issue. These are all the things that will occur over a period of time.

Early days

It’s early, heady days in the cloud world, and we're going to see all of that goodness come to pass here as we go forward. In reality, we talk about cloud it as if it’s a thing. It’s true value isn't so much in the technology, but in creating these new disruptive business capabilities and business models. Openness of the cloud doesn’t facilitate that creation of those new business models.

That’s where we need to focus. Are we able to actually drive these new collaborative models with our cloud capabilities? You're going to be interoperating with many CSPs not just two, three, or four, especially as you see different factors grow into the cloud. It won’t matter where they operate their cloud services from; it will matter how they actually interoperate at that API level.

Gardner: It certainly seems to me that the interoperability is the killer application of the cloud. It can really foster greater inter-department collaboration and synergy, government to government, state to federal, across the EU, for example as well, and then also to the private sector, where you have healthcare concerns and you've got monetary and banking and finance concerns all very deeply entrenched in both public and private sectors. So, we hope that that’s where the openness leads to.

It won’t matter where they operate their cloud services from; it will matter how they actually interoperate at that API level.

Chris, before we wrap up, it seems to me that there's a precedent that has been set successfully with The Open Group, when it comes to security. We've been able to do some pretty good work over the past several years with cloud security using the adoption of standards around encryption or tokenization, for example. Doesn’t that sort of give us a path to greater interoperability at other levels of cloud services? Is security a harbinger of things to come?

Harding: Security certainly is a key aspect that needs to be incorporated in the standards where we build on the API paradigm. But, some people talk about move to digital transformation, the digital enterprise. So, cloud and other things like IoT, big-data analysis, and so on are all coming together, and a key underpinning requirement for that is platform integration. That's where the Open Platform 3.0™ Forum of The Open Group is centering on the possibilities for platform interoperability to enable digital platform integration. Security is a key aspect of that, but there are other aspects too.

Gardner: I am afraid we will have to leave it there. We've been discussing the latest developments in eGovernment and cloud adoption with a panel of experts. Our focus on these issues comes in conjunction with The Open Group Paris Event and Member Meeting, October 24-27, 2016 in Paris, France, and there is still time to register at www.opengroup.org and find more information on that event, and many others coming in the near future.

You may also be interested in:

More Stories By Dana Gardner

At Interarbor Solutions, we create the analysis and in-depth podcasts on enterprise software and cloud trends that help fuel the social media revolution. As a veteran IT analyst, Dana Gardner moderates discussions and interviews get to the meat of the hottest technology topics. We define and forecast the business productivity effects of enterprise infrastructure, SOA and cloud advances. Our social media vehicles become conversational platforms, powerfully distributed via the BriefingsDirect Network of online media partners like ZDNet and IT-Director.com. As founder and principal analyst at Interarbor Solutions, Dana Gardner created BriefingsDirect to give online readers and listeners in-depth and direct access to the brightest thought leaders on IT. Our twice-monthly BriefingsDirect Analyst Insights Edition podcasts examine the latest IT news with a panel of analysts and guests. Our sponsored discussions provide a unique, deep-dive focus on specific industry problems and the latest solutions. This podcast equivalent of an analyst briefing session -- made available as a podcast/transcript/blog to any interested viewer and search engine seeker -- breaks the mold on closed knowledge. These informational podcasts jump-start conversational evangelism, drive traffic to lead generation campaigns, and produce strong SEO returns. Interarbor Solutions provides fresh and creative thinking on IT, SOA, cloud and social media strategies based on the power of thoughtful content, made freely and easily available to proactive seekers of insights and information. As a result, marketers and branding professionals can communicate inexpensively with self-qualifiying readers/listeners in discreet market segments. BriefingsDirect podcasts hosted by Dana Gardner: Full turnkey planning, moderatiing, producing, hosting, and distribution via blogs and IT media partners of essential IT knowledge and understanding.

IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...