Welcome!

Government Cloud Authors: Elizabeth White, Pat Romanski, Dana Gardner, Liz McMillan, Gopala Krishna Behara

Related Topics: Containers Expo Blog, Microservices Expo, Government Cloud

Containers Expo Blog: Article

Why Federal Government IT Leaders Like Data Virtualization

Better data integration is key to effective governance

With money tighter than ever, federal agencies are under increasing pressure to improve efficiency, share information more readily and execute on ever-expanding congressional mandates.

But this is far easier said than done in IT environments with government-sized volumes, decades of existing systems and myriad new requirements being layered on top.

Is there a better way to for federal government IT leaders to meet these challenges?

There is.  It is called data virtualization.

Step 1 - Align with Service-Oriented Architecture Mandates
Recognizing the need for high-level, cross-agency direction for addressing these challenges, the federal government wisely identified a number of top-down service-oriented architecture (SOA) initiatives including Netcentricity, Civilian Centered Services, e-Government and more.

These broad strategies address system integration, i.e., the leveraging and advancement of already running systems, in an intelligent, yet high-level way. They do not proscribe the detailed implementations however. So agencies have the freedom, or curse, to identify specific technological implementations and vendors.

Step 2 - Simplify the Problem to Accelerate the Solution
Most new development leverages existing transactional processing and information reporting systems. Each is very different by its nature. And so too are the tools, techniques, and most important staff who develop and support them.

Consider data integration. How many ways are there to integrate data?

  • Hand code direct, point-to-point integrations
  • FTP files from one system to another
  • Replicate data from one database to another
  • Consolidate data in a warehouse or mart using ETL
  • Query data on demand using data virtualization

Especially well suited for SOA style architectures, data virtualization is the newest approach, emerging from earlier technologies such as enterprise information integration (EII) or federated query.  And at this point, Gartner, Forrester, and TDWI all recommend that organization have all these options at their disposal; selecting one or more on an application-by-application basis.

Step 3 - Try Data Virtualization
Numerous government agencies, across civilian, defense and intelligence, are already using data virtualization today to provide the diverse, distributed information required in support of a range of mission-critical IT projects.

Data virtualization is often a critical success factor in a federal agency's ability to be agile and cost-effective when integrating the key information required to its customers - citizens, industry, and other agencies or departments. Significant benefits include:

  • Increased Responsiveness - Accelerate time-to-solution for new information requirements
  • Improved Productivity - Ensure civilian, military, and intelligence staff can access all the information they require
  • Reduced Costs - Avoid long data integration development cycles and excess data replication
  • Decreased Risk - Provide complete visibility across agencies
  • Better Compliance - Meet FEA and SOA information standards

Data Virtualization Supports SOA
New service-oriented approaches including XML structures, the XQuery language, SOAP, REST, JMS and more have greatly complicated data integration in the past five years. Gone are the days when the SQL language and relational structures alone could support nearly all an agencies' information reporting needs.

Simply wrapping existing data sources doesn't scale.  And applying ESB and other transaction-centric integration approaches is an awkward way to meet more complex data-centric services needs where data modeling and query optimization are critical capabilities.

Data virtualization platforms, such as the Composite Data Virtualization Platform, provide a complete set of development and runtime tools to build and deploy sharable data services that meet a variety of agency information needs.

Delivering data to consuming applications via SOAP, JMS and REST, data services can be applied to multiple projects, so government IT leaders can achieve their agility and reuse objectives. Data virtualization platforms even allow agencies to repurpose "non-SOA" code such as relational views into sharable data service in just a few clicks.

And because data virtualization platforms work in conjunction with other SOA tools such as Enterprise Services Buses (ESBs), Registries, and Application Servers, agencies can leverage existing technology investments, even those obtained for transactional integration focused applications.

Data Virtualization Supports the Federal Enterprise Architecture (FEA) Model
According to the FEA Program Management Office, "Federated Data Management is an architecture for managing and accessing information data and metadata across physical boundaries, which may be system to system, department to department, or enterprise to enterprise boundaries."

Data services built and deployed using data virtualization are key enablers for the following FEA components:

  • Service Component Reference Model
  • Technical Reference Model
  • Data Reference Model

How Data Virtualization Enables Cross-agency Data Sharing
Data virtualization can help the government share data held in disparate IT systems and groups, presenting the information to multiple agency applications in a uniform fashion.  For example, government agencies can virtually gather and share data concerning the experimentation and distribution of hazardous substances from the Centers for Disease Control, U.S. Department of Agriculture, federally funded university projects, and more.

How Data Virtualization Simplifies Website Development
To provide the website, government agencies and citizens require, data virtualization can simplify the process of turning disparate source data into easy-to-consume data for internal and external web portals. For example, a Department of the Interior agency can combine disparate systems to provide citizens with comprehensive view of park attendance, weather reports, upcoming park activities, and more.

How Data Virtualization Delivers a Single View of a "Person of Interest"
Today, several defense and intelligence agencies use data virtualization to gain real-time or near real-time access to data across multiple underlying systems. Rather than discuss specifics, consider how such a capability might allow the U.S. Department of Homeland Security to pull seemingly innocuous information from multiple partners throughout the intelligence community, Transportation Security Administration, Customs and Border Protection, and Immigrations and Customs Enforcement, to effectively profile persons of interest.

How Data Virtualization Improves Operations
By providing on-demand capture and analysis of operational data, data virtualization helps government agencies make better decisions and improve operations.  For example, agencies may be overpaying of disaster claims.  Data virtualization can simplify on-demand access and integration of claims and payment systems giving agencies greater visibility into fraudulent cases, while avoiding expired claims.

Step 4 - Get Started on the Data Virtualization Path
The heat is on. New data integration approaches are needed.  Now is the time for government IT leaders to consider data virtualization.

Fortunately the path to successful data virtualization adoption is well understood, with a number of proven tools readily available including:

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

IoT & Smart Cities Stories
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...