Welcome!

Government Cloud Authors: Pat Romanski, Kevin Jackson, Liz McMillan, Elizabeth White, JP Morgenthal

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Government Cloud

@CloudExpo: Blog Feed Post

MaaS – The Solution to Design, Map, Integrate and Publish Open Data

Data models can be shared, off-line tested and verified to define data designing requirements, data topology, performance, place

Open Data is data that can be freely used, reused and redistributed by anyone – subject only, at the most, to the requirement for attributes and sharealikes (Open Software Service Definition – OSSD). As a consequence, Open Data should create value and might have a positive impact in many different areas such as government (tax money expenditure), health (medical research, hospital acceptance by pathology), quality of life (air breathed in our city, pollution) or might influence public decisions like investments, public economy and expenditure. We are talking about services, so open data are services needed to connect the community with the public bodies. However, the required open data should be part of a design and then integrated, mapped, updated and published in a form, which is easy to use. MaaS is the Open Data driver and enables Open Data portability into the Cloud.

Introduction
Data models used as a service mainly provide the following topics:

  • Implementing and sharing data structure models;
  • Verifying data model properties according to private and public cloud requirements;
  • Designing and testing new query types. Specific query classes need to support heterogeneous data;
  • Designing of the data storage model. The model should enable query processing directly against databases to ensure privacy and secure changes from data updates and review;
  • Modeling data to predict usage “early”;
  • Portability, a central property when data is shared among fields of application;
  • Sharing, redistribution and participation of data among datasets and applications.

As a consequence, the data should be available as a whole and at a reasonable fee, preferably by finding, navigating and downloading over the Cloud. It should also be available in a usable and changeable form. This means modeling Open Data and then using the models to map location and usage, configuration, integration and changes along the Open Data lifecycle.

What is MaaS
Data models can be shared, off-line tested and verified to define data designing requirements, data topology, performance, placement and deployment. This means models themselves can be supplied as a service to allow providers to verify how and where data has to be designed to meet the Cloud service’s requisites: this is MaaS. As a consequence by using MaaS, Open Data designers can verify “on-premise” how and why datasets meet Open Data requirements. With this approach, Open Data models can be tuned on real usage and then mapped “on-premise” to the public body’s service. Further, MaaS inherits all the defined service’s properties and so the data model can be reused, shared and classified for new Open Data design and publication.

Open Data implementation is MaaS (Model as a Service) driven
Open Data is completely supported by data modeling and then MaaS completely supports Open Data. MaaS should be the first practice, helping to tune analysis and Open Data design. Furthermore, data models govern design, deployment, storage, changes, resources allocation, hence MaaS supports:

  • Applying Best Practice for Open Data design;
  • Classifying Open Data field of application;
  • Designing Open Data taxonomy and integration;
  • Guiding Open Data implementation;
  • Documenting data maturity and evolution by applying DaaS lifecycle.

Accordingly, Maas provides “on-premise” properties supporting Open Data design and publication:

  1. AnalysisWhat data are you planning to make open? When working with MaaS, a data model is used to perform data analysis. This means the Open Data designer might return to this step to correct, update and improve the incoming analysis: he always works on an “on-premise” data model. Analysis performed by model helps in identifying data integration and interoperability. The latter assists in choosing what data has to be published and in defining open datasets;
  2. DesignDuring the analysis step, the design is carried out too. The design can be changed and traced along the Open Data lifecycle. Remember that with MaaS the model is a service, and the data opened offers the designed service;
  3. Data securityData security becomes the key property to rule data access and navigation. MaaS plays a crucial role in data security: in fact, the models contain all the infrastructure properties and include information to classify accesses, classes of users, perimeters and risk mitigation assets. Models are the central way to enable data protection within the Open Data device;
  4. Participation - Because the goal is “everyone must be able to use Open Data”, participation is comprehensive of people and groups without any discrimination or restriction. Models contain data access rules and accreditations (open licensing).
  5. Mapping – The MaaS mapping property is important because many people can obtain the data after long navigation and several “bridges” connecting different fields of applications. Looking at this aspect, MaaS helps the Open Data designer to define the best initial “route” between transformation and aggregation linking different areas. Then continually engaging citizens, developers, sector’s expert, managers … helps in modifying the model to better update and scale Open Data contents: the easier it is for outsiders to discover data, the faster new and useful Open Data services will be built.
  6. OntologyDefining metadata vocabulary for describing ontologies. Starting from standard naming definition, data models provide grouping and reorganizing vocabulary for further metadata re-use, integration, maintenance, mapping and versioning;
  7. Portability – Models contain all the properties belonging to data in order that MaaS can enable Open Data service’s portability to the Cloud. The model is portable by definition and it can be generated to different database and infrastructures;
  8. Availability – The DaaS lifecycle assures structure validation in terms of MaaS accessibility;
  9. Reuse and distribution – Open Data can include merging with additional datasets belonging to other fields of application (for example, medical research vs. air pollution). Open Data built by MaaS has this advantage. Merging open datasets means merging models by comparing and synchronizing, old and new versions, if needed;
  10. Change Management and History – Data models are organized in libraries to preserve Open Data changes and history. Changes are traced and maintained to restore, if necessary, model and/or datasets;
  11. Redesign – Redesigning Open Data, means redesigning the model it belongs to: the  model drives the history of the changes;
  12. Fast BI – Publishing Open Data is an action strictly related to the BI process. Redesigning and publishing Open Data are two automated steps starting from the design of the data model and from its successive updates.

Conclusion
MaaS is the emerging solution for Open Data implementation. Open Data is public and private accessible data, designed to connect the social community with the public bodies. This data should be made available without restriction although it is placed under security and open licensing. In addition, Open Data is always up-to-date and transformation and aggregation have to be simple and time saving for inesperienced users. To achieve these goals, the Open Data service has to be model driven designed and providing data integration, interoperability, mapping, portability, availability, security, distribution, all properties assured by applying MaaS.

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] N. Piscopo - DaaS Contract templates: main constraints and examples, in press
[4] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[7] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in theDaaS Life Cycle
[8] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[9] The Open Software Service Definition (OSSD) at opendefinition.org

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@ThingsExpo Stories
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.