Welcome!

Government Cloud Authors: Elizabeth White, Liz McMillan, Pat Romanski, Dana Gardner, Gopala Krishna Behara

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Apache, Government Cloud

@CloudExpo: Article

Washington to Put $200 Million into Big Data R&D

To improve the tools and techniques needed to access, organize and glean discoveries from huge volumes of digital data.

The Obama Administration Thursday unveiled a Big Data Research and Development Initiative that will see the six federal agencies and departments put $200 million or more into Big Data R&D.

These new commitments are supposed to improve the tools and techniques needed to access, organize and glean discoveries from huge volumes of digital data.

Dr. John Holdren, director of the White House Office of Science and Technology Policy, said, "In the same way that past federal investments in information technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education and national security."

It's seen as being that important.

The major initiative is supposed advance state-of-the-art core technologies, apply them to accelerate the pace of discovery in science and engineering as well as transform teaching and learning, and expand the workforce needed to develop and use Big Data technologies.

It's a response to recommendations by the President's Council of Advisors on Science and Technology, which last year concluded that the federal government was under-investing in Big Data technologies.

As a result, the National Science Foundation (NSF) and the National Institutes of Health (NIH) will be implementing a long-term strategy that includes new methods to derive knowledge from data; infrastructure to manage, curate and serve data to communities; and new approaches to education and workforce development.

As a start, NSF will be funding a $10 million Expeditions in Computing project based at Berkeley to integrate machine learning, cloud computing and crowd sourcing.

It will also provide the first round of grants to support EarthCube, a system that lets geoscientists access, analyze and share information about the planet, issue a $2 million award for a research training group for undergraduates to use graphical and visualization techniques for complex data and provide $1.4 million to support a focused research group of statisticians and biologists to determine protein structures and biological pathways.

NIH is particularly interested in imaging, molecular, cellular, electrophysiological, chemical, behavioral, epidemiological, clinical and other data sets related to health and disease.

It said the world's largest set of data on human genetic variation - produced by the international 1000 Genomes Project - is now available on Amazon's cloud. At 200TB - the equivalent of 16 million file cabinets filled with text, or more than 30,000 standard DVDs - the current 1000 Genomes Project data set, derived from 1,700 people, is a prime example of Big Data.

AWS is storing the 1000 Genomes Project on S3 and in Amazon Elastic Block Store (EBS) as a publicly available data set for free; researchers only will pay for the EC2 and Elastic MapReduce (EMR) services they use for disease research. They used to have to download publicly available datasets from government data centers to their own systems, or have the data physically shipped to them on disks. The current aim of the project is to sequence 2,600 individuals from 26 populations around the world. (See http://aws.amazon.com/1000genomes.)

The Defense Department will be investing around $250 million a year (with $60 million available for new research projects) in a series of programs that use Big Data in new ways to bring together sensing, perception and decision support to make autonomous systems that can maneuver and make decisions on their own.

The agency also wants a 100-fold increase in the ability of analysts to extract information from texts in any language, and a similar increase in the number of objects, activities and events an analyst can observe.

DARPA, the Defense Advanced Research Projects Agency, is beginning an XDATA program that will invest about $25 million a year for four years to develop computational techniques and software tools for analyzing large volumes of data, both semi-structured (tabular, relational, categorical, metadata) and unstructured (text documents, message traffic).

That means developing scalable algorithms for processing imperfect data in distributed data stores and creating effective human-computer interaction tools to facilitate rapidly customizable visual reasoning for diverse missions.

The XDATA program will employ open source toolkits for software development so users can process large volumes of data in timelines "commensurate with mission workflows of targeted defense applications."

The Energy Department will kick in $25 million in funding to establish a Scalable Data Management, Analysis and Visualization (SDAV) Institute under Lawrence Berkeley National Laboratory.

It's supposed to bring together the expertise of six national laboratories and seven universities to develop new tools to help scientists manage and visualize data on the agency's supercomputers to streamline the processes that lead to discoveries made by scientists using the agency's research facilities. It said new tools are needed since the simulations running on its supercomputers have increased in size and complexity.

Lastly, the US Geological Survey will incubate Big Data projects that address issues such as species response to climate change, earthquake recurrence rates and the next generation of ecological indicators.

More Stories By Maureen O'Gara

Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at)sys-con.com or paperboy(at)g2news.com, and by phone at 516 759-7025. Twitter: @MaureenOGara

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...