Welcome!

Government Cloud Authors: Elizabeth White, Liz McMillan, Gopala Krishna Behara, Raju Myadam, Kevin Jackson

Related Topics: @CloudExpo, Containers Expo Blog, @DXWorldExpo

@CloudExpo: Blog Post

Case Study: Accelerate - Academic Research | @CloudExpo @DDN_limitless #Cloud #Storage

UCL transforms research collaboration and data preservation with scalable cloud object storage appliance from DDN

University College London (UCL), ranked consistently as one of the top five universities in the world, is London's leading multidisciplinary university with more than 10,000 staff , over 26,000 students as well as more than 100 departments, institutes and research centers. With 25 Nobel Prize winners and three Fields medalists among UCL's alumni and staff, the university has attained a world-class reputation for the quality of its teaching and research across the academic spectrum.

As London's premier research institution, UCL has 5,000 researchers committed to applying their collective strengths, insights and creativity to overcome problems of global significance. The university's innovative, cross-disciplinary research agenda is designed to deliver immediate, medium and long-term benefits to humanity. UCL Grand Challenges, which encompass Global Health, Sustainable Cities, Intercultural Interaction and Human Wellbeing, are a central feature of the university's research strategy.

According to Dr. J. Max Wilkinson, Head of Research Data Services for the UCL Information Services Division, sharing and preserving project-based research results is essential to the scientific method. "I was brought in to provide researchers with a safe and resilient solution for storing, sharing, reusing and preserving project-based data," he explains. "Our goal is to remove the burden of managing project data from individual researchers while making it more available over longer periods of time."

The Challenge
The opportunity to improve the sharing and access of project-based research presented several unique technical and cultural challenges. On the technical side, the team had to accommodate a variety of different types of data, growing in volume and velocity. In some cases, a small amount off data is so valuable to a research team that six discrete copies were retained on separate USB drives or removable hard drives kept in different locations. In other instances, UCL researchers produce copious amounts of very well-defined data that pass between compute algorithms under which research sits.

In addition to solving technical problems, the research data services team was faced with the opportunity to support researchers in a new ‘data-intensive' world by making it safe and easy to follow best practices in data management and use best-of-class storage solutions. "We discovered the valuable data underpinning most research projects were stuck on a hard drives or disk, never to be seen again," adds Wilkinson. "If we could provide a framework over which people could share and preserve data confidently, we could minimize this behavior and improve research by making the scholarly rerecord more complete."

To accomplish this, UCCLL needed to provide an enterprise-class foundation for data manipulation that met the needs of its diverse user community. While some researchers thought 100GB was a large amount of data, others clamored for more than 100TB to support a particular project. There was also an expectation that up to 3,000 individuals from UCL's total base of 5,000 active researchers and collaborators would require services within the next 18-to-24 months.

"We had a simple services proposition that would eliminate the need for research teams to manage racks of servers and data storage devices," says Wilkinson. "Of course, this meant we'd need a highly scalable storage infrastructure that could grow to 100PB without creating a large storage footprint or excessive administrative overhead."

Additionally, they had to address long-term data retention needs that extended well beyond the realm of research projects. UCL, along with many other UK research intensive institutes, is faced with increasingly stringent requirements for the management of project data outputs by UK Research Councils and other funding bodies in the United Kingdom. As grant funding in the UK supports best practice, it was critical to have a proven data management plan that documented how UCL would preserve data for sometimes decades while ensuring maximum appropriate access and reuse by third parties.

The Solution
In seeking a scalable, resilient storage foundation, UCL issued an RFP to solicit insight on different approaches for consolidating the university's research data storage infrastructure. Each of the 21 RFP respondents was asked to provide examples of large-scale deployments, which produced far-ranging answers, including how providers addressed sheer data volume, reduced increasingly complex environments or delivered overarching data management frameworks.

UCL's RFP covered a diverse set of requirements to determine each potential solution provider's respective strengths and limitations. "We asked for more than we thought possible from a single vendor-from a synchronous file sharing to a high performance parallel file system, highly scalable, resilient storage that would be simple to manage," notes Daniel Hanlon, Storage Architect for Research Data Services at University College London. "We wanted to cover our bases while determining what was practical and doable for researchers."

Recommendations encompassed a broad storage spectrum, including NAS, SAN, HSM, object storage, asset management solutions and small amounts of spinning disks with lots of back-end tape. "Because we had such broad requirements, we omitted any vendor that was bound to a particular hardware platform," explains Wilkinson. "It was important to be both data and storage agnostic so we would have the flexibility to support all data and media types without being locked into any particular hardware platform."

With its ability to support virtually unlimited scalability, object storage appealed to UCL, especially since it also would be much easier to manage than alternatives. Still, object storage was seen as a relatively new technology and UCL lacked hands-on experience with large-scale deployments within the university's ecosystem. In addition to evaluating the different technologies, UCL also assessed each provider's understanding of their environment, as it was critically important to accommodate UCL's researcher requirements in order to drive acceptance. "Some of the RFP respondents didn't understand the difference between the corporate and academic worlds, and the fact that universities by nature generally have to avoid being tied into particular closed technologies," adds Hanlon. "Many of the RFP respondents were eliminated, not because of their technical response, but because they didn't really get what we were trying to do."

As a result, the universe of prospective solutions was reduced to a half-dozen recommendations. As the team took a closer look at the finalists, they considered each vendor's academic track record, ability to scale without overburdening administrators and experience with open-source technology. "We wanted to work with a storage solutions provider that took advantage of open-source solutions," Hanlon notes. "This would enable us to partner with them and also with other academic institutions trying to do similar things."

In the final analysis, UCL wanted a partner with equal enthusiasm for freeing researchers from the burden of data storage so they could maximize the impact of their projects. "We were very interested in building a relationship with a strong storage partner to fill our technology gap," says Wilkinson. "After a thorough assessment, DataDirectTM Networks (DDN) met our technical requirements and shared our data storage vision. In evaluating DDN, we agreed that their solution had a simple proposition, high performance and low administration overhead."

The proposed solution, which included the GRIDScaler massively scalable parallel file system and Web Object Scaler (WOS), also provided the desired scalability and management simplicity. Another plus for WOS storage was its tight integration with the Integrated Rule-Oriented Data Management Solution (iRODS). This open-source solution is ideally suited for research collaboration by making it easier to organize, share and find collections of data stored in local and remote repositories.

"It was important that DDN's solution gave us multiple ways to access the same storage, so we could be compatible with existing application codes," says Hanlon. "The tendency with other solutions was to give us bits of technology that had been developed in different spaces and that didn't really fit our problem."

The Benefits
During a successful pilot implementation involving a half-petabyte of storage, UCL gained first-hand insight into the advantage of DDN's turnkey distributed storage and collaboration solution. "The main attraction of DDN WOS is the combination of an efficient object store with edge appliances to ease integration with other storage infrastructure," says Hanlon. Another big plus for UCL is DDN's high-density storage capacity, which will enable fitting a lot more disks into existing storage racks, which is crucial to growing while maintaining a small footprint in UCL's highly-congested, expensive downtown London location.

As researchers are often reluctant to give up control of their data storage solutions, the team also has been pleased to discover early adopters who see the value of using the new service to protect and preserve current data assets. In fact, the new research data service already is getting high marks for performance reliability, data durability, data backup and disaster recovery capabilities.

UCL predicts that as traction for the new service increases, there will be greater interest in leveraging it to further extend how current research is reused and exploited to drive more impactful outcomes. By taking this innovative approach, the UCL Research Data Services team is embracing the open data movement while enlisting leading-edge technologies to deliver reliable, flexible data access that maximizes appropriate sharing and re-use of research data.

Additionally, UCL is taking the researcher worry of meeting increasingly strong expectations from funding organizations out of the storage equation with its plans to add a scalable archive to its dynamic storage service offering. "We'll be able to tell researchers that if they use our services, they'll be compliant with UCL, UK Research Council and other UK and international funding bodies' policies and requirements," Wilkinson says. "They won't have to worry about it because we will."

By providing a framework over which UCL researchers can store and share data confidently, UCL expects to achieve significant bottom-line cost savings. Early projections around the initial phase of the infrastructure build out are upwards of hundreds of thousands of UK pounds, simply by eliminating the need for thousands of researchers to attain and maintain their own storage hardware. "DDN is empowering us to deliver performance and cost savings through a dramatically simplified approach; in doing so we support UCL researchers, their collaborators and partners to maintain first class research at London's global university," concludes Wilkinson. "Add in the fact that DDN's resilient, extensible storage solution provided evidence of seamless expansion from a half-petabyte to 100PBs, and we found exactly the foundation we were looking for."

More Stories By Pat Romanski

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...