Welcome!

Government Cloud Authors: Elizabeth White, Liz McMillan, Gopala Krishna Behara, Raju Myadam, Kevin Jackson

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

What Is #DevOps Intelligence? | @DevOpsSummit #AI #ContinuousDelivery

DevOps intelligence gives you access to both real-time and historical information about your applications, people, environments

DevOps Intelligence Changes the Game
By Lisa Wells

One of my favorite parts of the novel The Phoenix Project is when Bill Palmer, DevOps hero and VP of IT Operations for the fictional company “Parts Unlimited” has a light bulb moment about the central importance of IT to the business.

The moment comes as the company’s CFO lays out for Bill how he strives to align the goals of his department with the goals of the business. It’s here Bill starts to realize he must take a similar approach with IT. He ultimately turns to data about his delivery process to improve IT’s effectiveness and save his team from outsourcing—and a DevOps team is born.

Okay, so real-world situations might not be as dire as the fictional drama at Parts Unlimited. Still, many IT teams that are transforming to DevOps have yet to take the next step—using “DevOps Intelligence” to make data-driven decisions that help them improve software delivery.

What Is DevOps Intelligence?
DevOps Intelligence  is all about providing the intelligence and insight companies need to deliver software more efficiently, with less risk and with better results. Making it part of your process is becoming crucial as both the demand for better software faster and the complexity of application development keep increasing. As incentive for getting started, below are seven benefits of making DevOps intelligence a top priority in 2017 and beyond.

1. Faster Release Cycles
End-to-end intelligence about your delivery pipeline lets you optimize your processes and accelerate release cycles. With the real-time, actionable information that DevOps intelligence provides, you can identify waste, such as bottlenecks in the pipeline. You can quickly find out how systems are performing with new changes, monitor the success rate of deployments, get insight into the cycle times for each team, and see which processes are working well and which are negatively impacting time to delivery.

2. Higher Quality Software
DevOps intelligence enables feedback loops, which are the foundation of iterative development. Feedback loops allow for creativity and are extremely valuable for doing things like trying out new features or changes to an interface to make sure you’re building more of what customers want. Feedback loops can become an integral part of the software development and delivery process because failures are fast, cheap, and small.

3. Increased Business Value of Software
DevOps intelligence allows you to quickly get actionable information about things like which features customers are using, which processes they’re abandoning, or whether they’re changing their behavior. DevOps intelligence can also be mined after a release to support impact analysis so you can find out whether what you’re delivering is actually of value to your customers and make smarter decisions about future offerings.

4. Greater Transparency
Insight into the entire pipeline provides end-to-end transparency. Clear, real-time visibility into the process makes it easier for you to understand why you are (or are not) your hitting goals, justify requests for additional time and resources, and make the case that readiness rather than calendar dates should drive releases. Transparency also means that non-IT stakeholders can easily track progress at any given point in the process and feel empowered to make business decisions based on real-time data without having to go through IT.

5. Addition of Proactive and Predictive Management to the Delivery Process
DevOps intelligence gives you access to both real-time and historical information about your applications, people, environments, and more. Real-time, actionable insight delivers advantages such as early warning of what might fail so you can prevent it rather than wasting time firefighting. Historical data lets you analyze trends and predict behavior based on past results.

6. Better Support of Compliance Requirements
Data collected about your processes shows not just how those processes can be optimized, but what happened when, in an auditable fashion. Were processes followed? Who did what and when? What failed? What steps were taken, by whom, when, and were they correct? DevOps intelligence helps you stay on top of your compliance requirements and fix problems that might threaten your ability to meet them.

7. Stronger DevOps Culture
Intelligence about your delivery process helps strengthen your DevOps culture by empowering people, both inside and outside of IT, to affect change and be part of efforts to improve processes and products. DevOps intelligence provides insight that shines a light on their accomplishments so they can be celebrated. The ability to share data with people across the business reinforces the fact that they have an important role in making impactful decisions that help the company.

As companies improve their DevOps maturity and implement release orchestration, they’re building the infrastructure they need to automatically capture and analyze DevOps data and turn it into actionable information. Armed with this intelligence, IT will be well-positioned to fully support the goals of the business.

The post DevOps Intelligence Changes the Game appeared first on XebiaLabs.

More Stories By XebiaLabs Blog

XebiaLabs is the technology leader for automation software for DevOps and Continuous Delivery. It focuses on helping companies accelerate the delivery of new software in the most efficient manner. Its products are simple to use, quick to implement, and provide robust enterprise technology.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...