Welcome!

Government Cloud Authors: Elizabeth White, Liz McMillan, Gopala Krishna Behara, Raju Myadam, Kevin Jackson

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Government Cloud

@DevOpsSummit: Blog Post

Is Your Log Data Presentation Worthy? By @TrevParsons | @DevOpsSummit [#DevOps]

The complex data you turned into consumable information was no trivial task

Is Your Log Data Presentation Worthy?

This article originally posted on the Logentries Blog.

You're in a quarterly meeting where everyone is armed with slide decks; some good, some mind numbingly bad. It's your turn, and you have a concise five-slide deck comprised of targeted, well designed, and insightful graphs. The graphs are clear and the room is able to discuss the highs and lows reported in the data.

While you leave the meeting satisfied that you communicated clearly...

The complex data you turned into consumable information was no trivial task

Regardless if it took a few hours or a few days to prepare the slides - it took significant effort.

Taking large volumes of information (sometimes abstract information), and creating something easily digestible can be challenging; both mentally as well as on resources. The information has to first, be collected, then correlated, analyzed, insights identified, and finally, organized in a consumable way. Sometimes, an intimate knowledge of the system is required to make sense of connected components, knowledge that sometimes has to be translated for a non-specialized audience.

Today's log platforms allow you to write complex queries to get information important to your business. However, with traditional tools the query design alone can take days.

Once you have the information, you need to manually normalize CSV or XML results into a format that your visualization engine (either huge spreadsheets or more advanced tools like Graphite) can ingest. Then, when you have the data loaded, you'll have to format the graphs to make sense to your target audience of peers.

The goal of your presentation is to educate the organization

is your log data presentation worthy

You want to paint a clear picture of what is going on, and what direction to go. There is a chance the results won't go further than the conference room; evaporating your effort, and more importantly the time you could have spent on other tasks.

Other tasks like consuming the data in real-time for the management of applications and systems. Speed is key with these tasks, so the data needs to be presented in a way (if something goes wrong) you can identify issues in a glance and respond quickly by drilling down to find out specific incident details.

Data feeds insights, and insights feed action

Wouldn't it be nice if the intelligence of log analysis fed the visualizations of a charting platform automatically? Your job would be monitoring real-time dashboards of valuable information, picking the best charts, and downloading or screenshotting to build presentations.

It sounds like a pipe dream, but the technology exists

Modern log analysis platforms like Logentries are moving away from the query first, analyze second approach by turning it on its head with automated, effortless insights, and query engines for drill down only. Visualization engines offer unlimited customization with clear visualizations that your team, and the teams around you can understand in a glance.

The good news is It has finally happened, with Logentries integration with Hosted Graphite.

LE-hosted-graphite-chart

Hosted Graphite, based on the popular open-source Graphite platform, is a SaaS based service for visualizing key operations and application metrics.

Logentries is a powerful, modern log management and analysis platform that takes log analysis one step further with machine-learning, intelligent tagging and team annotation capabilities. Both these services provide excellent tools to help operations and development teams better understand their application eco-system, and activity.

However, Hosted Graphite's powerful visualizations are lost without strong data. Not just any data, data that has been synthesized into useful insights.

Information overload is a rampant disease in many organizations, and one to avoid

Logentries powerful machine-learning - real-time anomaly and inactivity alerting - feeds Hosted Graphite graphs with near real-time data about systems in a easily consumable format, without the manual effort put on the shoulders of operations teams.

A perfect combination of robust data analysis, and data visualization means:

  • Unshackled application and operation teams
  • Faster reaction time to key events, and less time spent on trivial activity
  • Faster and easier communication with your team and others
  • Avoid being over inclusive by targeted insight driven visualizations
  • Modern log analysis in existing Graphite environments

Because both platforms are cloud based, set-up occurs in days, not weeks or months. If you are already a Logentries user your visualizations are immediately ready to go, only some bedazzling is required. If you are an existing Hosted Graphite customer, the data you are already feeding the engine will be more valuable, and less time consuming.

If you are new to both, be prepared for how humbling the additional insights will be.

The next time you are in a meeting or working with your team, you can maintain the integrity of the data behind your insights.  Knowing that a great product design combined with a powerful integration between two platforms got you there, without a bunch of wasted effort and esteem.

Interested in learning more about the integration with Hosted Graphite? Watch the recorded webinar to learn more.

More Stories By Trevor Parsons

Trevor Parsons is Chief Scientist and Co-founder of Logentries. Trevor has over 10 years experience in enterprise software and, in particular, has specialized in developing enterprise monitoring and performance tools for distributed systems. He is also a research fellow at the Performance Engineering Lab Research Group and was formerly a Scientist at the IBM Center for Advanced Studies. Trevor holds a PhD from University College Dublin, Ireland.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...