Many Enterprise IT organizations have been focused on the planning and implementation of new infrastructure to enable business operations to turn Big Data into competitive advantages. But now Enterprise IT is starting to turn the tables on itself and is finding success in using Big Data analytics to help improve its own operations. A new generation of tools – so called IT Operational Analytics tools – is on the rise, and is being utilized by an increasing number of IT operations professionals to increase service availability, velocity and efficiency.
“IT Operations Analytics”
Gartner, a leading IT analyst firm, coined IT Operations Analytics (ITOA) to define a new class of technologies now used by IT teams to “discover complex patterns in high volumes of often ‘noisy’ IT system availability and performance data.” In other words, ITOA technologies rely heavily on Big Data analytics to perform crucial IT operational functions, such as capacity planning, incident detection and performance optimization.
ITOA tools take a new approach to old problems, but are rapidly gaining adoption given the dynamic and complex nature of modern IT infrastructures and continuous deployment processes. This so called “new era of IT” has evolved to accommodate both the emergence of customer-facing applications as a competitive advantage, along with data-driven business requirements operating across a layered integration of cloud, mobile and open software infrastructures.
To gain visibility into these dynamic and complex architectures, IT Ops and DevOps teams extensively deploy monitoring mechanisms across the IT stack, from the application software down to the physical equipment. Henceforth, the volume of monitoring data has increased greatly and will continue to expand as environments scale. Given this data deluge, ITOA technologies are in high demand because they can make sense of the Big Data generated by IT itself. They can automate the analysis, then quickly turn this information into useful and actionable insights which can help IT support work in a more efficient manner.
Time Warp: Past-Time vs. Real-Time Big Data
The efficiency and velocity benefits that ITOA tools can bring to IT operation steams are numerous. ITOA technologies can generally be divided into three categories. The initial generation of tools analyzes Big Data in the traditional way, scouring past-time (i.e. historical) data, archived over days, weeks and even years. Much of this data is unstructured, typically in the form of log files. There are two broad categories of ITOA technologies that work in this way:
- Log Search & Inference
- Statistical Predictive Analysis
The third category of tools is the newest generation of ITOA technology and is able to analyze Big Data streams in real-time (i.e. as it happens):
- Algorithmic Event Processing
Each of these technologies has their appropriate use case and purpose.
Operating in the Past
IT Ops and DevOps teams realize that there is insight to be gleaned by analyzing past data. Problems are often intermittent, so it is useful to look back in time and drill down on the data for troubleshooting purposes. Analyzing historical data is also useful for forensic activities, such as security audits and determining “who changed what when.” Log Search & Inference technology is ideal for these use cases. More commonly known as Log Analysis, these tools simplify the search process across large amounts of unstructured data, and help find what you’re looking for. At the same time, these tools are limited if you aren’t sure where to start looking, as they do not provide any early warning as to what may be going wrong in the current production environment.
Another insight that can be gained from analyzing past data is the capability to identify patterns of anomalous behavior and use this to try to predict and plan for the future. This is the basis of Statistical Predictive Analysis technology, and it has been successful in proactively preventing some classes of problems. It can also be useful for assessing longer term trends, such as usage growth and how that may influence capacity planning. But there is a caveat. Unlike what your financial advisor will tell you, Statistical Predictive Analysis assumes that past performance is indicative of future performance. This means that “zero-day” incidents – those that haven’t occurred before – won’t be predicted well. Such zero-day problems are becoming the norm as IT environments become more dynamic and complex with cloud, mobile, and virtualization.
Looking back at historical Big Data for IT, it is a helpful tool for Ops and DevOps teams to drill down on already known information. Big Data archived after the fact can be used to help explain why something has already happened. But what about analyzing Big Data in real-time to understand what’s happening now, before the fact of customers complaining?
Operating in the Now
IT Ops and DevOps teams also need to be aware of what’s happening “now,” which is why receiving early warnings of service-affecting incidents is crucial. This ensures that remediation actions will be taken quicker and will also restore application and service uptime faster.
The ability to ingest distinct data streams in real-time from across the entire IT stack enables automated correlation and clustering of related data events into distinct service-affecting situations. This is what Algorithmic Event Processing technology is about, the real-time application of IT Big Data, and the final element of the ITOA puzzle. The Big Data is analyzed in real-time, in the form of events and alerts generated as a status stream generated by applications, middleware, orchestration, cloud, and data center equipment – the entire IT stack.
Think of these tools as the “first responder” to IT problems as they occur, alerting domain experts to come together and collaborate on the contextualized Big Data output. Based on what they see and share, IT Ops and DevOps can then drill down using their historical Big Data tools, once they have the insight on where to look. In fact, the combination of real-time and past-time Big Data tools provides a complete ITOA toolkit for today’s IT environments.
Final Thoughts
As Enterprises migrate toward the new era of IT, the use of ITOA tools will be the only way to support the dynamic, complex nature of customer-facing applications and services running over hybrid infrastructures. And just as business operations are turning Big Data into competitive advantage, IT Ops and DevOps teams will automate analysis of the massive data created by the IT stack into actionable insights for themselves. With this in mind, IT teams are now able to detect real-time early warnings of incidents as they take place, as well as use forensic tools to help troubleshoot deep into a specific area to understand longer term trends.
As Chief Marketing Officer at Moogsoft, Rob is responsible for global outbound marketing, sales support, and product management. Rob’s 25-year career is all about building successful IT service assurance businesses. Prior to Moogsoft, he was SVP of Sales and Marketing at VSS Monitoring and later Danaher after its acquisition. Rob was also a co-founder of Visual Networks, which grew $0-100M in 5 years and had a successful IPO. He was CEO of Agito Networks (now Shoretel) and Network Chemistry (now Aruba), and also served in senior executive positions at Empirix and iBahn. Rob holds a BSEE degree from Carnegie Mellon University.