The Analytics Buzz
It’s an exciting time! The IT security industry is buzzing with vendors talking analytics as the solution to a never ending cat-and-mouse game of “catch the bad guy.” The idea is that analytics puts a combination of the brains and horsepower in the box. Massive amounts of activity data coupled with super computing power has the potential to detect and highlight anomalies. And these anomalies will supposedly illuminate the bad guy like a spotlight in a prison camp.
History Repeats Itself
They say history repeats itself and if you’ve been in the business long enough, you’ve likely seen something like this before. In the late 80’s and 90’s as the desktop PC evolved into the back-end server and new PC based server applications were popping up everywhere, there was a need for monitoring the health and safety of these boxes and applications. This was a booming market and vendors didn’t waste time jumping in to seize the opportunity. Software companies built the ultimate all-in-one solutions to monitor everything using a “framework”. These frameworks sold into many of the biggest companies with multi-million dollar price tags. In the end, they seldom worked and when they did, they provided such shallow insight into applications that they were often of little value. Applications would hang, users would complain, and the NOC would report that everything was green. These frameworks were part of the “shelfware” legacy of the 90’s. A great idea on the drawing board but failed miserably in the field. For many years, even into the new century, IT Pros that had lived through it would shudder when they heard the term “framework.”
The problem is, these frameworks were a mile wide and an inch deep. They could cover every platform and application but provided little value. They lacked critical insight into the internals of the applications they were watching; they were not domain experts.
Today’s analytics are faced with a similar risk. They are often built on top of other data warehouses; often SIEM. SIEM captures a tremendous amount of data but has little in the way of deep application intelligence. SIEM relies on other sources; in the world of Windows applications and security, the data can range from being uninformed to overly verbose. Translating logs to intelligence can be a real problem and is guaranteed to be “after the fact” making it impossible to gather further data that might be needed. Thus, SIEM offers up mountains of data; sometimes very valuable, sometimes not, but always after the fact and disconnected. What’s missing is deep, domain expertise and context that would highlight risk areas that are invisible to SIEM.
I coin the phrase hierarchical analytics because the current analytics architectures tend to be very flat. It’s like a CEO having to make every decision in the organization based on a recording of all of the day’s data. First, it’s not likely efficient … even if you give him super processing powers…but second, and more important, good decisions often come from asking a few more questions and you can’t do that from a report, or in the case of SIEM, a log.
If we agree that Analytics requires good data, and we agree that good data comes from expertise, proximity to the event, timeliness and the ability to gather context, then we logically must agree that a flat approach of data feeds to SIEM coupled with “analytics on top” is likely problematic.
Instead, we need data from products that are close to the bare metal and that will feed domain intelligence to SIEM. Domain expert products that are watching in real-time, with the ability to gather additional context, add domain expertise with their own lightweight analytics, and feed this higher level knowledge to SIEM. With this more intelligent, contextual data, SIEM analytics are better empowered and in a more timely manner; the ‘who’, ‘what’, ‘where’ and ‘when’ is provided and the noise removed. These events are designed for SIEM and thus contain the data suitable for SIEM to link together the big picture.
As it was in the early days of framework monitoring, successful implementations resulted from smart IT people linking domain expert monitors into their framework backbone. So it is today with SIEM and analytics.
And as the real estate professionals say, “Location, Location, Location” … I would translate to “Proximity, Proximity, Proximity.” Proximity to the application, and the actual event in real-time is the key to obtaining quality data and making decisions that might require additional context. Recognizing this and implementing the appropriate hierarchy is the surest way to succeed with analytics.