Journalism 2.0 … Blatant Advertising!
For those of you who know me, I
have been on a kick about the fundamental transformations of PR and
Marketing over the last two years. And if you have seen this “Log and event management appliances improve compliance, security, operations”
piece by By Chris Peterson, Network World , 03/19/2008, this is just
one manifestation of a single thread on the fundamental changes in PR
& Marketing. This article comes under the
heading of Network Management, so that identify the viewpoint up front,
but still … this is not news, but an ad.
“current approaches to log and security event management force customers to purchase and integrate two or more products for each discipline”. Not sure where he is going with this, but if I want network event management (SIM/SEM), there are several vendors that offer complete and compelling solutions. Single vendor. Single product. Not sure what his intention is with this statement but it seems misleading to me.
What is more, I advocate that we should have two systems, at least in the short term. One to monitor, assess and audit network infrastructure, and one that does the same for data/applications. Why? The type of information relevant and available to each is different, and both security & compliance challenges are different. It would be great to have one, but I think all too often securing one is seen as securing the other, and that horrid fallacy continues to persist.
“To unlock the value of logs, a new class of appliance has emerged that combines universal log-data collection, analysis, event management, automated report distribution and incident response”. Wow. I have many problems with that statement, but the list is too long to really cover, so I will just mention a couple of points.
“Since log formats are as varied as the log sources themselves, once logs are collected they must be normalized.” Normalization is NOT mandatory. Normalization is very useful if you want to report on very large quantities of data. Migration into the lowest common denominator format is a way to simplify the reporting challenges and reduce the computational complexity of report generation. But just as the author points out “log formats are as varied as the log sources themselves”, the process of normalization means a loss of information that does not fit within the normalized template. So that means the value of the data itself is less, unless you keep both the original record and the normalized record. And if you keep both, the data management and the data processing challenges double in overhead (storage, reporting, archive & restore, etc), and you need to have some reference structure to link the normalized and original record together.
Under the topic of Event Management: “One way to do that is to assign a priority from 1 to 100 based on the type of event; …”. There is no such thing as universal priority. I will grant you that some events, like failed logins, are interesting to security, compliance, IT and development groups alike. Perhaps not the executive team, but it is safe to say most groups want that information. But save for one or two event types, almost no group shares the same priority list. In fact, Internal Audit and IT often have the opposite list.
Under the topic of Log Collection: “They can be
pulled from Windows hosts (event logs) and any database compliant with
Open Database Connectivity.” ODBC? For database log collection? Really?
If 25% of IT data created is log files, (according to the author this is a SANS statistic) doesn’t this indicate a problem unto itself? Does it not suggest that perhaps filtering events not pertinent to core business, security and compliance might be the core issue to deal with a logarithmic growth curve? I actually suspect this number is in fact far higher, but let’s go with 25% for the sake of argument. Is an appliance the appropriate way to grow with this data set? Are you really going to house, analyze, manage and report terabytes of data in remote appliances? Or the corollary, think about your investment in hardware & software for applications like SAP or databases like Oracle.
Are you willing to spend 25% yet again to watch the underlying network log files? Is log file aggregation really that wise of an investment? Does it deserve this level of infrastructure and cost as event management tends to be more forensic in nature? Is it really an effective way to gather and report on security and compliance data? Not to me, and this is not a compelling story for SEM in the picture being painted.
I am a fan of well executed SIM/SEM, and this product may fall into that category. But as it is clear this is not a general advocacy piece, two things need to happen here. First, this article really needs a giant flashing ‘Advertisement’ banner so I don’t choke on my coffee again when I read stuff like this. Second, if there is really is “a new class of appliance” out there, the author needs to explain why exactly that is because you are not going to know from this article. This is exactly the same type of feature set I was reading about and saw demoed from other vendors in 2004. I know Network World wants new and exciting content, but they need to do better than this.
Recent Comments