Security analytics has become a “key” tool in implementing effective security breach detection capabilities, Gartner has advised as the recent launch of several new managed security services (MSS) offerings brings new hosted and analytics security capabilities to the Australian market.
While many security intelligence and event management (SIEM) tools could already collate and analyse security data, Gartner believes those tools will increasingly be differentiated depending on their ability to more proactively and automatically apply their analytics capabilities in business-relevant ways.
“Breach detection is top of mind for security buyers and the field of security technologies claiming to find breaches or detect advanced attacks is at an all-time noise level,” wrote Gartner research director Eric Ahlm, who labelled the security industry as being “immature in the application of analytics”.
Specialised analytics tools would increasingly focus on areas such as user behaviour analytics (UBA), in which large quantities of user behaviour data are collected and correlated to identify important security events. UBA systems are already expanding in scope to incorporate data such as device details and user location, but Gartner advised that “there is still an opportunity to enhance the analytics to include even more data points that can increase the accuracy of detecting a breach.”
Akamai, for its part, this month released a pair of services based on its Cloud Security Intelligence, a data processing engine within the company's Akamai Intelligent Platform. The new services include Kona Client Reputation, which claims an 8-fold improvement in malicious site blocking by creating a reputation score for every IP address crossing Akamai's content delivery network, and Akamai's Improved Kona Rule Set, claimed to reduce false positives and negatives by around 30 percent.
Both tools leverage Akamai's massive and accumulating repository of security event information – which already informs the company's quarterly State of the Internet reports – to provide a consistent and meaningful source of security information structured for standalone analysis or feeding into a SIEM system.
Similarly, BT recently launched BT Assure Cyber, a security monitoring offering that combines monitoring and analytics services to correlate data from a range of platforms and applications. The service's 'super correlation' engine applies anomaly-detection algorithms to automatically identify events, and a risk modelling engine to evaluate how serious a particular threat may be to the organisation.
Effective analytics systems also needed to incorporate long-term views that allow them to look back across days' or weeks' worth of seemingly innocuous data that may, in retrospect, be a lead indicator of a looming security compromise.
This sort of analysis may still require extensive human involvement to apply meaningful heuristic analysis to the observed data, but analytics tools would also need to more effectively collate and present this information to assist in its analysis.
Read more: Automation key to defeating new adversaries
“Ultimately, how actual human users interface with the outputs of large data analytics will greatly determine if the technology is adopted or deemed to produce useful information in a reasonable amount of time,” Ahlm said.
“Like other disciplines that have leveraged large data analytics to discover new things or produce new outputs, visualization of that data will greatly affect adoption of the technology.”
This article is brought to you by Enex TestLab, content directors for CSO Australia.