In today's world, no one can really afford to be reactive when it comes to security; it's now all about rapidly responding to security risks through ongoing, real-time monitoring, with analytics to spot the anomalies and potential threats.
Designing an adaptive security architecture can help your organisation provide a continuous response to threats.
Continuous monitoring and analytics is watching all assets and devices across a distributed network and acting on potential threats in an automated way. A system may automatically identify an indication of a risk and immediately act on it with a set of incident response procedures. Risk scoring algorithms may also be used to identify which potential threats are of higher risk and need acting on straight away.
Here are the key four pillars of continuous monitoring and analytics systems which can enhance your security footing.
A continuous monitoring and analytics system needs to be able to piece together all the data coming from the organisation's mobile devices, applications and hardware. These items could have different ways of identifying themselves – either through IP address, hostname or MAC address, for example – which could make it challenging to bring all the data together. One way to solve this challenge is by using cross-referencing or a defined master table that contains all identifiers for an entity.
The system also needs to be able to handle multiple data formats such as Comma Separated Values (CSV) files, Extensive Markup Language (XML), and log files, as well as different access routes. Access could be though APIs (application programming interfaces), directly through a database, or by manually exporting the data. When there are multiple tools with different ways of access to the data, the system needs to be able to integrate all of them.
If there’s a database or repository taking in data from hundreds of sensors or devices then being able to consolidate data from all these sources is also crucial in a continuous monitoring and analytics system. This requires a model that can transform the data, consolidate it and correlate it.
The system also needs to feed the data to a real-time dashboard, which could be through a dimensional database and then using Online Analytical Processing cubes.
This is probably the most important pillar, as these systems depend on the analytics capability to spot potential threats and give them a risk score.
Sophisticated algorithms are needed to be able to deal with missing data, inconsistent data and different time intervals of when sensors and devices generate data. It needs to be able to fill in missing data with the mean or most frequent value or predict the value using other relevant data values. The analytics also need to be able to look across different time windows and understand which data is most recent and which has been superseded.
With large volumes of data coming from many different devices and assets, scalability also needs to be taken into account with a continuous monitoring and analytics system. Some organisations have thousands to millions of assets and devices, with hundreds to thousands of software applications, patches, configuration settings, and so on.
Being able to process this data efficiently is crucial so that there's no window of opportunity for an undetected threat to occur while processing is being fixed. Fast streaming XML parsers to quickly write the data to a database, with separate jobs tackling the data consolidation and correlation is a way to solve this challenge.
Ask yourself: should I be looking into ways of being more proactive with threat mitigation?