From Obsolete to Impregnable: Modernising SIEM with Data Streaming for Better Cybersecurity

News Desk -

Share

By Fred Crehan, Area Vice President, Emerging Markets at Confluent

Human society has emerged victorious from many battles of late – a pandemic, a global economic downturn, supply chain issues, and more. Even among the problems that remain there seems to be a sense of hope that relief is around the corner. For the GCC’s cybersecurity teams however, that hope seems dim at best. For example, this year over 50,000 cyber-attacks targeting organisations in the UAE on a daily basis, overwhelming the country’s IT security professionals and impacting morale in SOCs across the region.

Things however can get better, and this can be achieved through security incident and event management (SIEM). Naysayers would be quick to dismiss this claim, stating that SIEM does not work in the modern IT environment amid a flurry of sophisticated, adaptive, stealthy attack methods. They would argue that while there are many SIEM products that are undeniably usable and rich in their presentation of information, they are delivering insights after the fact because they receive raw data in batch form. Real-time insights – necessary to thwart today’s fast-moving threats – become impossible and critics would justifiably conclude that SIEM is of little use to the modern SOC.

And there the discussion might end but for one addendum. There is the potential to put an end to the operational fragmentation, slow response, and high data costs associated with SIEM and update it for the cloud-first era. This can be done by introducing data streaming. All of the data sources normally attended to by collection agents in each SIEM product – forwarders in Splunk, adapters in Arcsight, beats in Elastic – are replaced by real-time data feeds. These streams of information augment the SIEM system rather than replacing it. They support each installed SIEM product as required, allowing the intelligent analysis engine within to do what it does best. With this simple tweak – moving from batch processing to real-time analysis and investigation – alerts come faster and the SOC becomes immeasurably more agile.   

Bathing in the stream

Data streaming maximises the ROI of SIEM by allowing the combined system to raise potential threats for analysis at different levels of granularity for multiple tools. Satisfied that each SIEM tool has all the data it needs for comprehensive analysis, security response teams can get down to the real work of detecting threats. Previously, these teams would have spent arduous hours trying to weed out data, but stream-processing platforms eliminate this grind.

What follows is, quite simply, a more robust security posture. The data-streaming approach centralises information, which breaks down the silos that otherwise restrict enterprise-wide visibility. With contextually rich data, the SOC gains awareness of everything within its digital jurisdiction. The team’s tools are ingesting real-time data streams from all relevant data sources and forwarding enriched streams to the sink of their choosing. This also returns a lot of control to security professionals, as stream-processing platforms allow new threat detection rules to be applied to data streams on the fly, which improves operational agility. And, as an added bonus, costs of ownership go down even as the coverage of SIEM tools increases.

The data streaming approach has broad applicability. SIEM systems and observability pipelines can be built on the industry-standard Apache Kafka to deliver a next-generation solution that consolidates, categorises, and enriches log and database feeds, along with data captured from real-time events, and makes it available for real-time monitoring and forensics. The analytics engines consume just the right amount of data to fit a use case, leading to faster insights.

Back to life

So, as we have now seen, SIEM’s obsolescence has been greatly exaggerated. Along with data streaming technology it can become the real-time cybercop the SOC needs to fight its daily fight. The team will evolve from using batch processing to delivering real-time analysis at scale. Data streaming is a true partner to SIEM – its real-time information pipeline. It supercharges the technology and gets it ready for modern warfare. It makes SIEM capable of faster iteration and response. It provides agility and incident detection through the enrichment of inflight event data.

And stream processing goes so much further. It allows threats to be detected in live streams of data, which is great news for a business’s finance professionals because a lot of this type of data is costly to store and index in an SIEM. This goes a long way towards eliminating the agony of having to choose between lower costs and more granular visibility.

Of course, you may be wondering where data streaming services and cybersecurity cross over. Are generic stream-processing engines set up for SIEM-specific logic out of the box? While the short answer is “no”, we get help here from a special translator language called Sigma, a text-based go-between that can describe log events in a way that makes them accessible to cybersecurity analytics engines. Because it is a generic industry standard, it enables data interchange between organisations, leading to greater collaboration on threat intelligence.

Sigma allows for cybersecurity rules that govern raw data streams from different sources and because it is an open-source project, communities spring up to improve it over time. It also works with Kafka Streams and because the knowledge corpuses of Kafka and Sigma overlap, enterprises that adopt the data-streaming approach for their SIEM systems benefit from new Sigma rules being published into Kafka.

SIEM still has an important place in our security ecosystems. Data streaming has returned it to prominence by overcoming its one disadvantage in the hybrid, multi-cloud space. A stream-fed SIEM is a well-equipped cybercop for the modern age.