Proof of Concept - Simple Correlation with Logger!

Document created by pbrettle on Oct 8, 2012Last modified by pbrettle on Oct 9, 2012
Version 7Show Document
  • View in full screen mode

Introduction

 

One of the big differentiators between Logger and ESM / Express is that Logger doesn't do correlation. It has some simple alerting (well, actually its quite advanced) and this has been extended in the recent releases with saved searches. But when we are looking at simple use cases or ones where we are focused against traditional log management competitors, it does get difficult to show the value of Logger.

 

**** THIS IS NOT SUPPORTED ****

 

Dont even try to get support for this because the tech support guys wont be able to. This information is shared as a proof of concept only and not something that we provide as a feature - so usual caveats, blah and no, dont call me either.

 

Transaction

 

This all came from the introduction of the Transaction operator and the power that it has within a search. This example is shamelessly stolen from Rob, so thanks Rob:

 

Lets take a simple search based on failed logins which is based on the following:

 

simple search1.gif

categoryDeviceGroup = "/Operating System" and categoryBehavior = "/Authentication/Verify" AND categoryOutcome = "/Failure" and sourceAddress is not null and destinationUserName is not null

 

From here, we get a nice search based on the number of failed logins, but we have no aggregation based on the same source and same username involved, which is where we need it to be for some form of correlation as we know it. But the transaction operator allows us to group things together:

 

simple search 2.gif

 

categoryDeviceGroup = "/Operating System" and categoryBehavior = "/Authentication/Verify" AND categoryOutcome = "/Failure" and sourceAddress is not null and destinationUserName is not null| transaction sourceAddress destinationAddress destinationUserName maxspan=2m | where eventcount >=5 | dedup transactionid

 

When we run the search we get the following:

 

tranaction id 3.gif

 

We can now see that there is a unique transaction ID for each line and that there are a number of events that are bunched together. So the transaction is taking the sourceAddress, destinationAddress and destinationAddress and ensuring that they are the same. So in effect we are aggregating on these fields and matching them. This is the bit that we couldnt do previously with Logger and we can use the specific transaction operator to support us.

 

Additionally, we can then specify a time span for this transaction, so we are aggregating these three fields over a 2 minute period so this is the time frame that we want for the simple correlation. We can then further pass this for the where operator and ensure that this particular set of search criteria occurs 5 or more times. So its just like the standard correlation rule for a brute force login attempt - 5 login failures over a 2 minute period where the username, source and destination is the same! The dedup on the transactionid ensures that we are grouping things together.

 

Next comes the difficult part and settting this to be a saved search and then scheduled alert.

 

Scheduled Alert - how to configure process

 

First we need to save this search by clicking the little Floppy icon

 

save query.gif

 

Note that we cannot save it as an alert at this point because it will stop you. So save the search and we will create the alert from the Configuration -> Settings -> Saved Search menu. Go to the saved searches and locate your search and then click the edit icon:

 

edit alert.gif

 

Form here, you need to remove the section after the "|" for the transaction part of the query. Odd I know, but the UI stops you creating a search with an operator like this for a saved search, so you need to remove this and save it. The search should look like as follows:

 

edit search 2.gif

 

You can now save the search and create a schedule for it. Click the Scheduled Search / Alerts tab and click the Add button to add one.scheduled alert 1.gif

 

Add in an event count, threshold (not sure what to add here, but I have been using 5, but it doesnt seem to make a difference) and an email address (or other alerting mechanism) for this saved alert. Once you have finished it, save it and activate it:

 

activate alert.gif

But to add the transaction bit for the search, we need to go back to the saved search we created and edited above and add it back in. So go back to the Saved Search tab, locate the alert we created, and edit it to add in the removed transaction part. Simply a cut from above and now a paste will be fine:

 

edit search and save.gif

 

Now save the search and it will accept it. If you tried to have the search with the tranasction operator in it and then add an alert, it will actually give an error message. The logic of the UI means that it stops you, but technically there is no limitation on the underlying search processing / saved search processing with this, its just the UI that is stopping you. Save the search and then go to the Alerts view and see what is happening. I took this example because it does work with the standard Logger demo and it does fire nicely. To show the alerts, click on the Analyze tab, select Alerts and then make the view changes:

 

alert view options.gif

 

Remember we have the search running every 15 minutes, so do wait until it runs. But once it runs you can see the alert and see what is going on:

 

failed login alert.gif

 

There is our alert and it is triggering correctly on the underlying events!

 

The Use Cases?

 

The next obvious question that comes out of this is what is the use case and can we re-use it? Unfortunately there is no direct way to export scheduled alerts, only real-time ones at the moment. So I am afraid that creating a package for this is going to be difficult. I guess we could create a package and then use it from there, but at the moment this is beyond what I have done for a proof of concept. I will look into the options later, but I wanted to get something done now.

 

What does this mean? Unfortunately it means that you have to create your searches by hand (or copy and paste) and go through this painful process of editing the search to get it to do what you want. This isnt all bad though, as it does give us the capability to create simple correlation rules (aggregation based rules) on Logger to hit a number of use cases based around simple log collection, raw log collection or even to off-load some processing before the events get to ESM / Express. It should also help show what we can do with Logger when we are up against Splunk and its "correlation".

 

Finally, you can of course extend the search and get it to do a chart:

 

correlation chart.gif

 

And this was generated by the following search:

 

categoryDeviceGroup = "/Operating System" and categoryBehavior = "/Authentication/Verify" AND categoryOutcome = "/Failure" and sourceAddress is not null and destinationUserName is not null| transaction sourceAddress destinationAddress destinationUserName maxspan=2m | where eventcount >=5 | dedup transactionid | chart count(transactionid) as Thresholds_Exceeded by destinationUserName | sort - Thresholds_Exceeded

 

Thats it, thanks Rob for discovering this, Steve for extending it and working around the options. And yes, Logger can do simple correlation - which is way beyond what a lot of our competitors can do! Now the question is to get this a supported feature and get it built into the product going forward!

2 people found this helpful

Attachments

    Outcomes