ES-5 2 2-Tutorials
ES-5 2 2-Tutorials
i
Introduction
1
Correlation Search Tutorial
This tutorial is for users who are comfortable with the Splunk Search Processing
Language (SPL) and who understand data models and the Splunk App for
Common Information Model.
You will learn how to create a correlation search using the guided search
creation wizard.
Correlation searches allow you to search across one or more types of data and
identify patterns that could indicate suspicious or malicious activity in your
environment.
2
When to use a correlation search
Use a correlation search to identify patterns in your data that can indicate a
security risk.
• You want to know when high-risk users log in to machines infected with
malware.
• Identify vulnerability scanning behavior in your network.
• Validate that your access control deprovisioning process is working as
expected by monitoring inactive and expired account activity.
• Look for compromised accounts by identifying geographically impossible
logins.
Develop a use case that you want the search to address before you start creating
the search. This tutorial walks you through creating the Excessive Failed Logins
search, which is designed to detect brute force access attempts.
For example, a security analyst wants to know all the users that attempted to log
in to an application and failed to type their passwords correctly at least six times.
The Excessive Failed Logins correlation search included in Splunk Enterprise
Security captures that use case and performs the following functions:
As another example, a security analyst wants to know if more than ten computers
on the network failed to update their virus signatures for a week. The High
Number of Hosts Not Updating Malware Signatures correlation search included
in Splunk Enterprise Security captures that use case and performs the following
functions:
3
• Count the collected events.
• If there are more than 10 collected events, create an alert.
After you determine the security use case that you want your correlation search
to address, determine which data sources are relevant to the use case.
In this case, the Excessive Failed Logins search looks for data related to
logins, so it uses the Authentication data model as the data source. By using a
data model rather than searching a specific source type directly, the correlation
search can search a wide variety of data sources related to authentication, such
as operating systems, applications, or RFID badge readers, without needing to
be changed. Relying on data models in correlation searches allow you to write
one search for multiple types of data.
Next step
Create a search
4
aligns with the type of search that you plan to build. If you have a custom
app for your deployment, you can store the correlation search there.
6. In the UI Dispatch Context drop-down list, select None. This is the app
used by links in email and other adaptive response actions. The app must
be visible for links to work.
7. In the Description field, type a description of what the correlation search
looks for, and the security use case addressed by the search. For
example, Detects excessive number of failed login attempts (this is
likely a brute force attack).
If you disable or remove the app where the search is stored, the correlation
search is disabled. The app context does not affect how or the data on which the
search runs.
Next Step
5
build it. The best way to build a correlation search with syntax that parses and
works as expected is to use guided search creation mode.
1. From the correlation search editor, click Guided for the Mode setting.
2. Click Continue to open the guided search editor.
1. For the Data source field, select the source for your data.
♦ Select Data Model if your data is stored in a data model. The data
model defines which objects, or datasets, the correlation search
can use as a data source.
♦ Select Lookup File if your data is stored in a lookup. If you select a
lookup file for the Source, then you need to select a lookup file by
name.
To recreate the Excessive Failed Logins search, select Data Model.
2. In the Data model list, select the data model that contains the
security-relevant data for your search. Select the Authentication data
model because it contains login-relevant data.
3. In the Dataset list, select the Failed_Authentication dataset. The
Excessive Failed Logins search is looking for failed logins, and that
information is stored in this data model dataset.
4. For the Summaries only field, click Yes to restrict the search to
accelerated data only.
5. Select a Time range of events for the correlation search to scan for
excessive failed logins. Select a preset relative time range of Last 60
minutes. The time range depends on the security use case for the search.
Excessive failed logins are more of a security issue if they occur during a
one hour time span, whereas one hour might not be a long enough time
span to catch other security incidents.
6. Click Preview to review the first portion of the search.
6
7. Click Next to continue building the search.
Filter the data that the correlation search examines for a match using a where
clause. The search applies the filter before applying statistics.
The Excessive Failed Logins search by default does not include any where
clause filters, but you can add one if you want to focus on failed logins for
specific hosts, users, or authentication types.
The search preview shows you if the correlation search string can be parsed.
The search string appends filter commands as you type them, letting you see if
the filter command is a valid where clause. You can run the search to see if it
returns the results that you expect. If the where clause filters on a data model
dataset such as Authentication.dest, enclose the data model dataset with
single quotes. For example, a where clause that excludes authentication events
where the destination is local host would look as follows: | where
'Authentication.dest'!="127.0.0.1".
7
Analyze your data with statistical aggregates
Analyze your data with statistical aggregates. Each aggregate is a function that
applies to a specific attribute in a data model or field in a lookup file. Use the
aggregates to identify the statistics that are relevant to your use case.
For example, the Excessive Failed Logins correlation search uses four
statistical aggregate functions to surface the important data points needed to
define alerting thresholds. For this search, the aggregates identify the following:
This aggregate retrieves all the values for the Authentication.tag dataset.
8
2. Select the dc function from the Function list.
3. Select Authentication.user from the Field list.
4. Type user_count in the Alias field.
This aggregate retrieves a distinct count of devices that are the destination of
authentication activities.
Fields to split by
Identify the fields that you want to split the aggregate results by. Split-by fields
define the fields that you want to group the aggregate results by. For example,
you care more about excessive failed logins if the users were logging into the
same application and from the same source. In order to get more specific notable
events and to avoid over-alerting, define split-by fields for the aggregate search
results.
9
Split the aggregates by source.
You can find information on split-by fields in the Splunk platform documentation.
Identify the criteria that define a match for the correlation search. The correlation
search performs an action when the search results match predefined conditions.
Define the statistical function to use to look for a match.
For Excessive Failed Logins, when a specific user has six or more failed logins
from the same source and attempting to log in to the same application, the
10
correlation search identifies a match and takes action.
1. In the Field list, select the function count. The Field list is populated by
the attributes used in the aggregates and with the fields used in the
split-by.
2. In the Comparator list, select Greater than or equal to.
3. In the Value field, type 6.
4. Click Next.
The guided mode wizard ensures that your search string parses and produces
events. You can run the search to see if it returns the preliminary results that you
expect.
1. Open a new tab in your browser and navigate to the Splunk platform
Search page.
2. Run the correlation search to validate that it produces events that match
your expectations.
1. If your search does not parse, but parsed successfully on the
filtering step, return to the correlation search guided editor
aggregates and split-bys to identify errors.
2. If your search parses but does not produce events that match your
expectations, adjust the elements of your search as needed.
3. After you validate your search string on the search page, return to the
guided search editor and click Done to return to the correlation search
editor.
Next Step
11
Configure a schedule for the correlation search
As excessive failed logins matter most when you hear about them quickly, select
a real-time schedule for the search. If you care more about identifying all
excessive failed logins in your environment, you can select a continuous
schedule for the search instead.
Optionally, you can set a schedule window and a schedule priority for the search.
The schedule priority setting overrides the schedule window setting, so you do
not need to set both.
When there are many scheduled reports set to run at the same time, specify a
schedule window to allow the search scheduler to delay running this search in
favor of higher-priority searches. When detecting excessive failed logins, time
matters but there are other searches that are more important so you want to use
the automatic setting to rely on the search scheduler.
If this search is more important to run and see results from than other searches,
you can change the schedule priority to "Higher" or "Highest" instead of the
default. Detecting excessive failed logins is a priority, but not higher than other
potential security incidents.
12
Define trigger conditions for the alerts
You can choose to trigger an alert based on a number of factors associated with
the search. By default, the trigger conditions are set to alert you when the
number of results is greater than zero. For this search, leave the default value.
1. Type a Window Duration of 1 and select day(s) from the drop-down list
to throttle alerts to 1 per day.
2. Type app and src as Fields to group by. You want to select the fields
here that you split the aggregates by.
This means that no matter how many Excessive Failed Logins correlation search
matches there are in one day that contain the same app and source field values,
only one alert is created.
Next Step
Part 5: Choose available adaptive response actions for the correlation search.
The Excessive Failed Logins search creates a notable event alerting security
analysts to the fact that a host has a large number of failed logins, and modifies
the risk score of the host by 60 to ensure that analysts are able to identify that it
is a host that people are attempting (and failing) to log in to.
13
1. Click Add New Response Action and select Notable to add a notable
event.
2. Type a Title of Excessive Failed Logins - Tutorial.
3. Type a Description of The system $src$ has failed $app$
authentication $count$ times using $user_count$ username(s)
against $dest_count$ target(s) in the last hour.
4. Select a security domain of Access.
5. Select a Severity of medium.
6. Leave the Default Owner and Default Status as leave as system
default.
7. Type a Drill-down name of View all login failures by system $src$ for
the application $app$.
8. Type a Drill-down search of
| from datamodel:"Authentication"."Failed_Authentication" |
search src="$src$" app="$app$"
This search shows the contributing events for the notable event.
9. Type a Drill-down earliest offset of $info_min_time$ to match the
earliest time of the search.
10. Type a Drill-down latest offset of $info_max_time$ to match the latest
time of the search.
11. (Optional) Add Investigation Profiles that apply to the notable event.
For example, add an investigation profile that fits a use case of "Malware"
to malware-related notable events.
12. Add the src, dest, dvc, and orig_host fields in Asset Extraction to add
the values of those fields to the investigation workbench as artifacts when
the notable event is added to an investigation.
13. Type the src_user and user fields in Identity Extraction to add the values
of those fields to the investigation workbench as artifacts when the notable
event is added to an investigation.
14. (Optional) Add Next Steps for an analyst to take when triaging this
notable event. For example, Review user activity on the Identity
Investigator dashboard.
15. (Optional) Add Recommended Actions for an analyst to run when
triaging this notable event.
Create a second response action to increase the risk score of the system on
which the failed logins occurred.
14
4. Type a Risk Object Field of src.
5. Select a Risk Object Type of System.
Next Step
For information on choosing a data model as a data source, see What data
models are included in the Splunk Add-on for Common Information Model.
For information about choosing a lookup table as a data source, see Introduction
to lookup configuration in the Knowledge Manager Manual and Create and
manage lookups in Splunk Enterprise Security in Use Splunk Enterprise Security.
• For Splunk Enterprise, see Use the stats command and functions in the
Splunk Enterprise Search Manual.
• For Splunk Cloud, see Use the stats command and functions in the Splunk
Cloud Search Manual.
• For Splunk Enterprise, see Specify time modifiers in your search in the
Splunk Enterprise Search Manual.
15
• For Splunk Cloud, see Specify time modifiers in your search in the Splunk
Cloud Search Manual.
Search scheduling
For information on real-time and continuous search scheduling, see the Splunk
platform documentation.
Alerting conditions
• For Splunk Enterprise, see Configure alert trigger conditions in the Splunk
Enterprise Alerting Manual.
• For Splunk Cloud, see Configure alert trigger conditions in the Splunk
Cloud Alerting Manual.
For details about how to make sure that additional fields appear in the notable
event details, see Change notable event fields in Administer Splunk Enterprise
Security.
16