Setting Up Splunk for Log Management on Kali Linux for Your Cybersecurity Home Lab

IritT
19 min readNov 28, 2024

--

Splunk is not just a tool for managing logs — it’s a gateway to understanding real-world cybersecurity scenarios. In a cybersecurity lab, Splunk helps you simulate threats, monitor suspicious activities, and analyze logs just like professionals do in enterprise environments. With Splunk, you can detect failed login attempts, track system events, and build actionable insights that enhance your skills in security monitoring.

In a cybersecurity lab, Splunk plays a critical role in simulating real-world scenarios. It allows you to:

  1. Detect and respond to potential threats.

2. Automate log monitoring and analysis.

3. Gain hands-on experience with tools and techniques used in enterprise environments.

Whether you’re monitoring failed login attempts, analyzing system performance, or investigating security incidents, Splunk provides the tools to centralize and make sense of all your data. This guide will walk you through installing, configuring, and using Splunk on Kali Linux, step by step, as part of your cybersecurity lab.

Step 1: Downloading Splunk

  1. Go to the Splunk official website.
  2. Navigate to the Products section and click Free Trials & Downloads.

2. Scroll down to the Splunk Enterprise section and click Get My Free Trial.

3. Fill out the registration form with your email, name, and job title to access the download.

4. Once registered, choose the .deb package for Linux.

Step 2: Downloading Splunk on Kali Linux

  1. Open the terminal on your Kali Linux machine.
  2. Use the following wget command to download the Splunk package and after Verify the download:
wget -O splunk-9.3.1–0b8d769cb912-linux-2.6-amd64.deb "https://download.splunk.com/products/splunk/releases/9.3.1/linux/splunk-9.3.1-0b8d769cb912-linux-2.6-amd64.deb"
ls

2. Change to the root user

sudo su

3. Make the downloaded file executable and Check file permissions to confirm the changes:

chmod +x splunkforwarder-9.3.2-d8bb32809498-linux-2.6-amd64.deb
ls -la

Step 3: Installing Splunk

Use the dpkg -i command to install the .deb package, which unpacked and set up Splunk on the system.

dpkg -i splunk-9.3.1-0b8d769cb912-linux-2.6-amd64.deb 

Step 4: Starting Splunk

After installing navigate to Splunk installation directory (/opt/splunk/bin), which is where Splunk’s main executable files are located and Start Splunk for the first time and accept the license agreement:

cd /opt/splunk/bin
./splunk start - accept-license

You’ll be prompted to create an administrator username and password. Use these credentials to manage your Splunk instance.

Username: admin

Password: Create a secure password.

Now Splunk will install and validate its components but also set up secure cryptographic keys and certificates to ensure encrypted and secure communication during its operations.

Splunk has been successfully installed, all prerequisites have been validated, and the Splunk web interface is now available at http://127.0.0.1:8000. You can open a browser on our Kali Linux machine and access this URL to complete further configurations.

Managing Splunk Services

Here are some useful commands for managing the Splunk service:

  1. Getting Help
sudo /opt/splunk/bin/splunk help

2. Starting Splunk

sudo /opt/splunk/bin/splunk start

3. Stopping Splunk

sudo /opt/splunk/bin/splunk stop

4. Restarting Splunk

sudo /opt/splunk/bin/splunk restart

5. Checking Splunk Status

sudo /opt/splunk/bin/splunk status

6. Adding a Single Event (Oneshot) — Adds one specific event to Splunk’s index (useful for testing).

This command uploads the content of a specific log file (at /path/to/logfile) into Splunk for one-time indexing. It means Splunk will process and analyze the file’s current data, but it will not track future changes to the file. This is useful for analyzing static log files or performing a one-off investigation.

sudo /opt/splunk/bin/splunk add oneshot /path/to/logfile

7. Searching Logs

sudo /opt/splunk/bin/splunk search "error"

8. searching within a specific index in Splunk.

sudo /opt/splunk/bin/splunk search "index=_internal error"

9. Searching logs within a specific time range in Splunk

sudo /opt/splunk/bin/splunk search "error earliest=-15m latest=now"

10. Get a list of all indexes in Splunk

sudo /opt/splunk/bin/splunk list index

Step 5: Adding a Desktop Shortcut for Splunk (optional)

  1. Navigate to the Desktop directory:
cd ~/Desktop

2. Create a new shortcut file using nano:

nano splunk.desktop

3. Add the following content to the file:

[Desktop Entry]
Name=Splunk
Comment=Launch Splunk Web Interface
Exec=xdg-open http://kali:8000
Icon=/opt/splunk/share/splunk/search_mrsparkle/exposed/img/splunk.ico
Terminal=false
Type=Application
Categories=Network;

Name: Sets the name for the shortcut ( Splunk).

Exec: Command to open Splunk’s web interface (http://kali:8000).

Icon: Path to the Splunk icon (splunk.ico).

Terminal: false, indicating that this shortcut does not require a terminal.

Type: Defines it as an application.

Categories: Specifies the category (Network).

4. Save and exit nano (Ctrl + O, Enter, Ctrl + X).

5. Make the shortcut executable:

sudo chmod +x splunk.desktop

6. Right-click the desktop icon and select Allow Launching.

This option permits the shortcut to be launched directly from the Desktop.

Launching Splunk interface

Log In: Use the username admin and the password we set during the installation.

This will take us to the Splunk home page, where you can start configuring data sources, creating dashboards, and exploring other features.

Or

Use browser

Step 6: Accessing the Splunk Web Interface

  1. After starting Splunk, it will display the URL for the web interface, typically: http://127.0.0.1:8000

2. Open a web browser on your Kali Linux machine and enter the URL.

http://127.0.0.1:8000

3. Log in using the administrator username and password you just created.

4. After accessing the Splunk interface, you are now on the Splunk Home page. This page provides an overview and quick links to essential Splunk tools and apps, such as Search & Reporting, Splunk Secure Gateway, and Upgrade Readiness App.

5. A security risk warning within the Splunk Web interface. The issue pertains to an empty allowedDomainList in the alert_actions.conf configuration file. This means that no restrictions are currently in place for which email domains can be used for sending email alerts, posing a security risk.

6. In Kali terminal window edit the alert_actions.conf File

Without allowedDomainList, users can send email alerts to any domain, which can be exploited for spam or phishing.

Specifying domains ensures email alerts are restricted to trusted domains.

nano /opt/splunk/etc/system/local/alert_actions.conf

7. Need to Add or update the [email] stanza to include allowedDomainList

[email]
allowedDomainList = yourcompany.com, example.com

8. Save and exit nano

9. Restarting Splunk

/opt/splunk/bin/splunk restart

Step 7: Configuring Splunk to Monitor Logs

The “Add Data” feature in Splunk brings data from a wide range of sources (like files, directories, network protocols, and cloud services) into Splunk for indexing, which organizes the data; searching, which allows users to query the data; and analysis, which helps in visualizing trends, identifying issues, and monitoring in real-time.

  1. In the Splunk web interface, go to Settings > Add Data.

2. A welcome prompt in Splunk asking if we would like to take a quick tour of the data onboarding process. This message appears the first time we access the Add Data page.

Options Available:

Skip: We can skip the tour if we are already familiar with adding data in Splunk.

Continue to Tour: Selecting this will guide us through the steps for adding data sources to Splunk, explaining different options for data onboarding.

3. Next is a different option in Splunk for adding data to the platform. Specifically, it highlights three main methods for bringing data into Splunk:

Upload: This option allows us to upload files directly from our computer into Splunk. It’s commonly used for structured files, like CSVs or log files, which we want to analyze without setting up continuous monitoring.

Upload is suitable for one-time data imports or non-continuous log files.

Monitor: The Monitor option enables us to monitor files and ports on the Splunk instance continuously. This means that Splunk will ingest new data in real-time as it appears in the specified files or on the designated ports.

Common sources for monitoring include log files, HTTP traffic, TCP/UDP ports, and scripts that generate data.

The monitor is ideal for real-time data sources, like system or application logs, that you want to track continuously.

Forward: The Forward option allows us to receive data from other Splunk forwarders, which are lightweight agents that collect data from remote systems and forward it to the main Splunk instance.

Forward is useful for distributed environments where data is gathered from multiple machines or devices and sent to a central Splunk instance for analysis.

Select Monitor to continuously monitor log files or directories.

4. Choose Files & Directories and specify the directory path to monitor ( /var/log/).

This is a common directory for log files on Linux systems, including system logs, application logs, and security logs. By monitoring this directory, Splunk will ingest and index data from all log files in /var/log.

5. Next to proceed to the Input Settings, where you can configure source type, set permissions, and apply indexing settings.

5. Configure input settings:

Input Settings step in the Splunk Add Data configuration process.

Source Type: The Source Type field allows us to specify what kind of data we indexing. It’s an important setting because it helps Splunk understand and correctly parse the incoming data.

Automatic: Splunk will automatically detect the data type and apply a source type based on the content.

Select: You can choose from predefined source types in Splunk, such as syslog, access_combined, etc., depending on the type of log data.

New: If none of the existing source types fit, wecan create a custom source type to specify how Splunk should interpret the data.

Specifying the correct source type is essential for data formatting, field extraction, and search optimization.

App Context: App Context determines the application context in which this data input will be available. In Splunk, we can organize data inputs, configurations, and searches by “apps” or “contexts.”

Host: The Host field assigns a “host” identifier to each event in the data. This value usually represents the machine name or the source from which the data originated.

There are three options for setting the host value:

Constant value: The host value will remain fixed, as specified in the Host field value (in this case, “kali”).

Regular expression on path: Allows us to extract the host value dynamically based on a regular expression applied to the file path.

Segment in path: Uses a specific segment of the file path as the host value, which is useful if we have organized log files by hostnames in directories.

In this configuration, if Constant value is selected with kali as the host field value. This means that each event indexed from this data source will have “kali” set as its host, making it easy to identify which machine generated the data.

At the top, the progress bar shows that you’re on the Input Settings step. The next steps are Review and Done.

6. Review your settings and click Submit to start monitoring.

Splunk will start monitoring /var/log and indexing data in real time based on the configurations you’ve set.

7. Next, you will see a message “File input has been created successfully” indicating that Splunk has successfully set up the monitoring of our specified directory (/var/log) and is ready to start indexing data from it.

Next Steps Options

  • Start Searching: Clicking on Start Searching will take us to the Search & Reporting app in Splunk, where we can begin searching, filtering, and analyzing the data from your new data input.

This is typically the next step after setting up a data input, as it allows us to verify that data is being indexed correctly and is available for queries.

Add More Data: If we have additional data sources to configure, we can select Add More Data to repeat the process for another input source.

Download Apps: This option allows us to install apps from Splunk’s app marketplace. Apps provide pre-built dashboards, reports, and tools tailored for specific use cases or data sources (security monitoring, IT operations).

Build Dashboards: We can proceed to Build Dashboards to visualize our data, create real-time monitoring panels, or design custom views for specific metrics or logs.

For now, click on Start Searching to verify that data from /var/log is being ingested correctly. In the Search & Reporting app, we can use Splunk’s search language to filter, analyze, and visualize the log data.

8. Welcome Tour Prompt

Welcome, Administrator: This prompt is asking if we like to take a tour of the Search & Reporting app.

Skip: Will bypass the tour if you’re already familiar with the interface.

Continue to Tour: This will guide us through the main features and functions of the Search & Reporting app, which is helpful if we new to Splunk.

Select Skip tour

We now at the Search page in Splunk is the main interface for querying, filtering, and analyzing data. From here, we can:

Enter search queries using SPL.

Choose time ranges and sampling to focus on specific datasets.

Access resources to learn more about search capabilities.

Create Table Views for an easier, visual approach to data analysis.

This page is central to Splunk’s functionality, allowing users to explore their indexed data in depth, run analytics, and generate insights in real-time. Let me know if you’d like further details on using any specific feature!

Key Components of the Search Page

  1. Search Bar (top center): This is where we enter search queries to retrieve and analyze data. We can use Splunk’s Search Processing Language (SPL) to create queries and filter specific data, generate statistics, or create visualizations.

For example, a basic search query like index=_internal would search the internal index where Splunk stores its logs.

2. Time Range Selector (next to the search bar): The Last 24 hours dropdown allows us to specify the time range for our search. We can choose pre-set ranges (like Last 24 hours, Last 7 days) or define a custom time range.

Filtering by time helps us focus on relevant data and improves search efficiency.

3. Event Sampling (left of the search bar): Event Sampling allows us to sample a percentage of the total data instead of all available data. This can be useful for quickly reviewing large datasets without waiting for a full search.

Options include sampling percentages like 1 in 100 or 1 in 10 events.

4. Search History: The Search History section (expandable) shows us recent search queries, allowing us to re-run or modify them easily.

This is helpful for tracking queries we’ve recently executed, making it easy to revisit or refine previous searches.

4. How to Search: This section provides links to resources to help us get started with searching in Splunk, including:

Documentation: Splunk’s official documentation for search commands and syntax.

Tutorial: A guided tutorial for using the search feature.

Data Summary: A summary of data sources available in Splunk, helping you understand the data you’re working with.

5. Analyze Your Data with Table Views: The Table Views section on the right provides an alternative to SPL for analyzing data. You can use a point-and-click interface to create and modify data tables.

Create Table View: Clicking this button allows us to create custom table views, useful for organizing data visually without writing SPL commands.

6. Dark Mode Introduction Banner (bottom):

The banner at the bottom highlights the dark mode feature in the Search & Reporting app. It provides instructions to enable dark mode by going to Profile > Preferences > Theme.

Step 8: Searching Logs in Splunk

  1. Use the search bar to query logs:
source="/var/log/*" host="kali"

After running the query, the results are displayed in the following format:

Search Summary: The search events that match the specified query criteria.

Time Range: The query retrieves events within the Last 24 hours.

Event Results: Each row represents an individual log event from the /var/log/ directory.

Key fields:

_time: Shows the timestamp of each event, helping you see when each log entry was created.

Event: Shows the specific log message text for each entry. For instance, you see events like:

status installed splunk:amd64 9.3.1

status half-configured splunk:amd64

These messages indicate package management actions, likely showing Splunk’s installation steps.

host: Identified as “kali,” the host that generated the log.

source: Indicates the specific log file in /var/log/ where the event originates /var/log/dpkg.log.

sourcetype: Indicates the type of log file, such as “dpkg” for package management logs.

Field Sidebar: On the left, we see Selected Fields and Interesting Fields:

Selected Fields include host, source, and sourcetype, which are relevant for all events in this search.

Interesting Fields like date_hour, date_minute, index, and linecount offer more details that could be useful for further analysis, especially in narrowing down specific event attributes.

Timeline and Pagination:A Timeline visualization shows the distribution of events over time, allowing us to see peaks or patterns in log entries.

Pagination allows us to navigate through multiple pages of events, making it easier to go through all matching entries.

Events can be expanded to reveal detailed information about its fields and values it is helpful for:

Detailed Analysis: Understanding the context and metadata of specific events, such as timestamps, sources, and types.

Field Exploration: Identifying useful fields for further filtering or analysis. For example, we may want to create a filter based on the sourcetype or specific timestamp ranges.

Actionable Insights: Adding significant events to dashboards or creating alerts if they represent critical system changes or errors.

This detailed view provides a comprehensive understanding of each event, allowing us to analyze logs deeply for troubleshooting or monitoring purposes.

Creating Alerts in Splunk: Monitoring Failed Authentication Attempts

Alerts in Splunk automate log monitoring by running searches on a schedule and notifying you when specific conditions are met.

Why Create Alerts:

Security Monitoring: Alerts provide real-time notification of suspicious activities.

Proactive Response: By automating this process, you can act quickly on potential security threats.

Efficiency: Automated monitoring reduces the need for manual log analysis.

Step 1: Create a Search Query

  1. Open the Splunk web interface.
  2. Navigate to the Search & Reporting app.
  3. In the search bar, enter the following query to look for failed authentication events
  4. Set the Time Range to Last 24 hours to focus on recent events.
source="/var/log/*" host="kali" failed authentication source="/var/log/*" host="kali" failed authenticating

source=”/var/log/*”: Specifies the log files to search in the /var/log/ directory.

host=”kali”: Limits the search to logs generated by the “kali” host.

“failed authentication”: Searches for events containing this specific phrase, indicating login failures.

This query helps detect recent failed login attempts, which could indicate unauthorized access or brute-force attempts.

Step 2: Save the Query as an Alert

  1. Once you’ve confirmed that the query retrieves relevant results, go to the Save As dropdown (above the search bar) and select Alert.

2. Splunk will prompt you to configure the alert settings.

Step 3: Configure the Alert

In the Alert Settings section, fill out the following fields:

  1. Title: Enter a descriptive name, such as “Failed Authentication Attempts”.
  2. Description: Optionally, add a note like “Login attempts failed”.
  3. Permissions: Choose between:

Private: Only you can view this alert.

Shared in App: Make it accessible to other users within the same Splunk app.

4. Alert Type:
Select one of the following:

Scheduled: Run the alert on a recurring schedule (daily or weekly).

Real-time: Trigger the alert instantly when conditions are met.

Frequency (for Scheduled Alerts):
Specify how often the alert runs. For example:

Run every day at 6:00 AM to get daily updates.

5. Expires:
Set an expiration period if the alert is temporary or for testing purposes (optional).

Trigger Conditions section, specify when the alert should activate:

6. Trigger Alert When:

Choose Number of Results.

Set the condition to greater than 0 to trigger the alert if any failed authentication events are detected.

Higher Sensitivity:

Set greater than 1 if you want to be notified of every failed attempt. Use this for high-security systems.

Lower Sensitivity:

Set greater than 5 if you want to reduce false positives, focusing only on repeated failed attempts. Use this for systems with high user activity.

If you’re monitoring a public-facing server, set a low threshold (2–3) to quickly detect brute-force attempts.

For internal systems, a higher threshold (5–10) may be more appropriate to avoid frequent alerts.

7. Trigger Frequency:

Once: Trigger the alert once when conditions are met.

For Each Result: Trigger the alert for every matching log event (useful for detailed tracking).

Step 4: Save the Alert

  1. Review the alert configuration and click Save.
  2. The alert titled “Failed Authentication Attempts” is now active.

Step 5: Test and Monitor the Alert

  1. Simulate a failed authentication attempt from Kali to Windows (enter incorrect SSH credentials).
ssh invalid_user@<Target_IP>
ssh invalid_user@192.168.19.12
  1. Wait for the alert to trigger based on the conditions you set.
  2. Verify that the alert notifies you via the specified method (email, dashboard notification, etc.).
Windows Event Logs
index=* sourcetype="WinEventLog:Security"
Splunk
index=* sourcetype="WinEventLog:Security" EventCode=4625
index=main sourcetype=syslog error OR failed

Step 6: How to Respond to a Failed Authentication Alert

When your alert triggers, you can take several actions to investigate and mitigate the issue:

  1. Review the Details of the Alert:

Use Splunk’s Search & Reporting app to analyze the logs. Look for patterns, such as repeated failed attempts from a single IP address or a specific time frame.

source="/var/log/*" host="kali" "failed authentication"

2. Block Suspicious IP Addresses:

If you identify a malicious IP address, use your firewall (pfSense) to block it.

In pfSense:

Navigate to Firewall > Rules.

Add a rule to block the suspicious IP.

Apply the changes.

Failed SSH attempt Alert

source="/var/log/*" host="kali" password failed

Creating a Custom Source Type
If Splunk doesn’t automatically recognize the structure of your logs, you can create a custom source type to define how the data should be interpreted

Custom source types improve the accuracy of field extractions, making it easier to analyze logs and create meaningful alerts.

  1. Navigate to Settings > Source Types in the Splunk web interface.
  2. Click New Source Type and follow these steps:

Name: Give your source type a meaningful name, such as custom_authentication_logs.

Preview: Upload a sample log file to view how Splunk parses the data.

Timestamp Extraction: Define the field containing the timestamp and specify its format (%Y-%m-%d %H:%M:%S).

Field Extraction: Use regular expressions to extract custom fields like username, IP address, or error message.

3. Save the source type and assign it to the logs in the Add Data workflow.

By setting up these alerts, you automate the monitoring of critical security events, enabling real-time detection and response to potential threats.

Final Tips for Using Splunk to Maximize Splunk’s Potential

  1. Create daily reports summarizing key activities, such as failed login attempts or system errors.
  2. Build dashboards for real-time monitoring of your environment. Include charts, alerts, and log summaries.
  3. Explore integrating additional data sources, such as network traffic logs, to gain a broader view of your system’s security.
  4. Use Splunk’s search language (SPL) to create advanced queries, such as identifying specific attack patterns or correlating events across multiple logs.

By incorporating these practices, you’ll develop a deeper understanding of Splunk’s capabilities and strengthen your cybersecurity skills.

--

--

IritT
IritT

Written by IritT

In the world of cybersecurity, the strongest defense is knowledge. Hack the mind, secure the future.

Responses (1)