Eyeglass Solutions Publication2

ElasticSearch SIEM Zero Trust alert integration

Home



 

Overview

Elastic SIEM (Security Information and Event Management) is a powerful security platform built on top of the Elastic Stack, which includes Elasticsearch, Logstash, and KibanaIt provides a centralized way to collect, normalize, and analyze security-related data from various sources, offering real-time threat detection and response capabilities


Support Statement

  1. NOTE:  This documentation is provided "as is" without support for 3rd party software.  The level of support for this integration guide is best effort without any SLA on response time.  No 3rd party product support can be provided by Superna directly.  3rd party components require support contracts

Limitations

  1. None

Solution Overview

Superna Defender Zero Trust API receives webhook alerts and parses the key data into a HTTP API payload events that are sent to ElasticSearch SIEM endpoint url.  ElasticSearch SIEM is a modular architecture that provides real-time visibility of your IT infrastructure, which you can use for threat detection and prioritization.   The ElasticSearch solution uses the Custom HTTP Endpoint Logs Integration.


Advanced Zero Trust Capabilities

  1. Webhook to native HTTP endpoint with json custom payload

What is ElasticSearch SIEM?

Elastic SIEM (Security Information and Event Management) is a powerful security platform built on top of the Elastic Stack, which includes Elasticsearch, Logstash, and KibanaIt provides a centralized way to collect, normalize, and analyze security-related data from various sources, offering real-time threat detection and response capabilities


Integration Architecture



Solution Configuration in ElasticSearch SIEM and Data Security Edition Zero Trust

Prerequisites

  1. Installed Security Edition
  2. Eyeglass OS appliance version 15.5
    1. cat /etc/os-release
  3. License key for the Zero Trust API 
  4. ElasticSearch SIEM


Configuration in ElasticSearch SIEM

The steps below to create a collector endpoint. 

  1. Login to create the http custom collector
  2.   Select Project settings and Integrations
  3. Select Custom HTTP endpoint logs
  4. Click install option and follow the guide to install an agent. 
    1. https://www.elastic.co/docs/reference/fleet/install-fleet-managed-elastic-agent
    2. Once the agent is deployed and registered on an on premise host that has IP reachability to the Eyeglass vm and firewall port eyeglass --> TCP 8080 --> elastic agent host.  
    3. NOTE: Record the IP address of the Elastic Agent, this will be used in the Eyeglass integration code to send formatted json payload alerts to the Elastic agent over port 8080
    4. Click  Add Custom HTTP Endpoint Logs integration
    5. Change the listening port from local host to 0.0.0.0 for the agent.
      1. Leave all other defaults to create the agent and http policy
    6. Completed integration with agent running.
    7. NOTE: The http endpoint agent accepts http over 8080 port with json payload that is indexed using json.fieldname within the index.  The Superna Zero trust field names that are encoded.
      1. {
        "@timestamp": [
        "2025-06-12T19:20:52.293Z"
        ],
        "agent.ephemeral_id": [
        "866a659a-00e7-409c-942c-3407d17897a6"
        ],
        "agent.id": [
        "f5e19748-ba67-40b9-ae86-dc5646434d88"
        ],
        "agent.name": [
        "rapid7"
        ],
        "agent.name.text": [
        "rapid7"
        ],
        "agent.type": [
        "filebeat"
        ],
        "agent.version": [
        "9.0.2"
        ],
        "data_stream.dataset": [
        "http_endpoint.generic"
        ],
        "data_stream.namespace": [
        "default"
        ],
        "data_stream.type": [
        "logs"
        ],
        "ecs.version": [
        "8.0.0"
        ],
        "elastic_agent.id": [
        "f5e19748-ba67-40b9-ae86-dc5646434d88"
        ],
        "elastic_agent.snapshot": [
        false
        ],
        "elastic_agent.version": [
        "9.0.2"
        ],
        "event.agent_id_status": [
        "verified"
        ],
        "event.dataset": [
        "http_endpoint.generic"
        ],
        "event.ingested": [
        "2025-06-12T19:21:01.000Z"
        ],
        "event.module": [
        "http_endpoint"
        ],
        "input.type": [
        "http_endpoint"
        ],
        "json.action": [
        "Lockout | Comment | Comment | Comment | Comment | Comment | Comment | Comment | Comment | Comment | Comment | Comment"
        ],
        "json.Alert_url": [
        "https://172.31.1.102/rsw/alerts/25:24"
        ],
        "json.client_ip": [
        "172.31.1.45"
        ],
        "json.detected": [
        "Jan 8, 2025, 10:18:34 AM"
        ],
        "json.detected_time": [
        1742410571000
        ],
        "json.device_event_class_id": [
        "security"
        ],
        "json.device_product": [
        "Data Security Edition"
        ],
        "json.device_vendor": [
        "Superna"
        ],
        "json.device_version": [
        "V1"
        ],
        "json.event": [
        "25:24"
        ],
        "json.event_type": [
        "threat_detection"
        ],
        "json.files": [
        "\\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (2) - Copy - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (11) - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (5).locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (6) - Copy - Copy - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (3) - Copy - Copy - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (10) - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (4) - Copy - Copy - Copy - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (11) - Copy - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (10) - Copy - Copy.locky; \\\\production\\data\\ifs\\data\\dfs\\my data - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy - Copy (12) - Copy.locky"
        ],
        "json.nes": [
        "production"
        ],
        "json.protocol": [
        "SMB2"
        ],
        "json.severity": [
        "CRITICAL"
        ],
        "json.shares": [
        "marketing; igls-dfs-dfsprod; dfsprod; marketing"
        ],
        "json.state": [
        "LOCKED_OUT"
        ],
        "json.timestamp": [
        "2025-03-19T18:56:11.000Z"
        ],
        "json.user": [
        "demouser@adcto1.test"
        ],
        "json.version": [
        "1.0"
        ],
        "tags": [
        "forwarded"
        ],
        "_id": "AZdllgSvSrYFFhOni4Sm",
        "_ignored": [
        "json.files"
        ],
        "_index": ".ds-logs-http_endpoint.generic-default-2025.06.12-000001",
        "_score": null
        }

         



Configuration Steps on Eyeglass Virtual Machine

High Level steps

  1. Create python location to run the application on the Eyeglass vm
  2. Create python main application script
  3. Create linux systemd service and set to auto start
  4. Create Zero Trust configuration in Data Security Edition
  5. Update the main script to customize with ElasticSearch SIEM  python code
  6. Test the script is running as a service
  7. Create a test event in Defender to validate the alerts appear as indexed parsed events in ElasticSearch SIEM

Configuration Step by Step

Configure the Service start and python integration files
  1. Login to the Eyeglass VM via SSH as the admin user
    ssh admin@<your-vm-ip>

    # Become root
    sudo -s

    mkdir -p /opt/superna/cgi-bin

    chown -R sca:users /opt/superna/cgi-bin
    chmod -R u+rwX,g+rwX /opt/superna/cgi-bin

    # become SCA user

    sudo -u sca -s

    cd /opt/superna/cgi-bin

    python3 -m venv venv-elasticsearch

    source venv-elasticsearch/bin/activate

    pip install flask requests

    deactivate


    # Create required files
    touch elasticsearch.py
    touch elasticsearch.sh

    # Make scripts executable
    chmod +x elasticsearch.py
    chmod +x elasticsearch.sh

    # Create the elasticsearch.sh launch script
    nano /opt/superna/cgi-bin/elasticsearch.sh

    # past contents below into the file 

    #!/bin/bash
    export PATH="/opt/.pyenv/bin:$PATH"
    source /opt/superna/cgi-bin/venv-elasticsearch/bin/activate
    exec python /opt/superna/cgi-bin/elasticsearch.py

    # Make the launch script executable
    chmod +x /opt/superna/cgi-bin/elasticsearch.sh

    ## exit from being SCA user

    exit

    whoami 

    ### make sure you are the root user again for these steps


    # Create the systemd service unit file
    nano /etc/systemd/system/elasticsearch.service
     

    #Paste the contents below into the file

    [Unit]
    Description=Webhook listener for Zero Trust API translations and integrations
    After=network.target

    [Service]
    Type=simple
    User=sca
    Group=users
    WorkingDirectory=/opt/superna/cgi-bin
    ExecStart=/bin/bash /opt/superna/cgi-bin/elasticsearch.sh
    Restart=always
    RestartSec=5

    [Install]
    WantedBy=multi-user.target


    # Reload systemd to recognize the new service
    systemctl daemon-reload

    # Enable the service to start on boot (do NOT start it yet)
    systemctl enable elasticsearch


Configure the python packages and customize the ElasticSearch SIEM integration python code
  1. Customize the application code by downloading the python code from this link to download
    1. Open the python template file in a text editor. NOTE: make sure to only replace the values and do not delete any of the commas
    2. Locate this section in the file and replace the yellow sections to match your ElasticSearch SIEM endpoint URL.   The endpoint url is unique and can be copied from the console into the variable below.
      1. # ElasticSearch SIEM HTTP Collector Endpoint
        elastic_URL = "https://x.x.x.x:8080/"
  2. nano /opt/superna/cgi-bin/elasticsearch.py
  3. Open the file locally in Windows OS notepad and use control-A or select all the text in the python template
  4. Paste the clipboard into the ssh terminal session with the open nano editor file
  5. save the file
    1. press control+x
    2. Answer yes to save and exit the nano editor 
  6. Start the service and verify it is running
    1. systemctl start elasticsearch
    2. systemctl status -l elasticsearch 
    3. Verify the service is started successfully and returns "active and running".
  7. If the service does not start do not proceed and double check the steps above are completed. 

Configure Defender Zero Trust Webhooks

  1. The next step creates an Zero Trust Webhook URL.    
    1. Configure Zero Trust endpoint in Ransomware Defender Zero Trust tab.
      1. Recommended Configuration: Only Critical and Major events and only the webhooks that set lockout or delayed lockout.Customers can customize based on specific requirements. The goal is to send findings versus a list of alarms that do not pinpoint a security incident.
      2. The endpoint url above will use localhost and will send Webhooks to the application service listening on port 5000. URL to use in the configuration
        1. http://localhost:5000/webhook 
        2. Add the Content-Type header with value of application/json and the content-encoding with value gzip as shown above to complete the webhook configuration.
        3. Click save to commit the configuration.
        4. Click save on the main Webhook configuration page
  2. Test the configuration is working following the next section

How to test the Integration with ElasticSearch SIEM 

  1. To test the integration follow these steps
  2. Prerequisites In ElasticSearch SIEM and Eyeglass:
    1. Get the ip address of the Eyeglass vm
    2. download this curl command template and open with a text editor and locate the ip address of eyeglass at the very end of text and replace the ip address with the IP address of your eyeglass vm.
    3. Copy all the text in the text editor
    4. ssh to the eyeglass vm as the admin user
    5. Paste the entire cli command text to the ssh prompt to send sample data to the running Zero Trust application.  This will send test data directly to the application to be processed and sent to ElasticSearch SIEM SIEM Integration service that is running on the Eyeglass VM.
    6. The output of a successfully processed webhook test will return this text in the ssh terminal
      1. done sending event to ElasticSearch SIEM and check for http 200 and success count in response
    7. How to review the process logs from the web application
      1. sudo -s 
      2. journalctl -f -u ElasticSearch SIEM 
      3. This allows you to view the logs generated by the application.
      4. To log to a file and review with nano showing only the most recent 250 lines.
      5. journalctl -f -n 250 -u elasticsearch  > /tmp/elasticsearch.log
      6. nano /tmp/elasticsearch
      7. In the log below the response code from the ElasticSearch SIEM  api call should show http 200 status code and successCount 1 to indicate the Finding was successfully created.
      8.   You can also view ElasticSearch SIEM.log for specific error messages.
  3. Done


ElasticSearch SIEM SecOps administrators Integration Experience 

Example case created but the integration.

  1.    
© Superna Inc