Wazuh ruleset as code (RaC)

| by | Wazuh 4.12.0
Post icon

Wazuh ruleset as code (RaC) introduces a DevOps-driven approach to consistently manage Wazuh threat detection and security monitoring rulesets. It allows security teams to use version control systems and CI/CD pipelines to automatically deploy Wazuh rules and decoders.

This approach leverages the principles of infrastructure as code (IaC) to enable collaboration, change tracking, and rollback of rulesets using tools like Git. It supports the continuous deployment of security detection logic without direct access to the Wazuh manager.

In this blog post, we demonstrate how to implement Wazuh RaC. We use the Detection-Engineering as Code (DaC) repository to automate the lifecycle of custom Wazuh rulesets from creation to deployment.

How it works 

We provide an overview of how the Wazuh RaC works, from creating rulesets to its deployment to the Wazuh server.

  1. Local development and testing: Security engineers begin by creating and editing rulesets locally using integrated development environments (IDE) tools like Visual Studio Code. They work within the custom Wazuh ruleset directories:
  • /var/ossec/etc/decoders
  • /var/ossec/etc/rules

These directories are version-controlled using a local Git repository.

Note

 We use a .gitignore file to exclude unrelated files or directories under the /var/ossec/etc/ directory, ensuring that only relevant rulesets are tracked.

  1. Push to dev branch: Once local changes are tested, rulesets are committed and pushed to a dev branch of a remote repository (GitHub). This branch acts as a collaborative development space where multiple security engineers can contribute.
  2. Pull request and review: Changes in the dev branch are subject to peer review. A pull request (PR) is created to merge updates from the dev branch into the main branch. This review process ensures quality control, facilitates collaboration, and supports auditing and tracking of changes.
  3. Merge to main and trigger CI/CD: Upon PR approval and merge into the main branch, a GitHub Actions workflow automatically triggers the CI/CD pipeline. The CI/CD pipeline executes in the following order:
  • Navigates to the /var/ossec/etc/ directory, where the rulesets are stored.
  • Pulls the ruleset changes from the remote GitHub repository to the local Git repository using the git pull command.
  • Changes the file ownership and permission by executing the chown wazuh:wazuh and chmod 660 commands.
  • Restarts the Wazuh manager service and echoes success if the Wazuh manager restarts successfully, and failure if the restart operation is unsuccessful.
  • As a debugging step, the systemctl status wazuh-manager command is included to show you the status of the Wazuh manager upon completing all tasks.
  1. Automated deployment to Wazuh: The workflow synchronizes the updated ruleset files to the Wazuh server, ensuring that the latest reviewed rulesets are applied when the CI/CD workflow is completed successfully. This step eliminates manual intervention, reduces errors, and guarantees that production Wazuh environments always run validated configurations.

Requirements

We use the following infrastructure requirements to demonstrate the Wazuh RaC:

  • A cloud-hosted Ubuntu 24 endpoint with a public IP address. The endpoint will host the following components:
    • Wazuh 4.12.0 central components (Wazuh server, Wazuh indexer, Wazuh dashboard) installed using the Quickstart guide.
    • Git version 2.34.1 installed.
  • Any endpoint with VSCode IDE installed for creating Wazuh rulesets. In our case, we use Windows 11.
  • GitHub account.

Note

You can also use a GitHub Actions self-hosted runner in place of the default GitHub Actions runners if your Wazuh deployment is on a local network.

Configuration

Ubuntu endpoint

Perform the following steps on the Ubuntu endpoint hosting the Wazuh server.

  1. Assign a public IP address or use network address translation (NAT) if the endpoint is behind a firewall. This will allow the endpoint to be reachable on the internet by GitHub Actions runners.
  2. Ensure that port 22 or your custom SSH port is open on the assigned public IP address.
  3. Create a private/public key pair to enable SSH login to the Ubuntu endpoint using the assigned SSH port and IP address. Copy and store your public key file with .pem extension safely for future use.

Creating a Git repository

We create local and remote Git repositories to synchronize local changes with remote changes.

To onboard the  Wazuh custom ruleset directories, /var/ossec/etc/decoders and /var/ossec/etc/rules in Git, we create a local Git repository. We initialize the /var/ossec/etc/ directory in Git and ignore other files and directories present by using a .gitignore file.

We also set up a remote repository on GitHub as our single source of truth (SSOT). Security engineers use this repository to manage the creation, modification, and deployment of rulesets to the Wazuh server. Security engineers collaboratively create and modify new or existing rulesets on this repository and review the changes before deploying them to the Wazuh server.

In this section, we provide steps to set up the local and remote repositories and synchronize them.

Remote repository

The DaC repository contains the necessary workflow files to automate the integration of rulesets into your Wazuh server. It also has a script for checking conflicting rule IDs to avoid errors on the Wazuh server.

Create a fork of the DaC repository, or a new repository on GitHub, and import the DaC repository.

Note

After creating a fork of the repository, navigate to Actions, and click I understand my workflows, go ahead and enable them.

The repository contains key files containing scripts. These files include:

  • .github/workflows/integrate_rulesets.yml: This file contains the Update Rulesets on SIEM workflow, which automates the integration of new or modified custom decoders and rules with the Wazuh server.
  • .github/workflows/check_rule_ids.yml: This file contains the Check Rule ID Conflicts workflow, which automates the running of the check_rule_ids.py script.
  • Check_rule_ids.py: This Python script checks for rule ID conflicts by comparing the rule IDs of new or modified rules in the dev branch with existing rule IDs in the main branch.
name: Update Rulesets on SIEM
on:
  push:
    branches: [ "main" ]
    paths: ["**.xml"]
  workflow_dispatch:

jobs:

  DaaC:
    runs-on: ubuntu-latest
    steps:
      - name: Apply modified or new decoders and rules to SIEM
        uses: appleboy/ssh-action@v1.0.0
        with:
          host: ${{ secrets.HOST }}
          username: ${{ secrets.USERNAME }}
          key: ${{ secrets.SSH_KEY }}
          port: ${{ secrets.PORT }}
          script: |
            sudo bash -c '
            cd /var/ossec/etc/
            git pull origin main
            chown wazuh:wazuh /var/ossec/etc/decoders/* && chmod 660 /var/ossec/etc/decoders/*
            chown wazuh:wazuh /var/ossec/etc/rules/* && chmod 660 /var/ossec/etc/rules/*
            sudo systemctl restart wazuh-manager \
            && echo "Ruleset apply SUCCESS!!! - Wazuh manager restarted successfully." \
            || echo "Ruleset apply FAILURE!!! - Wazuh manager failed to restart, check ruleset for error..."
            sudo systemctl status wazuh-manager -l --no-pager
            '
name: Check Rule ID Conflicts

on:
  pull_request:
    branches: [ "main" ]
    # paths: ["**.xml"]

jobs:
  check-rule-ids:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout PR branch
        uses: actions/checkout@v3
        with:
          fetch-depth: 0  # Required for git diff and history

      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: '3.10'

      - name: Fetch main branch
        run: git fetch origin main

      - name: Run rule ID conflict checker
        run: python check_rule_ids.py
import subprocess
import xml.etree.ElementTree as ET
from pathlib import Path
import sys
from collections import defaultdict, Counter

def run_git_command(args):
    result = subprocess.run(args, capture_output=True, text=True, check=True)
    return result.stdout

def get_changed_rule_files():
    try:
        output = run_git_command(["git", "diff", "--name-status", "origin/main...HEAD"])
        changed_files = []
        for line in output.strip().splitlines():
            parts = line.strip().split(maxsplit=1)
            if len(parts) != 2:
                continue
            status, file_path = parts
            if file_path.startswith("rules/") and file_path.endswith(".xml"):
                changed_files.append((status, Path(file_path)))
        return changed_files
    except subprocess.CalledProcessError as e:
        print("❌ Failed to get changed files:", e)
        sys.exit(1)

def extract_rule_ids_from_xml(content):
    ids = []
    try:
        # Wrap multiple root elements in a fake <root> tag to avoid parse errors
        wrapped = f"<root>{content}</root>"
        root = ET.fromstring(wrapped)
        for rule in root.findall(".//rule"):
            rule_id = rule.get("id")
            if rule_id and rule_id.isdigit():
                ids.append(int(rule_id))
    except ET.ParseError as e:
        print(f"⚠️ XML Parse Error: {e}")
    return ids


def get_rule_ids_per_file_in_main():
    run_git_command(["git", "fetch", "origin", "main"])
    files_output = run_git_command(["git", "ls-tree", "-r", "origin/main", "--name-only"])
    xml_files = [f for f in files_output.splitlines() if f.startswith("rules/") and f.endswith(".xml")]

    rule_id_to_files = defaultdict(set)
    for file in xml_files:
        try:
            content = run_git_command(["git", "show", f"origin/main:{file}"])
            rule_ids = extract_rule_ids_from_xml(content)
            for rule_id in rule_ids:
                rule_id_to_files[rule_id].add(file)
        except subprocess.CalledProcessError:
            continue
    return rule_id_to_files

def get_rule_ids_from_main_version(file_path: Path):
    try:
        content = run_git_command(["git", "show", f"origin/main:{file_path.as_posix()}"])
        return extract_rule_ids_from_xml(content)
    except subprocess.CalledProcessError:
        return []

def detect_duplicates(rule_ids):
    counter = Counter(rule_ids)
    return [rule_id for rule_id, count in counter.items() if count > 1]

def print_conflicts(conflicting_ids, rule_id_to_files):
    print("❌ Conflicts detected:")
    for rule_id in sorted(conflicting_ids):
        files = rule_id_to_files.get(rule_id, [])
        print(f"  - Rule ID {rule_id} found in:")
        for f in files:
            print(f"    • {f}")

def main():
    changed_files = get_changed_rule_files()
    if not changed_files:
        print("✅ No rule files were changed in this PR.")
        return

    rule_id_to_files_main = get_rule_ids_per_file_in_main()

    print(f"🔍 Checking rule ID conflicts for files: {[f.name for _, f in changed_files]}")

    for status, path in changed_files:
        print(f"\n🔎 Checking file: {path.name}")

        try:
            dev_content = path.read_text()
            dev_ids = extract_rule_ids_from_xml(dev_content)
        except Exception as e:
            print(f"⚠️ Could not read {path.name}: {e}")
            continue

        # Check for internal duplicates
        duplicates = detect_duplicates(dev_ids)
        if duplicates:
            print(f"❌ Duplicate rule IDs detected in {path.name}: {sorted(duplicates)}")
            sys.exit(1)

        if status == "A":
            # New file
            conflicting_ids = set(dev_ids) & set(rule_id_to_files_main.keys())
            if conflicting_ids:
                print_conflicts(conflicting_ids, rule_id_to_files_main)
                sys.exit(1)
            else:
                print(f"✅ No conflict in new file {path.name}")

        elif status == "M":
            # Modified file
            main_ids = get_rule_ids_from_main_version(path)
            if set(dev_ids) == set(main_ids):
                print(f"ℹ️ {path.name} modified but rule IDs unchanged.")
                continue

            new_or_changed_ids = set(dev_ids) - set(main_ids)
            conflicting_ids = new_or_changed_ids & set(rule_id_to_files_main.keys())

            if conflicting_ids:
                print_conflicts(conflicting_ids, rule_id_to_files_main)
                sys.exit(1)
            else:
                print(f"✅ Modified file {path.name} has no conflicting rule IDs.")

    print("\n✅ All rule file changes passed conflict checks.")

if __name__ == "__main__":
    main()

Local repository

Perform the following steps on the Ubuntu endpoint hosting the Wazuh server to set up your local Git repository.

  1. Navigate to the /var/ossec/etc directory as the working directory:
# cd /var/ossec/etc
  1. Create a .gitignore file in the working directory to ignore other files and directories that are not the decoders/ and rules/ directories from being added to Git:
# touch .gitignore
  1. Add the following files and directories, and any other files or directories to be ignored by Git, to the .gitignore file:
# Ignore the following files

client.keys
internal_options.conf
local_internal_options.conf
ossec.conf
sslmanager.cert
localtime
sslmanager.key

# Ignore the following directories

lists/
rootcheck/
shared/
  1. Mark the working directory as safe to add to Git:
# git config --global --add safe.directory /var/ossec/etc
  1. Initialize the working directory as a Git repository. This will create a .git directory in the working directory.
# git init
  1. Add your Wazuh RaC repository to your local Git as origin to push local changes and pull remote changes:
# git remote add origin https://<PERSONAL_ACCESS_TOKEN>@github.com/<USERNAME>/<REPO_NAME>

Replace <PERSONAL_ACCESS_TOKEN> with your GitHub personal access token, <USERNAME> with your GitHub username, and <REPO_NAME> with the name of your GitHub repository.

  1. Configure your Git user identity. This will be used to sign local commits to Git.
# git config --global user.name <YOUR_NAME>
# git config --global user.email <YOUR_EMAIL_ADDRESS>

Replace <YOUR_NAME> with your GitHub username and <YOUR_EMAIL_ADDRESS> with your GitHub email address.

  1. Create a new branch, main, and switch to the new branch:
# git checkout -b main
  1. Stage the rule and decoder files in the decoders/ and rules/ directories for commit and make an initial commit to the local Git repository:
# git add .
# git commit -m "Initial commit"
  1. Push the changes on the local Git repository to the main branch of your remote GitHub repository:
# git pull --rebase origin main
# git push -u origin main

Note

We first merge the GitHub repository with the local repository and resolve any merge conflicts using the git pull --rebase origin main command. This helps to update the local Git repository with the GitHub repository, since the GitHub repository is inconsistent with the local Git repository. This resolves issues that may arise when pushing to GitHub.

Creating a dev branch

To protect the main branch, where the more stable rulesets are stored before deploying to the Wazuh server, it is necessary to create a dev branch. Development of new rulesets is done in the dev branch to allow for proper review and testing before merging to the main branch, which deploys to production.

Using the GitHub documentation, create a new branch named dev from your main branch on the remote repository (GitHub).

Creating GitHub Actions secrets

Perform the following step on the remote GitHub repository to create secrets for use during the execution of the automation workflow.

Navigate to Settings > Secrets and variables > Actions > Secrets to create the following GitHub Actions secrets:

  • USERNAME – Use the username of the Ubuntu endpoint.
  • HOST – Use the public IP address or DNS address of the Ubuntu endpoint.
  • SSH_KEY – Use the public key generated for SSH login on the Ubuntu endpoint. It is usually a .pem file.
  • PORT – Use the SSH port assigned to the Ubuntu endpoint. It is port 22 by default, unless otherwise assigned to another port.

These secrets are used by GitHub Actions in the workflow file, .github/workflows/integrate_rulesets.yml to automate the deployment of the rulesets to the Wazuh endpoint.

Setting up VSCode IDE for creating rulesets

Perform the following steps on the VSCode application:

  1. Navigate to Manage > Extensions. Search for Remote Repositories and install the extension.
  2. Navigate to Remote Explorer on the left menu bar and select Open Remote Repository.
  3. Enter the remote URL of your Wazuh RaC repository and authorize the action to open the repository.

Using Wazuh RaC

In this section, we demonstrate how to use Wazuh RaC from local development to deployment on the Wazuh server. We also introduce an error-checking step to resolve rule ID conflicts.

Deploying rulesets to the Wazuh server

We demonstrate how Wazuh RaC works from when an engineer writes a new custom ruleset in an IDE to when the custom ruleset is integrated into the Wazuh server. 

The ruleset is integrated with the Wazuh server once changes in the dev branch are merged with the main branch. This triggers the GitHub Actions workflow integrate_rulesets.yml, named as Update Rulesets on SIEM, to update the Wazuh server with the recent changes. These changes cover decoder and rule creations, modifications, and deletions. The necessary file permissions and ownership are also given to the Wazuh user and group for the new decoders and rules, and the Wazuh manager is restarted.

The GIF image below provides a walk-through of the process, as follows:

  1. We create a new decoder and rule files demo_decoder.xml and demo_rule.xml on VSCode.
  2. We push the new decoder and rule files to the remote repository (GitHub).
  3. We create PR to merge changes from dev -> main.
  4. We check Update Rulesets on SIEM workflow on GitHub Actions for completion status.
  5. We confirm that rulesets are updated on the Wazuh dashboard.

Resolving rule ID conflicts

To validate our rulesets during PR, we resolve conflicts that arise from reusing rule IDs during rule creation or modification by adding a PR check. It uses automation to check for conflicting rule IDs when a PR is created from dev -> main.

We use the Python script check_rule_ids.py to extract the rule IDs from recently created or modified rule files in the dev branch. The extracted rule IDs are then compared to existing rule IDs already present in the main branch. For this to be effective, it is important to use rule ID numbers within the range of 100000 and 120000 for custom rules. 

The process of running the Python script is automated using the GitHub Actions workflow file .github/workflows/check_rule_ids.yml. This workflow is added as a check that must be passed during PR for a change to be eligible for merge to the main branch. To enforce the check on your repository, do the following:

  1. Download the protect_main.json file containing the GitHub rule to protect the main branch.
  2. Navigate to Settings > Rules > Rulesets > New ruleset > Import a ruleset on the GitHub repository.
  3. Select the protect_main.json file downloaded earlier.
  4. Review the rule and select Create to enforce the rule on your repository.

The check_rule_ids.py script and the .github/workflows/check_rule_ids.yml workflow file are present when the DaC repository is forked or imported.

Testing conflict resolution

We test this by creating a new rule file demo_rule2.xml, which is a copy of the demo_rule.xml file created earlier. The new rule has the same rule IDs as demo_rule.xml, but with a different file name. The new rule is pushed to the remote repository, and a PR is created to merge the change from the dev to the main branch. 

This triggers the GitHub Actions workflow, .github/workflows/check_rule_ids.yml named Check Rule ID Conflicts to check for conflicts in the rule IDs. Upon detecting a conflict, the workflow status is set to Failure, and the PR cannot be merged.

The image below demonstrates a failure in the PR when there is a conflicting rule ID.

Finding conflicting rule IDs

To find the exact rule IDs causing the conflict, check the Check Rule ID Conflicts workflow runs on your GitHub Actions. To directly check from the PR, follow the instructions in the GIF image below.

Troubleshooting errors

In this section, we show how to check for errors with the CI/CD pipeline. Examples of such errors include:

  • Issue with Wazuh server IP address.
  • Issue with SSH key pair.
  • Issue with SSH port.
  • Syntax error in rulesets.

The Update Rulesets on SIEM workflow pipeline will fail if any of the above-listed errors occur. Errors related to deploying rulesets to the Wazuh server can be found in the GitHub Actions Update Rulesets on SIEM workflow runs. The image below shows an error with a failed CI/CD pipeline.

Note

You can add custom checks to your repository to further prevent errors on the production Wazuh server.

Conclusion

Wazuh ruleset as code (RaC) showcases a DevOps approach to handling security operations and detection engineering. By treating rulesets as code, security teams can manage detection logic with the same agility, scalability, and discipline used in software development. This results in faster iterations, fewer production issues, and more consistent threat detection rules.

Wazuh is a free and open source SIEM and XDR solution that can be deployed and managed on-premises or in the Wazuh cloud. You can ask questions about this blog post and other topics related to Wazuh in any of our community channels.

References