Skip to main content
Skip table of contents

JSON Policy Builder

The JSON Policy Builder is a wizard that helps users easily map JSON values to the LogRhythm schema and export the policy file to use on the System Monitor Agent.

At any time during your use of the wizard, you can click Reset Wizard in the lower left-hand corner to completely clear the wizard and start from the beginning again.

Step 1: Introduction

On the first page of the wizard, choose whether you want to Create New Policy or Update Existing Policy.

Create New Policy

To create a new JSON policy:

  1. Select Create New Policy.

  2. Enter a unique Policy Name.

  3. Scroll down and click Start Building Policy.

Update Existing Policy

To update an existing JSON policy:

  1. Select Update Existing Policy.

  2. Click Choose policy file.
    The File Explorer opens.

  3. Browse and select your JSON policy file.

  4. Click Open.

  5. Verify the Policy Information that populates, or click Clear Policy to start over.

  6. Scroll down and click Start Building Policy.

Step 2: Sample Data Input

Choose one of the following methods to determine how you will provide representative JSON data to analyze and identify patterns and parsing rules:

  1. Manual Input: Manually enter a single JSON object or array to represent your log data.

Optionally, click the Format JSON button to automatically clean up and format your pasted data.

  1. File Upload: Click Choose JSON file and upload a .json file containing sample data.

  2. Multiple Logs: Paste multiple JSON objects, one per line.

Once your sample data has been selected and entered, verify the Data Preview & Analysis section is displaying records, unique fields, nesting levels, and arrays.

The Detected Insights section provides helpful hints that may assist you with building the JSON policy.

For example, it will display the type(s) of logs in the submitted file, as well as rather fields could benefit from fanout processing (see the next section).

When finished, click Continue to Schema Rules.

Step 3: Schema Rule Configuration (Optional.)

This page offers two optimizations to help clean-up and prepare your JSON for parsing:

  1. Convert to JSON: If your imported/entered data is not in JSON format, click this option to automatically format the data correctly.

  2. Array Fanout Processing: Select any JSON array elements that should be processed individually.

If neither of these options are required, they will not be selectable, and you can simply skip these steps.

When finished, click Continue to Filter Rules.

Step 4: Filter Rule Configuration (Optional.)

This step allows you to define filtering conditions to sort and select relevant data, and to determine when parsing rules should apply.

To add filter rules:

  1. Click Add Condition.
    A new condition expression is added to the rule builder.

  2. Open the Field drop-list and select a field that should be filtered.
    The fields are automatically pulled from your sample data.

  3. Open the Operator drop-list and select one of the following options:

Operator

Description

Equals (==)

The selected Field must be identical to the selected Value to be filtered into the results.

Not Equals (!=)

The selected Field must not be identical to the selected Value to be filtered into the results.

Contains

The selected Field must simply contain the selected Value to be filtered into the results.

Optionally, check Case Insensitive to ignore whether the value is upper case or lower case.

Has Attribute (Exists)

The selected Field will be included in the results if it exists, regardless of the value.

  1. Open the Value drop-list and select one of the available values to be filtered.

The Has Attribute (Exists) operator does not require a value to be selected.

  1. (Optional.) Click Add Condition to add an additional condition to your filter.
    Select And or Or to determine whether both conditions must be met, or only one condition needs to be met.

  2. Review the Generated Filter Expression, and optionally click Copy Expression to copy the expression for later use.

  3. When finished, click Continue to Field Mapping.

Step 5: Field Mapping

Use this step to map fields from your JSON to LogRhythm schema fields. To map fields:

  1. Click on any field in the JSON tree to create or edit a mapping.
    The Edit Field Mapping pop-up appears.

Optionally, use the Search fields… textbox to search for a specific field.

  1. Configure any of the following fields.
    Mandatory fields are marked with an asterisk (*).
    When finished configuring these fields, click Save Mapping.

Field

Description

Source Field*

The field from your JSON being mapped. This field is auto-populated with the field selected in step 1.

Add Operations

Click this button to open the Add Operations pop-up, allowing you to configure actions to be taken when parsing this field.

The “Add Operations” pop-up is described in greater detail in the Add Operations section below.

LogRhythm Schema Field*

Open the drop-list and select the LogRhythm field that data in the chosen field should be mapped to during parsing.

Data Type*

Open the drop-list and select one of the following options to determine the type of data that will be in this field:

  • String

  • DateTime

  • Number

  • Decimal

  • Boolean

Selecting DateTime opens a DateTime Format Configuration section. Here, you can select the date format to be used when parsing dates and times. You can create a custom pattern yourself (such as yyyy-MM-dd HH:mm:ss.SSS), or you can select from one of the frequently-used presets.

Format

Optionally, enter a format for the selected field to help with parsing.

Default Value

Optionally, enter a value to automatically populate if the selected field is empty during parsing.

Alternative Fields

Optionally, select a fallback field to parse in its place if the selected field is empty.

Fanout Parent Element

Optionally, if you configured fanouts in step 3, select one of the fanout arrays to associate with this field.

  1. Repeat steps 1 and 2 for each additional field to be mapped.

As you map fields, they will appear in the Configured Mappings column on the right.

Optionally, use the Filter mappings… textbox to search for a specific configured mapping.

  1. When all desired fields are mapped, click Continue to Sub Transform.

Add Operations

The Add Operations pop-up can be used to configure actions to take place each time a field is parsed using this JSON policy. These actions are described in the table below. As you select an action, the Configuration section on the right may populate with additional fields to further configure the operation.

When finished configuring operations, click Apply Operation at the bottom-right.

You can use the Search operations… textbox to search for a specific operation. Additionally, you can use the tabs to filter actions that are available for specific data types, such as strings or numbers.

Operation

Description

None

Take no additional action with the selected field; map it exactly as it comes in.

REGEX

Extract the data in the field using pattern-matching with a regular expression.

Upon selecting this option, fill in the Regex Pattern and Capture Group fields on the right with the Regex command to utilize and the group to be captured, respectively, or use the Common Patterns (Quick Insert) drop-list and select a frequently used pattern.

IsIP

Check whether the selected field contains an IP address.

SPLIT

Splits data in the selected field using a configured delimiter.

Upon selecting this option, enter a Delimiter and an Index value in the fields on the right. Alternatively, select one of the frequently-used delimiters under the Quick Select heading.

Concat

Concatenates (combines) two or more string values within a field.

Upon selecting this option, field in the Value fields on the right to determine the values to be concatenated.

ToString

Converts the value in a field into a string.

ConcatArray

Concatenates (combines) two or more string values using a delimiter.

Upon selecting this option, enter a Delimiter in the field on the right to determine the delimiter that will separate the joined values. Alternatively, select one of the frequently-used delimiters under the Quick Select heading.

EpochSectoDateTime

Converts a UNIX timestamp that is measured in seconds into a human-readable data and time format.

EpochMilliSectoDateTime

Converts a UNIX timestamp that is measured in milliseconds into a human-readable data and time format.

LocalDateTime

Adds the current date and time to the data when the field is parsed.

Add

Add a number to the value in the field.

Upon selecting this option, enter the value to be added in the Value to add field on the right. Alternatively, select a value under the Quick Select heading.

Subtract

Subtract a number from the value in the field.

Upon selecting this option, enter the value to be subtracted in the Value to subtract field on the right. Alternatively, select a value under the Quick Select heading.

Multiply

Multiply the value in the field by a number.

Upon selecting this option, enter the desired multiplier in the Value to multiply by field on the right. Alternatively, select a value under the Quick Select heading.

Divide

Divide the value in the field by a number.

Upon selecting this option, enter the desired divisor in the Value to divide by field on the right. Alternatively, select a value under the Quick Select heading.

Step 6: SubTransform Configuration (Optional.)

This step allows you to define conditional field mappings that apply based on certain data conditions. As an example case, “login” events may require different fields, mappings, and operations than “access” events, and these differences can be configured here.

This step is intended for advanced use cases only. For most users, we recommend exploring the option of creating a new policy file instead of configuring SubTransforms.

To skip this step and only use the base mappings configured in the previous step, check Skip SubTransforms and then click Continue to Export.

To add a SubTransform:

  1. Click Add First SubTransform.

  2. Click Add/Edit Condition.
    The Edit Condition pop-up appears.

  3. Click Add Condition.
    This condition will determine when the SubTransform will apply.

  4. Open the Field drop-list and select a field that should be filtered.
    The fields are automatically pulled from your sample data.

  5. Open the Operator drop-list and select one of the following options:

Operator

Description

Equals (==)

The selected Field must be identical to the selected Value to be filtered into the results.

Not Equals (!=)

The selected Field must not be identical to the selected Value to be filtered into the results.

Contains

The selected Field must simply contain the selected Value to be filtered into the results.

Optionally, check Case Insensitive to ignore whether the value is upper case or lower case.

Has Attribute (Exists)

The selected Field will be included in the results if it exists, regardless of the value.

  1. Open the Value drop-list and select one of the available values to be filtered.

The Has Attribute (Exists) operator does not require a value to be selected.

  1. (Optional.) Click Add Condition to add an additional condition to your filter.
    Select And or Or to determine whether both conditions must be met, or only one condition needs to be met.

  2. Review the Generated Filter Expression.

  3. Click Save Condition.

At any time, you can use the Move Up, Move Down, Delete, or Edit options on the right to adjust your work.

  1. Click Add Mapping.
    The Add Field Mapping pop-up appears.

  2. Configure a field mapping using the same steps outlined in the Step 5 - Field Mapping section above.

  3. (Optional.) Click the Add Nested SubTransform button to add a subtransform within the current subtransform, allowing you to configuring multiple levels if necessary. Repeat steps 2 through 11 to create the nested subtransform.

  4. When the subtransform(s) are fully configured as desired, click Continue to Export.

Step 7: Review & Export Policy

If all steps are completed successfully, the Policy Generated Successfully message displays.

You can review the Policy Summary and the Policy Preview on this page. Optionally, the Policy Preview can be expanded, collapsed, or copied to the clipboard.

In the Export Options section, choose one or more of the following:

  1. Download Policy JSON: Downloads your configured .json file directly to your machine.

  2. Copy to Clipboard: The new policy is copied directly to your clipboard so you can paste it elsewhere as needed.

  3. Edit Policy JSON: Opens a window that allows you to manually edit the .json file as needed before downloading or copying it. To reformat the changes into clean JSON formatting, click the Format JSON option on the right. When finished, click Save Changes.

When finished, click Complete Wizard to close the wizard, or Start New Policy to start over from the beginning.

Export JSON Policy to System Monitor Agent

To export the policy to the System Monitor Agent:

  1. Locate the downloaded JSON file, or copy the data into an existing JSON file.

  2. (Optional) Rename the file if desired.

  3. Place the file on the System Monitor Agent in the custompolicies folder:
    C:\Program Files\LogRhythm\LogRhythm System Monitor\policies\custompolicies

To enable your new JSON policy, you must restart the System Monitor Service. For more information, see Custom Policy Folder.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.