We have updated our Terms of Service, Code of Conduct, and Addendum.

Populating lookup table using incoming events

reduk
reduk Posts: 1

As the title says, I’m looking to create and update a lookup table based on incoming events, somewhat similar to the outputlookup function in splunk. Is there any way to update or create a table based on fields from events passing through a pipeline?

Best Answer

  • Jon Rust
    Jon Rust Posts: 458 mod
    Answer ✓

    You can use the Redis function to update Redis-based data from within a pipeline using live data.

    This isnt currently possible with CSV-based lookups.

Answers

  • Jon Rust
    Jon Rust Posts: 458 mod
    Answer ✓

    You can use the Redis function to update Redis-based data from within a pipeline using live data.

    This isnt currently possible with CSV-based lookups.

  • concanonmike
    concanonmike Posts: 1

    Weve created a product with custom functions that will read/write tables in data stores, which can be mongo, Oracle, MySql, MSSQL, Postgress, DB2 and others, making the data available across workers. Ping me if this aligns with what youre looking for.

  • pacman
    pacman Posts: 1

    You can optionally configure a script in "Sources" to run and collect the data you need from the source and output the results to /opt/cribl/data/lookups. This is handy in the instance you have a fileserver hosting reference data or an API endpoint that has the file or data ready to be consumed.
    Example script for hitting endpoint and writing to lookup directory.

    wget URL-to-Data -O /opt/cribl/data/lookups/filename.csv