So, I need to recreate 30 pipelines for a different business unit that goes to a different Splunk index... what's the fastest way to do that? Or is it just a tedious export/rename/import sequence I have to follow? :see_no_evil:
Can you say a few more words about your requirement? Maybe there's a better solution?
I have 30 pipelines that process about 30 different event types from a Corelight network sensor. This is for a specific business unit, so inside these pipelines, I add the index and sourcetype fields for Splunk using an eval.
Now another business unit wants to do the same, but their data is going to another index (on the same production cluster). The events are the same, so the way I process them is the same... except for the index name. So I feel like I'm stuck recreating the pipelines, just so that I can rename the value for index in the eval.
Is there a better way of doing this?
Would Route Cloning work for that use case?
Then I don't have to define these fields in the pipelines?
We also have a Clone function if you wanted to do the same in a pipeline
And I can just add the new source under the filter for each route as an "OR"
I think I got it
I'll try this and report back to you guys
Or a clone in a pipeline
There are a number of ways to do this, those are just the first two that came to mind, but you could use lookup tables.
Example is in this pack: https://github.com/criblpacks/cribl-knowledge-pack/releases/download/v0.5.0/cribl-knowledge-pack-e8d2b7c9-v0.5.0.crbl
I think I like the first approach. Keep the pipeline as generic as possible for "manipulating data" but leave the destination handling (kind of) with the route
However, I can't seem to get that clone functionality in the route, like you have on your screenshots. Am I missing something?
Unselect Final flag
Yes, just saw that now
From the Knowledge Pack (Regex_Lookup_Unroll_Router) also can set field values and then in an output router you can direct traffic or do whatever you like based on the values that are set using a lookup table.
Oh wait, I'm not just cloning the index fields. The index field for each event will be dependent on the inputId
Even though it's being processed by the same pipeline
Does that mean I need the Knowledge pack?
No, that is just an example. Don't want to confuse you with too many options. You can setup a unique Route per InputID or use the `||` in the same Route Filter.
Okay, it's amazing how many options I have (go Cribl!) but I ended up using a Lookup function, matching the `__inputId` in a CSV file to retrieve the index value. I did end up doing this directly in the pipeline, but it works there.
Thanks for the help!
BTW, alternative way... Add a postprocessing pipeline to the destination, put a lookup in that pipeline that translate index names
Oooh, then I don't have to do it for each pipeline... I only have to do it once for the destination...
Exactly. We've done this a few times if a certain destination needs special handling
Would the `__inputId` field still be available by the time I'm in a post-processing pipeline ?
Yep, should be