Way to recreate 30 pipelines for different business unit
So, I need to recreate 30 pipelines for a different business unit that goes to a different Splunk index... what's the fastest way to do that? Or is it just a tedious export/rename/import sequence I have to follow? :see_no_evil:
Answers
-
Can you say a few more words about your requirement? Maybe there's a better solution?
0 -
I have 30 pipelines that process about 30 different event types from a Corelight network sensor. This is for a specific business unit, so inside these pipelines, I add the index and sourcetype fields for Splunk using an eval. Now another business unit wants to do the same, but their data is going to another index (on the same production cluster). The events are the same, so the way I process them is the same... except for the index name. So I feel like I'm stuck recreating the pipelines, just so that I can rename the value for index in the eval. Is there a better way of doing this?
0 -
Would Route Cloning work for that use case?
0 -
Then I don't have to define these fields in the pipelines?
0 -
Correct
0 -
We also have a Clone function if you wanted to do the same in a pipeline
0 -
And I can just add the new source under the filter for each route as an "OR"
0 -
Yes
0 -
I think I got it
0 -
I'll try this and report back to you guys
0 -
Or a clone in a pipeline
0 -
There are a number of ways to do this, those are just the first two that came to mind, but you could use lookup tables.
0 -
-
I think I like the first approach. Keep the pipeline as generic as possible for "manipulating data" but leave the destination handling (kind of) with the route
0 -
However, I can't seem to get that clone functionality in the route, like you have on your screenshots. Am I missing something?
0 -
Unselect Final flag
0 -
Yes, just saw that now
0 -
From the Knowledge Pack (Regex_Lookup_Unroll_Router) also can set field values and then in an output router you can direct traffic or do whatever you like based on the values that are set using a lookup table.
0 -
Oh wait, I'm not just cloning the index fields. The index field for each event will be dependent on the inputId
0 -
Even though it's being processed by the same pipeline
0 -
Does that mean I need the Knowledge pack?
0 -
No, that is just an example. Don't want to confuse you with too many options. You can setup a unique Route per InputID or use the `||` in the same Route Filter.
0 -
Okay, it's amazing how many options I have (go Cribl!) but I ended up using a Lookup function, matching the `__inputId` in a CSV file to retrieve the index value. I did end up doing this directly in the pipeline, but it works there.
0 -
Thanks for the help!
0 -
BTW, alternative way... Add a postprocessing pipeline to the destination, put a lookup in that pipeline that translate index names
0 -
Oooh, then I don't have to do it for each pipeline... I only have to do it once for the destination...
0 -
Exactly. We've done this a few times if a certain destination needs special handling
0 -
Would the `__inputId` field still be available by the time I'm in a post-processing pipeline ?
0 -
Yep, should be
0