-
Why am I getting this error, I have this issue with most of function in pipelines.
-
S3/Blob storage folder selection as source
Hi Team, I am a Cribl Certified Engineer and have been working on a use case which is described below, but I haven't achieved the expected output. The use case involves an S3/Blob Storage where we have one bucket containing 2-3 folders. The source should be a specific folder, and from that folder, I need to ingest data to…
-
Export pipelines. Format.
The default format of exporting pipelines is the JSON. As the YML is used in config files, IMO, also exporting to this format should be possible.
-
Cribl Stream logging
IMO, the Cribl logging need improvements. While in logs can see errors it is hard find which a pipeline, function causing it. My suggestion is to add infromation to each ERROR line about a pipeline and fuction releated to an error.
-
What's the coolest thing you've done with a single function?
You know how as you get better at Cribl Pipelines, you refactor as you learn better ways to do stuff? What's the coolest thing you've figured out how to do? Include a screenshot if you can.
-
Is there any way to force a pipeline to ONLY run on a single worker?
I think I know the answer to this, but I need to ask anyway.I have an aggregation that MUST run over the entire event stream.Is there any way to force a pipeline to ONLY run on a single worker.
-
Access Splunk UF meta data
splunk uf internal logs are picked up by a passthru pipeline in cribl. based on index.startsWith('_') for the route filter. That works fine. the problem, i lose all meta information about the splunk ufs. like version and os. Can this be prevented somehow? I just see all the cribl workers and some machines (HF) that are…
-
What is the expected behavior of Cribl if the Redis cluster is unavailable for some time?
Double checking on this, what is the expected behavior of Cribl if the Redis cluster is unavailable for some time? To my understanding events skip the Redis function (after default 30seconds) and processes through the rest of the pipeline. Is that how it is?
-
Every firewall event is being cloned
I'm trying to figure out why every event from my pfsense firewall is being cloned as it passes through the pipeline. When I look at Preview Full from the Pipeline, I see the first event in my sample data file being processed as desired through the pipeline, then the second event is an unprocessed clone of the first. The…
-
Import this field list and throw away all fields in the data that are not contained in the list
Hi everyone, I have a use case atm where I have a lookup that contains a column of "allowed fields" for various sourcetypes, that is, the value for a given sourcetype is a common separted list of field names. At the end of my pipeline, I basically want to import this field list and throw away all fields in the data that…