How can I get the default `@timestamp` on elastic output?
Answers
-
``` { "create": { "_index":".ds-interfaces-sensors-ptx-2023.03.22-000001","_id":"5bltTMx3nTh6pWRi","status":400,"error": { "type":"mapper_parsing_exception","reason":"failed to parse","caused_by": { "type":"illegal_argument_exception","reason":"_id must be unset or set to [0QfN01UBNYDT2wsqAAABhwrKl7A] but was [5bltTMx3nTh6pWRi] because [.ds-interfaces-sensors-ptx-2023.03.22-000001] is in time_series mode" } } } },```
0 -
I'm not sure why the worker didn't get those error messages
0 -
That was from a capture on the wire
0 -
Based on the docs. > For TSDS documents, the document `_id` is a hash of the document's dimensions and `@timestamp`;. A TSDS doesn't support custom document `_id` values The `_id` currently being set is `random(16)` . However, it is possible to override this by adding `__id` to the event. Logic looks like this: ```const _id = event.__id ?? random(16);```
0 -
It might need to be unset though. Is that possible?
0 -
Unfortunately this happens in formatting of data before it hits the wire.
0 -
_id should not be set. it is automatically generated by elastic on ingest.
0 -
It would only be useful to "update" documents and NEVER used for metrics
0 -
<@U0410L186KS> Can you try stripping _id out using an elastic ingest pipeline? In the interim?
0 -
I could do that.
0 -
_id is required in other situations, which is why it's provided by default. Looks like we need an option to omit _id. I'll open a ticket.
0 -
When in data stream, disable.
0 -
You will run against issues with read-only index, etc...
0 -
It should be user option to create _id in the pipeline
0 -
From Logstash documentation:
0 -
Removed the _id using ingest pipeline. That works.
0 -
Cool, glad there is a work around! Created ticket: `CRIBL-16300` to address this in an upcoming release.
0 -
Argg.. now I got data in the future!
0 -
Only 1 document made it. Something isn't working.
0 -
That's the document I set manually. Not working :slightly_smiling_face:
0 -
Documentation problem maybe regarding "ingest pipeline". I used the field "Elastic pipeline" but that didn't work. I had to use the "extra parameters".
0 -
Not working:
0 -
Working:
0 -
I had problems with an older version of Cribl with the pipeline, it wasn't being put in the correct place in the url that was constructed for the API call
0 -
After adding the extra parameter, I see this: `POST /_bulk?pipeline=remove_id HTTP/1.1`
0 -
What do you see when it is specified in the Elastic pipeline field?
0 -
Just `_bulk`
0 -
I feel a bug report coming on....
0 -
Was just looking at this with somebody else. The problem is that it needs to be "quoted"
0