A More Dynamic Approach
Microsoft Sentinel and Log Analytics requires working with tables with predefined column names and data types. This also requires a Data Collection Rule (DCR) and a Data Collection Endpoint (DCE) to allow the Log Ingestion API to successfully write data into the desired table.
Imagine a scenario where data is being sent to Sentinel, but a field needs to be added, or a data type changed from ‘integer’ to ‘string’. Making those changes in Cribl Stream is trivial, but Sentinel needs to be updated to reflect the desired changes. This isn’t as straightforward as it requires updating the table schema and using the cloud shell to update the existing DCR (or creating an entirely new DCR).
Alternatively, using the dynamic data type in Sentinel allows for much more flexibility and resiliency to data as it changes over time. Using the dynamic data type is simple: shape your data into a JSON object using Cribl Stream then send that data to a table with a dynamic data type field in Azure Sentinel. Here are some examples.
The Setup
First, let's look at a custom table schema where the column name is Event with a type set as Dynamic:
Next, here's an example of a DCR. This setup is simple, with only the dynamic type column Event and the required column TimeGenerated:
Here's an example Pipeline for a syslog message that is parsed and serialized to a JSON object called Event. The TimeGenerated field is created from _time and formatted to ISO8601, as required by Sentinel:
Viewing Logs in Sentinel
Once data reaches Sentinel, this is how the logs appear when using the dynamic data type:
The JSON object can be easily referenced with KQL (Kusto Query Language):
Summary
With this setup, adapting to data changes is seamless. We can update our queries to reference new or changed fields quickly. The dynamic data type approach can help reduce the complexity of managing schema changes and can speed up data processing and updates.
Check out the full blog article here.