Answer by warren for Splunk – Use streamstats with transaction command to find gaps in event logs

Without knowing what your data looks like, we can only guess, but something like the following should work:

index=ndx sourcetype=srctp message=* publicationID=*
| fields - _raw
| fields publicationID _time 
| sort 0 publicationID _time
| eval message=if(match(message,"Request.+publication"),message+tostring(_time),message)
| rex field=message "Request.+publication(?<begin>.+)"
| filldown begin
| stats min(_time) as start max(_time) as end by begin publicationID
| eval gap=end-start
| where gap>3600
| eval start=strftime(start,"%c"), end=strftime(end,"%c")

What this does:

  • a little housekeeping (keep only the fields we need, chuck the rest)
  • [re]sort to ensure we’re grouped by publicationID, then by _time instead of only by _time
  • add a unique identified to message if it matches the pattern for being a beginning value
  • extract a custom field (begin)
  • if message doesn’t contain a value, begin will be null
  • filldown the value of begin (works just like Excel’s filldown functionality)
  • stats-out the start and end times of every individual publicationID event set (a la transaction, but faster and more flexible)
  • filter every line to ensure there is more than a 1 hour (3600 second) "gap" (perhaps this would be better worded "duration"?)
  • format the start and end times for readability

from User warren – Stack Overflow https://stackoverflow.com/questions/75110504/splunk-use-streamstats-with-transaction-command-to-find-gaps-in-event-logs/75111487#75111487
via IFTTT