Without knowing what your data looks like, we can only guess, but something like the following should work:
index=ndx sourcetype=srctp message=* publicationID=*
| fields - _raw
| fields publicationID _time
| sort 0 publicationID _time
| eval message=if(match(message,"Request.+publication"),message+tostring(_time),message)
| rex field=message "Request.+publication(?<begin>.+)"
| filldown begin
| stats min(_time) as start max(_time) as end by begin publicationID
| eval gap=end-start
| where gap>3600
| eval start=strftime(start,"%c"), end=strftime(end,"%c")
What this does:
- a little housekeeping (keep only the fields we need, chuck the rest)
- [re]sort to ensure we’re grouped by
publicationID, then by_timeinstead of only by_time - add a unique identified to
messageif it matches the pattern for being a beginning value - extract a custom field (
begin) - if
messagedoesn’t contain a value,beginwill be null filldownthe value ofbegin(works just like Excel’s filldown functionality)stats-out the start and end times of every individualpublicationIDevent set (a latransaction, but faster and more flexible)- filter every line to ensure there is more than a 1 hour (3600 second) "gap" (perhaps this would be better worded "duration"?)
- format the start and end times for readability
from User warren – Stack Overflow https://stackoverflow.com/questions/75110504/splunk-use-streamstats-with-transaction-command-to-find-gaps-in-event-logs/75111487#75111487
via IFTTT