What are you trying to achieve? If you’re sending "unique" events to the HEC, or you’re running UFs on "unique" logs, you’ll never get duplicate "records when indexing". It sounds like you (perhaps routinely?) resend the same data to your aggregation platform – which is not a problem with the aggregator, but with your sending …
Continue reading Answer by warren for Splunk : Record deduplication using an unique field
Tag:stackexchange
Answer by warren for How to make a dynamic span for a timechart?
@RichG‘s answer is correct – but doesn’t address your core issue, which is wanting to set a specific span= for any given selected timeframe. If you’re doing this on a "splunk dashboard", you can control a lot about how your search works by using tokens. Create a custom time selector as a dropdown that you …
Continue reading Answer by warren for How to make a dynamic span for a timechart?
Answer by warren for Is it possible to call a URL in a Splunk dashboard and get the response as a string?
If I understand you correctly, you want to have the drilldown target be an external link. If that is correct, just put the external URL in the drilldown: If you want some value from an external link to be pulled-into Splunk, then you’ll need to setup another mechanism. For example, you might have a scripted …
Continue reading Answer by warren for Is it possible to call a URL in a Splunk dashboard and get the response as a string?
Answer by warren for Use sub-second precision on “earliest” in Splunk query
Yes, earliest‘s precision is limited to "standard" Unix epoch time (ie the number of elapsed seconds since the dawn of Unix (arbitrarily set to 01 Jan 1970 00:00:01 (or, sometimes, 31 Dec 1969 23:59:59))) because the _time field holds whole-number seconds. Splunk knows how to convert timestamps seen with more precision than mere seconds, but …
Continue reading Answer by warren for Use sub-second precision on “earliest” in Splunk query
Answer by warren for Avoid using Transaction in splunk queries
Typically, stats will be found to be your friend here However, without seeing sample data or what actual SPL you have tried so far, any answer is mostly going to be speculation 🙂 I’ll happily update this answer if/when you provide such, but here’s a possible start: (index=ndxA sourcetype=srctpA "search log 1" r=*) OR (index=ndxB …
Continue reading Answer by warren for Avoid using Transaction in splunk queries
Answer by warren for Splunk: How to use multiple regular expressions in one query?
I generally try to avoid putting multiple field extracts in a single rex Instead, I go for sequential ones like this: <search> | rex field=_raw "(?<ip>\d+\.\d+\.\d+\.\d+):" | rex field=_raw "\d+:(?<port>\d+)" | rex field=_raw "\d+:\d+\s+(?<msg>.+)" <more stuff here> In this example, I’m pulling an IP, port, and some message afterwards into three new fields: ip, port, …
Continue reading Answer by warren for Splunk: How to use multiple regular expressions in one query?
Answer by warren for Splunk: Group by certain entry in log file
Add a by clause to your timechart: source="/log/ABCD/cABCDXYZ/xyz.log" doSomeTasks | timechart partial=f span=1h count as "#XYZ doSomeTasks" by taskType | fillnull from User warren – Stack Overflow https://stackoverflow.com/questions/64803118/splunk-group-by-certain-entry-in-log-file/64806445#64806445 via IFTTT
Answer by warren for Using splunk to plot table of key counts extracted from json string field
It appears your JSON has multivalue fields Try using mvexpand first: index=ndx sourcetype=srctp | mvexpand queryParams | stats count by queryParams | rename queryParams as "Query Param" from User warren – Stack Overflow https://stackoverflow.com/questions/64763178/using-splunk-to-plot-table-of-key-counts-extracted-from-json-string-field/64775415#64775415 via IFTTT
Answer by warren for Extract Values from a field
If you don’t have the ability to modify your props.conf to extract the field correctly, this rex will pull it (presuming it’s at the end of the event): index=ndx sourcetype=srctp | rex field=_raw "HelloSample\=(?<HelloSample>.+)" If your test is somewhere else in the event, we’ll need to know what kind of delimeters exist to refine the …
Continue reading Answer by warren for Extract Values from a field
Answer by warren for SPLUNK : Duplicated json fields on searchHead
In all probability what you’re describing is a multivalue field inside the JSON blob, and not a "duplicated" field. Please share some sample data to verify. Are you seeing something like this in a | table? FieldA | FieldB ^^^^^^^^^^^^^^^ BarA | FooB | FooM ^^^^^^^^^^^^^^^ BarB | FooA | FooG If so, then in …
Continue reading Answer by warren for SPLUNK : Duplicated json fields on searchHead
Answer by warren for using splunk forwarder and filebeat on the same file
There’s no reason it shouldn’t work, unless the logfile happens to rotate out from underneath you in the middle from User warren – Stack Overflow https://stackoverflow.com/questions/64358248/using-splunk-forwarder-and-filebeat-on-the-same-file/64359836#64359836 via IFTTT
Answer by warren for Splunk: How to get N-most-recent values for each group?
While @RichG’s dedup option may work, here’s one that uses stats and mvindex: index=ndx sourcetype=srctp clientType=* key=* | eval comb=_time+" | "+key | stats values(comb) as comb by clientType | eval mostrecents=mvindex(comb,-N,-1) | fields – comb | mvexpand mostrecents | rex field=mostrecent "(?<timemost>\d+)\s\|\s(?<keymost>.+)" | table clientType timemost keymost | eval timemost=strftime(timemost,"%c") from User warren – …
Continue reading Answer by warren for Splunk: How to get N-most-recent values for each group?
Answer by warren for Splunk: How to get top 2 most recent values for each group
While @RichG’s dedup option may work, here’s one that uses stats index=ndx sourcetype=srctp clientType=* key=* | eval comb=_time+" | "+key | stats values(comb) as comb by clientType | eval mostrecent=mvindex(comb,-1), secondrecent=mvindex(comb,-2) | rex field=mostrecent "(?<timemost>\d+)\s\|\s(?<keymost>.+)" | rex field=secondrecent "(?<timesecond>\d+)\s\|\s(?<keysecond>.+)" | table clientType timemost keymost timesecond keysecond | eval timemost=strftime(timemost,"%c"), timesecond=strftime(timesecond,"%c") from User warren – Stack …
Continue reading Answer by warren for Splunk: How to get top 2 most recent values for each group
Answer by warren for Epoch time conversion to time in Splunk
Typically, you’d convert from the timestamp (ie epoch time) to something human-readable in your search Like this: index=ndx sourcetype=srctp earliest=-4h | stats max(_time) as rtime min(_time) as etime by fieldA | sort 0 – rtime + fieldA | eval rtime=strftime(rtime,"%c"), etime=strftime(etime,"%c") | rename rtime as "Most Recent" etime as "Earliest" Splunk strftime docs: https://docs.splunk.com/Documentation/Splunk/8.0.6/SearchReference/DateandTimeFunctions#strftime.28X.2CY.29 Further …
Continue reading Answer by warren for Epoch time conversion to time in Splunk
Answer by warren for SPLUNK enterprise i am trying to calculate results where if > 4% of failure is anomaly?
The logic appears correct, but why multiply by 100? Save yourself a step: | inputlookup sample.csv | eval isananomaly = if((Failcount / Totalcount) > .04 , 1 , 0) from User warren – Stack Overflow https://stackoverflow.com/questions/64249711/splunk-enterprise-i-am-trying-to-calculate-results-where-if-4-of-failure-is-a/64262356#64262356 via IFTTT
Answer by warren for How to do compound query with where clause in Splunk?
Something like this should do it (presumign the fields are properly broken-out already): index=ndx sourcetype=srctp Temperature>80 | eval sorttime=strptime(Time,"%a %B %d %Y %H:%M:%S %Z") | stats values(Time) as Time by strptime Location Temperature Type | fields – strptime strptime will convert from your timestamp into epoch time … which allows for better sorting. That step …
Continue reading Answer by warren for How to do compound query with where clause in Splunk?
Answer by warren for Sending out multiple reports as one email in splunk
There sure is – put both reports into a Dashboard, and schedule the Dashboard to be delivered every morning at 8a (or whenever you like). Adding a Report to a Dashboard as a Panel is pretty straightforward: Add as many panels as you desire to the dashboard, then schedule it: from User warren – Stack …
Continue reading Answer by warren for Sending out multiple reports as one email in splunk
Answer by warren for extract filename out of raw data using regex
Here are two options: First If you want what’s between the GET and HTTP, this will do it: | rex field=_raw "GET\s+(?<fname>\S+)\s+HTTP" Start at the string literal GET, go one (or more) whitespaces, then put everything that’s not a whitespace character (up until a whitespace sequence that ends in the string literal HTTP) into the …
Continue reading Answer by warren for extract filename out of raw data using regex
Answer by warren for Counting by table with splunk – consolidate like fields
Try this: | stats values(COMMAND) as COMMAND by HOST USER from User warren – Stack Overflow https://stackoverflow.com/questions/63945800/counting-by-table-with-splunk-consolidate-like-fields/63958341#63958341 via IFTTT
Answer by warren for What is the recommended way to write to Splunk using Log4J
As @Honky Donkey said, you can setup direct logging to the HTTP Event Collector. However, that’s probably not the best idea – unless you’re also logging to local disk. Why is it not a good idea? Because if you’re only logging to the HEC, you must have Splunk up and running and configured to receive …
Continue reading Answer by warren for What is the recommended way to write to Splunk using Log4J
Answer by warren for Splunk query to retrieve value from json log event and get it in a table
What have you tried already? I suspect this (or similar) will work, presuming Splunk’s identified this data as being in JSON format already: index=ndx sourcetype=srctp properties{}.host=* | rename properties{}.host as hostname | stats count by hostname from User warren – Stack Overflow https://stackoverflow.com/questions/63826857/splunk-query-to-retrieve-value-from-json-log-event-and-get-it-in-a-table/63834378#63834378 via IFTTT
Answer by warren for How do we get/extract log data from splunk
You need to investigate the following: index retention (and for Smart Store) storage availability if you have an index set for 500G or 1 year, but you store 50G per day, you’ll rotate at 10 days if you hsve an index set for 500G or 1 year, but only have 400G available storage, it will …
Continue reading Answer by warren for How do we get/extract log data from splunk
Answer by warren for Splunk: Execute the same query on multiple datasources
The only way Splunk has to connect to a database "itself" is via DB Connect (docs) From Splunk’s perspective, there is no way to connect to 100 databases without having unique connections to each. So far as I know, there is no tool that will connect to more than one database without unique connections – …
Continue reading Answer by warren for Splunk: Execute the same query on multiple datasources
Answer by warren for Regex separate IP:Port from a log
Here’s a regex that will pull all of the ip:port values from a field: | rex field=_raw max_match=0 "(?<ip_port>\d+\.\d+\.\d+\.\d+\:\d+)" Now expand the ip_port field: | mvexpand ip_port And then extract from ip_port into ip & port: | rex field=ip_port "(?<ip>\d+\.\d+\.\d+\.\d+\)\:(?<port>\d+)" from User warren – Stack Overflow https://stackoverflow.com/questions/63536430/regex-separate-ipport-from-a-log/63562264#63562264 via IFTTT
Answer by warren for Splunk how to exclude a certain vale from the list if exist
The JSON payload is being treated as a multivalue field So you need to mvexpand it before filtering-out what you want to ignore Try something like this: index=ndx sourcetype=srctp Stats{}.type=* | rename Stats{}.type as type | mvexpand type | search NOT type="Unknown" | … from User warren – Stack Overflow https://stackoverflow.com/questions/63292320/splunk-how-to-exclude-a-certain-vale-from-the-list-if-exist/63342203#63342203 via IFTTT