fluentd TimeParser Error - Invalid Time Format

WilnarMasonik
New here

fluentd TimeParser Error - Invalid Time Format

I'm trying to get some Cisco Meraki MX firewalls logs pointed to our Kubernetes cluster using fluentd pods. I'm using the @syslog source plugin, and able to get the logs generated, but I keep getting this error

 

2022-06-30 16:30:39 -0700 [error]: #0 invalid input data="<134>1 1656631840.701989724 838071_MT_DFRT urls src=10.202.11.05:39802 dst=138.128.172.11:443 mac=90:YE:F6:23:EB:T0 request: UNKNOWN https://f3wlpabvmdfgjhufgm1xfd6l2rdxr.b3-4-eu-w01.u5ftrg.com/..." error_class=Fluent::TimeParser::TimeParseError error="invalid time format: value = 1 1656631840.701989724 838071_ME_98766, error_class = ArgumentError, error = string doesn't match"

Everything seems to be fine, but it seems as though the Meraki is sending it's logs in Epoch time, and the fluentd @syslog plugin is not liking it.

I have a vanilla config:

 

<source>
  @type syslog
  port 5140
  tag meraki
</source>

Is there a way to possibly transform the time strings to something fluentd will like? Or what am I missing here.

2 Replies 2
WilnarMasonik
New here

found this answer;

I work for Blue Medora, BindPlane is our product. This should fix your issue, if not please let us know and we can help get it configured properly.

Try time_format %Y-%m-%d %H:%M:%S.%L %Z

  1. The lowercase z represents "Time zone as an hour offset from UTC" (eg.+0400)
  2. Capital Z represents "Time Zone Name" which it looks like what you have in your log files.
  3. It also looks like there is a space between the milliseconds and the timezone. So the space should be added as well.

Here's a link to the documentation on the strptime() options echatspin that shows difference between %z and %Z

Santosh1
New here

Hey WilnarMasonik, 

 

I'm also getting the same error and I've added time_format %Y-%m-%d %H:%M:%S.%L %Z to the parse section. Where did you add the time_format and did you need to do anything else?

Get notified when there are additional replies to this discussion.