FME Version
Introduction
This article walks through how to integrate FME Server with Datadog. The example assumes the Virtual Machine is running on Azure, however, the steps would be very similar if you were running FME Server on a VM on another cloud platform or on-premises.
Datadog has several log file integration options.
-
Custom log forwarder via TCP or HTTP: Forward logs to Data Dog via HTTP or TCP.
-
Application log collection: Configure FME Server at the code level to generate logs and send directly to Data Dog.
-
Tail files: Using the Datadog agent you can tail FME Server logs from the virtual machine and forward them to Datadog.
This article will walk you through how to tail FME Server logs and get them into Datadog. There isn't a native integration between FME Server and Datadog, but as you'll see, tailing files works extremely well.
Tailing Logs
-
Install the data dog agent on your Windows virtual machine.
-
Configure the agent for logs
-
Now the agent is set up and ready to send logs Datadog. The next step is to configure the FME Server log files you wish to send back. This document walks you through that. I also found it useful to visit the Logs section in the Datadog UI. Once in the Logs section, select Server and then Custom Files under Select a Log Source.
This page then walks you through what you need to do. To tail the fmeservercore.log I used the following YAML:
logs: - type: file path: "C:\\ProgramData\\Safe Software\\FME Server\\resources\\logs\\core\\current\\fmeserver.log" service: "fmeserverlog" source: "fmeserver"
Note, Datadog does support wildcards on both the file and folder names, so you can pull job logs in too.
logs: - type: file path: "C:\\ProgramData\\Safe Software\\FME Server\\resources\\logs\\engine\\current\\jobs\\*\\*.log" service: "fmeserverlog" source: "fmeserver_jobs"
-
To get the Datadog agent to pick up the changes, you need to restart the agent from the command line.
Next, you need to create a pipeline and add a parser to extract the value in the log into attributes. The documentation here does a great job of walking you through it. For the most basic pipeline, I used two processors, using a Grok parser to parse the logs into attributes, and then Remappers to remap the source attributes to official Datadog attributes.
Defining the parser rules can be a bit tricky. Here are the rules I came up with for parsing the FME Server Core log and the FME Job logs.
FME Server Core Parser:
parseFMEServerLog %{date("EEE-dd-MMM-yyyy hh:mm:ss.SSS a"):date}\s+%{notSpace:status}\s+%{data:message}
FME Jobs Parser:
noSpaceAfterStatus %{date("yyyy-MM-dd HH:mm:ss"):date}\|\s+%{number:cpu_time}\|\s+%{number:system_time}\|%{word:status}\|%{data:message} spaceAfterStatus %{date("yyyy-MM-dd HH:mm:ss"):date}\|\s+%{number:cpu_time}\|\s+%{number:system_time}\|%{word:status}\s*\|%{data:message}
- That's it, you should now see the log files streaming into Datadog and you can begin analyzing and setting up alerts.
Comments
0 comments
Please sign in to leave a comment.