FME Flow and Azure Monitor Log Integration

Liz Sanderson
Liz Sanderson
  • Updated

Introduction

This article walks through how to integrate FME Flow (formerly Server) with Azure Monitor. There are three ways to get custom logs into Azure Monitor:

  1. Application Code - Configure FME Flow at the code level to generate logs and send them directly to Azure Monitor.
  2. Via a REST API - Forward log data to Azure Monitor via HTTP.
  3. Tail log files - Using the Azure Monitor agent you can tail FME Flow on the virtual machine and forward them to Azure.

This article walks you through both tailing log files and using the Azure Monitor API in combination with FME Flow System Events to push events into Azure.

 

Tailing Log Files

In this section, we walk through creating an alert against an FME Flow log file and then sending an SMS to a user when one of the FME Flow services is not reporting correctly. The logic could be used to create alerts against any of the log files on FME Flow.
 

Create VM

When creating a VM you do not need to do anything special. The instructions below are for a Windows Virtual Machine.
 

Create Log Analytics Workspace

Installing the Log Analytics VM extension allows Azure Monitor to collect data from your Azure VMs. The extension can be installed through the Azure Portal.

  1. Create Workspace
  2. Enable Log Analytics VM Extension


Your VM with FME Flow installed on it will now be connected to the Log Analytics Workspace you just created. In addition to the Azure portal, you can also check on the VM which Log Analytics Workspace it is connected to.

On the VM > Control Panel > Microsoft Monitoring Agent > Azure Log Analytics tab.

 

1585345669914.png

 

Add Custom Logs Data Source

Next, the FME Flow log file(s) that you would like to aggregate need configuring. All FME Flow log files can be watched except the FME Job logs. FME Job logs cannot be captured because the folder name is dynamic in the Jobs folder and wildcards are not currently supported on folders.

To add custom log files follow the steps here. When defining a custom log there are a few decisions that you need to make, here is an example for setting up watching the process monitor log file.

  1. Upload log file: In the FME Flow Web UI, find the fmeprocessmonitorcore.log (Resources > Logs > core > current) and download.

  2. Set delimiter: Use New Line as the delimiter, the timestamps are not in the right format.

  3. Log collection paths: This will vary based on where you install the FME Flow e.g. C:\ProgramData\Safe Software\FME Flow\resources\logs\core\current\fmeprocessmonitor*.log

 

View Logs in Log Analytics

The Log Analytics interface lets you query the FME Flow logs. First, you need to check if everything is set up correctly.

  1. Access the Log Analytics page for the workspace you have setup.

  2. Under Custom Logs, find the process monitor log and select Preview data.

    1585347950575.png

  3. You should now be able to view data from the log file and it should appear in near real-time.

 

Extracting Data into Custom Fields

When the log data comes into Azure, the log file is stored in a field called RawData. To prepare the logs so custom alerts can be created, the contents of the log file need breaking out into custom fields. The Azure documentation walks you through creating custom fields.
 

For the fmeprocessmonitorcore.log I created three custom fields:

  1. Timestamp - The timestamp for the log row.

  2. Status - The severity of the log message (INFORM, WARN, ERROR, FATAL)

  3. Message - A description of the log.
     

1585347905549.png


All the FME Flow log files follow a similar pattern so you should be able to use the same custom fields for all core, engine, and service logs.
 

Creating Queries

Now the log data is being extracted into custom fields, we can begin to write queries against those fields that will be used to trigger alerts. Here is an example search query that searches for messages that have the ERROR or FATAL status.
 

fmeserver_CL
| where status_CF contains "ERROR" or status_CF contains "FATAL"

 

Create Alerts

Once you have written the query that you wish to act on, you can use it to set an alert. The documentation walks you through how to set up an alert.
 

I set up an alert to send me an SMS when there are more than 3 ERROR or FATAL warnings in a five minute period.

 

Using System Events

System events, in conjunction with FME Flow Automations, can be used to forward messages to Azure Monitor. There are many system events, the following are the only ones relevant to system health:

  • Warning Logs - Warning was recorded in fmeserver.log file.
  • Error Logs - Error was recorded in fmeserver.log file.

The Azure documentation shows you how to use the HTTP Data Collector API to send log data to Azure Monitor from a REST API client. However, due to the fact you need to create an authorization signature computed using the SHA256 algorithm and then Base64 encoded, it makes things a bit tricky.

The following resources have therefore been created to enable you to get up and running in minutes:-

Setting up the Automation

  1. Download the automation from FME Hub. Login to the FME Flow you wish to deploy the FME Flow on and import the Project. Navigate to the Automation, it is called Populate Azure Log Analytics workspace.

  2. By default, the automation is set to respond to the Warning Logs and Error Logs system events. To edit this, click on the System Event received trigger and configure the Events.

  3. Next, we need to set the Azure Workspace properties. Click on the azure_monitor_populate workspace action and set the Customer ID and Shared Key. The Customer ID is the Workspace ID and the Shared key is the workspace key

  4. The Automation can now be enabled. This will push data into the Log Analytics Workspace and you should be able to query it and set alerts as defined in the Log Tail section above.

 

Was this article helpful?

Comments

0 comments

Please sign in to leave a comment.