Monitoring Maximo with FME

Liz Sanderson
Liz Sanderson
  • Updated

FME Version

  • FME 2019.x


Maximo can be configured to send notifications when objects are added, deleted or modified. These notifications can be read by an FME workspace and used to update other datasets, or trigger other imports or exports.


Configuring Events in Maximo SaaS

The steps needed to configure outbound event in Maximo SaaS are documented in the IBM document Integration for Maximo Public SaaS. We will run through the process, with illustrations for clarity.


Creating Event Stream

The first step is to set up an IBM Event Stream in your IBM Cloud account. Log into, then create a Resource from the Dashboard.


Type 'event streams' into the search bar to find the Event Streams Service.



Select the Event Streams service, then select the desired plan. Lite is sufficient for testing. Click the Create button to create the service.



Once the Service is created, we need to create a Credential to access it, and a Topic to send our messages to. The Lite Plan only allows a single Topic to be created, so it is not sufficient for a working integration.

First we will create the Topic. From the side menu, choose Manage, then click on Topics. Click the Create Topic button to create a new Topic.



Set the Topic Name to MaximoAssetChange, then click the Next button.



Set the desired number of partitions (Lite plan allows only 1).



Set the message retention. Since we will be polling and clearing the messages in a workspace, a one-day retention should be sufficient. Click the Create Topic to finish.



Next, create a Credential to allow access to the Event Stream.

Select Service Credentials from the side menu, then click the New Credential button.



In the Add new credential dialog, leave the default settings and click the Add button



Once the credential is created, you can view it by clicking the View credentials button. We will need the password, and the kafka_brokers_sasl.



Copy the highlighted portion of the brokers to a text file, then remove the spaces and carriage returns to make it a single line.



Configuring Maximo to use Event Stream

In Maximo, go to Administration, then choose Integration from the side menu. The Configure IBM Event Streams for IBM Cloud dialog should open automatically. If it does not not, Click on Outbound Events, then click the '>' button and choose Event Streams.



The Configure IBM Event Streams for IBM Cloud dialog requires three pieces of information:



Set the User ID to 'token'. For the Password, use the password from the Event Stream credentials. In Servers, paste modified the server string from your text editor.



Click Save to add the event stream to Maximo.

Now let's create an event to track modifications to an Asset. Click on the Add Event Button to bring up the Add Event dialog.


Fill in the Event Name and Description with something suitable, ie 'Add Asset'.

Click the Magnifying Glass icon next to Topic, then select the MaximoAssetChange topic you created in the IBM Event Streams setup.

Click the Magnifying Glass icon next to Object Structure. This will bring up a dialog listing all the available Maximo API structures. Type asset into the search field, then hit the enter key to search for the Asset APIs.



From this list, choose MXASSET. This will cause all the Asset's properties to be included in the event message.



Click the Add button to save the Event.



Click on the new Event to bring up it properties.



Click on the Activate Event toggle, then toggle data update off and data added on. Click Save to return to Outbound Events.

Now anytime you add an Asset to Maximo, an event will be sent to the IBM Event Streams.


Monitoring the Events in FME

To connect to the IBM Event Stream, we need to use the KafkaConnector transformer in a workspace. This transformer is a part of a downloadable package, which is available within Workbench, in the Transformer gallery or Quick Add, if you are connected to the internet.



Configure the KafkaConnector with the credentials from the IBM Event Stream using the following settings:



Set Credential Source to Embedded, then set the User Name to token and the Password to the password from the Event Stream credentials.

Set the Security Protocol to SASL SSL, with the SASL mechanism set to Plain and the Certificate blank.

The kafka_brokers_sasl values from the Event Stream credentials need to broken up into urls and ports, and added to the Brokers parameter of the transformer.

Click the ... button next to topic to bring up the Select Topics dialog. Select the MaximoAssetChange from the topic list and click OK to accept.



The default Receive Mode of Stream works best in a real-time workspace that never finishes. For testing, it is more useful to set the Receive Mode to Batch, with a Batch Size of 100. In this mode, the transformer will poll the Event Stream for up to 100 events whenever a feature is sent to it.

Set the Consumer Group ID to test. For more information on Kafka Consumer Groups, please see

Leave all the other settings as the defaults.

The KafkaConnector will output an individual feature for each event it reads, with the following attributes added to them:

_timestamp: The timestamp of the event, formatted as seconds since Jan 1, 1970 (Unix Epoch).

_topic: The topic of the event. This is useful when reading from multiple topics.

_value: The event message. In this case, this is a binary value that needs to be converted to JSON



The BinaryEncoder transformer converts the event message from binary to Hex, then the TextDecoder decodes the JSON string from Hex. Finally, we use a JSONFormatter to make the JSON more readable.

The DateTimeConverter is used to convert the timestamp to an ISO datetime.

Example of the message produced by adding the Stove asset with the MaximoObjectAdder:


Was this article helpful?



Please sign in to leave a comment.