Creating a POST Endpoint in a Data Virtualization API

Sanae Mendoza
Sanae Mendoza
  • Updated

FME Version

Introduction

​Data Virtualization APIs are not limited to delivering data; they can also receive it. POST endpoints allow external users or applications to send data to a service, enabling processes like updates, notifications, or synchronization.

Unlike GET requests, which typically use URL parameters to retrieve data, POST requests include a request body that carries structured data, commonly in JSON or XML formats. This approach allows for the transmission of complex or sensitive information, such as nested records or large datasets, which is not feasible through URL parameters.​ This data can be utilized for downstream processing, such as inserting records into databases or triggering workflows.​

Additionally, by separating the data payload from the URL, POST requests enhance security and provide greater flexibility in handling diverse data inputs, making them well-suited for scenarios requiring data modification or creation.​

Data Virtualization is currently in technical preview within the 2025.1 beta and should not be used for production. Note that documentation may change rapidly and not reflect the current build. This article was written with FME 2025.1 b25562.

Learning Objectives

After completing this lesson, you’ll be able to:

  • Add an endpoint to a workspace
  • Use templates with POST requests
  • Process request bodies in a workspace
  • Enable caching for API responses

 

Step-by-Step Instructions

In this exercise, the EnvironData Coordination Office has created a community reporting app that will receive reports of wildfire data. In order to submit the reports to their systems, they will create a POST /wildfires endpoint in their EnvironData API. Their backend app will use the POST request to collect data and store it in their internal databases. The process involves:

  • Creating a new POST endpoint in FME Flow
  • Defining a request body schema for validation
  • Authoring a workspace that reads and processes the request body
  • Inserting the validated data into a SQLite database

By the end of this lesson, the system will be able to accept structured wildfire data submissions, validate them, and store them in a consistent and reliable format.

 

1. Create a Schema

Navigate the Data Virtualization Environmental Impact and Response API in FME Flow. Open the Schemas tab. 

Select Create to define a new schema. 

Name the new schema “WildfireRequest”. Enter “Wildfires request body.” 

Click “Create” next to Properties. Enter the following values, after each property, click Create again to create another property.

Name Type Required
cause String No
severity  String No
reporter String Yes
contact String Yes
lat Number Yes
long Number Yes

 

The “Required” flag is more significant when creating a schema for a request body. When flagged as Required, the end user understands that this value is mandatory for a successful POST request. 

 

The WildfireRequest schema is different from the WildfireResponse schema because each serves a different purpose. Request schemas include only the data sent by the client, while response schemas add server-generated info like IDs, timestamps, or status. These extra fields aren't needed in the request.

Also, the request uses "lat" and "long" instead of the "location" field from the response. This lets the reporting app collect simple point data, and the workspace can then figure out the location. This avoids relying on users to describe it correctly.

Because of these differences, it's best to use separate schemas for requests and responses to keep client input and server output clear and organized.

When complete, save. 

 

2. Create an Endpoint

Navigate to the Endpoints tab. Select Create to begin the POST request. 

In the Endpoint Details, enter the following parameters: 

Path wildfires
Operation POST
Summary For collecting wildfire reports
Description Creates a new wildfire event
Tags events
Security Inherit

 

3. Configure Request Body

Since this endpoint is configured to use the POST method, a new tab is available: Request Body. Navigate to the Request Body tab.

A request body in a POST operation contains the data the client sends to the server, typically in JSON or XML format. This data is used by the server to create, update, or process a resource.

Make sure the "Required" option is enabled. While some POST requests may not require a body (for example, if they simply trigger an action, use query parameters instead, return a static response, or the server generates all required data), those scenarios are rarer. In this case, the client must provide wildfire data for the request to be valid and processed correctly.

Finally, configure the remaining request body settings, and select the WildfireRequest schema as the input template.

Required Yes
Description Wildfire event data
Content-Type application/json
Input Type Use Schema
Schema WildfireRequest

 

4. Configure the Response

Navigate to the Response tab. 

For Response Type, select Workspace. Leave Inherit API Job Queue and Inherit API Job Expiry Time as enabled. 

 

To define a new status code, click on Add Status Code

We'll first define a "201 Created" status code, which will be used when a request to the POST /fires endpoint successfully creates a new record in the SQLite fires table. 

HTTP Status Code 201 - Created
Description Succcessful
Content Type application/json
Input Type Create Properties

 

Create a new property with the Name “Message” and Type “String”. 

 

Add another status code, this time  a "400- Bad Request" status code, which will be used when a request to the POST /fires endpoint  is missing required properties. 

HTTP Status Code 400 - Bad Request
Description Missing parameters
Content Type application/json
Input Type Create properties

 

Create a new property with the Name “ErrorMessage” and Type “String”. 

 

Leave Asynchronous Processing as enabled. 

Once configured, select Create to save the endpoint. 

 

5. Assign a Workspace

Navigate to the Workspaces tab. The new POST endpoint is listed, with the Workspace Status as “Unassigned”.

 

Select the POST /wildfires endpoint and expand the Endpoint Actions menu. Select Generate Workspace. Name the workspace “NewWildfires.fmw”.

 

Once assigned, the POST /wildfires Workspace Status will be “Needs Updating”. 

 

6. Download Workspace

In FME Form, open the Data Virtualization menu and select the POST /wildfires workspace.

 

The workspace will open with a new Data Virtualization Reader Type, POST /wildfires.

A POST request introduces additional attributes to the Data Virtualization Reader: 

  • request.body_file_path: The full path to any files uploaded by the client, available for use during the request translation.
  • request.body: The raw data sent from the client to the server in the request body.
  • request.content_type: Specifies the format of the request body (e.g. application/json), allowing the FME to correctly parse and handle the incoming data.

 

7. Use a POST Template

On the POST /wildfires Reader, select Run Just This. 

In the Translation Parameter Values window, update the template with sample data: 

{
  "path": "/wildfires",
  "method": "POST",
  "body": [{
     "name": "requestBody",
     "contentType": "application/json",
     "content": {
        "severity": "moderate",
        "contact": "1-555-618-5924",
        "cause": "Dry conditions",
        "reporter": "Michael Johnson",
        "lat": "50.73033",
        "long": "-120.33565"
     }
  }]
}

 

Inspect the results to review the sample data.

 

The request.body attribute contains our sample values.

 

8. Parse the Body Contents

To turn the sample data into attributes, the request.body must be parsed. 

Add a JSONFlattener to the workspace and connect it to the POST /wildfires feature type. Open the JSONFlattener to configure the parameters. 

In the Source parameters, set the JSON Document value to the “request.body” attribute. In the parameters, use the ellipses to open the Attributes to Expose window. Add all the WildfireRequest schema properties: cause, contact, lat, long, reporter, severity. 

OK to close. 

 

9. Check for Required Parameters 

In FME Flow, a few properties in the WildfireRequest schema were marked as required: reporter, contact, lat, and long. The workflow should verify the client has sent values for these parameters before proceeding.

Add an AttributeValidator to the canvas and connect it to the JSONFlattener. 

Open the AttributeValidator to configure its attributes. In the Attributes to Validate, select the required attributes: contact, lat, long, and reporter. Set the Validation Rule as “Has a Value”. 

OK to close. 

 

10. Add a Datetime Attribute

Some data attributes, such as unique IDs or timestamps, are often generated automatically by the target system or another downstream component in the data pipeline. In this example, the fires SQLite table includes a submission_date column, which records when a wildfire event is created, as well as identifier fields like fire_id and hazard_id.

To keep the workflow flexible and avoid conflicts with system-assigned values, we will leave the fire_id and hazard_id fields unpopulated, assuming they will be generated later by another system. We’ll also leave the confirmed column, assuming that is a manual process.

However, for the submission_date column, we will use FME to generate a timestamp during the workflow execution. This ensures that each submitted wildfire event is tagged with a consistent and accurate creation time.

 

Add a DateTimeStamper to the workspace and connect it to the Passed port of the AttributeValidator. 

Open the DateTimeStamper. In the parameters, update the Include Fractional Seconds to “No”. Rename the Result parameter to “submission_date”. 

OK to close. 

 

11. Geocode the Coordinates (Optional) 

We can turn the collected coordinates into a location using the Geocoder. Add a Geocoder to the workspace and connect it to the DateTimeStamper. 

Use a Geocoder Service of your choice, or use the free OpenStreetMap service for this example (review limitations before using any geocoding service). 

Update the Mode to “Reverse” and set the Latitude and Longitude parameters to the “lat” and “long” attributes, respectively. 

 

OK to close. 

The Geocoder creates a new attribute, _address, that can be used as the SQLite fires table’s location column.  

 

12. Filter the Attributes

At this point, our sample wildfire data submission can be prepared for database updates and the final API response. 

Add an AttributeManager to the canvas and connect it to the Geocoder. Open the AttributeManager and configure the following Actions:  

  • Remove all Data Virtualization Reader attributes: request.path, request.body_file_path, request.body, request.content_type. 
  • Remove attributes incompatible with the database: _latitude, _longitude, lat, long. 
  • Rename the _address attribute as “location”
  • Create a new attribute, “confirmed”. Set its value to “false” 

OK to close. 

 

13. Write to the Database

Finally, the client data and generated attributes can be inserted into the database. 

Add a FeatureWriter to the canvas and connect it to the DateTimeStamper. Configure the FeatureWriter to connect to the SQLite ‘events’ database. 

In the Parameters, update the Table Name to ‘fires’. In Table Handling, select “Use Existing”. 

OK to close. 

 

14. Test for Successful Insert 

Before returning a response, the workflow should verify that the event has been successfully written to the database. 

Add a Tester to the canvas and connect it to the Summary port. Open to configure its attributes. 

Left Value Operator Right Value
_total_features_written = 1

OK to close. 

 

15. Format Successful Response Attributes

Add an AttributeCreator to the canvas and connect it to the Tester’s Passed port. Create the required response attributes for the ‘201 - Created’ status code. 

Output Attribute Value
response.status_code 201
response.body.content {"Message": "Report submitted"}
response.body.content_type application/json

 

OK to close. 

To complete the successful response workflow, connect the AttributeCreator to the Data Virtualization Writer, http_response

The successful (201) workflow is now complete. 

 

16. Format Failed Response Attributes

Recall we created a second status code, 400 - Bad request, for request bodies that were missing the required parameters. 

Add another AttributeCreator to the canvas. This time, connect it to the Failed port of the AttributeValidator. Open it up to configure the parameters.

Create the required response attributes for the ‘400 - Bad request status code. 

Output Attribute Value
response.status_code 400
response.body.content {"ErrorMessage": "Missing required parameters"}
response.body.content_type application/json

 

OK to close. 

Connect the AttributeCreator_2 to the Data Virtualization Writer, http_response. 

The unsuccessful (401) workflow is now complete. 

 

17. Test the Workspace

POST requests can also be tested locally with templates. Run the workspace to completion. 

Review the response in the dv_output.json file. A simple 201 response indicates the SQLite database has been updated. 

 

To confirm the update, open the events.sqlite database in the FME Data Inspector (if already open, refresh the data). The newly reported wildfire report is listed.

Save the workspace.

 

18. Publish to FME Flow

Publish the workspace to FME Flow, uploading the GeoCoder package if used.  

 

Confirm both endpoints’ Workspace Status are now assigned. 

 

19. Test the Request from a Client

Find the POST /wildfires endpoint in the EnvironData API Swagger documentation (or another API client). Select Try it Out to test the request. 

Replace the Wildfire event data with the following sample: 

{
        "severity": "high",
        "contact": "+1-555-378-9821",
        "cause": "Campfire left unattended",
        "reporter": "Emma Davis",
        "lat": "50.68035",
        "long": "-120.19706”
      }

 

Execute. 

 

A 201 - Created response confirms that the POST /wildfires endpoint is successful. 

Was this article helpful?

Comments

0 comments

Please sign in to leave a comment.