Defining Data Transformation for 3D Augmented Reality (AR) Models

Evie Lapalme
Evie Lapalme
  • Updated

FME Version

Introduction

In this section, we will bring all the necessary data into the workflow, define the transformation and add the AR writer. At this point, we won’t generate the actual AR data to use in the field, we just define the rules of how the data will be created when we request it. 

Here is the main idea behind our connected FME Realize app—we generate AR data only when it is requested only for the location at which it was requested. This allows having the most up-to-date, most relevant information on the ground. 

To achieve this, we will add some transformers that don’t do much to the tutorial dataset, however, they are necessary for your real-world data.

This tutorial demonstrates the general approach to creating AR models. However, your own data might look significantly different in terms of structure, source formats and the output requirements. As you follow along, be prepared to adapt the transformation steps to accommodate your specific data characteristics, such as different attribute structures or spatial formats, to ensure your AR models align accurately with your specifications.

 

Requirements

  • FME Form 2025.1+ 
  • FME Flow 2025.1+
  • An iOS mobile device with FME Realize app installed 

 

Step-by-step Instructions

Part 1: Reading Data 

1. Create a Point

Open FME Workbench and start a new workspace. 

Add a Creator transformer to the canvas. In the parameters, set Geometry Source to Geometry Object. Next, click the ellipsis next to the Geometry Object parameter to open the Geometry Creator dialog. 

In the Geometry Creator dialog, create a point with X = 1.5, and Y = 0. Click OK twice. 

This transformer will initiate your transformation, and the point it creates will define your location within the AR model.

2. Add a Bufferer

Next, connect a Bufferer. In the parameters, specify a Buffer Distance which can be any value above 15 meters. Set the Bufferer Distance Units to Ground Units (None), for this dataset. For your own dataset, adjust the units as needed. 

Bufferer.png

 

Keep in mind that the FMEAR format uses meters only, and if your data uses other units, you may need to reproject it before transforming it into 3D.

The buffer will be used as an initiator once we add a FeatureReader. It dictates how much data near your current location is included in the workflow. If you prefer a rectangular Area of Interest (AOI), you can add a BoundingBoxReplacer after the Bufferer.

Boundingbox.png

 

3. Add a FeatureReader

The FeatureReader is critical for accessing large datasets, such as city-wide infrastructure databases. Rather than loading the entire dataset, it reads only the subset defined by the AOI created in the Bufferer. This ensures that the AR workflow remains efficient and manageable. By dynamically filtering and clipping data to the AOI, the FeatureReader allows users to explore different areas on demand, eliminating the need to pre-generate AR scenes for every possible location.

 

Add the FeatureReader transformer and connect it to the Bufferer (or BoundingBoxReplacer). In the FeatureReader parameters, set the following:

  • Format: SpatiaLite
  • Dataset: UndergroundInfrastructure.sqlite

Next, click Parameters. In the parameters, expand Schema Attributes, click the ellipsis next to Additional Attributes to Expose, then select fme_feature_type. Click OK to return to the FeatureReader parameters.  

Now we need to select all of the tables by clicking the ellipsis next to Feature Types to Read, then clicking Select All. Click OK. Finally, enable Clip to Initiator Envelope so that the transformer only reads the data inside the AOI.

 

Part 2: Transforming Points 

There are two main methods of transforming points to 3D assets: modifying geometries and replacing them with pre-created 3D models. Let’s try modifying geometry first.

 

Changing Point Geometry

Changing the geometry works well for simple assets, like poles or manhole covers or when a realistic representation of an asset is not required. For instance, the UtilityPole point feature type in our dataset has attributes called Diameter and Height. We can use them to create a cylindrical representation of a pole with the 2DEllipseReplacer and Extruder.

 

Utility Poles

1. Add a 2DEllipseReplacer

Add a 2DEllipseReplacer to your workspaces and connect it to the UtilityPole output port of FeatureReader. In the transformer parameters, set both Primary Radius and Secondary Radius to:

@Value(Diameter)/2


Note that the FME Realize app needs all the data in meters, so if your data uses other units, you need to reproject or scale your data first.

2DElipse.png

 

2. Add an Extruder

Next, connect an Extruder transformer and set Distance to the Height attribute.

Extruder.png

 

3. Add an AppearanceSetter

Add AppearanceSetter and connect the Geometry input port to the Extruder. In the parameters, expand the Color section and set the following:

  • Appearance Name: Electrical
  • Diffuse Color: 1,0.984314,0 (yellow)
  • Ambient Color: 1,0.576471,0 (orange)
  • Specular Color: 0.921569,0.921569,0.921569 (grey)
  • Shininess: 0.8

If you use lighter tones for the diffuse color, set darker tones to the ambient color and vice versa. The specular color and shininess allow setting matte or polished/metallic look of 3D geometry. Feel free to experiment with these settings. Keep in mind, FME Data Inspector cannot display shininess, but FME Realize will reflect your settings perfectly. See the Styling Objects section below for more details about styling.

4. Run Workspace

Run the workspace with Feature Caching enabled, and then view the output in Visual Preview. Note that you may need to use the Orbit tool to see the utility poles.

 

5. (Optional) Add a Bookmark

Select 2DEllipseReplacer, Extruder, and AppearanceSetter and create a bookmark around them as a visual aid—there will be a lot more transformers by the end of the exercise. If adding bookmarks, do so after each object.

 

Water Valve Cover

6. Create the Water Valve Cover

Similarly, create Water Valve Cover—find WaterValveCover output port in FeatureReader and repeat steps 1-3, or duplicate the 2DEllipseReplacer, Extruder, and AppearanceSetter. In the Extruder_2, instead of the Height attribute, which water valve covers do not have, use a small number, for example: 0.05.

A 3DForcer transformer could be included in the workflow to ensure the cover is positioned above the pipe or manhole it protects—try to figure out what elevation you should use for a consistent visualization. Without this step, the pipe may visually pass through the cover, which may look less pleasant.

 

7. Set Water Valve Cover Appearance

Instead of setting colors as we did with the utility poles, we will use a manhole cover texture. Add a reader to the workspace and set the following:

  • Format: JPEG 
  • Dataset: manholecover.jpg 

Connect the JPEG reader feature type to the AppearanceSetter_2 Appearance input port. 

 

In the AppearanceSetter_2 parameters, expand the Color section and set the following: 

  • Appearance Name: WaterValveCover

Expand Texture Coordinate Generation:

  • Texture Mapping Type: From Top View


Here is the cover after the steps above:

 

Catch Basin

Next, instead of replacing a point with an ellipse, we can also replace it with a rectangle. Let’s create a catch basin using this method. 

 

8. Add a CoordinateExtractor

Connect a CoordinateExtractor to the StormCatchBasin output port on the FeatureReader. In the parameters, set the Mode to Specify Coordinates. This will add coordinates _x and _y to the passing features. It will also add _z, but that can be ignored. 

 

9. Add a 2DBoxReplacer

Add 2DBoxReplacer and connect it to the CoordinateExtractor. In the parameters, set the following: 

MinimumXValue -0.3
MinimumYValue -0.2
MaxiumumXValue 0.3
MaximumYValue 0.2

 

Replacing Point Geometries with 3D Models

Replacing a point with a 3D model is ideal when you need to represent assets with detailed shapes, such as fire hydrants or water valves. 

In this tutorial, we use the OBJ format for 3D models. While it is an older format, OBJ is reliable, widely supported across many platforms, and processes quickly, making it a solid choice for workflows like this. However, you are not limited to OBJ. Depending on what you already have or what better fits your needs, you can use formats such as Collada, FBX, SketchUp or other supported formats. Each format has its own strengths, so feel free to choose the one that best suits your project requirements and integrates seamlessly into your workflow.

Asset models should be prepared in a separate workflow to optimize performance. While it is possible to integrate this step into the main workspace, doing so would add unnecessary processing time every time the AR (or any) workspace runs. Instead, create a dedicated workspace where the Scaler transformer is used to scale the model to meters, accommodating variations in units like millimeters or inches. Use the Rotator transformer to set a default orientation for the model, making it easier to apply additional rotations in the main workflow for alignment with other features, such as adjusting from the default to match a pipe at, for example, 25º. Center the model at the origin (0,0) with the BoundsExtractor and Offsetter, or the custom ModelCenterer transformer. Make sure that critical points (e.g., the vertical axis through the center of a fire hydrant) are aligned with the origin. For a smooth appearance, apply the VertexNormalGenerator transformer in “Averaged” mode. Save the asset models in a format like OBJ for reliability, speed, and broad compatibility, ensuring it is ready for seamless integration into the main workspace.

 

Fire Hydrant

10. Add a CoordinateExtractor

Add another CoordinateExtractor to the workspace and connect it to the FireHydrant output port on the FeatureReader. In the parameters, set the Mode to Specify Coordinate and then click OK.  These attributes define an insertion point for the fire hydrant model.

 

11. Add a FeatureReader

Add another FeatureReader to the canvas and connect it to the CoordinateExtractor_2. In the parameters, set the following:

  • Format: OBJ
  • Dataset: FireHydrant0.zip
  • Expand Attribute and Geometry Handling:
    • Accumulation Mode: Merge Initiator and Result

The Merge Initiator and Result parameter will pass the insertion point to the model. 

 

12. Add an Offsetter

Add an Offsetter transformer and connect it to the Hydrant0 output port on the FeatureReader_2. Set the X Offset to _x and the Y Offset to _y. This will move the model of the fire hydrant to the insertion point.

 

Water Valve

13. Duplicate the CoordinateExtractor_2

Duplicate or copy and paste the CoordinateExtractor_2. Connect the new CoordinateExtractor_3 to the WaterValve output port on the first FeatureReader. 

 

14. Duplicate the FeatureReader_2

Duplicate or copy and paste the FeatureReader_2. Connect the new FeatureReader_3 to the CoordinateExtractor_3. In the parameters, change the Dataset to WaterValve.zip

 

15. Duplicate the Offsetter_2

Duplicate or copy and paste the Offsetter_2. Connect the new Offsetter_3 to the WaterValve output port on the FeatureReader_3. 

By this moment, we have processed all point geometries and can switch to linear features.

For a challenge, create a line at an arbitrary angle going through the water valve point geometry. Try placing the water valve at the point location so that it is oriented along the line. Use NeighborFinder to find the rotation angle and Rotator to actually rotate the water valve. Make sure you rotate the valve around its center.

 

Part 3: Transforming Linear Objects

We will use three methods to transform lines into 3D geometries:

  1. Replacing lines with pipes
  2. With boxes having rectangular cross sections
  3. With surfaces (street labels) - see Part 4

Water Main

FME supports pipe geometry and has a simple way of turning a line into a pipe.

 

1. Add a PipeReplacer

Connect a PipeReplacer to the WaterMain output port on the first FeatureReader. In the parameters, set the Outer Radius to:

@Value(Diameter)/2

(Optional) You can try setting the Inner Radius parameter to make a more realistic-looking pipe. Keep in mind that the inner radius should be smaller than the outer radius. This means it makes sense to calculate the inner radius based on the Diameter attribute:

(@Value(Diameter)/2)*0.9

This will make your model bigger, and it is not recommended to use the Inner Radius parameter unless showing the interiors of the pipes is necessary.

 

2. Add a VertexNormalGenerator

Now add a VertexNormalGenerator and connect it to the PipeReplacer. Set Mode to Averaged. This will create a smoother look for the pipe.

 

3. Set Appearance

Add another AppearanceSetter to the canvas and connect the Geometry input port to the VertexNormalGenerator. Style the pipe by setting the appearance name, diffuse, ambient, specular colors, and shininess. Feel free to express your creativity and pick colors you like. For this example we used:

  • Appearance Name: Water
  • Diffuse Color: 0,0.588235,1 (blue)
  • Ambient Color: 0,0.992157,1 (light blue)
  • Specular Color: 0.921569,0.921569,0.921569 (grey)
  • Shininess: 0.8 


 

Hydrant Feed and Water Service Connection

4. Duplicate PipeReplacer, VertexNormalGenerator and AppearanceSetter_4

Duplicate PipeReplacer, VertexNormalGenerator and AppearanceSetter_4 and connect the PipeReplacer_2 to the HydrantFeed output port and the WaterServiceConnection output port on the first FeatureReader.

 

Sewer Main and Sewer Service Connection

5. Duplicate Water Main Transformers

Duplicate the PipeReplacer, VertexNormalGenerator and AppearanceSetter_4 again and connect the PipeReplacer_3 to the SewerMain output port and the SewerServiceConnection output port on the first FeatureReader. 

If desired, use a separate AppearanceSetter with unique styling for each feature type. 

 

Storm Main 

6. Duplicate Water Main Transformers Again

Duplicate the PipeReplacer, VertexNormalGenerator and AppearanceSetter_4 for a third time and connect the PipeReplacer_5 to the StormMain output port on the first FeatureReader. If you are using bookmarks, group this workflow into the same bookmark as the catch basin. 

 

Replacing Lines with Box Geometries

Another way to transform linear features into 3D assets is by buffering them. 

 

Electric Cables

7. Add a Bufferer 

Add the Bufferer transformer and connect it to the ElectricCables output port on the first FeatureReader.

In the parameters, set Bufferer Distance to half of the Width attribute: 

@Value(Width)/2

Then set End Cap Style to Square. 

 

8. Add an Extruder

Add an Extruder to the canvas. Connect the Extruder_4 to the Buffer. In the parameters, set the Distance to the Height attribute.

 

9. Add a VertexNormalGenerator

Add a VertexNormalGenerator and connect it to the Extruder_4. In the parameters, set the Mode to Facet. This allows you to see the box's flat faces.

 

10. Set Appearance

Connect the VertexNormalGenerator_4 to the Geometry input port on the first AppearanceSetter to style it the same as the utility poles. 


For a square cross-section, it is possible to use PipeReplacer followed by PipeEvaluator, in which the Number of Interpolated Edges should be set to 4.

 

Part 4: Adding Labels

Labels are extremely useful for annotating real-world objects in Augmented Reality, such as streets, poles, manholes, etc. In this tutorial, we will generate labels for streets using the StreetCenterline feature type.

 

1. Add a Snipper

First, we will keep just a short segment of the original line. Add a Snipper and connect it to the RoadCenterline output port on the first FeatureReader. 

In the parameters, set Starting Location to 9 and Ending Location to 11. This will keep a short 2-meter-long line in the center of our scene.

When you deal with a real street or pipe network, you may need to use the Densifier and Chopper transformers, and maybe LengthCalculator and Tester to avoid labeling short segments.

We actually need two objects facing different directions for a road label. To achieve this, we will use Orientor, which reverses the direction of the line segment and hence, a surface made from it will face the opposite direction from the original segment. Now we need both the output from Snipper and the output from Orientor to go to the 3DForcer transformer. Let’s raise the lines to 2 meters above the ground, and then extrude them (Extruder) by 0.5 meters. As a result, we will have two face geometries 2 by 0.5 meters hanging in the air and facing the opposite directions.

 

2. Add a 3DForcer

Connect a 3DForcer to the Snipper. In the parameters, set Elevation to 2. This will raise the line 2 meters above the ground.

 

3. Add Another Extruder

Add another Extruder and connect it to the 3DForcer. In the Extruder_5 parameters, set Distance to 0.5. This will create a simple vertical face geometry—2 meters long and 0.5 meters high.

 

4. Create a Texture

Let’s create a texture with our road name. Add another 2DBoxReplacer and connect it to the Snipper. In the 2DBoxReplacer_2 parameters, set the following:

MinimumXValue 0
MinimumYValue 0
MaximumXValue 2
MaximumYValue 0.5

 

5. Add a MapnikRasterizer

After the 2DBoxReplacer_2, connect a MapnikRasterizer. In Parameters, create three Input Ports, all named Label. We will use three different symbolizers, but on the outside, we will still see only one input port. Rename the existing Box port to Label.

Set the first Symbolizer to Polygon. Click the Edit button, set the following:

  • Color: Dark Green

This will create a dark green background for the label. 

Set the second Symbolizer to Line. Click the Edit button, set the following:

  • Color: White
  • Width (Pixels): 4
  • Expand Positioning:
    • Offset (Pixels): -6 

This will create a white frame inside the polygon.

Set the third Symbolizer to Text. Click the Edit button, set the following: 

  • Text: FullName attribute
  • Font: Tahoma
  • Font Size: 68
  • Color: White
  • Expand Positioning:
    • Placement Type: Simple

In the main MapnikRasterizer parameters section, set:

  • Resolution Specification: Cell Spacing
  • Cell Spacing: 0.005

6. Add an AppearanceSetter

Now we will combine our face geometry and the output of the MapnikRasterizer transformer. Add another AppearanceSetter, and connect the MapnikRasterizer Raster output port to the Appearance port and the face geometry from Extruder_5 to the Geometry port.

 

7. View Output

In FME Data Inspector, the label will not be visible until you rotate the scene because we will be looking at it from above.

Keep in mind that MapnikRasterizer is a very powerful and sophisticated transformer, you can create quite complex labels, however, it may require some experience to achieve really great results. 

Optional: Add labels to the pipe objects. Use the ID attribute or a combination of several attributes to get a better understanding of how labelling works.

 

Advanced Labelling

For complex AR scenes requiring dozens or even hundreds of labels, an advanced labelling workflow can provide flexibility and consistency. One approach involves using CSS snippets stored in an external styling file, organized by themes. Each theme defines the label's appearance, including font, size, color, and layout. HTML content for all labels is dynamically generated and then rasterized into images using the wkhtml2image utility (https://wkhtmltopdf.org/). This command line cross-platform utility allows running the image generation on the desktop computers and in the FME Flow environment.

To streamline the process, a single image is created per theme and clipped into individual labels using pre-generated vectorized frames that match the dimensions of each object's label. This method ensures precise styling and alignment, although it involves careful planning and preparation. While not explained in detail here, this workflow represents a scalable solution for managing large-scale high-quality labelling for AR solutions.

 

Part 5: Summary Annotations

Summary Annotations in the FME Realize app are informative overlays that appear at the bottom-center of the screen while the center of the screen is pointed at an AR asset. They provide key information about specific assets, such as attributes, text descriptions, or even clickable links. Users configure these annotations to include the data they deem most relevant, allowing for a customizable and efficient way to convey critical information without cluttering the AR view. This makes Summary Annotations an essential tool for enhancing the usability and clarity of AR models. 

 

1. Create Summary Attribute

Add an AttributeCreator transformer and connect it to the AppearanceSetter Output port that styles Water Main (AppearanceSetter_4).

In the AttributeCreator, create a new attribute called fmear_feature_summary. Set its value to the following expression:

**@ReplaceRegularExpression(@Value(fme_feature_type),"(.)([A-Z]{1})","\1 \2",caseSensitive=TRUE)**

**ID :** @Value(ID) | **Status :** @Value(Status)


This expression tells FME Realize to show the bolded value of fme_feature_type with spaces inserted in front of non-leading capital characters in the first line, and in the second line—the bolded word “ID” followed by value of “ID”, and after a “|” symbol—bolded “Status” followed by the value of the “Status” attribute. 

Water Main
ID : WTND00959 | Status : Operating

 

2. Add a Link to the Summary Attribute

Now let’s add a simple link to this annotation—a URL to www.safe.com. Once a user taps a link in Summary Annotations, a browser opens within the app. Adding links to FME Flow workspaces and Flow apps will allow you to build very powerful integrations where you control functionality and the look of the pages displayed in the webview.
The text of a URL visible to a user goes into square brackets. The URL itself goes into round brackets, that is, [URL name](https://yoururl.com).


Here is our updated summary annotation:

**@ReplaceRegularExpression(@Value(fme_feature_type),"(.)([A-Z]{1})","\1 \2",caseSensitive=TRUE)**

**ID :** @Value(ID) | **Status :** @Value(Status)

[Safe Software](https://www.safe.com)

Water Main
ID : WTND00959 | Status : Operating

Safe Software


3. Create Other Summary Attributes 

Now duplicate the AttributeCreator and add its copies at the end of all feature processing branches. As you can see, you can customize summary annotations for each feature type with different attributes and links. For this example, we will just use the same summary attributes. 

Connect a duplicated AttributeCreator to the following workspace branches:

  • FireHydrant: Offsetter_2
  • CatchBasin: AppearanceSetter_3
  • UtilityPoles: AppearanceSetter
  • RoadLabel: AppearanceSetter_8
  • SewerMain: AppearanceSetter_6
  • All of the Water branches into the first WaterMain AttributeCreator

Other Summary Annotation Examples:
For building AR experiences, you may want to create FME Flow apps or Webhooks that accept feature IDs and feature types as their parameter. In this case, you will be able to pass a particular feature to other apps you created. For example, you may want to display all feature attributes related to a selected object. The ID will allow you to run a workspace with FeatureReader and ID as a constraint for fast data retrieval.

Here is a possible format of such a url:

[Attributes](https://your_fme_flow_instance.com/fmedatastreaming/Samples/UndergroundInfrastructureAttributes.fmw?ID=@Value(ID)&DATASET=@Value(fme_feature_type)&token=xxxxxxxxxxxxxxxxxxxx.

Object ID: @Value(ObjectId)

@Value(Family)

[Attributes](https://myARinstance/fmedatastreaming/Samples/Bringing_Revit_to_AR_Attributes.fmw?OBJECTID=@Value(ObjectId))

 

**Asset Type:** @ReplaceString(@Value(fme_feature_type),_, ,caseSensitive=TRUE) **ID:** @Value(GIS_ID)

**Status:** @TitleCase(@Value(STATUS))

[Attributes]( https://myARinstance/fmedatastreaming/Samples/InfrastructureAttributes.fmw?GISID=@Value(GIS_ID)&FEATURETYPE=Water_Mains) |  [Toolbox](https://myARinstance/fmedatastreaming/Samples/InfrastructureToolbox.fmw?ID=@Value(GIS_ID)&DATASET=@Value(fme_feature_type)&BUTTONS=@Value(_buttons)&MAPNAME=@Value(_map_name))

@Value(fmear_feature_summary_fragment)

The summary annotation created in this tutorial will look as follows on an iPad:

 

Part 6: Anchor and Location Parameter

In the FME Realize app, anchors are essential for ensuring AR models are correctly georeferenced and aligned with real-world locations. An anchor combines geographic coordinates of user’s location stored as attributes fmear_anchor_latitude and fmear_anchor_longitude, with a point geometry in the model’s coordinate system. A point is recognized as an anchor when it has the fmear_location_feature attribute set to "anchor". 

Only one anchor per model is allowed. The first point with the fmear_location_feature attribute to enter the writer is used as the anchor, while any subsequent points with the same attribute are discarded. The anchor can be sent to any writer feature type. However, it will not be written to that feature type; instead, it is stored in the FMEAR file’s metadata. 

On the mobile device, the FME Realize app can populate the geographic location parameter automatically. In order to use it, we need to create a location parameter in our workspace. 

 

1. Create fmear_location_feature Attribute

Connect a new AttributeCreator transformer to the Creator. 

Create an attribute called fmear_location_feature and set the Value to anchor.

Now we have a feature with a point geometry and the format attribute fmear_location_feature.

For placing the model in the real world, it needs to have two coordinate attributes—fmear_anchor_latitude and fmear_anchor_longitude.

 

2. Create Geometry Parameter

Right-click on User Parmeters in the Navigator window and select Manage User Parameters. In the Parameter Manager, click on the green plus sign in the top-left corner and select Geometry. For the Geometry parameter, set the following:

  • Parameter Identifier: LOCATION
  • Prompt: Select Location of Model
  • Geometry Encoding: GeoJSON
  • Geometry Types: Point (click the ellipsis and deselect everything except Point)
  • Default
{
"coordinates": [
-122.82,
49.3
],
"type": "Point"
}

You can also add a default location for scenarios where you won’t use GPS, but still would like to have a value to run your workspace (for testing purposes, for example). 

See the Location Parameter in FME Flow article to see how to use this parameter for FME Flow AR apps.

It is possible to skip the anchor creation step entirely, as the FMEAR writer can automatically generate an anchor at the center of the model extents. This method is equally accurate, making it ideal when speed and simplicity are priorities. However, defining an anchor explicitly provides greater control, ensuring it aligns with a specific, meaningful point in the model. This approach is particularly useful when working with models that need precise placement relative to real-world features, such as aligning a hydrant or manhole with its exact geospatial location. Additionally, manual anchor creation offers an opportunity to better understand how anchors work and how they impact AR model placement.

LocationParam.png

 

Part 7: Saving FMEAR files

The transformed data goes to the FMEAR writer. It stores the model and the metadata information such as location, summary annotations, scale, anchor information, etc. in a file with the *.fmear extension. 

 

1. Add a Writer

Add a new writer and select FME Augmented Reality (FME AR) format.

Specify the output location for your file.

In the parameters, set Initial Model Scaling to Full Scale. This will allow us to load the model in scale 1:1, that is, the data will be created according to its real size. Click OK. 

In the Feature Type Parameter dialog, set FME AR Asset Name to FireHydrant, then click OK. 

Connect the new feature type to the AttributeCreator of the Fire Hydrant branch.

 

2. Duplicate Feature Type and Modify Asset Name

Duplicate the feature type several times and attach copies to your other branches—water, sewage, drainage, electrical, and road label. Open the parameters of each and rename the FME AR Asset Name.

 

3. View Output in FME Data Inspector 

In the Writers menu on the top toolbar, redirect your output to FME Data Inspector and run the workspace. You should see a scene similar to the following:

Note the tiny dot between the fire hydrant and the right catch basin. This is our anchor, our spawning point if we want to use computer gaming terminology. If everything is set up correctly, once the model is visualized around yourself, you know where your relative position inside the model should be—somewhere close to the fire hydrant.

 

4. Save the Workspace

Save the workspace as UndergroundUtilities.fmw before publishing it to FME Flow. 

 

5. Publish the Workspace

The publishing instructions and how to create an FME Flow App can be found in the next tutorial: Creating an FME Flow AR App

 

Adapting the Workflow for Large Datasets

While this tutorial uses a small, simulated dataset that easily fits within our clipping boundary, the FME workspace provided with this tutorial has the flexibility to handle larger, city-wide datasets for real-world AR experiences. If users plug in their own large datasets, the workflow can dynamically clip data based on their current GPS location within the AOI. Check the section depicted below in the workspace that comes with this tutorial.

 

Styling Objects

Why Styling is Important

Styling enhances the visual clarity and usability of AR models by allowing you to distinguish between the functions of similar assets. For example, while water and sewage pipes may share similar physical characteristics, you can use different colors or textures to represent their unique roles—blue for water and purple for sewage. Additionally, styling enables customization based on attributes such as age, status, condition, or ownership. For instance, you could use color coding to indicate conditions—green for good, yellow for moderate, and red for poor—making it easy for users to interpret critical information. Proper styling transforms functional models into intuitive and visually engaging AR experiences.

 

Setting Appearance Name

It is important to set appearance names (see Color section in AppearanceSetter parameters). You can type a descriptive name or take the value from an attribute, for example, fme_feature_type. Missing appearance names can lead to mixed up appearances on the features in the AR scene.

 

Using Colors, Shininess, and Alpha

Colors, shininess, and alpha are key parameters for controlling the appearance of 3D objects in augmented reality. Here's how they work:

  • Diffuse Color: Represents the base color of an object under direct lighting, defining its material's overall appearance.
    • A red diffuse color gives the object a vibrant red look in natural light.
    • A darker diffuse color tones down the vibrancy, making the object appear subtler.
  • Ambient Color: Simulates how an object appears under indirect lighting, adding shading to shadowed areas.
    • For bright diffuse colors, use a slightly darker ambient color to soften shadows naturally.
    • For darker diffuse colors, a lighter ambient color can help reduce excessive shading.
  • Specular Color: Controls the color and intensity of reflective highlights.
    • A white specular color creates bright, polished highlights.
    • A colored specular highlight (e.g., blue) can simulate metallic or tinted surfaces.
    • A darker specular color reduces highlight intensity, giving the surface a matte appearance.
  • Shininess: Adjusts the size and sharpness of highlights.
    • A high shininess value produces sharp, small highlights, ideal for reflective surfaces like metal.
    • A low shininess value results in broad, diffused highlights, suitable for materials like plastic or satin.
  • Alpha (Transparency): Controls the opacity of the object, determining how much light passes through it.
    • An alpha value of 1.0 (fully opaque) makes the object block all light, showing no background visibility.
    • A lower alpha value (e.g., 0.5) makes the object partially transparent, allowing some light to pass through.
    • An alpha value close to 0.0 makes the object nearly invisible, useful for representing subtle overlays like glass or water surfaces.

By experimenting with these settings in the AppearanceSetter transformer, you can create realistic materials and fine-tune the appearance of your 3D objects.

 

Using Textures

Textures are images applied to the surface of a 3D object to give it a realistic appearance. They can represent various material properties, such as color, roughness, or reflectivity. Unlike simple color settings, textures add detail and complexity, allowing objects to look like wood, metal, concrete, or other materials. Textures are especially useful for creating visual interest in AR models, making them appear more lifelike and immersive. 

The model of the fire hydrant we added to our workspace earlier already has textures, and that makes the hydrant look realistic.

 

Using External Sources for Styling Features

Attributes controlling the appearance of features can be stored in external sources, such as spreadsheets, text files, or database tables. FME enables you to:

  1. Read Styling Data: Use appropriate readers to import styling attributes.
  2. Merge with Features: Combine styling data with features using transformers like FeatureMerger or FeatureJoiner.
  3. Apply Appearance: Use these imported attributes in the AppearanceSetter transformer to set colors, shininess, and other visual properties.

This approach centralizes styling information and makes it easier to maintain consistent appearances across different datasets or projects.

Was this article helpful?

Comments

0 comments

Please sign in to leave a comment.