Skip to main content Skip to footer

Working with data

After you have got some data from external sources into your Flow, the next step is to do something with the data.

Pre-process: Transform & Filter

Data seldom arrives in a format that can be used directly, or you may receive different formats from different sources and need to align the formats before using the data. There may also be irrelevant data, or data arriving at a higher frequency than needed by your use case. There are a large number of modules available to help you with this, some examples:

  • Property Mapper - This is the swiss-army knife for changing the structure and naming in your messages. Remove unwanted parts, add new properties or move any property to a new hierarchy level or a new name.
  • Array Split - Iterate over the elements in an array by splitting it into individual message. The array can be restored later on with the Array Join module.
  • ReportByException - Remove duplicate values.
  • JSON/XML/CSV  - Convert structured data to the internal format used in Flows.

You will find many more modules in the Flow Studio Module Library/Transformation category.

Read more about these modules in the Documentation/Analytics Modules Overview →


The data you get may not be what you expect. Adding validation before taking actions based on the data is often a good idea.

As a first step you should check that your input modules are happy. Each module will add a ´success´ property on the output message. By adding a message filter on the next module in your Flow that checks that this property is true you avoid propagating data from an external request that failed. You can also combine this with other checks, by just adding more conditions.

For more advanced validation you can use the JSON Schema Validation  module, to verify both the structure and the values in your messages.

You find these modules in the Flow Studio Module Library/Process & Analyze / Conditions & Filtering category.

Read more about these modules in the Documentation/Analytics Modules Overview →

Process & Analyze

For basic use cases it may be enough to transform and validate your data and then send the relevant data to your destinations. The real value with Crosser comes into play when you analyze the data and take actions directly.

You can make basic calculations using the Math module, get statistical properties for your timeseries data with the Statistics module, or remove noise with the Smooth module. Check the uptime of a machine with the Time Counter module, or the yield from your discrete production line with the Message Counter module.

For more advanced processing you also have the option to implement any algorithm using the Code modules (C#, Python and Javascript). With the Python Bridge module you can use any third-party library, for example an ML library so that you can use a trained model to analyze your data.

You find this type of modules in the Flow Studio Module Library/Process & Analyze / Calculations & Statistics and Process & Analyze / Custom Code & ML categories.

Learn more about these modules in the Documentation/Analytics Modules Overview →