Skip to main content Skip to footer

Deliver results

The last step in our Flow designs is to deliver the results to our destination systems, whether it's data or actions, or both. You basically have the same set of options as for your input sources, we'll cover some common ones below.

APIs

Most modern systems offer APIs that can be used to send bot data and trigger actions. Most common is HTTP based REST APIs. Here we can use either the HTTP Request module to make POST/PUT request where the body comes from our Flow messages. To simplify the reuse of APIs we can also build a Universal Connector to deliver results to REST APIs.

Databases/Data Warehouses

When delivering data a database is a common destination and then the Insert modules in the database category are the simplest way to store data in any of the common databases. They work in a similar way whether it's a SQL database or some other type of database.

When working with data warehouses like Snowflake and Databricks you can use Insert modules here as well, to write data directly to the destination tables. This might not be the most efficient way though, due to both performance and costs. A more common approach is to batch data, upload it to a staging area accessible from the warehouse, and then use the corresponding Publisher modules to transfer the batches into the destination tables.

You find these modules in the Flow Studio Module Library/Destinations / Databases category.

Read more about these modules in the Documentation/Analytics Modules Overview →

Notifications

When you have analyzed the data in your Flow and found something relevant, like an anomaly, you may want to just send a notification to someone that needs to take actions. There are modules in the library to for example send Slack/SMS messages or emails.

You find these modules in the Flow Studio Module Library/Destinations / Notifications category.

Read more about these modules in the Documentation/Analytics Modules Overview →

Files

Files are still used a lot to transfer information between systems. For generic text files can use the File TextWriter module to write the whole file at once, or the File StreamWriter to append data and then start on a new file at a specified frequency. If you want the file to contain structured data, you can use the JSON or XML module to convert your data before writing the file. For CSV data there is a dedicated CSV StreamWriter module that appends message objects to rows in a CSV file and use property names as column names. There is also a Parquet Writer to create Parquet files.

Note that all these modules use local storage, i.e. the destination must be accessible from the Node you run the Flow on. If installed your Node as a Docker container the only external directory mapped into the container is the data directory. If you want to use other destination directories they must be mapped into the container file system. See Node Installation for more information.

You can also upload files to a SFTP server using the SFTP Upload module.

These modules are found in the Flow Studio Module Library/Destinations / Files category.

Read more about these modules in the Documentation/Analytics Modules Overview →

Machines

Sometimes the result needs to be sent back to the machines, either from northbound systems or for closed loop use cases. An example of the former is order data from an ERP system that is needed to print labels on the delivery packaging. An example of the latter is optimization use cases where machine data is collected and sent through a machine learning model that generates optimized settings that are sent back to the machine, all in a single Flow.

There are corresponding Writer modules for all the common machine interfaces that can be used to write values back to the machines.

You find these modules in the Flow Studio Module Library/Destinations / Industrial category.

Read more about these modules in the Documentation/Analytics Modules Overview →