Skip to main content Skip to footer

Article

Solving the IT–OT Integration Puzzle: Lessons Learned from Real Industrial Deployments

IT–OT integration has been discussed for years, but in many enterprise environments it still proves difficult to implement in practice. What we typically see is not a lack of data or technology, but a combination of organizational, technical, and architectural challenges that make progress slower than expected.

In the following sections, we share practical observations and real-world examples that illustrate how IT and OT systems can be connected in a scalable, secure, and pragmatic way. The focus is on approaches that work in complex industrial environments.

Common Challenges in IT–OT Integration

Before looking at solutions, it is useful to step back and consider the recurring issues that teams encounter in industrial environments:

  • Skill gaps between IT and OT teams
    Most organizations still have teams that are either IT-focused or OT-focused. Bringing these disciplines together often creates bottlenecks due to missing overlap in skills and experience.
  • Siloed data across systems
    Relevant data exists on both the OT and IT levels, but it is often scattered across factories, legacy machines, flat files, databases, and enterprise systems. Accessing, connecting, and contextualizing this data is a major challenge.
  • Brownfield complexity
    Many environments include legacy equipment, industrial PCs, multiple protocols, and complex network topologies. In many cases, not everything can be connected from a single location. 
  • Scalability across sites
    What works for site A does not necessarily work for site B or C. Differences in age, equipment, and network design require flexibility without changing the overall integration approach.

Because of these challenges, many organizations are moving away from large, monolithic projects toward incremental, agile initiatives. In practice, this usually means starting small, proving value quickly, and then building on what already exists rather than replacing everything at once.

Crosser’s Role in Solving the IT–OT Puzzle

With these challenges in mind, the question becomes how to approach IT–OT integration without adding even more complexity. Crosser is designed to support exactly this incremental approach. At its core, it is a hybrid integration platform that connects systems across OT and IT environments.

On the OT side, Crosser can:

  • Connect directly to PLCs and industrial equipment
  • Read files generated by legacy systems
  • Integrate with MES, SCADA, and historian systems

On the IT side, Crosser connects to:

  • Cloud platforms and data lakes
  • ERP, CRM, and warehouse management systems
  • Other enterprise applications 

The idea is to act as the intelligent layer between these systems, using a low-code approach that allows people with different skill sets to collaborate on the same integration workflows.

More Than Data Transport: Transformation and Logic

In most real-world projects, integration is not just about sending data from A to B. In fact, in the vast majority of cases, data transformation and normalization are required.

OT protocols often operate on a very low level, dealing with bits and bytes. To make this data usable across the organization, it needs to be:

  • Normalized into a common data model
  • Enriched with additional context
  • Aligned with IT-side data structures

Beyond transformation, many use cases also require logic and actions to be applied directly within the data flow:

  • Triggering work orders based on events
  • Sending notifications via tools like Microsoft Teams
  • Feeding dashboards or analytics systems

Crosser supports integration in both directions, from OT systems to IT environments and from IT systems back down to the OT layer. This enables closed-loop scenarios where insights generated in IT systems can directly influence operations on the factory floor.

Platform Architecture: Control Center and Node

The Crosser platform consists of two main components:

Crosser Control Center

The Control Center is the management and orchestration layer:

  • Centralized SaaS application hosted in Microsoft Azure
  • Used to design, deploy, and manage integration flows
  • Provides visibility into system health and flow status

This is where users spend most of their time building and managing integrations.

Crosser Node

The Crosser Node is the runtime environment where all data processing happens:

  • Can run as a Docker container or Windows service
  • Platform-agnostic: supports edge gateways, industrial PCs, virtual machines, Kubernetes, and cloud container services
  • Typically deployed close to the OT environment, often behind the firewall

A key aspect is that all payload data is processed locally in the Node. No operational data is sent to the Control Center. Communication is initiated by the Node via outbound HTTPS, which simplifies security and firewall requirements.

Each Node can run multiple flows in isolated processes. Health, performance, and status information are reported back to the Control Center, making it easier to monitor distributed deployments.

Designing Integration Flows with Low-Code

Integration flows are designed using a low-code interface called Flow Studio, which is part of the Control Center.

Key concepts include:

  • A library with prebuilt connectors, providing connectivity to over 800 systems
  • Drag-and-drop design for source and destination systems
  • Built-in modules for transformation, filtering, aggregation, and triggering

For advanced requirements, Crosser can be extended in several ways:

  • Custom code in Python, C#, or JavaScript
  • Python-based integration of analytics or AI logic
  • Deployment of trained AI models directly into Flows for edge inference
  • Custom modules built using the publicly available SDK

This flexibility allows teams to start simple and gradually extend functionality as use cases mature.

Real-World Use Cases

Plant Data Hub in a Global Manufacturing Environment

One customer project in Germany focused on addressing highly siloed OT environments with strict network segmentation. The solution was implemented as a Plant Data Hub:

  • Multiple Crosser Nodes deployed at different network levels
  • Local Flows connecting to data silos and normalizing data
  • Aggregation of normalized data into a central OT-level hub

In a second phase, data was forwarded to Snowflake for analytics. Within eight weeks, seven use cases were implemented. Today, the same concept is used across plants worldwide, with local adjustments where needed.

Key benefits:

  • Fast time to value
  • Easy replication across sites
  • Fast learning curve leading to customer self-sufficiency

Data-driven Maintenance Work Orders Based on OT Events

A global manufacturing customer in the US focused on automating maintenance processes. By combining OT signals with conditions from IT systems, Crosser was used to:

  • Detect specific events on the factory floor
  • Apply business logic within the integration Flow
  • Automatically create work orders in the ERP system

This went far beyond simple data transport, delivering actionable insights and real process automation.

Edge AI in the Chemical Industry

In a more advanced use case from the chemical industry, large volumes of OT data needed to be analyzed by an AI model. Sending all raw data to the cloud would have been prohibitively expensive.

Instead:

  • The AI model was trained in the cloud
  • The trained model was deployed directly into a Crosser Flow
  • Data was normalized and windowed at the edge
  • Inference was performed locally in the Crosser Node

Results were then either stored locally or sent to cloud systems only when specific conditions were detected. This approach significantly reduced cloud costs while enabling real-time insights close to the process.

Final Thoughts

IT–OT integration is not a one-time project or a single tool, it’s a continuous process. Success comes from taking small, incremental steps, involving the right skills at the right time, and adapting to local constraints without losing sight of a common architecture.

By combining low-code integration, flexible deployment, and the ability to embed logic and analytics directly into data flows, organizations can steadily move toward meaningful, scalable IT–OT convergence.

Related resources 
On-demand technical session: Architecting IT–OT integrations and real-world use cases, watch the video here.

About the author

David Nienhaus | Senior Solution Engineer

David is a Senior Solution Engineer at Crosser. He has over 15 years experience working with software integration and digitization projects for critical infrastructure.
His engineering background gives him the understanding and focus needed to solve customer use cases in the most efficient and successful way.

Close