Skip to main content Skip to footer

Article

How Unified Namespace Simplifies Your Data Management Strategy

Unified namespace is kind of a contaminated buzzword these days. It's used a lot, you see it in a lot of the news and newsletters, and it means slightly different things to different people. So this is our take on what we at Crosser see as contained within this concept of a unified namespace.

Key Characteristics of Unified Namespace:

  1. Standardized Data Model
    A unified namespace means that you have a standardized data model defined, applied over a large variety of different systems and locations. This model harmonizes data from different systems into a well-defined structure.
  2. Presentation Layer
    The presentation layer is a centralized data hub where data producers can publish data and consumers can access it based on the standardized data model.
  3. Inclusion of Historical Data
    There is debate about whether the presentation layer should present only the current state of a system or if it should also incorporate historical data. This impacts the type of presentation layers that can be used.

 

Crosser | UNS Overview
Crosser | UNS Overview

Benefits of Using Unified Namespace

  1. Simplifying Application Implementation
    A standardized data model allows applications to use the data without needing to know about individual data sources. Different applications don't need to know anything about individual sources of data; they can just work and use the data based on this standardized model and don't need to care about where data originated and the format it had at the original source.
  2. Easier Introduction of New Data Sources
    New data sources can be easily integrated as soon as you have adapted the data they produce according to the model. It can be published into the presentation layer and be made available for any application that uses it. Existing applications can immediately use data from new producers without changes since they all follow this standardized data model.
  3. Architectural Decoupling
    Producers and consumers of data are decoupled, allowing for easy addition of new data sources and applications without affecting existing ones. This type of architecture simplifies working with data over time, as you will typically add both more applications and more data sources. This type of decoupling is a very useful design pattern for many cases.

Challenges in Defining a Data Model

Defining a data model across a large number of systems is challenging and often requires an iterative process. This is the main hurdle to get a unified namespace in place. Once defined, the rest is pretty straightforward, but it has to evolve over time. Starting with some systems and use cases, you can use that input for your design and then let it evolve over time.

Components of Data Modeling

  • Naming Conventions
    Consistent naming conventions are necessary to harmonize data from different sources. For example, temperature is called temperature wherever it's coming from.
  • Contextual Information
    Adding contextual information makes data more useful for applications. In addition to the actual values, you might add metadata such as the unit and the range of values expected from a sensor.
  • Hierarchical Structure
    A hierarchical structure, often based on the ISA95 hierarchy, organizes data points and their relations. You can also express relations horizontally across different hierarchies, such as finding data from similar devices or energy measurements from all machines.

Options for Presentation Layers

  • Message Broker (e.g., MQTT)
    • Supports live state data and event-driven setup.
    • Scalable but lacks historical data and multiple hierarchies.
    • Topic granularity must be balanced against the frequency of updates that applications can deal with.
    • Fixed structure through topics, limiting alternative data views.
  • Database
    • Supports live and historical data, flexible queries.
    • Adds latency and complexity in defining data models.
    • Allows different views of data through queries.
    • Data model and relations are not as explicit as in message brokers.
  • OPC UA
    • Comprehensive features for live and historical data, but complex and not scalable.
    • Supports both subscription and pulling of data.
    • Complexity in accessing data and setting up servers.
    • Horizontal scaling is challenging.

Where Crosser Can Help

Crosser can help with data preparation and publication into various presentation layers, building applications that utilize and derive data from the UNS, and supporting any presentation layer chosen for the unified namespace implementation. Crosser supports everything from preparing data from any type of data source according to a predefined data model to publishing it into the chosen presentation layer. Applications can use data from these layers, and in some cases, publish derived data back to the presentation layer. This includes calculating OEE values, storing historical data in databases, running machine learning models, and sending back the results for use by other applications connected to the unified namespace.

 

Crosser | UNS Overview with Crosser Flows
Crosser | UNS Overview with Crosser Flows

Interested in learning more about UNS? Watch the webinar video: How to Simplify your Data Management Strategy with Unified Namespace

 

Curious about the Crosser Platform? Sign up for a Free Trial here or Schedule a private demo with one of our Experts.

About the author

Goran Appelquist (Ph.D) | CTO

Göran has 20 years experience in leading technology teams. He’s the lead architect of our end-to-end solution and is extremely focused in securing the lowest possible Total Cost of Ownership for our customers.

"Hidden Lifecycle (employee) cost can account for 5-10 times the purchase price of software. Our goal is to offer a solution that automates and removes most of the tasks that is costly over the lifecycle.

My career started in the academic world where I got a PhD in physics by researching large scale data acquisition systems for physics experiments, such as the LHC at CERN. After leaving academia I have been working in several tech startups in different management positions over the last 20 years.

In most of these positions I have stood with one foot in the R&D team and another in the product/business teams. My passion is learning new technologies, use it to develop innovative products and explain the solutions to end users, technical or non-technical."

Close