Model, transform,
and distribute information.

Govern how you manage and execute the process of turning data into information, across your entire enterprise.

SEE A DEMO
Learn how Flow works

Information Management

Made Simple

Flow was built from the ground up to make it easy for you to control your data and engineering governance, bring your varied data sources together, and ensure that the information you need is reliably transformed and distributed.

No / Low Code

Avoiding custom scripting and code means you can scale.

Platform Agnostic

Avoid vendor lock and ensure your data can flow freely.

OT/IT Convergence

Establish central data and engineering governance.

"We unified our data sources, our data treatments, and our data experts. Then we sent context rich information to our data warehouse."

Start Your Flow Trial

Download the Whitepaper

One Simple Solution, Five Key Steps

Flow's Process For Turning Data Into Information At Scale

Model

Consolidated modeling to abstract and unify multiple underlying namespaces

Connect

Connection into multiple data sources, including OT, IoT, IT, and manually entered

Transform

Calculation services to clean, transform, contextualize, and combine time-series and transactional data

Visualize

Decision support via browser- based visualization, reporting, dashboarding, and notification

Bridge

Data collection and bridging via industry standard -protocols, including MQTT and REST
For scalability, Flow provides a modeling and configuration environment with an open architecture and templating. Leverage the work you have done in your existing systems while using Flow's self-service no code/low code approach.

Model

Define How You Create, Manage

And Distribute Information

An information model defines the key data an organization focuses on and establishes governance for how that data is created, managed, and used.

The model contains the business rules for how data is transformed into useful information and then distributed.

Flexible Structure

Follow standards like ISA-95 or use your own structure. The model is yours to define.

Measures & Events

Identify process and calendar events, and determine how KPI measures are aggregated.

Templatized

Using a template-first approach ensures you can easily scale and manage your solution.

Connect

Unify Your Systems Regardless

of Platform or Technology

Make it possible for your engineers and managers to work with data from across many systems in one centralized location. Every Flow system includes data connectors so you can easily connect to historians, SQL databases, OPC servers, MQTT brokers, cloud solutions, and more.

Enterprise

Cloud applications and databases can connect too!

Site & Legacy

Unite your manufacturing systems in one view.

Open Source

Connect to the latest and greatest open source databases.

Transform

Transform Your Raw Data Into A

Richly Contextualized Narrative

Flow Software includes a suite of powerful engines designed to execute the business rules established in the Information Model, ensuring that data processing is consistent, efficient, and aligned with organizational goals. This integration allows for precise data management and effective decision-making.

Data Engine

Purpose built to address the unique challenges of manufacturing.

Recalc As Needed

When underlying data changes, all calcs and their dependents automatically rerun.

Version History

No values are ever discarded, allowing a full version history of any previous records.

Visualize

Dashboards That Build trust are

key to confident decision making

Choose to visualize your data using Flow's included web based dashboards or connect your existing dashboard application to Flow's data feed or API. The choice is yours.

Either way, your decision makers will have information readily available that they can trust when it matters most.

Comments

Add comments to any data and it will be annotated to the original data point.

No User Limits

There is never a licensing restriction on the number of users or dashboards.

Interactive

Investigate any data point, even the expressions and data behind the values.

Bridge

Feed  Data Lakes, Apps And

Other Systems Automatically

What really sets Flow apart is the ability, and willingness, to freely publish every piece of information we create out to other systems and data warehouses. Nothing you do in Flow is held captive, and everything is designed to be securely shared.

On Demand

A single endpoint to query normalized data from many systems.

On Schedule

Push datasets on a calendar period interval like hourly, daily, weekly, etc.

On Trigger

Use specific events to trigger the publication of information or notify your team.

Tiered Flow Servers

REST API

AWS

Snowflake

Azure Event Hub

MySQL

PostgreSQL

MSSQL

MQTT

MQTT Sparkplug

Kafka

Canary Historian

Email

Microsoft Teams

Slack

Plus many more

Need a Historian? Use Ours at No Additional Cost.

Learn More
"Flow has become the standard tool within ABInBev Africa, with all our breweries using Flow for real time information and reporting."
Rowan Ray, Tech Supply Specialist, ABInBev

Data Into Information

With your data sources unified, Flow is ready to turn raw data into contextualized information, putting it in the hands of the men and women that can drive change in your organization.

Create a Model

Build and manage an information model of the key metrics and events that matter to your organization. Govern both the data you care about as well as the engineering and expressions used to turn that data into actionalbe information.

Calculate and Transform

Deploy your model and watch as Flow's data engines go to work, continuously processing thousands of KPIs and events. All the results are stored in the Flow database and will automatically recalculate if any of the underlying data changes.

Distribute the Results

Stream this new information in real time, landing it in your data lake (now rich with context!) in front of your people, or send it to other applications and databases. Flow integrates with your other platforms ensuring your data is available.

Define How Data Is Processed Into Information

These three steps are necessities for turning manufacturing data into shared information. Each is included with every Flow license and is the secret to building analytics architectures that actually scale.

Sound familiar? They should. Flow is the first commercially available Unified Analtyics Framework and follows the best practices as outlined in a UAF.

Learn more about the UAF here.

Step One: Create an Information Model

Flow is ideal for building an information model of metrics to improve production efficiency, increase quality, drive maintenance decisions, measure utility or material consumption, understand downtime, and monitor adherence to production plans. Templatized Flow models are centrally managed, honoring enterprise-established business rules and providing governance, while remaining flexible. Operations deploy template instances and add their own site-specific context. Since Flow information models are not hard coded to specific data sources, each deployed instance is adaptable to the site’s environment, regardless of the system architecture.

Step Two: Deploy Intelligent Execution Engines

Data Engine
Flow’s data engine excels at KPI calculations. It connects operational databases and servers to join data points from different systems, cleanse data, and slice it into context-rich KPIs. The engine handles the rerunning of calculations, versioning results, and allows for KPI interrogation, letting users drill down to examine raw data within the original source. Trust is essential, and with Flow, you can be confident that the information driving your operations is solid.

Message Engine
The message engine seamlessly integrates data into the organization's existing notification and communication tools, such as email, SMS, Microsoft Teams, and Slack. By delivering real-time notifications and updates through these familiar platforms, it ensures that stakeholders receive critical information in a timely and efficient manner, enhancing communication and responsiveness across the organization.

Integration Engine
The integration engine automates data streaming to various databases, data lakes, and BI tools, either on a triggered or scheduled basis. It matches the schema of target systems, facilitating seamless data integration, ensuring that all enterprise systems are synchronized with the latest information, and providing a unified, accurate data flow across the organization.

Step Three: Provide Universal Information Access

With new calculations and KPIs created, Flow becomes your hub to share contextually rich information with other applications and people at your plants and within the entire enterprise. A corporate Flow instance connects your site deployments, and all underlying data sources, to your data teams and advanced applications. Flow unifies a myriad of operational database formats that can be queried without requiring knowledge of their structure and return results in a single standard schema. As more and more subject matter experts use Flow, their expertise further enriches the information before additional analysis or data warehousing is completed.

Learn Exactly How Flow Works

A UNS (Unified Namespace) Or A UAF (Unified Analytics Framework)?

Flow believes in a Unified Analytics Framework, but you might have read about a Manufacturing Data Hub as well. What is it? It's an architecture that is designed to take your Unified Namespace (UNS) and expand the collection and sharing of real time data to include calculated KPIs and access to historic databases.

Imagine what you could do if you had a scalable platform built specifically to transform OT and IoT data streams into analytics-ready information. A way to connect all of your data producers and consumers, already plugged into your UNS, to all of the raw historical data living in other databases. With Flow in your architecture, this is possible today.

What is the UAF?

Information Management Is Hard.

Which Ways Have You Tried?

The need to manage your information is not new. In fact, you've probably been trying to do it a number of ways for a long time, possibly without knowing you were creating a process or strategy.

Duplicate Everything In The Cloud

Ever heard this one? Is the data in the lake structured and modeled? Is it accessible to all your people? Are your analysts spending half their time searching for and massaging the data? What about bandwidth costs? Is the lake actually being used, or has it turned into an unusable swamp?

Build It On The Historian

This one happens a lot! You get your historian vendor to build a new chart type, one you've always wanted, great! Now you want to use this shiny new chart to show some data from a different database. Can you get that data into your historian? Does it make sense to duplicate your transactional data into a time-series format?

Custom Build Our Own Solution

This is a common one. But, do you know how long it will take? Six months, a year, maybe two? Will you need a dedicated development team? Will they keep up with new requests? What if they resign?

It's All In Excel

How about this one? How often is the spreadsheet incomplete in your morning meeting? How often do the spreadsheet files become corrupted? How do you know a key value being reported hasn't been changed without you knowing? In the end, how many versions of the "truth" do you have?

Find A Better Way

We have a clear Five Step Plan that is designed to help you develop and launch your Information Management strategy using Flow. Learn about it in our free, no email address required white paper.
DOWNLOAD