Article

Deploying the Time in State Metric

Abstract

The Time-in-State Metric (TISM) provides a structured approach to evaluating process stability and performance by measuring the duration a system remains in a particular operational state. By implementing TISM with Flow Software, organizations gain structured, high-confidence insights into process efficiency, variability, and improvement opportunities. This whitepaper explores how Flow enables manufacturers to effectively deploy TISM through its aggregation, event processing, and visualization capabilities.

Introduction

In continuous manufacturing and process industries, making real-time adjustments presents challenges due to complex interactions and response delays. The Time-in-State Metric offers a framework for analyzing operational efficiency by tracking the duration processes remain in defined states. Flow facilitates this process by aggregating operational data into meaningful time slices, calculating and storing the percentage of time spent in each state, and visualizing trends that empower operators and managers to make informed decisions.

The MESA White Paper 50 on Time-in-State Metrics (TISM) describes a real-time methodology to enhance process performance by tracking how long a process remains in an optimal state versus suboptimal states. The goal is to improve decision-making, visualization, and control over continuous manufacturing processes.
Download the MESA White Paper
Download the Flow Software Datasheet

How Flow Enables Time-in-State Metric Implementation

Data Collection and Contextualization

Flow connects to a variety of operational data sources, including SCADA, historians, ERP, MES, IoT devices, and edge sensors. Once the data is collected, it undergoes contextualization, allowing operational states to be categorized based on predefined conditions. This ensures that manufacturers can analyze meaningful insights rather than isolated data points.

Aggregation and Calculation

Flow operates on an aggregated data model rather than a continuously refreshing real-time display. Data is processed in structured time slices that begin at one-minute intervals and extend to hourly, shift-based, and daily aggregations. Over time, this approach supports long-term analysis that reveals patterns and opportunities for process optimization. By calculating the percentage of time spent in each state, Flow provides a consistent and reliable method for tracking performance.

Event Processing and Action Insights

Flow’s event engine operates by detecting defined triggers, such as a machine state change, or a shift beyond a desired threshold in value or time. While not primarily used for real-time decision-making, the event engine is instrumental in post-analysis, allowing organizations to assess historical process behavior. A key strength of Flow lies in its ability to facilitate event-to-event comparisons, providing insights into process variability and efficiency. By grouping events, users can analyze average durations, variances, and the time between occurrences, enabling a deeper understanding of operational patterns and performance trends.

Visualization and Reporting

Flow provides structured insights tailored to different roles within an organization. Operators gain visibility into process state durations, assisting them in making well-informed adjustments. Managers rely on historical reports to assess compliance with optimal state conditions, while enterprise-wide analytics ensure that executives can compare TISM metrics across multiple facilities. By integrating with business intelligence tools, Flow enables a holistic approach to data-driven decision-making at every level.

Use Case: Applying TISM in a Manufacturing Plant

A chemical processing plant sought to optimize reactor stability by minimizing unnecessary fluctuations. Through Flow’s implementation of the Time-in-State Metric, data from the reactor’s historian was collected and categorized into operational states such as ‘Stable,’ ‘Transitioning,’ and ‘Out-of-Optimum.’ Flow calculated the percentage of time spent in each state at hourly and shift-based intervals, allowing for deeper analysis of process performance. When ‘Transitioning’ exceeded an acceptable threshold, an event-based trigger notified operators to take corrective action. Over a period of several weeks, TISM reports demonstrated that optimized adjustments led to a 15% increase in stable operation time, resulting in reduced waste and improved efficiency.

Conclusion and Next Steps

Implementing TISM with Flow Software provides manufacturers with structured insights into process performance. By focusing on aggregated data rather than real-time fluctuations, Flow enables data-driven decision-making, reduces process variability, and enhances long-term stability.

Organizations interested in leveraging Flow for TISM implementation should begin by identifying the key operational states that require monitoring. Configuring Flow to aggregate and analyze time-in-state metrics allows teams to establish a clear baseline for performance tracking. Once dashboards and reports are deployed, continuous monitoring and iterative process improvements will enhance process stability over time. To explore how Flow Software can support your TISM strategy, contact us today.

Ready to Discuss Your Analytics Project?

You might also like

How to Choose Your Industrial Data Management Solution
How to Choose Your Industrial Data Management Solution

The need for robust Industrial Data Management (IDM) solutions has never been greater. Whether integrating operational technology (OT) with enterprise IT systems, generating the KPIs that drive the business, or supporting advanced analytics, manufacturers require a platform that is flexible, scalable, and delivers ROI in days not years.

January 29, 2025
Read
Leverage OEE and APQ, The Right Way!
Leverage OEE and APQ, The Right Way!

Manufacturing leaders and plant managers often rely on metrics like OEE and APQ to gauge operational efficiency and identify areas for improvement. While these metrics are powerful, their effectiveness depends heavily on how they are applied. Incorrect calculation or over-reliance on OEE as a standalone KPI can lead to misleading insights.

January 16, 2025
Read
Unlocking Operational Excellence in  Colocation Data Centers
Unlocking Operational Excellence in Colocation Data Centers

Colocation data center providers face unique challenges in managing complex operations while meeting stringent reporting requirements for their customers. Providing accurate, timely, and standardized reports across multiple areas or sites is not only critical for customer satisfaction but also essential for compliance and operational efficiency.

January 14, 2025
Read