Most industrial data models stall at connectivity. They get devices streaming data to a dashboard and declare victory. The dashboard is a stepping stone. The intelligence layer is the product. And the progression from connected devices to autonomous operations follows a predictable three-phase arc.

I have led data platform initiatives across three companies, each at a different phase of this progression. At a global construction technology company, my team shipped an AI platform with an on-board neural network making 8,000+ autonomous decisions per workflow on a field device with no cloud dependency. At Zonar Systems, an enterprise data and telematics subsidiary of $44B annual revenue Continental AG, I led the development of a product-led growth platform connecting 50,000+ commercial fleet devices with zero-touch provisioning and a three-tier telemetry architecture. At Rehrig Pacific Company, I designed edge computing systems for a Fortune 500 retailer’s first fully automated “dark” dairy facility, processing RFID data locally with Swisslog warehouse automation integration.

The pattern is consistent regardless of the industry: construction technology, fleet telematics, supply chain logistics. Connectivity is table stakes. Contextualization is where the value inflects. Autonomous action at the edge is where the business model transforms.

What does Phase 1 look like?

Phase 1 is Connect. Get devices streaming data reliably. Sensors, telemetry, GPS, RFID, temperature, vibration, pressure. The technology for this phase is mature. MQTT for lightweight device-to-cloud messaging (the protocol of choice for constrained industrial environments because of its small packet size and publish-subscribe model). Store-and-forward for intermittent connectivity, ensuring no data loss when cellular or WiFi drops in the field. Cellular backhaul for mobile assets. For fixed industrial environments, protocols like OPC-UA bridge machine-level data (PLCs, SCADA) to higher-level analytics platforms.

The hard problems in Phase 1 are operational: provisioning devices at scale, managing firmware across heterogeneous fleets, and keeping cost-to-serve low enough that the unit economics work.

At Zonar, I experienced this directly. The legacy fleet telematics model required professional installation for every device: a technician dispatched to the customer site, vehicle downtime during the install, and manual device configuration via a laptop connection. Every new customer carried a fixed operational cost before generating a dollar of subscription revenue. I replaced this with a plug-and-play self-install device supporting zero-touch provisioning. The device powers on, connects to the cellular network, identifies itself, pulls its configuration from the cloud, and begins transmitting. No technician dispatch. No vehicle downtime. No manual configuration. Cost-to-serve dropped 82%.

That 82% figure comes from a five-component cost model I built with the finance team: technician labor, vehicle downtime, scheduling overhead, configuration support, and post-install troubleshooting. Each component was benchmarked at the per-unit level before and after the new provisioning model. The innovation was the operational model around it.

What changes in Phase 2?

Phase 2 is Contextualize. Raw sensor data has no business meaning. A temperature reading of 47.3 degrees means nothing without context: what asset, what location, what historical baseline, what threshold triggers action, what downstream system needs to know. Phase 2 transforms raw telemetry into structured, AI-ready intelligence.

This is where most organizations underinvest. They build the pipeline from device to dashboard and skip the intelligence layer. The result is a monitoring product. The field operator still has to look at the screen, interpret the data, and decide what to do. That is a visualization product. The intelligence product makes the decision for them.

The contextualization layer requires three capabilities:

A data model that maps physical assets to digital representations. At Rehrig, when I designed the edge architecture for Walmart’s dark dairy facility, the asset data model covered three entity types (pallets, large dairy trays, small dairy trays) with individual asset check-ins and parent-child group check-ins that mapped to Swisslog’s “Profile Checks” system. Without this model, RFID reader data was just a stream of tag IDs with timestamps. With it, the system tracked the physical flow of every asset through the automated facility.

A state machine that tracks asset conditions over time. This is where historical baselines live. A single vibration reading is noise. A vibration reading that exceeds the 90th percentile of the last 10,000 readings for that specific asset type, at that specific location, under those specific operating conditions, is an actionable signal.

A rules engine or ML model that identifies patterns worth acting on. The complexity here scales with the environment. Simple threshold rules work for temperature monitoring. Predictive models work for maintenance scheduling. Neural networks work for pattern recognition in complex, multivariate sensor data where the relationships between inputs are not linear or obvious.

At the company, I designed a three-tier telemetry ingestion architecture for the digital platform: real-time data for safety-critical alerts (sub-second latency for underground utility detection), near-real-time data for operational monitoring (bore path tracking, equipment status), and batch data for analytics and reporting (job completion records, compliance documentation). Each tier had different latency requirements, different storage costs, and different downstream consumers. The architecture question that mattered: which data matters when, and to whom. The Azure-based cloud platform my team delivered from scratch served all three tiers across 100+ export-controlled countries with GDPR and CCPA compliance.

What does Phase 3 require?

Phase 3 is Act. Autonomous actions at the edge driven by contextualized data. The system processes sensor data, applies the intelligence model, and makes decisions without waiting for a human to interpret a dashboard. The field operator’s device tells them what to do. Or it does it for them.

At the company, I drove the product strategy for an edge-first AI platform. The architecture includes an on-board neural network that processes 8,000+ sensor data points per workflow into autonomous frequency scanning decisions at the edge. No cloud dependency for core operations. The device updates its intelligence model periodically from the cloud, but it acts independently in the field.

This architectural choice was non-negotiable. The company’s products operate in underground construction environments. GPS-denied. Intermittent or zero connectivity. Extreme physical conditions. A cloud-dependent AI product would fail in the environments where the AI creates the most value. The neural network had to run on the device’s embedded processor within the power and thermal constraints of a handheld field instrument.

I proposed and led the company’s first-ever customer field trials to validate the AI against real operating conditions before committing to production. Testing with real operators in real ground conditions. A first for a company that had shipped products for over 30 years. The trial data validated the 40% projected support call deflection: the AI automates the frequency interpretation that previously required a human expert with years of experience, directly addressing the skilled labor shortage in the industry.

Phase 3 is where the business model changes. A connectivity product is sold per-device. An intelligence product is sold per-outcome. At Zonar, I designed a $0 customer activation and hardware-as-a-service pricing model. The customer pays nothing upfront for the device. Revenue comes from the subscription. The pricing architecture follows the value architecture: when the platform delivers intelligence rather than raw data, the product captures a share of that value. 15% greater close rate. 38% more units per account. $1M in new client revenue.

How does this apply to manufacturing?

Manufacturing is the clearest case study for this progression. Phase 1: connect machines and track production data. Machine data ports have existed for decades, and protocols like OPC-UA and MTConnect provide standardized access to CNC, PLC, and sensor data.

Phase 2: contextualize that data with drawing revisions, work orders, and quality records so the intelligence layer knows what the machine is supposed to be making and whether it matches spec. This is the gap I have seen firsthand. I owned and operated a Precision Machine Shop (5-axis machining, tolerances of ±0.001 to ±0.005 inches, Mitutoyo CMM inspection). I maintained three layers of drawing controls and a Fusion 360 CAM toolpath still did not follow a drawing revision change. The machinist caught it mid-run, but we scrapped parts. The drawing, the work order, the material cert, and the machine program all existed. They existed in different systems, different formats, different revision states.

Phase 3: the system catches a revision mismatch before the operator runs the wrong part. The intelligence layer that connects drawing management to machine operation does not exist for most small and mid-size manufacturers. For the approximately 80,000 defense contractors that need CMMC Level 2 third-party certification, this gap is a compliance risk. ITAR controls who can see defense-related technical data. CMMC controls how that data is protected digitally. A drawing revision served to the wrong machine, or accessed by an unauthorized person, is a compliance failure. The product that solves this problem does not add complexity to the shop floor. It removes it. The operator gets the right drawing without thinking about revision control. That is simplicity at the architecture level.

What should product leaders take from this?

Three principles:

Build the intelligence layer before you need it. If you wait until customers ask for AI, you are two years behind. The contextualization layer requires data model decisions that are expensive to change later. Design for Phase 2 even if you are still shipping Phase 1. At the company, I seeded the AI architecture decisions 18 months before the first field trial. The data model and telemetry tiers had to be right before the neural network had anything meaningful to process.

Edge-first architecture is non-negotiable for industrial environments. Cloud dependency is a luxury that field operators, factory floors, and remote worksites cannot afford. The intelligence must live where the action happens. At Rehrig, the Walmart dark dairy facility required the edge system to operate independently of cloud connectivity, with Swisslog receiving clean, pre-processed records in near-real-time. The cloud handled analytics and cross-facility pattern recognition. The edge handled decisions. This is the correct architecture for any environment where connectivity is intermittent or unreliable.

Pricing follows the value curve. Connectivity is commodity. Intelligence is differentiated. Autonomous action is transformational. Price accordingly. The per-device subscription model works for Phase 1. Phase 2 and Phase 3 justify consumption-based and outcome-based pricing that scales with the value delivered. I have designed pricing models across all three phases: per-device at Zonar (Phase 1), per-outcome at Rehrig through the pay-per-use model (Phase 2), and subscription-plus-value at the company through the digital platform architecture (Phase 3).

Connect. Contextualize. Act. That is the progression. The companies that stall at connectivity will be displaced by the ones that reach autonomy.

Technologies and standards referenced

  • MQTT (lightweight IoT messaging protocol)
  • OPC-UA (industrial machine-to-cloud interoperability)
  • MTConnect (manufacturing equipment data standard)
  • RFID / NFC (radio-frequency identification)
  • CAN bus (Controller Area Network, vehicle data protocol)
  • OBD-II (on-board vehicle diagnostics)
  • GDPR and CCPA (data privacy regulations)
  • ITAR and CMMC (defense export controls and cybersecurity)
  • Microsoft Azure (cloud platform)
  • Swisslog (warehouse automation systems)
  • Store-and-forward architecture (intermittent connectivity)

About the author

Product executive. 15+ years building industrial AI platforms, B2B SaaS products, and connected smart device ecosystems in regulated industries across 100+ countries. Three portfolio turnarounds. Three org builds. Three times the methodology transferred, only the industries changed.

Nick builds at the hardware-software-data intersection. Industrial AI. Edge-to-cloud platforms. Workflow automation systems making 8,000+ decisions per workflow with zero cloud dependency. The career pattern: enter complex regulated environments, find the kill decisions others avoid, and redirect capital from legacy programs to products that ship and outlast him. The acquiring company kept his product. Threw away their own.

Most recently Head of Product at Digital Control Incorporated. Global product portfolio. Turnaround-to-growth. Previously at Zonar Systems, a subsidiary of $44B annual revenue Continental AG, leading a $70M connected device platform across three continents, and at Rehrig Pacific Company building an innovation function from scratch.

Leading global products and global teams as a Chief Product Officer, Head of Product, VP of Product for B2B and B2B2C companies for digital transformation and product growth leadership.

More about Nick →