What is Latency?
Latency (also known as delay) is the time that elapses between sending a request and receiving a response. It is usually expressed in milliseconds (ms).
In network or system communication, latency therefore means: 🕒 How long does it take for a data packet to travel from point A to point B and have something done with it?
🔍 Types of latency
| Type of latency | Description |
|---|---|
| Network latency | Delay in transmitting data across networks (e.g. Ethernet, Wifi) |
| Processing latency | Time a device or controller needs to process data |
| Device latency | Delay caused by Sensors, PLCs or SCADA systems |
🧠 Why does latency matter?
- In IT: low latency = faster websites, better video call quality, responsive apps
- In OT: low latency is critical for real-time control, e.g. of machines, Actuators or alarms
🏭 Latency in industrial environments
Industrial networks frequently require very low latency (sometimes < 10 ms), for example for:
- Driving robots or motors via EtherCAT or ProfiNET
- Monitoring critical processes in energy, chemicals or water treatment
- Real-time communication between PLCs, HMIs and Remote IO
⏱️ Excessive latency can lead to unsafe situations, production problems or delays in emergency shutdowns.
⚠️ Factors influencing latency
- Distance and network delay (e.g. between field and Cloud)
- Quality and configuration of switches/routers
- Network congestion (too much simultaneous traffic)
- Type of protocol (e.g. Modbus RTU vs Modbus TCP)
- System load on servers or controllers
📌 In summary
Latency is the delay between action and response within a network or system. In industrial (OT) applications, latency must be as low as possible to control processes safely and accurately.
