Why Are You Testing Cellular Signals and speeds?
Most people run a speed test and look at the download figure first.
Then they either feel good or they feel bad, usually depending on whether the number is higher or lower than their broadband at home.
That is not what signal testing is for.
A download speed figure tells you one thing about one moment on one network. It says almost nothing about whether your router or IoT device will perform reliably for the application you are deploying. It does not tell you about upload capacity, which is often what the application actually depends on. It does not tell you about SINR, which is what predicts consistency. It does not tell you about latency, about RF stability under load, or about how the connection will behave six months from now when the site adds more devices and someone installs a CCTV NVR that nobody mentioned in the specification.
Signal testing exists because none of that appears in a single headline figure.
This guide is about what to actually look for, application by application, including real-world throughput ranges, realistic data use estimates, the upload problem that catches installations out constantly, and the configuration pitfalls that quietly destroy perfectly good deployments.
Why the Download Number Is the Wrong Starting Point
Mobile networks are built around consumer behaviour. Consumers stream. They scroll. They watch. They download apps and updates and video. All of that is download traffic. A consumer on a train wants a fast download. They do not particularly care about upload.
So that is what operators optimise for. LTE network architecture allocates far more spectrum and resource blocks to downlink than uplink. 5G NR does the same. The result is a systematic asymmetry baked into the technology itself, and it matters enormously for anyone deploying routers or IoT devices at the edge.
The applications that actually make up the majority of UK IoT and edge deployments are uplink-led. They are sending data, not receiving it. CCTV streams travel from the site to the server. Telemetry travels from the sensor to the platform. SCADA values travel from the RTU to the control room. Payment transactions travel from the terminal to the processor. Alarm events travel from the panel to the monitoring station.
When you test signal and measure only download, you are measuring the half of the connection that most of these applications use least. You can have 80 Mbit/s download and 8 Mbit/s upload, which sounds like a strong connection, and it is fine for consumer use. But if you have four cameras trying to stream 720p at 2 Mbit/s each, you have just used all 8 Mbit/s of your upload capacity, and you have not tested whether it holds under load, at different times of day, with other devices on the LAN competing for the connection.
This is why upload testing matters. Always test upload separately. Do it under load. Do it at more than one point in the day. The variation will often surprise you.
SINR, RSRP and RSRQ: What You Are Actually Testing
Speed tests give you throughput at a moment in time. Signal metrics give you the underlying conditions that determine whether that throughput is reliable or fragile. For anyone making decisions about an installation, the signal quality figures are more valuable than the headline speed.
There are three you need to understand.
RSRP (Reference Signal Received Power)
This is signal strength. It is measured in dBm and will always be a negative number. The closer to zero, the stronger the signal. RSRP above -80 dBm is generally strong. Between -80 and -100 is acceptable. Below -100 starts to become marginal. Below -110 is problematic, and below -120 you are operating at the edge of what the network will maintain a session for.
RSRP alone is not enough to judge an installation. A strong signal in an area with heavy cell congestion or significant interference can still produce poor performance. That is where SINR comes in.
RSRQ (Reference Signal Received Quality)
RSRQ is a combined measure that factors in both signal level and the noise and interference floor. It is also measured in negative dB. Above -10 dB is good. Between -10 and -15 is fair. Below -15 starts to cause problems. RSRQ gives you a view of the channel quality rather than just the signal power.
SINR (Signal to Interference and Noise Ratio)
SINR is arguably the most useful single figure for predicting real-world performance. It measures how much stronger your signal is compared to everything else the device can hear, expressed in dB. Higher is better. Above 20 dB is excellent. Between 13 and 20 is good. Between 0 and 13 is usable. Below 0 means interference and noise are winning, and you will see poor throughput, high retransmission, and unreliable sessions.
Poor SINR is often the hidden reason an installation misbehaves. The device has signal. The speed test is not terrible. But the connection drops, sessions time out, VPN tunnels flap, and nobody can pin down why. It is usually SINR. It is usually an antenna, a cable, a poorly chosen installation point, or cell contention at busy times.
Good signal testing looks at all three figures together, not just RSRP in isolation. You can run a full signal test here to see what your installation is actually working with.
The 10 Most Common UK IoT and Edge Applications: What They Actually Need
The table below is a planning reference. These ranges reflect real UK deployments, not lab conditions. Actual usage varies with polling interval, codec, VPN overhead, firmware behaviour, user habits, and poor RF conditions that force retransmission. Always add margin.
| Application | Technology fit | Real speed need | Typical monthly data | What actually drives usage |
|---|---|---|---|---|
| PLC / SCADA / RTU telemetry | NB-IoT, LTE-M, Cat 1, 4G router | 0.05 to 1 Mbit/s | 50 MB to 2 GB | Polling rate, VPN overhead, remote access |
| Smart metering / utility sensors | NB-IoT, LTE-M, Cat 1 bis | 0.01 to 0.2 Mbit/s | 5 MB to 500 MB | Message frequency, firmware activity |
| CCTV health and alarm signalling only | Cat 1, 4G router | 0.1 to 1 Mbit/s | 100 MB to 2 GB | Keepalives, snapshots, remote access |
| Live CCTV streaming | 4G Cat 4 upward, 5G | 2 to 20+ Mbit/s uplink | 50 GB to 500+ GB | Codec, resolution, frame rate, camera count |
| EV charger backhaul | Cat 1, Cat 4 | 0.1 to 2 Mbit/s | 200 MB to 5 GB | OCPP, firmware, payment integration |
| Card payment / kiosk / POS | Cat 1, Cat 1 bis, 4G router | 0.1 to 1 Mbit/s | 50 MB to 1 GB | Transaction reliability, VPN, security overhead |
| Vending / unattended retail | Cat 1 bis, Cat 1, 4G router | 0.05 to 1 Mbit/s | 50 MB to 2 GB | Telemetry cadence, card transactions, images |
| Digital signage and displays | 4G router, 5G for rich media | 1 to 10 Mbit/s | 2 GB to 100+ GB | Local cache vs streamed, update frequency |
| Building management / HVAC / plant | Cat 1, 4G router | 0.1 to 2 Mbit/s | 200 MB to 5 GB | Polling rate, remote engineering sessions |
| Temporary office / welfare / site internet | 4G router, 5G router | 10 to 200+ Mbit/s | 50 GB to 1000+ GB | Users, video calls, updates, guest Wi-Fi |
Application Detail: What the Numbers Actually Mean
1. PLC, RTU and SCADA Telemetry
The raw data payload for industrial telemetry is often tiny. An RTU polling a handful of Modbus registers every 60 seconds and pushing values to a SCADA platform is sending packets, not files. In pure telemetry terms, you are looking at kilobytes per session, not megabytes.
The problem is everything else that rides the same connection. A VPN tunnel adds overhead. Engineering access sessions add gigabytes if they go on long enough, or if the engineer forgets to disconnect. Historian sync can be chatty. Firmware updates arrive when nobody is expecting them. And then there is the moment when an alarm triggers and five people simultaneously try to remote in to see what is happening.
Signal quality matters more than speed here. A marginal SINR means session instability, which means the VPN flaps, which means the SCADA platform raises alarms, which means someone dispatches an engineer to a site that does not actually have a hardware fault. Getting the antenna right from the start is almost always cheaper than that call-out.
You can check signal metrics for telemetry installations using the signal tester before committing the installation.
2. Smart Metering and Utility Sensors
This is the natural home of NB-IoT and LTE-M in the UK market. Vodafone and O2 both offer IoT-specific LPWA products here, and the use case fits the technology well: small payloads, infrequent transmission, deep coverage requirements, battery life constraints.
The data use is genuinely tiny when the design is right. Exception-based reporting and scheduled reads can land well below 100 MB per device per year. The trap is poor device configuration or a platform that polls more aggressively than the design intended, which can push consumption up rapidly and drain batteries faster than the specification assumed.
Signal testing for LPWA deployments is different to testing for full routers. The sensitivity floor is much lower, and coverage-enhancement modes mean the device may maintain a session in conditions that would look awful on an LTE signal test. That said, consistent reachability matters more here than anywhere. A meter that stops reporting is a billing problem, not just an IT problem.
3. CCTV Health Monitoring
There is a useful distinction between a CCTV system that uses cellular for management, alarm signalling, and occasional remote checks, and a CCTV system that is actually streaming live video over cellular. They are entirely different animals from a data perspective.
The first case – keepalives, motion alerts, occasional snapshot, periodic health check – can live comfortably on a relatively modest plan. A few hundred megabytes to a couple of gigabytes a month is a realistic range if the system is well configured.
The problem is that these systems often drift toward the second case without anyone formally making that decision. A user discovers they can view live footage from their phone. They do it regularly. Then their colleague does. Then someone sets up a cloud sync that nobody knows about. None of this is visible in the original design, but it shows up on the SIM bill.
4. Live CCTV Streaming
This is where deployments get into serious trouble, and it is the application where upload speed, not download, is the critical figure.
Every camera stream travels from the recorder or camera to the server or viewing client. That is upload. A single camera running 1080p H.264 at moderate bitrate might use 2 to 3 Mbit/s. Two cameras: 4 to 6 Mbit/s. Four cameras: 8 to 12 Mbit/s. That is before you have done anything else on the connection. Before VPN. Before remote access. Before a firmware update lands.
At 2 Mbit/s continuous upstream, you are consuming roughly 650 GB per month. At 4 Mbit/s, you are over a terabyte. That is why cellular CCTV installations have to be designed around event-led streaming, local recording with selective pull, or aggressive codec optimisation. The alternative is a very expensive SIM bill and a connection that crawls for everything else because the cameras are saturating the uplink.
Before commissioning a CCTV installation over cellular, always measure upload capacity under realistic load using a proper signal and speed test, not just a single idle-state measurement.
5. EV Charger Backhaul
OCPP over WebSocket is not a heavy protocol in principle. Status updates, heartbeats, meter values, and transaction records are small. But a modern charge point is also talking to payment processors, potentially to energy management systems, and occasionally receiving firmware. A managed charger hub with multiple charge points behind a single router adds all of this up multiplied by the number of units.
The connectivity requirement here is less about raw speed and more about reliability and latency consistency. A failed charge authorisation at a public charger is a customer service problem and, increasingly, a regulatory one. 4G Cat 1 is often genuinely enough for a single charger or small hub. The case for Cat 4 or above grows with charger count and richer telemetry.
6. Card Machines, Kiosks and POS Terminals
Transaction data is small. A card payment is kilobytes, not megabytes. The network requirement looks modest on paper.
The real requirement is not bandwidth. It is reliability, latency predictability, and security. A payment terminal that drops sessions, cannot complete transactions, or runs card data over a poorly secured connection is a liability, not just an annoyance. Cat 1 and Cat 1 bis are well positioned here: they are 4G-era, reliable, and appropriately specified for the traffic pattern. The trap is treating “small data” as synonymous with “low risk” installation. It is not.
Signal quality is critical. A terminal sitting at -105 dBm RSRP with a poor SINR is at risk, even if the speed test looks fine in isolation. The sessions that fail are almost always under intermittent coverage, not during a clean signal test.
7. Vending Machines and Unattended Retail
Mixed profile, moderate data, high operational sensitivity. Vending machines typically combine stock telemetry, payment connectivity, temperature and door alerts, and remote management. None of it is individually heavy. Together, with card payments, periodic image uploads, and remote support, you land somewhere between 100 MB and 2 GB a month depending on how the system behaves.
The coverage issue is real here. Vending machines live in shopping centre basements, transport hubs, car parks, and service corridors. These are exactly the environments where RSRP looks acceptable on a map but SINR is poor because of metalwork, concrete, and interference. Testing on-site before installing hardware is the difference between a machine that works reliably and one that generates support calls from day one.
8. Digital Signage and Public Information Displays
This is a tale of two designs. A media player that downloads content packs overnight and plays them locally needs relatively little ongoing bandwidth. Maybe 2 to 10 GB a month depending on content volume and update frequency. A screen that streams content continuously, updates in real time, or serves dynamic data can push well beyond that.
4G is adequate for most cached signage. 5G starts to make sense for multiple-screen sites, real-time content delivery, or locations where 4G is congested. The smart design is to understand which model you are running before you choose the router and the tariff.
9. Building Management Systems, HVAC and Plant
BMS installations over cellular are one of the most common sources of poorly diagnosed connectivity problems, and that is largely because of what ends up on the LAN behind the router.
Raw polling traffic for BACnet, Modbus or KNX is not large. But BMS controllers, especially older ones retrofitted with IP gateways, can be chatty in unexpected ways. They broadcast. They retry. They send SNMP traps. They log aggressively. Some cloud-integrated platforms add their own polling loop on top. A system that looks like it should use 500 MB a month can behave very differently once you watch actual traffic.
The other issue is that engineers remote in. A building controls contractor who wants to check trend data or adjust a setpoint may do so via VPN, sometimes at length, sometimes repeatedly. If the router is shared with other site services, that remote access traffic competes with everything else.
This is one of the applications where understanding your signal metrics is genuinely useful for diagnosing intermittent faults rather than just commissioning the installation.
10. Temporary Offices, Welfare Units and Site Internet
This is the application that genuinely looks like broadband, because it is. People using a site office or welfare unit over a 4G or 5G router have broadband habits, broadband applications, and broadband-level expectations. Teams calls, cloud file access, software updates, video streaming, WhatsApp backups, browser sessions, and the inevitable guest connecting to a Wi-Fi password they found on the wall.
Human usage patterns are unpredictable, spiky, and relentless. A welfare unit with four workers can consume 100 GB in a month with no deliberate media streaming, just normal work activity.
Here, download speed does matter, alongside upload for calls and file uploads, and SINR matters because video calls degrade badly under variable signal quality even when raw throughput looks acceptable. 5G earns its place on sites where contention and demand are real. A well-installed 4G router with a proper external antenna can do a lot of work, but if the site draws consistent heavy users, 5G headroom is worth having.
The Upload Problem Nobody Talks About Enough
It is worth staying on this point because the asymmetry between download and upload creates a specific and repeatable failure mode in IoT and edge deployments.
A standard LTE downlink might provide 50 to 150 Mbit/s under good conditions. The uplink on the same connection might be 10 to 30 Mbit/s. That sounds like plenty. But at a busy cell during peak hours, real uplink throughput can drop considerably. In congested urban environments or at events, upload capacity can collapse while download remains relatively stable, because operators protect the download experience for the majority consumer market.
For a site with cameras streaming, telemetry posting, and VPN tunnels maintaining sessions, that uplink collapse is the failure. The cameras stall. Sessions drop. The SCADA platform raises alerts. Remote access becomes unusable.
Testing upload matters, especially at peak times, and especially for any application where the site sends more than it receives. Always include an upload test as part of a formal site assessment. The signal tester provides both upload and download figures alongside signal quality metrics so you can see the full picture before committing to an installation.
Misconfigured Devices: Where Data Goes to Die
Even when the signal is good and the tariff is reasonable, poorly configured devices on the LAN behind a router can destroy a deployment. This is one of the most common and least discussed problems in cellular edge installations.
CCTV NVRs with cloud sync enabled
A DVR or NVR configured with a cloud recording or cloud backup feature will start uploading footage, often continuously, often without any visible indication that it is doing so. Some systems do this by default. An installer who focuses on getting the cameras working and does not audit the NVR network settings can leave behind a device that is consuming several gigabytes a day over cellular without anyone knowing.
The fix is simple: audit the NVR. Disable cloud sync over cellular. Use local recording as the primary store and cellular only for alarm clips or scheduled pull.
BMS controllers with aggressive polling
BACnet and other building control protocols can broadcast discovery packets. A controller that has not been properly scoped to the local subnet can broadcast continuously across the network. A cloud-connected platform that polls every 30 seconds rather than every 5 minutes will use ten times the data. These are configuration choices, not hardware limitations. But they are rarely checked during installation because the installer is focused on device function, not cellular data consumption.
Automatic update services
Windows Update, firmware auto-update, antivirus signature updates, and software licence checks are all silent background processes that can generate hundreds of megabytes to gigabytes of traffic at unpredictable intervals. A router that connects a small Windows-based kiosk or HMI panel is also connecting that machine to the internet, including all of its background services.
On a business broadband connection, this is invisible. On a cellular tariff with a data cap, it is a bill spike or a throttled connection at the worst possible time.
Routers with no traffic monitoring
A router that does not report data usage by connected device makes all of the above invisible until the bill arrives or the data cap triggers. Good router configuration includes traffic monitoring, per-device visibility, and ideally data limit alerts. For cellular deployments, this is not optional. It is basic operational hygiene.
Teltonika routers, commonly used in UK industrial IoT deployments, have RMS (Remote Management System) which includes data usage monitoring. If you are deploying cellular connectivity without per-device visibility, you are guessing.
Security: The Part Most Installations Get Wrong
Cellular connectivity at the edge creates a network perimeter that most enterprise IT teams have not designed for. A 4G router in a kiosk, a remote telemetry unit in a plant room, or a charger hub in a car park are all network endpoints connected to the public internet over a cellular link, possibly with multiple devices behind them, often unmanned, sometimes in physically accessible locations.
Default credentials
The most common vulnerability is also the most avoidable. Default router credentials left unchanged mean anyone who can reach the router’s management interface can change its configuration, capture traffic, or use it as an entry point into connected devices. Change default passwords on every device. Every time. Without exception.
Open management interfaces
A router with its web management interface exposed on a public IP address is a liability. Management access should be locked down: either via a VPN, via operator-level private APN, or by restricting management access to specific IP ranges. If the cellular link is on a public IP, the management interface should not be reachable from that public address.
LAN devices with no firewall separation
The router’s firewall is the boundary. Devices on the LAN behind the router are often trusted implicitly, which can mean that a compromised LAN device has full access to everything else on that network segment. In a BMS installation, that can mean a HVAC controller, a PLC, and a payment terminal sharing the same LAN with no segmentation.
VLAN separation, firewall rules between LAN segments, and the principle of least privilege apply to cellular edge networks exactly as they do to enterprise networks. The fact that it is a small installation does not reduce the risk.
Unencrypted traffic
Older industrial protocols, Modbus TCP, BACnet over IP, some older CCTV management traffic, are not inherently encrypted. On a cellular link, that traffic is exposed unless it is protected at the VPN layer. If the application carries sensitive operational data or controls physical infrastructure, assume that cleartext traffic is unacceptable and design accordingly.
Physical access to the device
A router in an unlocked kiosk or accessible enclosure can be rebooted, factory reset, or physically replaced. Good installation practice includes physical security: locked enclosures, tamper alerts, and remote monitoring that detects device reboots or WAN changes.
Planning a Cellular Installation in 2026: What Has Changed
The UK network backdrop has shifted meaningfully and installations designed even three or four years ago may be making assumptions that are no longer valid.
3G is gone
EE and Vodafone completed 3G switch-off in 2024. Three completed in late 2025. O2 finished in early 2026. Any device or router with a 3G fallback dependency needs to be assessed. Equipment specified when 3G was the backup network has no backup now unless it supports 4G and above.
2G is still present but time-limited
Operators have committed to retire 2G by 2033 at the latest. Legacy M2M devices on 2G need migration planning. That is not urgent, but it is not indefinite either.
LPWA coverage exists but varies
NB-IoT and LTE-M are real in the UK market, but not uniformly available from all operators in all regions. Before designing a product around a specific LPWA technology, verify coverage at the target location with the specific operator, not just a general coverage map.
5G is real but not universal
5G coverage has grown significantly and continues to grow. It is not yet universal, particularly in rural or semi-rural areas. 5G makes sense where capacity, upload speed, and lower latency are genuinely needed. It does not make sense as a default choice for low-data IoT applications where 4G is adequate and simpler.
Signal conditions change over time
A signal test at commissioning is a starting point, not a guarantee. Network operators change antenna configurations, add cells, change spectrum allocations, and adjust power levels. Nearby development can affect propagation. A site that tested well in 2023 may be in a worse position now, or a better one. For critical applications, periodic re-testing as part of normal operational practice is worth building in.
How to Approach a Signal Test Before Installation
Doing a proper pre-installation signal assessment is not complicated, but it requires more than a phone in your pocket and a speed test app.
The goal is to understand what the connection will look like for the specific device, in the specific location, at the specific position it will occupy, under realistic operating conditions.
Start with a site survey at the intended installation point. Do not test in the car park and assume the meter cabinet will be similar. Concrete walls, metalwork, and even moisture can change conditions significantly.
Use the signal tester to capture RSRP, RSRQ and SINR alongside upload and download speed. Take readings at multiple points in the enclosure or installation area if there is variation.
Test at more than one time of day. Cellular network congestion is real. A test at 9am on a Tuesday in an industrial estate will give different results to a test at 5pm on the same site.
Test with the antenna you are actually going to use, not a test device. An external antenna, a dipole on the router itself, and a colinear on the roof perform differently. The signal test should reflect the final installation configuration as closely as possible.
If you are deploying uplink-heavy traffic, run an upload test under load. Connect a test device. Push traffic. Measure what the connection actually does when it is working, not just when it is idle.
Record the results. A written or logged record of pre-installation signal metrics gives you a baseline for troubleshooting later. When a BMS contractor calls six months down the line to say the connection has got worse, you want to know what “better” looked like.
The Data Estimate Trap
The most reliable thing about data estimates is that they underestimate real usage. Not because the estimator is incompetent, but because estimates are built around the normal state, and deployments live in the normal state only some of the time.
The abnormal state is where data actually goes. Firmware updates land without warning. An engineer opens a VPN session and leaves it running overnight. A camera NVR with cloud sync enabled silently uploads three weeks of footage during a firmware update cycle. A poorly configured BMS controller starts broadcasting every 15 seconds after a settings change. A user finds the site Wi-Fi password and starts streaming on their lunch break.
None of these are failures in the normal sense. They are real operational events. They are the reason that every data estimate should include margin, and the reason that per-device traffic visibility is not optional on a cellular router deployment.
A practical rule: if your calculation says 500 MB, plan for 2 GB. If it says 5 GB, plan for 15 GB. If video is anywhere in the picture, assume your first estimate is wrong by a factor you would rather not discover on the bill.
What This Means for Choosing Technology
The right technology choice starts with the application, not the datasheet.
NB-IoT or LTE-M make sense when the device is genuinely low-power, payloads are small, battery life is a constraint, and deep indoor coverage matters more than responsiveness. These are point device technologies, not router platforms.
Cat 1 and Cat 1 bis are underused and often the right answer for IoT applications that are too active for NB-IoT but do not justify a full router. Payment terminals, charger backhaul, vending, and low-density telemetry all fit here.
4G Cat 4 and above router class is appropriate when you have Ethernet devices, VPN requirements, multiple LAN clients, mixed traffic, or any application that resembles connectivity rather than just data transport. This is where most UK industrial edge deployments actually live.
5G earns its place where upload capacity, congestion resistance, or multi-user broadband-like behaviour is genuinely needed. It is not automatically the right choice for low-data IoT, and it is not universally available at every installation site.
Final Guidance
Signal testing is not about getting a big download number and feeling confident. It is about understanding what the connection can reliably deliver for the specific application, in the specific environment, under realistic operating conditions.
Test upload as well as download. Look at SINR, not just RSRP. Test at the actual installation point. Test under load. Test at different times. Record the results. Build in margin.
Then configure the router and devices properly. Audit what is on the LAN. Disable what should not be running over cellular. Monitor traffic. Set alerts. Apply security hygiene from day one.
And then review it again six months later. Because the network changes, the applications change, and the people using the site have a way of finding creative new uses for a router that was specified for something much simpler.
You can test your signal conditions here and see full metrics including SINR, RSRP, RSRQ, upload and download speed in a single test.
All speed and data figures in this guide are indicative planning ranges. Actual performance depends on operator, radio conditions, antenna configuration, router specification, firmware behaviour, VPN overhead, codec settings, LAN device configuration, and application design. Always validate with a site assessment and a realistic traffic model before committing to production deployment.