TEMS Cloud Benchmarking
5 MIN READ

​How to benchmark a 5G network: The 5 steps and tools that the world’s largest project used

Gavin Hayhurst
Jul. 19 2023
Share

A merger between CSPs can be driven by a variety of reasons – from saving CAPEX by consolidating infrastructure to driving operational efficiency, from extending customer reach and coverage to unlocking new markets. However, one thing is invariably always there – this market consolidation will attract the attention of the national regulatory authority.  

The scale, complexity and cost of building-out new 5G networks is bringing fresh impetus to CSP mergers. Take the example of North America as a market. Rolling out and benchmarking nationwide 5G coverage across the biggest cities and down to the most rural of counties takes time and investment.  

Following a merger between two North American mobile operators, the consolidated entity was required by the FCC to prove it met the specified 5G network build-out and performance commitments to which it had signed-up as part of the terms of the regulatory go-ahead. Furthermore, it also had to provide evidence that the network performance – namely the subscriber experience of the network and even OTT apps running over it – was consistent across the combined network. 

And the scale of the challenge? To satisfy the regulator, the operator’s network engineers would have to drive over 1 million miles and conduct over 5 million tests.  

Turning a network benchmarking overhead into a competitive opportunity 

But what if you could turn this compliance-driven benchmarking from a regulatory cost into a competitive opportunity? 

The customer decided since they were expending so much effort on testing it made sense to include additional benchmarking tests against the other operators. For minimal additional cost and effort, the operator could have a true picture of how the performance of the new consolidated 5G network compared to the experience being delivered by its competitors. 

What is competitor network benchmarking?

Competitor benchmarking enables you to compare your network quality to that of other mobile operators, in your home market. This can require extensive drive-testing to build a baseline view of how network assets are perferming relative to competitors across different areas and at different times.
Measuring subscriber experience across multiple operators, handsets, and services requires a drive test solution that is optimized for mobile network benchmarking projects. This means a multi-device configuration, supporting both iOS and Android handsets, along with in-depth service quality level testing including all the popular OTT applications, so you can perform a deep market comparison.
To reduce your benchmarking costs means minimizing field effort, and this requires automation. With centralized orchestration and enough automation, each benchmarking vehicle can be reduced to only a single driver saving significant costs.

 

Test requirements: 

The benchmark testing required by the regulator was straightforward yet onerous. It required: 

  • Stationary and mobile testing with reports on test location, 
  • Serving cell info, 
  • Measured throughput and scanner data.  
  • To support automated testing via the Ookla SpeedTest app. 

Crucially, the benchmarking had to cover 99.5% of the population (98% rural areas) in a nationwide patchwork of 500x500m grids. 

However, using this as the impetus, the customer understood that by using cloud-based fleet management they could centrally manage the whole testing process, streamlining work orders, real-time KPI reporting and automatic error detection. 

Using TEMS™ Cloud for the centralized fleet management enabled the operator’s experienced RF engineers to remain at HQ and focus on managing nationwide testing projects and analyzing results, not flying and driving around the country collecting test data.  

Network test scripts could now be created centrally and shared between test teams, ensuring consistency across the entire nationwide benchmarking project. 

To enable the drive testing to be conducted by non-technical contractors and staff in the field, the field solution needed to be self-contained. It had to be: 

  • Easy to ship to local teams across the country,  
  • Easy to install in rental cars,  
  • Easy and intuitive for the driver to navigate the prescribed route.  

Using TEMS™ Paragon for the multi-device benchmarking enabled the operator to capture benchmarking data from its network and all competitors in a single drive test.  

5 steps to carry out automated network benchmarking 

Cloud-based orchestration, automation and analytics enabled the operator to streamline the project workflow from fleet management and remote monitoring of kit in the field to post-processing, analytics and data storage. 

Step 1: Creation of work orders 

The entire consolidated nationwide network was divided into 500x500m grids, as mandated by the regulator, and imported into TEMS Cloud. Stationary test locations were used to create drive routes, with test scripts and work orders centrally defined and uploaded. To ensure consistency and accuracy in the drive tests, definition of ‘start’, ‘done’ and ‘failures’ were provided. 

Step 2: Selection of daily work orders  

Out in the field, the user was able start the equipment – phones, scanners, laptop, navigation tablet – and see all available work orders, confident that the work orders are synced and only available routes are listed in the ‘Assign work’ view. The user then selected the work orders they would be working on for the day, with other drivers in turn prevented from selecting them. 

Step 3: Route generation, navigation and test execution 

Upon selection of a workorder, TEMS Paragon generated a preview with turn-by-turn directions produced to direct the drive tester to the location of the first stationary test.  

Once in location, the tests were executed. Devices (phones and scanner) are automatically detected by TEMS Paragon, meaning all the user needed to see was a simple purpose-built view displaying service execution status, completion %, and/or failures. If any failures were detected (for example, a device stops reporting data), the driver was alerted, and automatic recovery attempted. This helped prevent unnecessary repeat drive tests. 

When a work order finished or was stopped, a summary of its execution appeared, with the user able to ‘accept’ or ‘reject’ the results.  

  • If the results were ‘accepted’, an upload of the collected data was initiated.  
  • If the results were ‘rejected’, the work order data was discarded and the user was able to repeat the test while still on-site. 

Step 4: Data upload 

When a user ‘submits’ the results of a test, the data was uploaded to the central server, with data upload only initiated when no testing was in progress. TEMS Cloud analyzed the captured data in near real-time and created the relevant reports and dashboards, storing all logfiles in a central repository so they could be leveraged by other engineers for further insights. 

Step 5: Centralized monitoring and reporting 

Monitoring of work order progress and execution success and reporting on network performance KPIs was done centrally through TEMS Cloud. By monitoring testing progress in real-time, engineers at HQ were able to address any testing issues while field teams were still on-site. Standardized ‘definition of done’ criteria ensured testing was aligned across teams and testers knew when their testing was complete, further reducing the need for time-consuming and costly repeat visits. 

The benefits of network benchmarking with TEMS Cloud and TEMS Paragon 

Working with Infovista, the customer turned a regulator-mandated overhead into a competitive advantage that enabled it to ensure that its post-merger consolidated network delivered a truly compelling subscriber experience. And all while successfully undertaking the largest benchmarking project ever, with over one million drive test miles travelled. 

Using a combination of TEMS Paragon managed centrally by TEMS Cloud, proven benefits of the benchmarking project approach for this North American mobile operator included: 

  • Uniform script distribution across drive test teams 
  • Native python scripts to enable testing of regulator-specified OTT apps 
  • Cloud-based device orchestration from a central office for efficiency 
  • Seamless synchronization between devices and TEMS Cloud 

For more on how network testing is combining with orchestration and analytics to automate next-generation network testing processes such as competitor benchmarking, see our latest use case eBook: Evolving network testing – an eBook of 12 use cases.  

More helpful pages: