Communications

Creating Custom Test Setups for Signals Research Group to Evaluate Audio and Video Performance

Creating Custom Test Setups for Signals Research Group to Evaluate Audio and Video Performance

Location: United States

Signals Research Group

Signals Research Group (SRG) offers thought-leading field research and consulting services covering the wireless telecommunications industry. They provide in-depth analysis of emerging wireless technologies with a particular focus on performance-based benchmarking, timely commentary on recent events in the industry, and customized studies and commissioned papers for individual clients, including mobile operators, equipment suppliers, and the financial community.

Evaluating Audio and Video Performance

Signals Research Group needed to evaluate the audio and video performance of four conferencing platforms—Cisco Webex, Zoom, Google Meet, and Microsoft Teams—on different platforms and devices and under different conditions. With our custom-built audio and video quality testing laboratories and extensive experience in the communications industry, we had the knowledge and tools to help the client attain relevant audio and video performance metrics within a short period of time.

Introducing TestDevLab—Creating Custom Test Setups

When we first started working with SRG, our main goal was to evaluate the audio and video performance of four conferencing platforms and gather a set of metrics. For each of these platforms, we needed to measure video quality—PSNR, impairments, and VMAF—as well as audio quality, specifically POLQA. We also had to measure the frame rate (FPS), network bandwidth, A/V latency, and A/V sync.

This required us to perform testing using 3 different Internet Service Providers (ISPs) depending on the number of simultaneous participants in a meeting. To do this, our network administrator set up virtual networks from different ISPs for our devices under test. We used 14 real devices for testing. Additionally, we needed to test three different bandwidth limitations by performing three 3-minute tests for each platform/application/limitation combination. For this, we developed a network limitation script.

To perform tests effectively and gather reliable results, we needed to follow specific test setups based on the client’s requirements. This was one of our greatest challenges. Namely, apart from the usual 1v1 setup (1 source, 1 capture device), we also had to use two additional setups—1v5 and 6v8. Therefore, the pressure on us to deliver and meet the client’s expectations and standards was high. To succeed, proper test planning and creating custom test setups was essential. Our manual team developed a comprehensive plan on how test setups would be executed and how they would capture all 14 physical devices effectively. At the same time, our video processing engineers found the best way to separate and process six different source videos in a call grid with 30 users. Creating these custom test setups was definitely a challenge, but one we were eager to overcome.

In addition, we carried out noise suppression tests with a 1v1 setup using a Windows laptop as the sender. We covered three key noise suppression scenarios—no background noise default settings, background noise without suppression, and background noise with suppression. To carry out these noise suppression tests successfully, we had to add background noise to our test audio sample and explore what noise suppression options were available in each application.

Adding the noise to the audio file was another challenge we had to overcome. Specifically, we added pink noise to the audio; however, we needed to find the correct loudness. Otherwise, if the noise were too loud, there would be no difference in the results between the scenarios with and without suppression, as both would result in a POLQA score close to the minimum.

To make things even more interesting for us, we received an additional request from the client. They wanted us to test a platform that was new for us—Chromebooks. This caused a plethora of unique issues when creating the test setups, like device performance, connectivity, application support, and script support—but again, we succeeded.

After testing was complete, we provided the raw metrics and testing data to the client, as well as our insight into the overall comparability between the conferencing applications we tested. We filmed 360 tests, which had a total of 2304 video files to process. From this data, SRG was able to draw various conclusions, such as which conferencing applications perform better on which devices, which perform better in limited quality network connections, which are more applicable to larger scale calls, and which could be offered to schools, to name a few.

With the support of TestDevLab, SRG was able to obtain reliable data and metrics that would reinforce its reputation as a trustworthy source of in-depth research and analysis. We consistently communicated our findings to SRG throughout the testing process so that they could further evaluate and compile reports for their end clients.

Unlike other QA providers, we offer the full package when it comes to audio and video testing. From audio and video performance metrics and different network conditions and limitations to various device combinations and user counts and adjustable manual and automation setups—we do it all. Here is what Michael Thelander, President of Signals Research Group, had to say about working with TestDevLab:

We engaged TestDevLab to conduct a very complex and time consuming analysis of various web conferencing applications. Their team went above and beyond to ensure we had all the data and results that we needed. Throughout the 3-month engagement, they fulfilled every additional request we had, took time to answer the questions we raised, and provided us with all the information we needed in a timely and professional manner.
Michael Thelander
President at Signals Research Group

Why Partner with TestDevLab?

At TestDevLab, we excel in testing communication solutions to ensure compatibility, load tolerance, network adaptability, and stability. With expertise in VoIP, WebRTC, WebSocket, and more, we cover messaging apps, chatbots, conferencing, streaming, and video calling. Our QA engineers offer flexible, tailored support onsite or remotely to help you deliver reliable communication products.

Smiling QA engineer working on a test report on an external monitor
Smiling QA engineer working on a test report on an external monitor

500+

ISTQB certified engineers

30+

mastered programming languages/technologies

10+

years in business