X
    Categories Performance Monitoring

4G Wireless Performance: Will You Support Me?

Every major wireless carrier is racing to deploy 4G branded mobile data services.  Although the mobile data user has been trained to lower his or her performance expectations, the wireless carriers are working hard to change that perception.  Can 4G wireless services support real time applications like voice and video for the mobile office of 1?  Can it replace more costly fixed line business services like T1, MPLS and broadband? How would you know?

Being a data geek with access to the unique PathView Cloud network performance management solution, I decided to find out. I ran some tests on Sprint’s WiMAX 4G network from San Francisco’s business district.  My tests were run using a preview version of our virtual-machine friendly monitoring application.  This puts the patented continuous performance analysis capabilities from our microAppliance into a lightweight software client compatible with major desktop and mobile device operating systems.

The 4G Performance Test – The Set-up
My trusty HTC EVO 4G served up multi-megabit network speeds using its mobile hotspot option, and I connected to it using a MacBook Pro running OSX.  PathView Cloud measured performance between my laptop and one of our hosted appliance targets on the West coast at 60-second intervals.  PathView is non-invasive and uses a variety of protocols for testing including UDP, ICMP, and TCP.  Because PathView Cloud doesn’t depend on dumb file transfers like some other older tools,  the only network parameter NOT tested was my download limit.  In addition to network speeds, I set out to test the overall performance of the connection including latency (one way and round trip), jitter, packet loss and route stability. I added a path between my laptop and one of our hosted appliance targets in Vancouver.

Factors of 4G Wireless Performance
How fast is my network?  Some call this bandwidth.  But PathView Cloud accurately measures Total Achievable Bandwidth Capacity available for your applications.
Sprint’s 4G WiMAX network is fast in the downtown San Francisco area tested.  Over the hour and 60+ samples taken, the EVO’s hotspot averaged over 4Mbps download and just under 3Mbps upload. Maximum download capacity was 5Mbps and upload was nearly 3.7Mbps.  Sprint’s WiMAX definitely delivers broadband-replacement network capacities in our downtown coverage area.  I’d give it a solid ‘A’ for consistent upload and download capacities exceeding many typical cable modem and DSL service tiers.

Quality
4G is fast. But how healthy is it for common applications? The big three performance indicators are latency, loss and jitter. Latency usually corresponds to the length of the network medium and the geographic distance the electrical signal covers between two hosts. PathView measured 35 ms from my laptop to the EVO hotspot and on to Vancouver. 35 ms is nearly enough to cover San Francisco to New York on a fixed line network, so it’s clear we’re seeing additional latency by including the WiMAX access network. Still, the latency was well within the thresholds for most common real-time applications like voice over IP and video conferencing.
Latency gets a solid ‘A‘ on this test.

Packet Loss was observed at relatively low levels in both upload and download directions during our tests.  This was primarily observed when measured with UDP and peaked at 6% loss on the upload leg to Vancouver.  6% loss can be a big deal for real-time applications such as VoIP and Video, especially if the loss affects consecutive packets.  Typical browser based cloud service applications use TCP and are less affected by packet loss measured.  The symptom is a brief delay while the lost packets are resent to the client browser. PathView’s hop-by-hop diagnostics indicated loss in the Sprint network.  AppView Voice enables further testing using actual application packets and can further analyze loss, reordering, and packet discards by placing an actual RTP load on the network.
Sprint ranks a ‘B-‘ on our packet loss test and warrants a second level of testing with PathView and AppView.

Variation in latency between network packets is known as jitter, and too much can make real-time communications annoying with dropouts in sound and other artifacts.  PathView showed jitter to be well under 20ms on the upload leg and while slightly higher on the download leg, it never exceeded 25ms.  Jitter values never exceeded accepted thresholds for good voice and video quality.
Sprint gets an ‘A‘ for jitter on our test.

Route Stability
While capacity, latency, loss and jitter are the key indicators for network performance, knowing the topology of a network and how often it changes can be critical when troubleshooting service issues. Have you ever had a short term connectivity outage somewhere on a WAN circuit and had to wonder where and when? PathView automatically analyzes the route taken by a monitored path and stores this for comparison purposes. We can see a high number of route changes in the Sprint WAN servicing our WiMAX area. Having multiple routes between point A and B across a carrier’s service is one of the things that makes the Internet resilient and scalable. In some cases many route changes in a short time can be indicative of configuration issues or other faults. I’m a bit of a data geek and like to know how the network is changing over time. PathView Cloud lets you configure alerts when route changes exceed a certain number in a defined period.

Summary
Sprint’s WiMAX service performed consistently well during testing and would make a suitable broadband replacement for most business applications where similar coverage is available. Downtown San Francisco performance closely matches what I’ve seen in many Boston area tests I’ve run with PathView. Network performance often changes over time, especially ones based on wireless segments. It’s always best to monitor your path to the cloud from wherever you are.
Thanks to Sprint for the solid data performance. I’ll be watching with PathView Cloud.

Team AppNeta: