Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > Networks

Timing synchronization for 3G wireless

Posted: 16 Dec 2004 ?? ?Print Version ?Bookmark and Share

Keywords:3g? base station? umts? cdma? gps?

Wireless 3G technologies like UMTS and cdma2000 require strict synchronization of data streams to ensure reliable signal handoff between base stations. In the past, synchronization in wireless networks was implemented only when something was not working right or when crossing operator network boundaries. With the high-bandwidth requirements and real-time nature of many wireless services, however, dropped connections are becoming increasingly intolerable, particularly as systems move to 3G. The solution comes in the form of effective management and distribution of a reliable reference clock throughout the entire network. The GPS is the foundation of this clock distribution.

Wireless networks comprise several key subsystems. Ideally, synchronization would be derived from the PSTN and distributed throughout the rest of the network using T1/E1 links. All nodes could be locked, and any jitter and wander added in transit could be filtered at each transit point. Unfortunately, such a scheme is unrealistic for many reasons: the quality of the PSTN cannot be guaranteed, excessive clock noise is introduced in transit, encapsulation of network traffic adds excessive wander and connections between subsystems may be managed by multiple vendors, among others.

In practice, wireless networks generally operate using a reference clock backed by a holdover clock. All other network clocks must be traceable to the reference clock. The challenge is in distributing a reliable reference clock throughout the entire network. GPS provides an independent reference for any application, including wireless.

What time is it?

Knowing the exact time can be tricky. It takes time to propagate a reference clock throughout a system, and each transit stage adds some level of error. After you set a subsystem clock it will start to drift, so you need to recalibrate it to the reference clock on a regular basis. Even though two subsystems may be linked to the same reference clock, they will begin to deviate at different rates from the reference clock. Given a large enough deviation, connections will start to drop.

CDMAone and cdma2000 demand the toughest requirements. The deviation is specified not to exceed 1 part in 1,010 or 7.5?s over a 24-hour period. Comparatively, UMTS wideband code-division multiple access (W-CDMA) and GSM networks require an accuracy of 5 parts in 108ms or 4.3ms. To achieve those levels of reliable synchronization, subsystems must have accurate access to a reliable reference clock and excellent holdover abilities for when the reference clock is not available.

Traditionally, reference clocks have been sent over the network itself. Network timing, however, has become less reliable as networks have become more complex and the latency of timing packets between each node becomes more difficult to calculate. For example, SONET and SDH pointer adjustments can add ambiguity!and thus error!to timing packets as they travel through a network.

The timing packet itself is impervious to corruption. Where problems start to arise is when SONET payloads are brought back from the optical to digital domain. At this time, it is possible for the packet to be adjusted by a byte or so, based on adjustments that have been made to any pointers. This adjustment causes a phase error in latency calculations, which will be off by the length of the adjustment. Even for the UMTS standard, this error could be enough to cause an improper handoff or a call to drop. Network timing may be appropriate most of the time, but it cannot guarantee adherence to the minimum requirements for high-speed data transfers across highly constrained wireless networks under all operating conditions.

GPS is an excellent candidate for transporting a reference clock because each subsystem can have a direct connection to the same reference clock instead of an indirect connection over an unreliable network link. Since GPS is universally available, it avoids many potential interoperability issues that can arise between equipment from different vendors or networks run by different operators. There are situations where GPS is not an appropriate synchronization source, such as for fast-moving objects, but given that base stations tend to be stationary, these are not applicable to wireless networks. Hence, it is not surprising that use of GPS has been mandated by CDMA requirements.

For some cost-sensitive applications, it is not feasible to provide a GPS receiver. A building or campus, for example, may support a number of picocells that link to a single base station. In such a case, the cost of a picocell can be minimized by leveraging a single central, reliable timing source and propagating a reference clock throughout the piconet. The base station could house the GPS receiver and redistribute the time!i.e., reference clock!to the picocells.

The prevalence of CAT5 cable or other network infrastructure in buildings offers a readily accessible and low-cost means for reliably distributing clock signals. By fixing the length of the wire, the picocell or central clock source can compensate for the known latency of the wire. By avoiding having to distribute the reference clock wirelessly, you increase the reliability and availability of the signal.

My watch has stopped

Locking on a GPS signal provides a reference clock reliable enough for you to run your network from. However, GPS is not 100 percent reliable. Perhaps the subsystem cannot find enough satellites or environmental conditions make the signal difficult to lock onto.

The U.S. Navy, which uses GPS for navigation, requires navigation personnel to practice dead reckoning for those times when GPS is not available. When GPS is not available!a little cloud cover should not shut down a cellular network!base stations must practice their own form of dead reckoning known as clock holdover. Normally, system clocks recalibrate to the reference clock each time it is made available. Holdover occurs when a clock holds the last frequency at which it was clocking when the reference clock signal failed to arrive. For a CDMA network, the clock must be able to hold the reference signal with drift less than 7.5?s over a 24-hour period during which the GPS/reference clock is unavailable.

Two standard types of clocks used in base stations today that are up to the task of accurately holding a reference clock with such accuracy are highly precise ovenized quartz oscillators and rubidium atomic clocks. Quartz oscillators are the less-expensive option but require additional components. For example, quartz performance changes over temperature, so this has to be managed. Rubidium atomic clocks, on the other hand, provide reliability that is an order of magnitude greater than CDMA networks require.

The plot shows seven compact rubidium units achieving lock within a 5-minute window with an absolute accuracy better than 1 part per billion. In contrast, quartz resonators depend on the bulk acoustic properties of a crystal and can take hours to achieve wireless levels of accuracy, even with a GPS reference.

The primary difference between the two clocks is reflected during loss-of-power scenarios and how long it takes the clock to regain stability once the reference source has been lost. Quartz clocks can take four to 24 hours to stabilize frequency. As the quartz warms, its stability increases. Rubidium atomic clocks achieve 98 percent reliability within minutes.

Alternatives to GPS

When an absolute time code is not required, such as for UMTS, you can free-run a local clock as an alternative and use it as the reference source so long as the long-term stability is 50 parts per billion over a 10-year period. This stability requirement pushes quartz technology to its edge when there is no external reference source such as GPS, but a rubidium atomic clock will achieve this level of precision. Rubidium resonators can run for longer than 10 years and exceed the UMTS requirements.

Wireless spectrum is the most limited resource in a wireless network. While there is a theoretical maximum number of channels you can squeeze into a band, there is a practical limit based on how accurately signals are transmitted, how well the system is cooled and other noise considerations.

Today, there is a GPS receiver in just about every CDMA base station worldwide per the IS-95 standard for CDMAone.

GPS provides the most accurate and universal reference clock for network synchronization. Since base stations can directly access the signal, clock distribution is reliable as well. With the right system clock, the reference clock can be accurately maintained even during long stretches without access to the GPS signal, as well as after recovery from power loss. Using fixed network infrastructure, GPS can act as a central reference clock source for networks that cannot absorb the added cost of a GPS receiver and system clock at every node. And with precise access to the time, networks can maximize efficiency and channel capacity, proving that indeed, time is money.

- Phil Mann

VP of OEM Business

Symmetricom Inc.

Article Comments - Timing synchronization for 3G wirele...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top