When it comes to 5G, no band is perfect. In low bands, 5G coverage is maximized and cell sizes can be quite large, the performance of 5G New Radio is limited because of smaller available channel bandwidths. Meanwhile, high-band, millimeter-wave 5G deployments have a number of known RF challenges: Performance can be excellent because of the capacity and bandwidth available, but coverage is a huge and expensive challenge because of the frequencies’ short range and penetration limitations.
But there are RF issues to deal with even in the “goldilocks” midband range of new spectrum that is coming online. Here are three.
In CBRS, power levels are a sticking point
For mobile operators, the power level limitations of CBRS pose a significant conundrum in reconciling use of the band as part of deployment. This has been reflected in comments in the FCC record, with multiple operators from AT&T and Dish Network to the Competitive Carriers Association urging the FCC to allow higher-power operations in the band. Dish, in a 2021 filing prior to the 3.45 GHz auction, said that the disparate power levels between CBRS and other cellular bands “leave a relatively encumbered CBRS band sandwiched between the 3.45 GHz Band and the 3.7 GHz Band, both of which … have services rules optimized for large-scale, wide-channel 5G service offerings. This is akin to connecting two cities with a new 8-lane high-speed roadway, but constructing a stretch of single lane road in the middle. The current U.S. approach undermines the overall usefulness of the entire 3 GHz band, and places the United States at odds with our major global competitors, especially China.” As of yet, however, the FCC has shown no inclination to change the existing power level limits or create a new class of CBRS devices that could operate at higher power.
In addition to the power level issues, there are also concerns about interference between 3.45 GHz operations and the ESC receivers that detect incumbent activity in the 3.55-3.65 MHz portion of the CBRS band. As noted in a WINN Forum document on co-existence between CBRS and ESC sensors, those operations are immediately adjacent to each other. “Because there is no guard band between the two, the ability of ESC sensors to adequately perform their sensing functions could be impacted by strong signals from systems operating in the 3.45 GHz Service,” the document says, with the “dominant predicted impact is from blocking interference from 3.45 GHz Service” rather than out-of-band impacts from CBRS to 3.45 GHz. That interference from 3.45 GHz “could render an ESC sensor inoperable, causing a false activation of incumbent radar activity” which would default to the SAS moving CBRS users out of the band and disrupting operations unnecessarily. Again, this plays out in part as a receiver issue – as WINNForum notes, its specs for CBRS ESC filters focused on spectrum activity above 3.65 GHz, because at the time the spec was developed, there weren’t even plans for a 3.45 GHz service that the receivers would have to filter. (ESC sensors, it should also be noted, live in so-called “whisper zones” for CBRS in-band transmission and are treated by the SAS as incumbents in the band, so the SAS limits CBRS channel assignments around ESCs—which also impacts the ability to deploy CBRS near ESCs along the coasts.)
TDD vs. FDD operations
Whether an operator ran an FDD or TDD network used to be a fairly significant difference—but now, post-CBRS and post-C-Band auctions (both of which are TDD spectrum), all the national U.S. operators are doing both.
“It’s not really talked too much about from an operator standpoint, but when you start frequency coordination, the impact of timing and using GPS timing is very critical,” says Tim Sill, VP of technology and business developmentSill of Alpha Wireless. He spent 16 years at Sprint, which put years of effort and cost into clearing and then using TDD spectrum. TDD, he says, “has nuances that you have to get your arms around. .. You have to understand those nuances and some of the uniquenesses about it.” Loss of GPS signal for an extended period of time is one potential risk to such systems’ operational capabilities, he explains, because eventually the performance of the site would degrade to the point where most calls drop. As an example, he mentions tropospheric tunneling, when temperature inversions in the atmosphere can trap TDD signals and channel them over long distances such that base stations hundreds of miles away can be affected.
“With the evolution of technology and also regulations, it opened the door for higher frequencies. When we go to higher frequencies, we also [have] the possibility of using TDD in a more optimized way,” Fabiano Chaves, senior spectrum standardization specialist at Nokia. “When you go up in frequency and you have wider blocks, it’s really quite more optimal to have TDD because we also have more flexibility.” The smaller channel sizes and dedicated uplink/downlink spectrum in lower bands mean that there is little opportunity to optimize uplink vs. downlink based on traffic needs, he explains. “You have the flexibility with TDD to make adjustments to the uplink/downlink proportion,” Chaves says. “It really depends on how the demand is. With TDD, you can adjust this with the network tools that you have for that.”
The caveat here is that not only would all of an operator’s neighboring TDD base stations have to be synchronized, but neighboring TDD operators would have to coordinate with one another in order to make sure they’re not stepping on each others’ transmit/receive patterns and causing what is known as cross-link interference. Timing and synchronization is particularly important in TDD systems. In an early 2022 LinkedIn post on tracking down interference in a CBRS system, Adam Wohld, an RF deployment engineer and solutions architect, related an instance of an operator struggling with high noise in CBRS channels that couldn’t be located on a spectrum analyzer, but which required a PCI scanner to find the TDD-LTE interference—which he discovered was coming from multiple sites, 10 and 17 miles away that were transmitting when the CBRS site was receiving. “CBRS spectrum allocation is about more than frequency, it’s also about timing. Being able to survey and inspect neighboring site configurations over the air is critical to maintaining LTE and 5G performance on CBRS,” Wohld wrote.
Intermod interference in 5G
Passive intermod, or PIM, is a well-known issue in LTE, particularly in FDD where systems transmit and receive at the same time. In LTE, PIM could be caused by loose or corroded connectors or nearby metallic or rusted objects that reflected RF energy. It can be spotted on a spectrum analyzer by a characteristic “shark fin”-shaped arc, notes Angus Robinson, senior marketing manager at Anritsu.
There is indeed PIM in 5G systems, Robinson adds, and Anritsu’s customers are typically encountering it when they are deploying midband spectrum at the same site as existing LTE systems in lower bands. “It’s not unique to the United States, but I think it’s more prevalent in the United States because in the United States there are more frequencies being used for LTE. So at one cell site, before you’ve got 5G, you’ve got 700/800 LTE, you’ve got 1800 MHz LTE, you’ve got 2.3, 2.4 GHz LTE. And it’s almost inevitable that the intermod products going from those base stations are going to fall in the 3.5 GHz band,” he says. That wasn’t obvious, going into 5G—he says Anritsu, which has established its equipment and solutions as helping to address PIM in previous generations of cellular systems, was skeptical that PIM would be an issue in 5G for a number of reasons: There was often beam-steering involved, so the RF energy is more directed and less likely to strike a PIM source; the systems are mostly TDD, which is less prone to PIM. “But actually, it’s not the inbound PIM that’s the problem,” he clarifies. “It’s the PIM resulting from PCS and AWS meeting with 600 or 700 or 800 MHz LTE and generating PIM products in the 3.5 GHz band.” The intermod products are second and third-order products, so they are 40 or more megahertz wide, and tend to be low-power and fall in both the uplink and the downlink. But the downlink power can usually overpower the intermod, he says; it’s in the uplink where the presence of the interference really matters, because it impacts the weaker signal coming from a handset rather than the transmission from the base station. Thus, for a device at the cell edge, he notes, even low-power intermod in the uplink can impact performance.
While some might quibble about whether to consider this “intermod” rather than “passive intermod,” the point is that there is cellular interference being generated by a cellular system—only instead of an LTE base station interfering with its own operations, the intermod here is the result of legacy LTE transmitters generating intermod products that are interfering with midband 5G operations. Whatever you call it, Robinson says, “I absolutely promise you, it’s a really big and significant issue that operators have only really started to realize and test for in the last six to 12 months.”
The post Three RF issues in midband spectrum appeared first on RCR Wireless News.