Much of the discussion around a new generation of mobile technology traditionally focusses on its increased throughput, and this seems no different for 5G. Everyone from Zoom video conference users to Unknown Battlegrounds players regularly experience that other key characteristic of digital communications systems, latency. Reducing latency is a key focus of the 5G standards, as low latency is seen as key to enabling new use cases such as mobile gaming and industrial automation. The battle to deliver on the goals is just beginning
5G era requires low latency to deliver on its promise
Higher latency or “ping time” means that in Fortnite you’ll find yourself “killed” before you’ve ever seen your opponent; in a game of FIFA, you’ve conceded a goal before you’ve seen the ball kicked. High latency renders games unplayable, causes people to talk over each other in video conferences, and risks killing the market for consumer applications such as Virtual and Augmented reality (VR and AR) that many hope will be major 5G revenue opportunities – but are highly latency-sensitive.
In the standards for 5G, many innovations have been included with the target of reducing latency by 10 or 20 times compared to 4G. The degree to which these will become a reality will in a large part depend on whether we let latency get in the way of a supercharged future.
Channel coding as a latency challenge
As mobile devices of all shapes, sizes and form factors communicate with fixed infrastructure, wireless communications are susceptible to noise and interference. If uncorrected this will lead to the received message differing to the transmitted message and therefore requesting a retransmit. The impact is increased latency.
To counter this, before sending a message, transmitters use a channel encoder to protect the communication from these transmission errors by adding additional bits, which enable the receiver decoding the transmitted signal to recompile the message correctly – a process called forward error correction, or channel coding. Encoding and especially decoding the message requires complex algorithms which, if not implemented efficiently, can introduce their own delays and hence increase latency.
Different types of channel codes exist, each coming with its own benefits and challenges. For 5G, Low Density Parity Check (LDPC) coding has been chosen for the data channel, where the user’s information is transmitted; Polar Coding has been chosen for the control channel, where the mobile and the base station exchange control information.
Delivering channel coding performance
While coding schemes themselves have been standardised, implementation techniques and algorithms of these channel codes, which are outside of the standardisation process, can greatly reduce latency and improve Quality of Service (QoS).
Overcoming the latency challenge and unlocking the full power of 5G is the focus of a growing band of channel coding experts, combining in-depth knowledge with unique insights as to the approach the industry should implement when it comes to LDPC and Polar coding.
This expertise is critical because, whilst LDPC codes have been used in Wi-Fi and satellite communications in the past, 5G has adopted the most flexible and most capable LDPC code ever standardised. This flexibility is key to allowing New Radio to meet the requirements of the diverse range of 5G use cases but presents its own challenges.
4G has a fixed timing for the radio whereas 5G supports up to five timing options (referred to as numerologies). 5G also supports a much wider range of frequencies, including those above 30HGz (referred to as mmWave) and throughputs up to 20X those of LTE. The channel coders and decoders must deliver the required performance for all these different options.
For the industry to go forward and unlock 5G’s full potential, the channel coding challenges of LDPC and Polar have to be overcome if we are to solve the 5G latency challenge. Implementing flexible, high performance decoders is possible – but it requires a fresh approach.
For example, LDPC decoders have to have many elements working in parallel to meet the performance requirements and there are two different established architectures. One of these can efficiently achieve the throughput requirements, but at relatively high latency, whereas the other achieves lower latency with reduced efficiency at only some throughputs. At AccelerComm we have developed a unique approach that efficiently delivers on both requirements.
Compared with LDPC codes, Polar codes are far less mature and a conventional approach to their implementation has not yet emerged. The encoder and decoder of a Polar code have regular but intricate structures. AccelerComm’s patented solutions deliver a highly efficient complete 3GPP compliant solution to simplify integration.
Integrating the channel coding building blocks
You can’t build a “one size fits all” when trading off latency, area, throughput and power for multi-market solutions – be they handset modems, small cell or vRAN infrastructure development. Channel coding has to be addressed with configurability at the design stage to enable optimal cost-effective design choices across not only FPGA and ASIC implementations, but also software-only virtualised network infrastructure.
Innovation in this area includes patented encode and decode IP for both Polar and LDPC coding. For example, AccelerComm has collaborated with Intel to develop a new, highly optimised LDPC decoder in software for its FlexRAN Reference Software. This reference architecture can be implemented in software-based radio stations, which can sit on any part of the wireless networks from edge to core.
Integrating AccelerComm’s LDPC codes within a FlexRAN-based software-based radio station will increase throughput by up to 3X whilst increasing network power efficiency and reducing latency – ultimately delivering a better user experience and potentially reducing system costs. The same designs also deliver market leading performance when implemented in FPGA and ASIC.
Taking a new look at channel coding
In a research note published in December 2019, Gartner forecast that by 2021, investments in 5G NR network infrastructure will account for 19% of the total wireless infrastructure revenue of communications service providers (CSPs), up from 6% in 2019.2 While early 5G applications will focus on fixed wireless access (FWA) and enhanced mobile broadband (eMBB), the analyst firm is expecting the first ultra-reliable and low-latency communications (uRLLC) private networks and then massive machine-type communications (mMTC) to be deployed between 2021-2025. These later use cases will really stretch the flexibility of 5G.
As more 5G networks are deployed, devices launched and services commercialised, the pressure to deliver on the 5G subscriber promise will become more acute. Low latency is not a ‘nice to have’ or something that is the sole preserve of niche ultra-low latency use cases. Low latency is a requirement for both today’s mass market applications and tomorrow’s future use cases, be they autonomous vehicles, remote surgery or smart manufacturing. With a relentless downward pressure on revenue and margins, performance must be delivered efficiently.
To design low latency performance fit for the 5G era, mobile operators and equipment manufacturers could address latency at the design stage and integrate the latest channel coding technologies and innovations into both physical and software-defined network elements. By acting that way, we will be able, as an industry, to reach the full potential of 5G.
The post Is latency key to realising 5G’s potential? (Reader Forum) appeared first on RCR Wireless News.