• State-of-the-art (eg, LTE / LTE-A) Turbo decoder implementations are only a slight evolution on a 20 year-old design
  • They suffer from low throughput, high latency, limited error correction capability and limited scalability because they only support limited degrees of parallel processing
  • 5 years in development, we have drawn upon 50 person-years of Turbo coding experience to not only redesign the hardware implementation, but also the Turbo decoding algorithm itself

The current state-of-the-art solution

  • The parallelism within each Turbo decoder is limited, parallel Turbo decoders would be required to achieve the designed throughput (the hosepipe diameter)
  • This approach becomes inefficient with scale and doesn’t improve the latency (length of the hose in the analogy) or error correction capability

we make 5G happen through our low latency turbo decoder solution

In AccelerComm’s Turbo solution however, we have unlimited parallelism…

  • We have invented the only solution in the industry that supports an unlimited amount of parallelism within the Turbo decoder and provide compatibility with any Turbo encoder
  • Only AccelerComm’s Turbo decoding solution can meet the stringent 3GPP New Radio specification, even when compared with the state-of-the-are Turbo decoders…

we make 5G happen through our low latency turbo decoder solution

State of the art versus AccelerComm Turbo

Next generation 5G New Radio (NR) requirementsState of the art limited parallelismAccelerComm unlimited parallelism
Significantly higher throughput 2Gbps – 20Gbps

Cannot be achieved

1Gbps at best without  severe HW inefficiency

2Gbps in FPGA

20Gbps in ASIC

Significantly lower latency 1µs - 3µs

Cannot be achieved

At least 1µs

3µs in FPGA

1µs in ASIC

Improved error correction

Needs significant change in encoder design, not conducive to decoder implementation

Significantly improved error correction

Scalable for different use cases, IoT to cloud computing

Limited parallelism

Fully scalable parallelism


Organisations we work with