This vendor-written piece has been edited by Executive Networks Media to eliminate product promotion, but readers should note it will likely favour the submitter's approach.
Channel coding, aka, error control codes, is a foundational building block in almost all modern communication systems. Over the decades there has been a long list of champions and pretenders for the crown of supreme code du jour or perhaps more accurately, code de la génération. As we approach our fifth generation of wireless, is there anything left for the information theory gang to do? Have we pushed this frontier to its limits?
I would suggest not. Innovation in this space suggests a little renaissance period in channel coding is coming because of requirements for 5G. But first a look at how we got here.
Channel coding history
Channel coding is one of the main reasons our wireless networks work the way we like them to do-fast and error free. The general idea is simple. First pad the information/packet/bits at the source node with someredundant bits to be transmitted over the communication medium. Then, at the receiving end, exploit the redundancy of the extra padded information to overcome the side effects of the channel, e.g. randomness, noise, interference, etc.
This is a simplification, but the whole challenge in the decades-long channel coding research has been on developing the nexus of method that effectively creates and exploits such redundancy in the most perfect way possible. This perfection was defined by Claude Shannon in 1948 in his classical works that told us just how many error-free bits we could ever hope to send through a noisy, bandlimited channel.
One of the very first breakthroughs in channel codes, so-called Golay Codes were introduced in 1949, and their practical implementation was deployed in NASA's Voyager 1 and enabled hundreds of colored pictures of Jupiter and Saturn to be sent to Earth. The following decade experienced a quantum leap in the performance of wireless communications primarily driven by the introduction of Convolutional Codes in 1955 by Elias. The key trick was to perform a continuous encoding mechanism at the transmitter and Trellis-based decoding at the receiver, e.g. the well-known Viterbi algorithm.
This radical shift proved to offer substantial performance gain yet with increased processing complexity and power consumption. Supported over time by the ever-increasing computation gains as provided by Moore's law, along with more power-efficient circuitry, Convolutional codes ascended as the de facto codes for 2G mobile communications, digital video and satellite communications.
Then came Turbo codes. The introduction of Turbo codes by Berrou in 1993 sent shockwaves through the telecommunications community because for the very first time we had a channel code that performed close to Shannon's limit. The relatively low complexity for the performance it offers put Turbo codes at the core of the digital and mobile revolution (3G/4G) that started in early 2000s.
Sign up for CIO Asia eNewsletters.