Ebook

AI for Wireless Communication Systems with MATLAB

CHAPTERS

Chapter 2: AI Workflow for 5G Channel Estimation

Chapter 2

AI Workflow for 5G Channel Estimation


Channel estimation is a fundamental challenge that every modern wireless system must solve. The receiver must understand how the channel is altering the signals sent by the transmitter and figure out how to specify the channel model at each instance of time and frequency. When channel estimation is done well, throughput goes up and error rates go down.

Traditional algorithms used to perform channel estimation are based on mathematical fitting algorithms, such as linear fitting or third-degree polynomial fitting. But channel variability has increased with increasing numbers of antennas, a wider range of frequencies, and varying environments.

Using AI, you can train a model to observe channel behavior and make accurate estimations despite the large numbers of parameters. An AI-based model can perform signal detection and classification in a few milliseconds, which is faster than the traditional method. Because the methods inside the AI-based model are simple, it can also reduce power consumption and computational requirements.

This section will walk you through the process, from data preparation to modeling, simulation, and deployment of an AI model that uses deep learning to create a convolutional neural network (CNN) that performs 5G channel estimation. When complete, the AI model will make it possible for you to improve overall wireless system performance without changing any other part of the system.

 Left-to-right blocks show iterative steps for developing AI-based 5G channel estimation starting with synthesis of 5G standard-compliant waveforms, moving to AI algorithm design, and then deployment of FPGA HDL code.

MATLAB can help you create 5G compliant waveforms, use that data to train an AI-based channel estimation model, simulate, test, and optimize the model, and deploy it.

section

Data Preparation

The first step in the process of creating an AI-based model for channel estimation is to generate 5G-compliant waveforms to use to train your model. The training data must be robust, meaning that it must not only be standard-compliant, but also it must be comprehensive and representative of channel impairments and scenarios that are realistic.

MATLAB makes it easy to generate standard-compliant waveforms and robust data sets. To create a data set to train an AI-based channel estimation model:

  • Use Wireless Waveform Generator to generate 5G-standard waveforms.
  • Augment those signals using Wireless Waveform Generator to make the data set more representative of reality by adding distortions that the signals will face in the real world. With a simple drop-down menu, you can add Gaussian noise, phase noise, or frequency noise.
  • Use Signal Labeler app to apply domain expertise to your data set. Labeled data helps with signal characterization during training and builds human intelligence into the model.
A screen shot of Wireless Waveform Generator shows the synthesis of 5G waveforms.

Wireless Waveform Generator provides options for selecting waveform standards and frequency ranges and adding impairments before generating waveform data.

Once your data is collected and labeled, you will need to process it to create a signal that can be used as input to train an AI model. For example, you can plot time on the y-axis and frequency on the x-axis and capture signal strength at each time and frequency coordinate as a color to create a heat map. This will create a series of images that lend themselves to being fed into deep learning networks trained to classify images.

You will also want to split your data into training data and validation data so that you have a data set to use to validate and tune your model once it is trained.

How you collect, manage, and label data will depend on your specific project. In some projects, you might be able to capture real-world data that sufficiently enables you to train a model.

A color-coded plot of signal strengths from a 5G data set, with time on the y-axis and frequency on the x-axis.

The Wireless Waveform Generator helps you create robust synthetic waveform data sets for a range of standards.

When that is not possible, you can consider using synthesized data to represent what a real system will see. It can be tricky to recreate the conditions that are seen in the field with synthesized data. MATLAB can help you recreate real-world conditions with its extensive library of typical channel impairments.

section

AI Modeling

Within MATLAB, you have direct access to common AI algorithms used for classification and prediction, including regression, deep networks, and clustering. Your first step in building an AI model is to choose an approach, such as building a CNN to perform channel estimation.

A CNN is a great choice for this AI model because CNNs excel at image processing. They have the added benefit of reliance on transfer learning, so your model can build upon pre-existing trained image processing networks, such as GoogLeNet or AlexNet.

A diagram showing a convolutional neural network for channel estimation that takes as input a received signal with pilot symbols and outputs channel estimation.

The channel estimation CNN will take in labeled images that represent 5G waveforms with pilot symbols and return an estimation of channel distortion.

To build the CNN, use Deep Network Designer to train and build the neural network. You can:

  • Import the data you generated and visualize the training process.
  • Accelerate training without any specialized programming using Parallel Computing Toolbox.

You can also import AI models developed using open-source frameworks such as PyTorch® and TensorFlow™.

You can then use the Experiment Manager app to tune the model and find optimal training options. Use grid search, random search, and Bayesian optimization–based search to sweep through the hyperparameters.

By running experiments in parallel, you can test different training configurations at the same time. Confusion matrices and custom metric functions will help you evaluate your trained network.

With MATLAB, you can create a “golden reference” or perfect channel estimation model that your AI model can be compared to. You can also compare your model to a traditional method, such as a linear interpretation algorithm, for the same channel model in the same environment.

Four side-by-side outputs of channel estimation. Three represent data from different models including linear interpolation, a practical estimator, and the neural network. The fourth is the actual channel.

The results of different approaches to channel estimation are compared to the actual channel using mean-squared error (MSE) and show that the neural network is the most accurate.

section

Simulate and Test

Once you have validated your AI-based channel estimation model locally, you will want to validate it globally in the context of the larger system. You will also want to test and fine-tune your model with over-the-air 5G signals.

With MATLAB, you can plug your AI model into an existing system simulation the same way you would drop in any other block.

Panel Navigation

To learn more about integrating design components from multiple sources and verifying that the resulting system meets requirements, read:

For testing, you can:

  • Create a lab setup with test and measurement hardware equipment. The hardware can be connected to the MATLAB environment using Instrument Control Toolbox to live-stream the data from MATLAB to hardware to perform OTA testing.
  • Use software-defined radios to transmit the data over the air and receive the data with real-time channel effects.
A block diagram shows how RF signal generation and capture instruments and supported SDR transmitters and receivers create flows of signals for analysis.

Acquire live data directly into MATLAB using SDR and signal transmit and receive instruments.

What should you expect from your wireless system once you have integrated your AI-based channel estimation CNN? Key metrics to examine for improvements include:

  • Throughput — The amount of data transmitted successfully per second should rise
  • Errors — Block error rate, bit error rate, and packet error rate should drop
section

Deployment

MATLAB has a unique code generation framework that allows models to be deployed anywhere without having to rewrite the code. You can:

  • Iteratively improve and test prototype AI models on hardware during the design phase
  • Deploy your AI model onto production hardware for system validation or rollout

For example, you might want to deploy the AI-based channel estimation model on an FPGA. Use Deep Learning HDL Toolbox™ to convert the model and create an HDL workflow. Then compile, deploy, and predict to determine inference speed and accuracy on different FPGA platforms.

Other deployment targets include:

  • Lightweight, lower power embedded devices (such as those used in a car)
  • Low-cost rapid prototyping boards, such as Raspberry Pi
  • Edge-based IoT applications such as a sensor and controller on a machine in a factory
  • Embedded platforms running C/C++, HDL, PLC, or CUDA code

MATLAB can also deploy to desktop or server environments which can allow you to scale from desktop executables to cloud-based enterprise systems on AWS® or Azure® (such as a financial analytics platform).

A hierarchy of deployment options shows that models can be deployed on embedded hardware or on enterprise systems.

MATLAB code generation enables deployment on a range of hardware platforms.