8000 GitHub - pollyjuice74/5G-Decoder: Implementation of the linear transformer-based decoder experiments from 5G LDPC Linear Transformer for Channel Decoding using the Sionna link-level simulator.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Implementation of the linear transformer-based decoder experiments from 5G LDPC Linear Transformer for Channel Decoding using the Sionna link-level simulator.

License

Notifications You must be signed in to change notification settings

pollyjuice74/5G-Decoder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 

Repository files navigation

5G LDPC Linear Transformer for Channel Decoding

Implementation of the linear transformer-based decoder experiments from 5G LDPC Linear Transformer for Channel Decoding using the Sionna link-level simulator.

Abstract

This work introduces a novel, fully differentiable linear-time complexity transformer decoder and a transformer decoder to correct 5G New Radio (NR) LDPC codes. We propose a scalable approach to decode linear block codes with O(n) complexity rather than O(n^2) for regular transformers. The architectures' performances are compared to Belief Propagation (BP), the production-level decoding algorithm used for 5G New Radio (NR) LDPC codes. We achieve bit error rate performance that matches a regular Transformer decoder and surpases one iteration BP, also achieving competitive time performance against BP, even for larger block codes. We utilize Sionna, Nvidia's 5G & 6G physical layer research software, for reproducible results.

Structure of this repository

  • LTD_model_5G_LDPC.ipynb running Decoder on 5G LDPC codes.

In src/ folder:

  • decoder.py linear and regular transformer decoder model.
  • decoder5G.py decoder model for 5G LDPC codes.
  • e2e_model.py end-to-end channel between Encoder and Decoder.
  • args.py class holding arguments for the models and training.
  • utils.py save, load, train/test functions for transformer models.
  • utils5G.py 5G pcm pruning function.
  • time_comparison.py comparison and plotting functions to evaluate speed of models.

References

[A] M. Hernandez, F. Pinero, "5G LDPC Linear Transformer for Channel Decoding", 2025

[B] S. Cammerer, J. Hoydis, F. Aït Aoudia, and A. Keller, "Graph Neural Networks for Channel Decoding", 2022

[C] Yoni Choukroun, Lior Wolf, "Error Correction Code Transformer", 2022

License

© 2025 Mario Hernandez Torres. This project is licensed under the Apache License 2.0.
Distributed on an "AS IS" basis, without warranties or conditions of any kind.

About

Implementation of the linear transformer-based decoder experiments from 5G LDPC Linear Transformer for Channel Decoding using the Sionna link-level simulator.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0