Here are
15 public repositories
matching this topic...
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Updated
Nov 19, 2024
Python
[NeurIPS 2024] Official code of ”LION: Linear Group RNN for 3D Object Detection in Point Clouds“
Updated
Oct 8, 2024
Python
Explorations into the recently proposed Taylor Series Linear Attention
Updated
Aug 18, 2024
Python
Implementation of Agent Attention in Pytorch
Updated
Jul 10, 2024
Python
The semantic segmentation of remote sensing images
Updated
Jul 29, 2022
Python
The semantic segmentation of remote sensing images
Updated
Jul 29, 2022
Python
CUDA implementation of autoregressive linear attention, with all the latest research findings
Updated
May 23, 2023
Python
Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)
Updated
Jun 6, 2024
Python
Updated
Jan 8, 2023
Python
Code for the paper "Cottention: Linear Transformers With Cosine Attention"
Updated
Oct 19, 2024
Cuda
[ICML 2024] Official implementation of "LeaPformer: Enabling Linear Transformers for Autoregressive and Simultaneous Tasks via Learned Proportions."
Updated
Nov 12, 2024
Python
RWKV Wiki website (archived, please visit official wiki)
Updated
Mar 26, 2023
Shell
Official Implementation of SEA: Sparse Linear Attention with Estimated Attention Mask (ICLR 2024)
Updated
Oct 24, 2024
Python
LEAP: Linear Explainable Attention in Parallel for causal language modeling with O(1) path length, and O(1) inference
Updated
Jun 18, 2023
Jupyter Notebook
Taming Transformers for High-Resolution Image Synthesis
Updated
May 2, 2022
Jupyter Notebook
Improve this page
Add a description, image, and links to the
linear-attention
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
linear-attention
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.