Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
Jun 10, 2025 - Rust
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
AICI: Prompts as (Wasm) Programs
Rust-tokenizer offers high-performance tokenizers for modern language models, including WordPiece, Byte-Pair Encoding (BPE) and Unigram (SentencePiece) models
Implementation of the GPT architecture in Rust 🦀 + Burn 🔥
A GPT Implementation in Rust on top of tch-rs 🔥 🦀
AthenaOS is a next generation AI-native operating system managed by Swarms of AI Agents
Succeeded by SyntaxDot: https://github.com/tensordot/syntaxdot
Succeeded by syntaxdot-transformers: https://github.com/tensordot/syntaxdot/tree/main/syntaxdot-transformers
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Parsing and JSON transformer library for the OpenFGA authorization DSL
Rust Implemention of paper: Attention Is All You Need(https://arxiv.org/abs/1706.03762), Code Port from http://nlp.seas.harvard.edu/2018/04/03/attention.html
A Lazy, high throughput and blazing fast structured text generation backend.
A project written in the Rust language with the goal of offline load of small LLM Model, specifically RAG (Retrieval Augmented Generation) on mobile devices.
Reproducibility of "Attention is All You Need"
CLI tool for converting JSON to RDF
XML2RDF Converter
Extremely fast html to react transformer
Library for converting CSV files into NTriple RDF
burn inference and training of baby dragon hatchling
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."