GIT repositories, category: transformers
tf-transformers
Imagine auto-regressive generation to be 90x faster. tf-transformers (Tensorflow Transformers) is designed to harness the full power of Tensorflow 2, designed specifically for Transformer based architecture.
#tensorflow #transformerstransformer-tf
Attention Is All You Need Implementation (TensorFlow 2.x) 🚀 This repository contains the TensorFlow implementation of the paper. This implementation can be used to perform any sequence to sequence task with some minimal code changes.
#tensorflow #transformersgpt-mini
Yet another minimalistic Tensorflow (re-)re-implementation of Karpathy's Pytorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer).
#transformersmgpt
We introduce mGPT, a multilingual variant of GPT-3, pretrained on 61 languages from linguistically diverse 25 language families using Wikipedia and C4 Corpus.
#llm #text processing #transformersopenai-finetuning-example
This repository provides an example of fine-tuning OpenAI's GPT-4o-mini model for classifying customer service support tickets. Through fine-tuning, we are able to increase the classification accuracy from 69% to 94%.
#llm #nlp #text processingminiGPT
Minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer), both training and inference
#llm #nlp #text processing