Topic: "generative-pre-trained-transformer"
VinAIResearch/PhoGPT
PhoGPT: Generative Pre-training for Vietnamese
Size: 22.5 KB - Last synced at: over 1 year ago - Pushed at: over 1 year ago - Stars: 370 - Forks: 27

angeluriot/Language_model
An autoregressive language model like ChatGPT.
Language: Python - Size: 4.67 MB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 37 - Forks: 2

drakyanerlanggarizkiwardhana/Auto-GPT
An experimental open-source attempt to make GPT-4 fully autonomous.
Language: Python - Size: 521 KB - Last synced at: about 2 years ago - Pushed at: about 2 years ago - Stars: 16 - Forks: 14

charlesxu90/helm-gpt
HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
Language: Jupyter Notebook - Size: 44.1 MB - Last synced at: 10 months ago - Pushed at: 10 months ago - Stars: 8 - Forks: 1

mytechnotalent/kgpt
A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.
Language: Python - Size: 1.17 MB - Last synced at: almost 2 years ago - Pushed at: almost 2 years ago - Stars: 2 - Forks: 0

kopasxnkliang/IPMN-07
An Industrial Project about NLP in Finance Application
Language: Jupyter Notebook - Size: 1.95 MB - Last synced at: over 1 year ago - Pushed at: over 1 year ago - Stars: 1 - Forks: 0

ewdlop/ChatGPTNote
Size: 28.3 KB - Last synced at: 2 months ago - Pushed at: 4 months ago - Stars: 0 - Forks: 0

mytechnotalent/RE-GPT
Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.
Language: Jupyter Notebook - Size: 1.59 MB - Last synced at: 5 months ago - Pushed at: 5 months ago - Stars: 0 - Forks: 0

mytechnotalent/ToyGPT
ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.
Language: Jupyter Notebook - Size: 1010 KB - Last synced at: 5 months ago - Pushed at: 5 months ago - Stars: 0 - Forks: 0

LaurenceLungo/GPT-from-Scratch
PyTorch implementation of GPT from scratch
Language: Jupyter Notebook - Size: 11.2 MB - Last synced at: 11 months ago - Pushed at: 11 months ago - Stars: 0 - Forks: 0

nguyenhongson1902/gpt-from-scratch
I built a GPT model from scratch to generate text
Language: Jupyter Notebook - Size: 435 KB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 0 - Forks: 0

YashrajBaila7/GPT2LM
A implimentation of GPT2 varient.
Language: Python - Size: 91.8 KB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 0 - Forks: 0

iliyaML/nlp
Repository for all things Natural Language Processing
Language: Jupyter Notebook - Size: 3.28 MB - Last synced at: almost 2 years ago - Pushed at: almost 2 years ago - Stars: 0 - Forks: 0

iliyaML/experiments
Repository for personal experiments
Language: Jupyter Notebook - Size: 2.51 MB - Last synced at: about 1 year ago - Pushed at: almost 2 years ago - Stars: 0 - Forks: 1

kohlivrinda/shakespeare-gpt
A Generatively Pretrained Transformer that generates Shakespeare-eque quotes
Language: Jupyter Notebook - Size: 37.9 MB - Last synced at: almost 2 years ago - Pushed at: about 2 years ago - Stars: 0 - Forks: 0
