An open API service providing repository metadata for many open source software ecosystems.

Topic: "generative-pretrained-transformers"

kanavgoyal898/echoGPT

echoGPT is a minimal GPT implementation for character-level language modeling with 25.4M parameters. Built with PyTorch, it includes multi-head self-attention, feed-forward layers, and position embeddings. Trained on text like tiny_shakespeare.txt to predict the next character.

Language: Python - Size: 7.36 MB - Last synced at: about 2 months ago - Pushed at: 4 months ago - Stars: 0 - Forks: 0

mytechnotalent/ToyGPT

ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.

Language: Jupyter Notebook - Size: 1010 KB - Last synced at: 5 months ago - Pushed at: 5 months ago - Stars: 0 - Forks: 0