An open API service providing repository metadata for many open source software ecosystems.

GitHub topics: transformer-model

sebastian-davidson/Japanese-to-Hiragana

Using just machine learning, can we convert the kanji in Japanese sentences to hiragana?

Language: Python - Size: 20.7 MB - Last synced at: 7 days ago - Pushed at: 9 days ago - Stars: 0 - Forks: 0

bniladridas/cpp_terminal_app

Leverages Llama 3.2 for Accelerated Computing the Terminal Inference App.

Language: Makefile - Size: 888 KB - Last synced at: about 1 month ago - Pushed at: about 2 months ago - Stars: 0 - Forks: 0

nicolay-r/bulk-ner

Tiny no-string framework for a quick third-party models binding for entities extraction from cells of long tabular data

Language: Python - Size: 128 KB - Last synced at: 27 days ago - Pushed at: 2 months ago - Stars: 4 - Forks: 0

rubenpjove/tabularT-OS-fingerprinting

This repository employs two different architectures of Tabular Transformer models for Operating System fingerprinting from three different datasets.

Language: Jupyter Notebook - Size: 3.98 MB - Last synced at: 3 months ago - Pushed at: 3 months ago - Stars: 0 - Forks: 0

SankethSingh/Text-Translation_BERT

This repository contains Machine-Translation model for French and English languages

Language: Jupyter Notebook - Size: 55.7 KB - Last synced at: 3 months ago - Pushed at: 4 months ago - Stars: 0 - Forks: 0

mytechnotalent/RE-GPT

Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.

Language: Jupyter Notebook - Size: 1.59 MB - Last synced at: 6 months ago - Pushed at: 6 months ago - Stars: 0 - Forks: 0

Xza85hrf/flux_pipeline

FluxPipeline is a prototype experimental project that provides a framework for working with the FLUX.1-schnell image generation model. This project is intended for educational and experimental purposes only.

Language: Python - Size: 33.9 MB - Last synced at: about 1 month ago - Pushed at: 6 months ago - Stars: 0 - Forks: 0

SreeEswaran/Text-Summarization-using-BART

This project demonstrates text summarization using the BART (Bidirectional and Auto-Regressive Transformers) model. BART is a transformer model trained as a denoising autoencoder and is effective for text generation tasks such as summarization.

Language: Python - Size: 11.7 KB - Last synced at: about 2 months ago - Pushed at: 8 months ago - Stars: 2 - Forks: 0

taxborn/betsi

A light implementation of the 2017 Google paper 'Attention is all you need'.

Language: Jupyter Notebook - Size: 2.35 MB - Last synced at: about 1 month ago - Pushed at: over 1 year ago - Stars: 0 - Forks: 0