An open API service providing repository metadata for many open source software ecosystems.

GitHub topics: kolmogorov-arnold-representation

AdityaNG/kan-gpt

The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling

Language: Python - Size: 3.05 MB - Last synced at: about 17 hours ago - Pushed at: 5 months ago - Stars: 714 - Forks: 55

kabachuha/nanoGPKANT

Testing KAN-based text generation GPT models

Language: Jupyter Notebook - Size: 583 KB - Last synced at: 22 days ago - Pushed at: 12 months ago - Stars: 16 - Forks: 1

andrewKLF/kan

This repository was created by Flatlogic Platform: https://flatlogic.com/generator | Application page: https://kan.flatlogic.app

Size: 1.95 KB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 0 - Forks: 0

Aqasch/KANQAS_code

Code for Kolmogorov-Arnold Network for Quantum Architecture Search i.e., KANQAS

Language: Python - Size: 574 KB - Last synced at: 4 months ago - Pushed at: 4 months ago - Stars: 12 - Forks: 0

pranavgupta2603/KAN-Distillation

An implementation of the KAN architecture using learnable activation functions for knowledge distillation on the MNIST handwritten digits dataset. The project demonstrates distilling a three-layer teacher KAN model into a more compact two-layer student model, comparing the performance impacts of distillation versus non-distilled models.

Language: Python - Size: 2.35 MB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 0 - Forks: 0

Simon-Bertrand/KAN-PyTorch

Kolmogorov–Arnold Networks (KAN) in PyTorch

Language: Python - Size: 3.35 MB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 1 - Forks: 0

JaroslawHryszko/NeuroBender

NeuroBender, where the mystical powers of mathematical complexity meet the robust stability of autoencoders!

Language: Python - Size: 21.5 KB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 1 - Forks: 0

HenkvdPol/ReLU-Network-for-KA-approximation

given beta-holder continuous function f:[0,1]^d -> R, this deep ReLU network approximates f up to approximation rate of 2^(-K beta) using 2^Kd parameters. Here, K is a set positive integer and d the dimension.

Language: Python - Size: 12.7 KB - Last synced at: over 1 year ago - Pushed at: almost 4 years ago - Stars: 0 - Forks: 0