Topic: "exploitation-vs-exploration"
WinDerek/awesome-multi-armed-bandit
A curated list of resources about multi-armed bandit (MAB).
Language: Jupyter Notebook - Size: 26.5 MB - Last synced at: about 23 hours ago - Pushed at: about 5 years ago - Stars: 5 - Forks: 0

Francesco-Sovrano/Combining--experience-replay--with--exploration-by-random-network-distillation-
Combining Experience Replay with Exploration by Random Network Distillation
Language: Python - Size: 761 KB - Last synced at: about 1 year ago - Pushed at: over 5 years ago - Stars: 5 - Forks: 1
