GitHub / naimkatiman / RAG-using-Llama-3.1-WebUi-on-Streamlit
Run Llama on serverless with multimodal RAG, with focus on reading files (But with limited token ofc)
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Jupyter Notebook
Size: 5.19 MB
Dependencies parsed at: Pending
Created at: 10 months ago
Updated at: 10 months ago
Pushed at: 10 months ago
Last synced at: 10 months ago
Topics: lightningai, llamaindex, python, rag, streamlit
Loading...