GitHub / declare-lab / Emma-X
Emma-X: An Embodied Multimodal Action Model with Grounded Chain of Thought and Look-ahead Spatial Reasoning
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/declare-lab%2FEmma-X
PURL: pkg:github/declare-lab/Emma-X
Stars: 64
Forks: 5
Open issues: 4
License: None
Language: Python
Size: 32.7 MB
Dependencies parsed at: Pending
Created at: 8 months ago
Updated at: 3 months ago
Pushed at: 3 months ago
Last synced at: 3 months ago
Topics: agents, chain-of-thought, embodied-agent, embodied-ai, embodied-artificial-intelligence, embodied-intelligence, llm, robotics, vla, vlm