Embodied AI and Robotics Lab (AIR)

The Embodied AI and Robotics (AIR) Lab at NYU Abu Dhabi develops multimodal foundation models, world modeling techniques, and intelligent robotic systems to enable autonomous perception, reasoning, and interaction in dynamic real-world environments.

The Embodied AI and Robotics (AIR) Lab at NYU Abu Dhabi focuses on advancing Embodied AI by developing multimodal foundation models and language-augmented AI systems that enhance robotic perception, interaction, and reasoning. By integrating vision, language, and action, the lab enables robots to achieve context-aware understanding, real-time adaptation, and intuitive human-like interactions. These advancements allow AI systems to interpret complex sensory inputs, understand natural language commands, and generate intelligent behaviors that facilitate seamless interaction with humans and their environments.

To further strengthen the intelligence and adaptability of robotic systems, the lab is also advancing world modeling, where AI systems learn structured representations of dynamic, heterogeneous environments. This research enables robots to capture state transitions, infer causal relationships, and model evolving world dynamics, allowing for more effective interaction in open-ended, real-world scenarios. By enhancing an AI system’s ability to reason about its surroundings, anticipate changes, and make informed decisions, this research significantly improves long-term planning, policy learning, and decision-making under uncertainty. A key focus is on object-centric representations, which enable robots to understand, manipulate, and interact with their environment in a more structured and efficient manner, ultimately improving generalization and robustness in complex environments.

The lab applies its AI advancements across various robotic platforms, including humanoid robots, quadrupeds, unmanned ground vehicles (UGVs), unmanned aerial vehicles (UAVs), and AI-powered assistive devices such as wearable smart glasses. These robotic systems serve as testbeds for embodied AI research, helping validate multimodal perception, control policies, and real-world autonomy. By bridging theoretical research and real-world deployment, the lab ensures AI-driven systems operate autonomously, efficiently, and adaptively across a wide range of applications, from autonomous robotics and human-AI collaboration to assistive and healthcare technologies.

By integrating Embodied AI, world modeling, and robotics, the AIR Lab is at the forefront of developing next-generation intelligent systems that push the boundaries of AI-driven autonomy, perception, and interaction. Through this interdisciplinary research, the lab enables robots to seamlessly navigate, understand, and engage with complex, real-world environments, driving both foundational AI research and transformative technological innovations.

Contact

Yi Fang
Assistant Professor of Electrical and Computer Engineering
Email: yf23@nyu.edu