# Robotics/Embodied Intelligence

## Navigation:

{% hint style="success" %}

* [Humanoid Robots](/by-industry-cases/robotics-embodied-intelligence/humanoid-robots.md)
  {% endhint %}

***

## <mark style="color:purple;">News and Updates:</mark>

### <mark style="color:orange;">Gemini Robotics 1.5</mark>

Gemini Robotics 1.5 from Google DeepMind marks the official introduction of agentic capabilities to robots, allowing them to complete complex, multi-step tasks.

{% embed url="<https://youtu.be/UObzWjPb6XM?si=in6_KmbfL0WkL_zb>" %}

### <mark style="color:orange;">Reachy Mini</mark>

Hugging Face, via its recent acquisition of Pollen Robotics, has unveiled Reachy Mini, a compact and affordable open‑source desktop humanoid kit (starting at $299) designed for expressive human‑robot interaction, creative coding, and AI experimentation—featuring motorized head and body rotation, animated antennas, audio‑visual sensors, pre‑loaded behaviors, and compatibility with Hugging Face’s AI model ecosystem, with shipping beginning fall 2025.

{% embed url="<https://youtu.be/JvdBJZ-qR18>" %}

### <mark style="color:orange;">Figure Felix</mark>

{% embed url="<https://x.com/Figure_robot/status/1931391490967928936>" %}

### <mark style="color:orange;">Boston Dynamics</mark>

Boston Dynamics partners with RAI Institute, combining reinforcement learning and motion capture to let Atlas robots master smoother, human-like movements.

{% embed url="<https://youtu.be/I44_zbEwz_w?si=XeMI1ZAHH8g0D82H>" %}

### <mark style="color:orange;">Nvidia GR00T N1</mark>

{% embed url="<https://nvidianews.nvidia.com/news/nvidia-isaac-gr00t-n1-open-humanoid-robot-foundation-model-simulation-frameworks>" %}

Nvidia introduces GR00T N1, the world's first open-source, fully customizable foundation model designed for generalized humanoid robot reasoning and skills.

{% embed url="<https://s3.amazonaws.com/cms.ipressroom.com/219/files/20252/nvidia-gr00t-n1.jpg>" %}

### <mark style="color:orange;">Figure Felix</mark>

{% embed url="<https://youtu.be/Z3yQHYNXPws?si=S2DvlnauHy6wq2dV>" %}

### <mark style="color:orange;">Protoclone: Bipedal Musculoskeletal Android V1</mark>

{% embed url="<https://youtu.be/H7dhwFcuUn0?si=X3WdvZIFhIt5m-m7>" %}

### <mark style="color:orange;">Unitree Upgrade</mark>

{% embed url="<https://x.com/UnitreeRobotics/status/1879864345615814923>" %}

***

Boston Dynamics and Toyota Research Institute (TRI) have announced a groundbreaking partnership that combines their expertise in robotics and AI to accelerate the development of general-purpose humanoid robots, leveraging TRI's Large Behavior Models and Boston Dynamics' Atlas robot platform.

{% embed url="<https://pressroom.toyota.com/boston-dynamics-and-toyota-research-institute-announce-partnership-to-advance-robotics-research/>" %}

### Tesla Optimus:

{% embed url="<https://x.com/Tesla_Optimus/status/1846797392521167223>" %}

### Nvidia Generative Physical AI

> NVIDIA today announced that the world’s leaders in robot development are adopting the [NVIDIA Isaac™ robotics platform](https://developer.nvidia.com/isaac) for the research, development and production of the next generation of AI-enabled autonomous machines and robots.

{% embed url="<https://youtu.be/AYSfcgVv9-U?si=2sEkhp3LEC7hn3fe>" %}

**Isaac:**

<https://developer.nvidia.com/isaac>

**More info:**

<https://nvidianews.nvidia.com/news/robotics-industry-development-ai-autonomous-machines>

***

<mark style="color:red;">**NVIDIA**</mark> introduced <mark style="color:red;">**GR00T**</mark> at the GPU Technology Conference (GTC) 2024. It will enable the robot to understand multimodal instructions like language, video, and motion.&#x20;

{% embed url="<https://x.com/tsarnick/status/1769956431078261015?s=20>" %}

***

## Resources:

### Awesome Weekly Robotics

<https://github.com/msadowski/awesome-weekly-robotics?tab=readme-ov-file>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.aiandbusiness.com/by-industry-cases/robotics-embodied-intelligence.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
