Logo
FrontierNews.ai

Eric Schmidt's $105 Million Bet on Robotics: Why Physical AI Is the Next Frontier

Eric Schmidt, the former CEO of Google, has emerged as a key investor in one of robotics' most ambitious ventures, backing French startup Genesis AI with $105 million in seed funding. The investment signals a major shift in how tech leaders view artificial intelligence: no longer confined to software and language models, but extending into physical machines capable of performing complex, real-world tasks. Genesis AI's new robotics foundation model, called GENE-26.5, represents a fundamental attempt to solve one of AI's hardest problems: teaching machines to manipulate objects with human-like precision and adaptability.

What Makes Physical AI Different From Software AI?

For decades, industrial robots have excelled at repetitive, highly structured tasks. A factory robot can weld the same car component thousands of times without error, or a warehouse robot can move identical packages with mechanical precision. But ask that same robot to handle an egg, adjust to a slightly different object shape, or work in an unpredictable environment, and it often fails. This limitation has become the central bottleneck preventing robots from moving beyond factories into homes, hospitals, and service industries.

Software-based AI systems like ChatGPT or Google's Gemini train on massive datasets of text and images, allowing them to generalize across countless tasks. Robotics companies are now attempting to apply the same principle to physical machines. But robots face a fundamentally different challenge: they must interpret space, force, movement, object interaction, and physical consequences in real time. A slight variation in texture, lighting, weight, or positioning can cause a robotic system to fail entirely.

Genesis AI's GENE-26.5 is designed to address this problem through large-scale multimodal training, creating what the company describes as a "robotic brain" capable of operating across multiple tasks and hardware systems. In demonstrations released by the company, robots powered by GENE-26.5 cooked meals, cracked eggs one-handed, solved a Rubik's Cube, and played piano pieces at human-like speed.

Why Are Robotic Hands Becoming Strategically Important?

The centerpiece of Genesis AI's announcement was not just the AI model itself, but the robotic hand accompanying it. While humanoid robots often attract headlines because of their human-like appearance, many robotics researchers consider dexterous manipulation to be one of the field's most difficult technical challenges. Human hands combine more than 20 degrees of freedom with highly sensitive touch feedback, fine motor coordination, and continuous adaptation. Replicating that mechanically while simultaneously interpreting environmental feedback through AI systems remains extremely difficult.

Genesis AI's robotic hand was designed to mirror human anatomy more closely than traditional industrial grippers. This design serves a practical purpose beyond appearance: one of the largest bottlenecks in robotics is collecting useful training data. Text-based AI systems can train on internet-scale datasets, but robots require physical interaction data linked to movement, force, and spatial reasoning. Because the robotic hand closely mirrors human hand structure, the company says that data transfers more effectively into robotic systems.

Genesis AI is not alone in pursuing dexterous robotic manipulation. Several companies are now competing in this space:

  • Shadow Robot Company: A UK-based firm that has spent years developing highly dexterous robotic hands used by research institutions including NASA and OpenAI
  • Sanctuary AI: A Canadian startup focused heavily on tactile sensing and fine manipulation for industrial humanoid robots
  • Figure AI and Physical Intelligence: US startups building broader robotics foundation models intended to generalize across tasks and environments
  • Tesla's Optimus: The electric vehicle maker has invested heavily in robotic hand development, with recent versions reportedly featuring 22 degrees of freedom designed for delicate object handling
  • Linkerbot: A Chinese robotics startup that has emerged as one of China's leading specialized robotic hand companies

Genesis AI's positioning differs slightly from many of these competitors because the company is attempting to build the full stack simultaneously: the robotic hand, motion-capture systems, simulation environment, and foundation model itself. That vertically integrated approach increasingly mirrors strategies pursued by companies such as Tesla and Figure AI, where hardware, sensing, and AI training are treated as tightly interconnected systems rather than interchangeable components.

How Is Genesis AI Addressing the Simulation Problem?

Simulation-based robotics training has become more sophisticated in recent years, but major technical limitations remain. Robots that perform well inside simulation environments often struggle in real-world deployment, a long-standing problem known within robotics as the "sim-to-real gap." Material textures, object variation, lighting conditions, friction, and unpredictable movement all introduce complications that simulations cannot perfectly replicate.

Genesis AI argues that combining large-scale simulation with real-world human movement data helps reduce that gap. The company collects data from humans wearing sensor-equipped gloves that track finger, wrist, and hand movements. The startup is also using internet video, simulation environments, and head-mounted cameras to generate training data. The company's demonstrations suggest substantial progress in dexterity compared with earlier generations of robotic systems.

Yet the distinction between controlled demonstrations and scalable commercial deployment remains important. Many robotics companies have historically produced impressive demonstrations without achieving broad industrial adoption. Physical systems operate under significantly greater reliability constraints than software systems because they interact continuously with unpredictable real-world environments. Genesis AI itself acknowledged that some delicate manipulation tasks currently succeed at lower rates than others.

Steps to Understanding the Robotics Foundation Model Approach

  • Foundation Model Concept: Large AI systems trained on broad datasets that can generalize across different tasks, similar to how GPT models work in software but adapted for physical machines
  • Multimodal Training: Genesis AI's GENE-26.5 learns from multiple types of data including video, motion capture, simulation, and real-world interaction to build a more complete understanding of physical tasks
  • Hardware Integration: Unlike software models that run on any computer, robotics foundation models must be tightly integrated with specific hardware, sensors, and mechanical systems to function effectively
  • Data Collection at Scale: The company uses human movement data, internet video, and simulation to generate training examples that would be impossible to collect through robot-only learning

Genesis AI emerged from stealth in 2025 with its $105 million seed funding round, backed by investors including Schmidt, French entrepreneur Xavier Niel, and France's public investment bank Bpifrance. The company was founded by Zhou Xian, who holds a robotics PhD from Carnegie Mellon University, and Théophile Gervet, formerly a research scientist at Mistral AI.

The investment from Schmidt and other prominent backers reflects growing confidence that robotics foundation models represent the next major frontier in AI development. While software-based AI has dominated headlines and venture capital for the past two years, the ability to extend those principles into physical machines could unlock entirely new categories of automation, from manufacturing to healthcare to domestic service. Genesis AI's GENE-26.5 and its accompanying robotic hand represent one of the first serious attempts to build a general-purpose robotics system capable of operating across different hardware platforms and real-world environments.