ΑΙhub.org
 

Bipedal robot achieves Guinness World Record in 100 metres


by
29 September 2022



share this:

bipedal robot running on trackCassie the robot sets 100-metre record, photo by Kegan Sims.

By Steve Lundeberg

Cassie the robot, invented at the Oregon State University College of Engineering and produced by OSU spinout company Agility Robotics, has established a Guinness World Record for the fastest 100 metres by a bipedal robot.

Cassie clocked the historic time of 24.73 seconds at OSU’s Whyte Track and Field Center, starting from a standing position and returning to that position after the sprint, with no falls.

The 100-metre record builds on earlier achievements by the robot, including traversing five kilometres in 2021 in just over 53 minutes. Cassie, the first bipedal robot to use machine learning to control a running gait on outdoor terrain, completed the 5K on Oregon State’s campus untethered and on a single battery charge.

Cassie was developed under the direction of Oregon State robotics professor Jonathan Hurst. The robot has knees that bend like an ostrich’s and operates with no cameras or external sensors, essentially as if blind.

Since Cassie’s introduction in 2017, in collaboration with artificial intelligence professor Alan Fern, OSU students have been exploring machine learning options in Oregon State’s Dynamic Robotics and AI Lab.

“We have been building the understanding to achieve this world record over the past several years, running a 5K and also going up and down stairs,” said graduate student Devin Crowley, who led the Guinness effort. “Machine learning approaches have long been used for pattern recognition, such as image recognition, but generating control behaviors for robots is new and different.”

The Dynamic Robotics and AI Lab melds physics with AI approaches more commonly used with data and simulation to generate novel results in robot control, Fern said. Students and researchers come from a range of backgrounds including mechanical engineering, robotics and computer science.

“Cassie has been a platform for pioneering research in robot learning for locomotion,” Crowley said. “Completing a 5K was about reliability and endurance, which left open the question of, how fast can Cassie run? That led the research team to shift its focus to speed.”

Cassie was trained for the equivalent of a full year in a simulation environment, compressed to a week through a computing technique known as parallelization – multiple processes and calculations happening at the same time, allowing Cassie to go through a range of training experiences simultaneously.

“Cassie can perform a spectrum of different gaits but as we specialized it for speed we began to wonder, which gaits are most efficient at each speed?” Crowley said. “This led to Cassie’s first optimized running gait and resulted in behavior that was strikingly similar to human biomechanics.”

The remaining challenge, a “deceptively difficult” one, was to get Cassie to reliably start from a free-standing position, run, and then return to the free-standing position without falling.

“Starting and stopping in a standing position are more difficult than the running part, similar to how taking off and landing are harder than actually flying a plane,” Fern said. “This 100-metre result was achieved by a deep collaboration between mechanical hardware design and advanced artificial intelligence for the control of that hardware.”

Hurst, chief technology officer at Agility Robotics and a robotics professor at Oregon State, said: “This may be the first bipedal robot to learn to run, but it won’t be the last. I believe control approaches like this are going to be a huge part of the future of robotics. The exciting part of this race is the potential. Using learned policies for robot control is a very new field, and this 100-metre dash is showing better performance than other control methods. I think progress is going to accelerate from here.”




Oregon State University




            AIhub is supported by:


Related posts :



Interview with Shaghayegh (Shirley) Shajarian: Applying generative AI to computer networks

  05 Aug 2025
Read the latest interview in our series featuring the AAAI/SIGAI Doctoral Consortium participants.

How AI can help protect bees from dangerous parasites

  04 Aug 2025
Tiny but mighty, honeybees play a crucial role in our ecosystems, pollinating various plants and crops.

The Machine Ethics podcast: AI Ethics, Risks and Safety Conference 2025

Listen to a special episode recorded at the AI Ethics, Risks and Safety Conference.

Interview with Aneesh Komanduri: Causality and generative modeling

  31 Jul 2025
Read the latest interview in our series featuring the AAAI/SIGAI Doctoral Consortium participants.
monthly digest

AIhub monthly digest: July 2025 – RoboCup round-up, ICML in Vancouver, and leveraging feedback in human-robot interactions

  30 Jul 2025
Welcome to our monthly digest, where you can catch up with AI research, events and news from the month past.

Interview with Yuki Mitsufuji: Text-to-sound generation

  29 Jul 2025
We hear from Sony AI Lead Research Scientist Yuki Mitsufuji to find out more about his latest research.

Open-source Swiss language model to be released this summer

  29 Jul 2025
This summer, EPFL and ETH Zurich will release a large language model (LLM) developed on public infrastructure.

Interview with Kate Candon: Leveraging explicit and implicit feedback in human-robot interactions

  25 Jul 2025
Hear from PhD student Kate about her work on human-robot interactions.



 

AIhub is supported by:






©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence