ΑΙhub.org
 

Bipedal robot developed at Oregon State learns to run


by
11 August 2021



share this:

Cassie the robotImage courtesy of Jonathan Hurst, Oregon State University.

By Steve Lundeberg

Cassie the robot, invented at Oregon State University and produced by OSU spinout company Agility Robotics, has made history by traversing 5 kilometres outdoors in just over 53 minutes. The robot was developed under the direction of robotics professor Jonathan Hurst with a 16-month, $1 million grant from the Advanced Research Projects Agency of the U.S. Department of Defense.

Since Cassie’s introduction in 2017, OSU students funded by the National Science Foundation have been exploring machine learning options for the robot.

“The Dynamic Robotics Laboratory students in the OSU College of Engineering combined expertise from biomechanics and existing robot control approaches with new machine learning tools,” said Hurst, who co-founded Agility in 2017. “This type of holistic approach will enable animal-like levels of performance. It’s incredibly exciting.”

Cassie, with knees that bend like an ostrich’s, taught itself to run with what’s known as a deep reinforcement learning algorithm. Running requires dynamic balancing – the ability to maintain balance while switching positions or otherwise being in motion – and Cassie has learned to make infinite subtle adjustments to stay upright while moving.

“Cassie is a very efficient robot because of how it has been designed and built, and we were really able to reach the limits of the hardware and show what it can do,” said Jeremy Dao, a Ph.D. student in the Dynamic Robotics Laboratory.

“Deep reinforcement learning is a powerful method in AI that opens up skills like running, skipping and walking up and down stairs,” added Yesh Godse, an undergraduate in the lab.

Hurst said walking robots will one day be a common sight – much like the automobile, and with a similar impact. The limiting factor has been the science and understanding of legged locomotion, but research at Oregon State has enabled multiple breakthroughs.

ATRIAS, developed in the Dynamic Robotics Laboratory, was the first robot to reproduce human walking gait dynamics. Following ATRIAS was Cassie, then came Agility’s humanoid robot Digit.

“In the not very distant future, everyone will see and interact with robots in many places in their everyday lives, robots that work alongside us and improve our quality of life,” Hurst said.

During the 5K, Cassie’s total time of 53 minutes included about 6.5 minutes of resets following two falls: one because of an overheated computer, the other because the robot was asked to execute a turn at too high a speed.

In a related project, Cassie has become adept at walking up and down stairs.



tags:


Oregon State University




            AIhub is supported by:


Related posts :



Interview with Shaghayegh (Shirley) Shajarian: Applying generative AI to computer networks

  05 Aug 2025
Read the latest interview in our series featuring the AAAI/SIGAI Doctoral Consortium participants.

How AI can help protect bees from dangerous parasites

  04 Aug 2025
Tiny but mighty, honeybees play a crucial role in our ecosystems, pollinating various plants and crops.

The Machine Ethics podcast: AI Ethics, Risks and Safety Conference 2025

Listen to a special episode recorded at the AI Ethics, Risks and Safety Conference.

Interview with Aneesh Komanduri: Causality and generative modeling

  31 Jul 2025
Read the latest interview in our series featuring the AAAI/SIGAI Doctoral Consortium participants.
monthly digest

AIhub monthly digest: July 2025 – RoboCup round-up, ICML in Vancouver, and leveraging feedback in human-robot interactions

  30 Jul 2025
Welcome to our monthly digest, where you can catch up with AI research, events and news from the month past.

Interview with Yuki Mitsufuji: Text-to-sound generation

  29 Jul 2025
We hear from Sony AI Lead Research Scientist Yuki Mitsufuji to find out more about his latest research.

Open-source Swiss language model to be released this summer

  29 Jul 2025
This summer, EPFL and ETH Zurich will release a large language model (LLM) developed on public infrastructure.

Interview with Kate Candon: Leveraging explicit and implicit feedback in human-robot interactions

  25 Jul 2025
Hear from PhD student Kate about her work on human-robot interactions.



 

AIhub is supported by:






©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence