ΑΙhub.org
 

Collaboration the key to realising the potential of AI


by
28 November 2022



share this:

Yellow dog-shaped robot next to a personSPOT is a quadruped robot “dog” from the Boston Dynamics company. Photo credit: Thor Balkhed.

By Anders Törneholm

It can be difficult for rescue personnel to reach an injured person in inaccessible terrain in time to provide necessary aid. It is probable that autonomous drones and quadruped robot “dogs” will become our ‘friends in need’ in the future. But we are currently far from achieving full autonomy for these robotic systems. Consequently, well-functioning collaboration between human and machine is crucial.

With a low thud, a red medical package lands in the damp grass, and the mechanical buzzing of a quadrotor system fades into the distance. A moment later, a yellow quadruped robot makes its way across the well-manicured lawns of Gränsö Manor. An injured person lies a bit further away. The quadruped robot walks over to the red first-aid kit dropped by the quadrotor system, picks it up and then walks over to the previously identified casualty, and places the medical package within reach. Most of these activities take place autonomously.

“We are facing a future in which crises become more frequent as a consequence of climate change, terrorism, natural and man-made disasters and war. There are numerous situations where it would be too risky to send in people to rescue or assist individuals, and in such cases autonomous robots may be part of a solution,” says Patrick Doherty, professor in the Department of Computer and Information Science at Linköping University.

Symbiosis

One of the areas in which researchers in the Artificial Intelligence and Integrated Computer Systems Division at Linköping University, led by Doherty, are working on is the development of systems that can deal autonomously with much of what is difficult or impossible for humans to do. At the same time, there is much that humans can do that is beyond the capability of current state-of-art robots. Therefore, the current research goal is to develop autonomous systems in which the sum of the abilities of people and of robots is greater than the parts. One leverages the capabilities of each team member, be they robots or humans.

Patrick DohertyPatrick Doherty, professor in the Department of Computer and Information Science at Linköping University. Photo credit: Thor Balkhed.

“Robots are becoming evermore sophisticated, and they can now even collaborate with humans to some extent. But this also means that you can’t build an autonomous system without a context. It’s important to consider what the systems interact with and how they achieve specific goals,” says Patrick Doherty.

He believes that it is important to find a symbiosis between robots and humans. And this means that one focuses on a spectrum of autonomy. In the ideal case, the robot system itself would choose the level of autonomy suitable for a task at hand, but it is equally important that a human can also regulate the degree of autonomy a robot system has at any one time. At one end of the scale, we have no autonomy at all, and we are well-familiar with such systems, while at the other end we have full autonomy, and this still lies some way off in the future.

Historically speaking, a major goal of AI research has been to develop fully autonomous systems exhibiting high degrees of intelligent behaviour. But in many cases, AI systems will almost always interact and collaborate with humans in one way or another. There are also scenarios in which full autonomy is desirable such as in space applications on the moon or Mars, where the communication paths are long, limiting interaction, and the risk of meeting a human is low.

Teamwork

But what about rescue operations, then? Wouldn’t it be an advantage if the systems were fully autonomous in risk-filled situations?

“No, currently in the context of rescue operations, that’s not a good idea. Now, and in the near future, it’s better to focus on collaboration between humans and robots and ensure that they form a cohesive team. People are still better at doing certain activities than robots in many ways. The only time it may be desirable to use fully autonomous systems is in extremely dangerous situations, such as a nuclear or chemical spill where radioactivity or spillage is high and there is no choice but to use robot systems exclusively,” says Patrick Doherty.

quadruped robot and droneCollaboration between several agents, whether they are robots or humans, in key to realising the potential of AI at its current state. Photo credit: Thor Balkhed.

In the emergency rescue demonstration at Gränsö Manor, humans and robots collaborate in three separate stages. In the first, a human indicates on a computer screen a larger region in which injured persons are assumed to be located. A drone then autonomously scans this region and attempts to identify objects of interest using machine-learned detection and classification systems. When a person in need of help has been detected and identified, an operator must then decide what that person needs. This may be a radio, emergency rations or – as in this case – a first-aid kit. A drone that is equipped with the necessary drop mechanisms then flies to an appropriate drop point and releases the kit as close to the casualty as possible.

However, in some cases the drone cannot release its load sufficiently close to the casualty. In this case, one requires a robot that can operate on the ground in changing terrain. This has led the researchers to introduce SPOT, a quadruped robot “dog” from the Boston Dynamics company, which can deliver the medical kit autonomously over the final distance by first picking it up with its robot arm and then navigating to the injured person autonomously.

Machine learning

The research behind the autonomous delivery is advanced, and many parameters must be considered to make everything work. The task of identifying objects is itself difficult for artificial intelligence, and the quality of the result often depends on the dataset used to train the system.

“During the rescue, the system must discover important items and be able to classify them – a person, an inverted boat, a small fire, etc. Machine learning and deep learning are used to enable the system to do this. It’s difficult. And the weather, for example, plays a role here. A training dataset that was collected on sunny days will not work well when it’s cloudy,” says Patrick Doherty.

Several other general problems must be solved before the systems become robust and reliable.

“Of course, we would like to have full autonomy that is just as effective as a human and that is the long term goal. But we won’t achieve this in the near future. So we need first to see how AI, in its current state, can help us in the here and now,” says Patrick Doherty.

Yellow quadruped robot walking“Robots are becoming evermore sophisticated, and they can now even collaborate with humans to some extent. But this also means that you can’t build an autonomous system without a context. It’s important to consider what the systems interact with and how they achieve specific goals,” says Patrick Doherty. Photo credit: Thor Balkhed.




Linköping University




            AIhub is supported by:


Related posts :



2024 AAAI / ACM SIGAI Doctoral Consortium interviews compilation

  20 Dec 2024
We collate our interviews with the 2024 cohort of doctoral consortium participants.

Interview with Andrews Ata Kangah: Localising illegal mining sites using machine learning and geospatial data

  19 Dec 2024
We spoke to Andrews to find out more about his research, and attending the AfriClimate AI workshop at the Deep Learning Indaba.

#NeurIPS social media round-up part 2

  18 Dec 2024
We pick out some highlights from the second half of the conference.

The Good Robot podcast: Machine vision with Jill Walker Rettberg

  17 Dec 2024
Eleanor and Kerry talk to Jill about machine vision's origins in polished volcanic glass, whether or not we'll actually have self-driving cars, and a famous photo-shopped image.

Five ways you might already encounter AI in cities (and not realise it)

  13 Dec 2024
Researchers studied how residents and visitors experience the presence of AI in public spaces in the UK.

#NeurIPS2024 social media round-up part 1

  12 Dec 2024
Find out what participants have been getting up to at the Neural Information Processing Systems conference in Vancouver.

Congratulations to the #NeurIPS2024 award winners

  11 Dec 2024
Find out who has been recognised by the conference awards.

Multi-agent path finding in continuous environments

and   11 Dec 2024
How can a group of agents minimise their journey length whilst avoiding collisions?




AIhub is supported by:






©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association