Research Topics

Abhishek Kashyap

Learning Object Affordances for Full-body Mobile Manipulation

A mobile robot manipulator – or in short, a mobile manipulator – comprises of one or more robot arms mounted on a movable base. The mobile base greatly expands the reachable workspace, allowing the manipulator to perform tasks that may not have been possible due to reachability constraints or insufficient degrees of freedom. To accomplish tasks, the mobile manipulator needs to form a representation of the scene or the environment it’s operating in. Scene representations like Neural Radiance Fields (NeRFs) and Gaussian Splatting encode scene information in a way that can also render photorealistic images from novel viewpoints. Scene representations suitable for manipulation and navigation in the context of mobile manipulation will be explored.

Furthermore, the tasks to be performed by the mobile manipulator may impose motion constraints. For example, navigating through a door requires identifying a door handle, planning for and then executing arm and base trajectories that respect the physical constraints of the door, handle, and hinges. Similarly, grabbing an object off a table without the base stopping requires synchronized whole-body control. Devising general-purpose approaches to identify motion constraints can be useful in making a mobile manipulator more efficient in its tasks.

Esranur Ertürk

“Get a grip” – Accurate robot-human handovers

The goal is to develop a robotic system that can accurately hand over objects to humans, ensuring precise placement without re-gripping. The project addresses three main challenges: achieving effective scene understanding and situation awareness, accurately recognizing user intentions and high-level reasoning, and teaching and executing collaborative tasks. By focusing on these areas, the project seeks to improve intuitive robot instruction and responsive collaboration in both medical and industrial contexts, ultimately contributing to more efficient workflows and addressing labor shortages in sectors like healthcare and manufacturing.

The project plan focuses on developing methods to determine which objects to hand over, how to place them in the user’s hand, and how to execute these tasks effectively. Initially, the project will use tools like a scalpel and a screwdriver. Hand tracking enables the robot to understand and respond to human gestures, ensuring precise object handovers. This involves the robot interpreting the surgeon’s or worker’s gestures and hand poses to accurately place the tool in their hand without the need for re-gripping. Tool tracking, on the other hand, allows the robot to accurately identify and manipulate these specific tools during collaborative tasks, ensuring that each tool is handed over in the correct orientation and position for immediate use.

By integrating high-level knowledge and explicit commands, the system can predict what tool to hand over and when, based on the context of the task and the workflow. This involves using symbolic domain knowledge and sensor information to understand the user’s intentions and make informed decisions. The system aims to balance personalization with general solutions, learning individual user preferences for gestures, speed, and force parameters, while also maintaining a level of generality that allows it to adapt to different users and scenarios.

Sebastiano Fregnan

Whole-Body Interactive Mobile Manipulation

Collision-free trajectory generation. Constrained control for composite platform-manipulator motion. Adaptive whole-body compliant control for interaction with obstacles. We are planning to use off-the-shelf visual perception for capturing information about the obstacles and force/torque sensors, tactile skin and proprioception for whole body compliance.

Adam Miksits

Communication-Control for robot localization

The team that I work in at Ericsson Research studies communication-control co-design for connected mobile robots. As a part of that work, in my research I investigate the implications of offloading the robot’s localization to the edge, which will cause congestions in the network when scaled up to many robots. By sending fewer sensor measurements from the robot to the edge, the load on the network can be reduced, but at the cost of a higher localization uncertainty. This could also have an effect on safety if the localization estimate is used to avoid obstacles in the robot’s map. The uncertainty can sometimes be handled by the controller on the robot, by going slower or staying further away from obstacles, but then the robot would take much longer to reach its goal. By looking at both the communication and control aspect together we can find a good trade-off between the two variables while keeping the robot safe.

Publications

Miksits, A., Barbosa, F.S., Araújo, J., & Johansson, K. H. (2024, November). Communication and Control Co-Design for Risk-Aware Safety of Mobile Robots with Offloaded Localization. Submitted to ECC2025

Styrud, J., Iovino, M., Norrlöf, M., Björkman, M., Smith, C. (2024, September). Automatic Behavior Tree Expansion with LLMs for Robotic Manipulation. arXiv preprint arXiv:2409.13356 

Iovino, M., Förster, J., Falco, P., Chung, J.J., Siegwart, R., & Smith, C. (2024, August). Comparison between Behavior Trees and Finite State Machines. arXiv preprint arXiv:2405.16137 

Styrud, J., Mayr, M., Hellsten, E., Krueger, V., & Smith, C. BeBOP–Combining Reactive Planning and Bayesian Optimization to Solve Robotic Manipulation Tasks. 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 2024, pp. 16459-16466, doi: 10.1109/ICRA57147.2024.10611468.

Vahs, M., & Tumova, J. Risk-aware control for robots with non-gaussian belief spaces. 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 2024, pp. 11661-11667, doi: 10.1109/ICRA57147.2024.10611412.

Miksits, A., Barbosa, F. S., Lindhé, M., Araújo, J., & Johansson, K. H. (2023, December). Safe Navigation of Networked Robots Under Localization Uncertainty Using Robust Control Barrier Functions. In 2023 62nd IEEE Conference on Decision and Control (CDC) (pp. 6064-6071). IEEE. 

Iovino, M., Styrud, J., Falco, P., & Smith, C. (2023, August). A Framework for Learning Behavior Trees in Collaborative Robotic Applications. In 2023 IEEE 19th International Conference on Automation Science and Engineering (CASE) (pp. 1-8). IEEE. 

Pek, C., Schuppe, G. F., Esposito, F., Tumova, J., & Kragic, D. (2023). SpaTiaL: monitoring and planning of robotic tasks using spatio-temporal logic specifications. Autonomous Robots, 47(8), 1439-1462.