The project management is responsible for the content of the information provided.
This project, funded by Gebert Rüf Stiftung, is supported by the following project partners: Sevensense Robotics AG; Wyss Zurich, NCCR Robotics; ESA Business Incubation Center; Venture Kick
Project no: GRS-031/18
Amount of funding: CHF 296'000
Duration: 02.2019 - 08.2020
Area of activity:
Pilotprojekte, 1998 - 2018
Mobile robots have the potential to bring disruptive changes to our everyday lives. However, to be able to move autonomously not only behind closed doors of factories, they need to be able to interact with other moving systems and, more importantly, with people. They need to react appropriately and socially compliant, such that they do not interfere or scare people around them.
In this project we will develop a compliance-driven navigation framework that will allow robots to become compliant in dynamic environments. For we will look at the problem holistically, from the sensors that perceive the robot’s surroundings, to the control strategy that will guide it through public spaces.
What is special about the project?
Mobile robots have brought significant increase in efficiency and effectiveness to several industries. For our everyday life, this also had an impact, even though it mostly happened behind the closed doors of ware houses and logistics centers.
It is largely due to the efficiency that robots brought to ware houses, that we can conveniently order most products online and expect them at our door step the following day.
Within this project we will enable such mobile robotic systems to work safely in areas that are not completely inaccessible to humans. In fact, many industries and applications could benefit from the capabilities of mobile robots. Imagine robotic cleaning machines or delivery robots, that work seamlessly in crowded areas such as airports. However, to enable such machines to work safely and efficiently among humans, they need to become compliant with human interaction. They need to be able to move through crowded areas while not disturbing the people around them.
Furthermore, they need to be able to navigate in complex environments and need to have an efficient and robust way to measure their own position in space. In this project visual Simultaneous Localization and Mapping (SLAM) technologies will be developed which give these robotic systems the flexibility they need.
The team has successfully completed the project and has selected and evaluated a suite of sensor systems. A prototype has been built up, such that it can be used as a testing platform during the project. With this prototype the team has been able to collect invaluable data sets, which formed a fundamental component for the development work of the project.
Furthermore, the team has built a holistic approach to obstacle avoidance, which allows machines to interact more naturally when dynamic obstacles are present. This resulted in a software framework which is capable of reading sensor data from multiple sources and steer a wheeled machine such that it naturally avoids any obstacles that are surrounding it.
The method has been tested on several different machines and will be refined in the future, by the leading partner of the project, Sevensense Robotics.
G. Cesari , G. Schildbach, A. Carvalho, and F. Borrelli, “Scenario model predictive control for lane change assistance and autonomous driving on highways”, Intelligent Transportation Systems Magazine , vol. 9, no. 3, pp. 23–35, 2017.
T. Eppenberger, G. Cesari, M. Dymczyk, R. Siegwart and R. Dubé, "Leveraging Stereo-Camera Data for Real-Time Dynamic Obstacle Detection and Tracking," 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 2020, pp. 10528-10535, doi: 10.1109/IROS45743.2020.9340699.
Persons involved in the project
Last update to this project presentation 12.04.2021