PORTEFEUILLE

Présentation des projets sur le site web

Chaque projet soutenu par Gebert Rüf Stiftung est présenté sur le site web de la fondation avec en particulier les données de base du projet. Par cette publication, la fondation informe sur les résultats du soutien accordé et contribue à la communication scientifique au sein de la société.

Close

Bio-inspired Vision System for Fast Localization and Mapping

Rédaction

Für den Inhalt der Angaben zeichnet die Projektleitung verantwortlich.

Coopération

Dieses von der Gebert Rüf Stiftung geförderte Projekt wird von folgenden weiteren Projektpartnern mitgetragen:

Données de projet

  • Numéro du projet: GRS-048/14 
  • Subside accordé: CHF 200'000 
  • Consentement: 06.11.2014 
  • Durée: 01.2015 - 06.2016 
  • Champs d'activité:  Pilotprojekte, 1998 - 2018

Direction du projet

Description du projet

Reaction time plays a critical role when robots move through the real world because it determines whether a robot crashes or not. Many robots today can only move slowly in unknown environments because it takes time to process all visual information. This is because the way conventional cameras observe visual scenes is very inefficient. To move faster, robots have to become more efficient in performing “simultaneous localization and mapping” (SLAM) which is necessary for navigation.

We have developed a camera which is inspired by the processing in our eyes. These cameras react much faster to changes and consume less power than conventional cameras. In this project we have develop the world’s first real-time SLAM algorithms for this camera.

Quelles sont les particularités de ce projet?

The Silicon Eye technology fuses the advantages of conventional machine vision with bio-inspired event-based processing. This type of sensor has a great potential in the applications such as robotics or smart glasses and with this project we delivered the proof that existing algorithms can be adapted to the output of our sensor.

Etat/résultats intermédiaires

The dynamic and active pixel vision sensor (DAVIS) which is the basis of the Silicon Eye technology has been successfully designed, produced and tested. It has following advantages over a conventional camera:
1. Low reaction time of a few microseconds (vs. several milliseconds)
2. High dynamic range of 130dB (vs. 60dB)
3. No motion blur (vs. a lot of motion blur during fast movements)
4. Efficient data representation leading to a low system level power consumption

A market analysis at the Consumer Electronics Show (CES 2015) and in the Silicon Valley including direct feedback from some of the most relevant players in the field of mobile robotics and smart glasses revealed a huge demand for the targeted products.

With the successful outcome of this project the Insightness AG can now market these novel event-based visual positioning systems world-wide. A
granted CTI project will allow to further develop the chips and potential partnerships with interested Fortune 100 companies will allow to further develop and integrate the technology.

Our project received support from ETH Zurich, University of Zurich, NCCR Robotics, CTI, Venture Kick and Venture Leaders.

Publications

C. Brandli, R. Berner, M. Yang, S.-C. Liu and T. Delbruck, «A 240 180 130 dB 3 µs Latency Global Shutter Spatio-temporal Vision Sensor», IEEE Journal of Solid-State Circuits, 2014;
E. Mueggler, B. Huber, D. Scaramuzza, «Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers», 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS);
A Censi, D. Scaramuzza, «Low-Latency Event-Based Visual Odometry», 2014 IEEE International Conference on Robotics and Automation (ICRA).

Revue de presse

Tages-Anzeiger, print & online, D. Osswald, Wenn Roboter wie Menschen sehen
Bilanz, print & online, Die Kamera-Männer von Insightness
Tages-Anzeiger, print & online, S. Schmid Spin-Off-Machine ETH (featuring Portrait of Insightness)
IEEE Spectrum, print & online, E. Ackerman Dynamic Vision Sensors Enable High-Speed Maneuvers With Robots

Liens

Personnes participant au projet

Dernière mise à jour de cette présentation du projet  13.11.2020