[Zurück]


Zeitschriftenartikel:

E. Piatkowska, A. Belbachir, M. Gelautz:
"Cooperative and asynchronous stereo vision for dynamic vision sensors";
Measurement Science & Technology, 25 (2014), 5; S. 1 - 8.



Kurzfassung englisch:
Dynamic vision sensors (DVSs) encode visual input as a stream of events generated upon relative light intensity changes in the scene. These sensors have the advantage of allowing simultaneously high temporal resolution (better than 10 μs) and wide dynamic range (>120 dB) at sparse data representation, which is not possible with clocked vision sensors. In this paper, we focus on the task of stereo reconstruction. The spatiotemporal and asynchronous aspects of data provided by the sensor impose a different stereo reconstruction approach from the one applied for synchronous frame-based cameras. We propose to model the event-driven stereo matching by a cooperative network (Marr and Poggio 1976 Science 194 283-7). The history of the recent activity in the scene is stored in the network, which serves as spatiotemporal context used in disparity calculation for each incoming event. The network constantly evolves in time, as events are generated. In our work, not only the spatiotemporal aspect of the data is preserved but also the matching is performed asynchronously. The results of the experiments prove that the proposed approach is well adapted for DVS data and can be successfully used for disparity calculation.

Schlagworte:
dynamic vision sensors, asynchronous stereo, event-based processing


"Offizielle" elektronische Version der Publikation (entsprechend ihrem Digital Object Identifier - DOI)
http://dx.doi.org/10.1088/0957-0233/25/5/055108

Elektronische Version der Publikation:
http://iopscience.iop.org/0957-0233/25/5/055108


Erstellt aus der Publikationsdatenbank der Technischen Universität Wien.