The OmniScape Dataset - Normandie Université
Communication Dans Un Congrès Année : 2020

The OmniScape Dataset

Résumé

Despite the utility and benefits of omnidirectional images in robotics and automotive applications, there are no datasets of omnidirectional images available with semantic segmentation, depth map, and dynamic properties. This is due to the time cost and human effort required to annotate ground truth images. This paper presents a framework for generating omnidirectional images using images that are acquired from a virtual environment. For this purpose, we demonstrate the relevance of the proposed framework on two well-known simulators: CARLA Simulator, which is an open-source simulator for autonomous driving research, and Grand Theft Auto V (GTA V), which is a very high quality video game. We explain in details the generated OmniScape dataset, which includes stereo fisheye and catadioptric images acquired from the two front sides of a motorcycle, including semantic segmentation, depth map, intrinsic parameters of the cameras and the dynamic parameters of the motorcycle. It is worth noting that the case of two-wheeled vehicles is more challenging than cars due to the specific dynamic of these vehicles.
Fichier principal
Vignette du fichier
20.icra.pdf (1.21 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03088300 , version 1 (26-12-2020)

Identifiants

Citer

Ahmed Rida Sekkat, Yohan Dupuis, Pascal Vasseur, Paul Honeine. The OmniScape Dataset. 2020 IEEE International Conference on Robotics and Automation (ICRA), May 2020, Paris, France. pp.1603-1608, ⟨10.1109/ICRA40945.2020.9197144⟩. ⟨hal-03088300⟩
175 Consultations
963 Téléchargements

Altmetric

Partager

More