Perception Sensor Dataset For Bioinspired Landing Trajectories Of An Ornithopter Robot
DOI10.5281/zenodo.3930442Zenodo3930442MaRDI QIDQ6699330
Dataset published at Zenodo repository.
Anibal Ollero, Augusto Gómez Eguíluz, Juan Pablo Rodríguez Gómez, J. R. Martínez-de Dios
Publication date: 4 July 2020
Copyright license: Creative Commons Attribution 4.0 International
The dataset contains the measurements captured by several onboard sensors during the landing maneuvers of an ornithopter robot. Each dataset contains a ROS bag file with the sensor measurements, a file with the bioinspired trajectory, a file with the events generated by the simulated event-based sensor, and a README file with the instructions to use the dataset. The bioinspired landing trajectories are computed using Tau Theory. Each landing trajectory test was performed in a simulated scenario. The object models of each scene can be found in the /model/meshes folder of each scene. There are two testing scenes: (i) a warehouse and (ii) a refinery. The file object_pose.csv includes the position and orientation of each object in the scene. The sensor measurements were saved in rosbag file that contains a topic for each sensor measurement. The dataset includes information from the following simulated sensors: Velodyne HDL-32E Sonar sensor with a range of 20 m IMU Frame based monocular camera Event camera
This page was built for dataset: Perception Sensor Dataset For Bioinspired Landing Trajectories Of An Ornithopter Robot