LUCOOP: Leibniz University Cooperative Perception and Urban Navigation Dataset

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksIV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
ISBN (elektronisch)9798350346916
ISBN (Print)979-8-3503-4692-3
PublikationsstatusVeröffentlicht - 27 Juli 2023
Veranstaltung34th IEEE Intelligent Vehicles Symposium, IV 2023 - Anchorage, USA / Vereinigte Staaten
Dauer: 4 Juni 20237 Juni 2023

Publikationsreihe

NameIEEE Intelligent Vehicles Symposium, Proceedings
Band2023-June

Abstract

Recently published datasets have been increasingly comprehensive with respect to their variety of simultaneously used sensors, traffic scenarios, environmental conditions, and provided annotations. However, these datasets typically only consider data collected by one independent vehicle. Hence, there is currently a lack of comprehensive, real-world, multi-vehicle datasets fostering research on cooperative applications such as object detection, urban navigation, or multi-agent SLAM. In this paper, we aim to fill this gap by introducing the novel LUCOOP dataset, which provides time-synchronized multi-modal data collected by three interacting measurement vehicles. The driving scenario corresponds to a follow-up setup of multiple rounds in an inner city triangular trajectory. Each vehicle was equipped with a broad sensor suite including at least one LiDAR sensor, one GNSS antenna, and up to three IMUs. Additionally, Ultra-Wide-Band (UWB) sensors were mounted on each vehicle, as well as statically placed along the trajectory enabling both V2V and V2X range measurements. Furthermore, a part of the trajectory was monitored by a total station resulting in a highly accurate reference trajectory. The LUCOOP dataset also includes a precise, dense 3D map point cloud, acquired simultaneously by a mobile mapping system, as well as an LOD2 city model of the measurement area. We provide sensor measurements in a multi-vehicle setup for a trajectory of more than 4 km and a time interval of more than 26 minutes, respectively. Overall, our dataset includes more than 54,000 LiDAR frames, approximately 700,000 IMU measurements, and more than 2.5 hours of 10 Hz GNSS raw measurements along with 1 Hz data from a reference station. Furthermore, we provide more than 6,000 total station measurements over a trajectory of more than 1 km and 1,874 V2V and 267 V2X UWB measurements. Additionally, we offer 3D bounding box annotations for evaluating object detection approaches, as well as highly accurate ground truth poses for each vehicle throughout the measurement campaign.

ASJC Scopus Sachgebiete

Zitieren

LUCOOP: Leibniz University Cooperative Perception and Urban Navigation Dataset. / Axmann, Jeldrik; Moftizadeh, Rozhin; Su, Jingyao et al.
IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings. Institute of Electrical and Electronics Engineers Inc., 2023. (IEEE Intelligent Vehicles Symposium, Proceedings; Band 2023-June).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Axmann, J, Moftizadeh, R, Su, J, Tennstedt, B, Zou, Q, Yuan, Y, Ernst, D, Alkhatib, H, Brenner, C & Schön, S 2023, LUCOOP: Leibniz University Cooperative Perception and Urban Navigation Dataset. in IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings. IEEE Intelligent Vehicles Symposium, Proceedings, Bd. 2023-June, Institute of Electrical and Electronics Engineers Inc., 34th IEEE Intelligent Vehicles Symposium, IV 2023, Anchorage, USA / Vereinigte Staaten, 4 Juni 2023. https://doi.org/10.1109/IV55152.2023.10186693
Axmann, J., Moftizadeh, R., Su, J., Tennstedt, B., Zou, Q., Yuan, Y., Ernst, D., Alkhatib, H., Brenner, C., & Schön, S. (2023). LUCOOP: Leibniz University Cooperative Perception and Urban Navigation Dataset. In IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings (IEEE Intelligent Vehicles Symposium, Proceedings; Band 2023-June). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IV55152.2023.10186693
Axmann J, Moftizadeh R, Su J, Tennstedt B, Zou Q, Yuan Y et al. LUCOOP: Leibniz University Cooperative Perception and Urban Navigation Dataset. in IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings. Institute of Electrical and Electronics Engineers Inc. 2023. (IEEE Intelligent Vehicles Symposium, Proceedings). doi: 10.1109/IV55152.2023.10186693
Axmann, Jeldrik ; Moftizadeh, Rozhin ; Su, Jingyao et al. / LUCOOP : Leibniz University Cooperative Perception and Urban Navigation Dataset. IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings. Institute of Electrical and Electronics Engineers Inc., 2023. (IEEE Intelligent Vehicles Symposium, Proceedings).
Download
@inproceedings{ff32052f02ef42c3a1b58ffb4fe709ae,
title = "LUCOOP: Leibniz University Cooperative Perception and Urban Navigation Dataset",
abstract = "Recently published datasets have been increasingly comprehensive with respect to their variety of simultaneously used sensors, traffic scenarios, environmental conditions, and provided annotations. However, these datasets typically only consider data collected by one independent vehicle. Hence, there is currently a lack of comprehensive, real-world, multi-vehicle datasets fostering research on cooperative applications such as object detection, urban navigation, or multi-agent SLAM. In this paper, we aim to fill this gap by introducing the novel LUCOOP dataset, which provides time-synchronized multi-modal data collected by three interacting measurement vehicles. The driving scenario corresponds to a follow-up setup of multiple rounds in an inner city triangular trajectory. Each vehicle was equipped with a broad sensor suite including at least one LiDAR sensor, one GNSS antenna, and up to three IMUs. Additionally, Ultra-Wide-Band (UWB) sensors were mounted on each vehicle, as well as statically placed along the trajectory enabling both V2V and V2X range measurements. Furthermore, a part of the trajectory was monitored by a total station resulting in a highly accurate reference trajectory. The LUCOOP dataset also includes a precise, dense 3D map point cloud, acquired simultaneously by a mobile mapping system, as well as an LOD2 city model of the measurement area. We provide sensor measurements in a multi-vehicle setup for a trajectory of more than 4 km and a time interval of more than 26 minutes, respectively. Overall, our dataset includes more than 54,000 LiDAR frames, approximately 700,000 IMU measurements, and more than 2.5 hours of 10 Hz GNSS raw measurements along with 1 Hz data from a reference station. Furthermore, we provide more than 6,000 total station measurements over a trajectory of more than 1 km and 1,874 V2V and 267 V2X UWB measurements. Additionally, we offer 3D bounding box annotations for evaluating object detection approaches, as well as highly accurate ground truth poses for each vehicle throughout the measurement campaign.",
keywords = "cooperative positioning, Dataset, georeferencing, GNSS, IMU, LiDAR, localization, multi-agent, object detection, SLAM, urban navigation, UWB",
author = "Jeldrik Axmann and Rozhin Moftizadeh and Jingyao Su and Benjamin Tennstedt and Qianqian Zou and Yunshuang Yuan and Dominik Ernst and Hamza Alkhatib and Claus Brenner and Steffen Sch{\"o}n",
note = "Funding Information: This measurement campaign could not have been carried out without the help of many contributors. At this point, we thank Yuehan Jiang (Institute for Autonomous Cyber-Physical Systems, Hamburg), Franziska Altemeier, Ingo Neumann, S{\"o}ren Vogel, Frederic Hake (all Geodetic Institute, Hannover), Colin Fischer (Institute of Cartography and Geoinformatics, Hannover), Thomas Maschke, Tobias Kersten, Nina Fletling (all Institut f{\"u}r Erdmessung, Hannover), J{\"o}rg Blankenbach (Geodetic Institute, Aachen), Florian Alpen (Hydromapper GmbH), Allison Kealy (Victorian Department of Environment, Land, Water and Planning, Melbourne), G{\"u}nther Retscher, Jelena Gabela (both Department of Geodesy and Geoin-formation, Wien), Wenchao Li (Solinnov Pty Ltd), Adrian Bingham (Applied Artificial Intelligence Institute, Burwood), and the student assistants Manuel Kramer, Khaled Ahmed, Leonard G{\"o}ttert, Dennis Mu{\ss}gnug, Chengqi Zhou, and We-icheng Zhang. Thanks to the Landesamt f{\"u}r Geoinformation and Landesermessung Niedersachsen (LGLN)/Zentrale Stelle SAPOS{\textregistered} for providing the virtual reference station data, infrastructure and reliable high quality. This project is supported by the German Research Foundation (DFG), as part of the Research Training Group i.c.sens, GRK 2159, ”Integrity and Collaboration in Dynamic Sensor Networks”.; 34th IEEE Intelligent Vehicles Symposium, IV 2023 ; Conference date: 04-06-2023 Through 07-06-2023",
year = "2023",
month = jul,
day = "27",
doi = "10.1109/IV55152.2023.10186693",
language = "English",
isbn = "979-8-3503-4692-3",
series = "IEEE Intelligent Vehicles Symposium, Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
booktitle = "IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings",
address = "United States",

}

Download

TY - GEN

T1 - LUCOOP

T2 - 34th IEEE Intelligent Vehicles Symposium, IV 2023

AU - Axmann, Jeldrik

AU - Moftizadeh, Rozhin

AU - Su, Jingyao

AU - Tennstedt, Benjamin

AU - Zou, Qianqian

AU - Yuan, Yunshuang

AU - Ernst, Dominik

AU - Alkhatib, Hamza

AU - Brenner, Claus

AU - Schön, Steffen

N1 - Funding Information: This measurement campaign could not have been carried out without the help of many contributors. At this point, we thank Yuehan Jiang (Institute for Autonomous Cyber-Physical Systems, Hamburg), Franziska Altemeier, Ingo Neumann, Sören Vogel, Frederic Hake (all Geodetic Institute, Hannover), Colin Fischer (Institute of Cartography and Geoinformatics, Hannover), Thomas Maschke, Tobias Kersten, Nina Fletling (all Institut für Erdmessung, Hannover), Jörg Blankenbach (Geodetic Institute, Aachen), Florian Alpen (Hydromapper GmbH), Allison Kealy (Victorian Department of Environment, Land, Water and Planning, Melbourne), Günther Retscher, Jelena Gabela (both Department of Geodesy and Geoin-formation, Wien), Wenchao Li (Solinnov Pty Ltd), Adrian Bingham (Applied Artificial Intelligence Institute, Burwood), and the student assistants Manuel Kramer, Khaled Ahmed, Leonard Göttert, Dennis Mußgnug, Chengqi Zhou, and We-icheng Zhang. Thanks to the Landesamt für Geoinformation and Landesermessung Niedersachsen (LGLN)/Zentrale Stelle SAPOS® for providing the virtual reference station data, infrastructure and reliable high quality. This project is supported by the German Research Foundation (DFG), as part of the Research Training Group i.c.sens, GRK 2159, ”Integrity and Collaboration in Dynamic Sensor Networks”.

PY - 2023/7/27

Y1 - 2023/7/27

N2 - Recently published datasets have been increasingly comprehensive with respect to their variety of simultaneously used sensors, traffic scenarios, environmental conditions, and provided annotations. However, these datasets typically only consider data collected by one independent vehicle. Hence, there is currently a lack of comprehensive, real-world, multi-vehicle datasets fostering research on cooperative applications such as object detection, urban navigation, or multi-agent SLAM. In this paper, we aim to fill this gap by introducing the novel LUCOOP dataset, which provides time-synchronized multi-modal data collected by three interacting measurement vehicles. The driving scenario corresponds to a follow-up setup of multiple rounds in an inner city triangular trajectory. Each vehicle was equipped with a broad sensor suite including at least one LiDAR sensor, one GNSS antenna, and up to three IMUs. Additionally, Ultra-Wide-Band (UWB) sensors were mounted on each vehicle, as well as statically placed along the trajectory enabling both V2V and V2X range measurements. Furthermore, a part of the trajectory was monitored by a total station resulting in a highly accurate reference trajectory. The LUCOOP dataset also includes a precise, dense 3D map point cloud, acquired simultaneously by a mobile mapping system, as well as an LOD2 city model of the measurement area. We provide sensor measurements in a multi-vehicle setup for a trajectory of more than 4 km and a time interval of more than 26 minutes, respectively. Overall, our dataset includes more than 54,000 LiDAR frames, approximately 700,000 IMU measurements, and more than 2.5 hours of 10 Hz GNSS raw measurements along with 1 Hz data from a reference station. Furthermore, we provide more than 6,000 total station measurements over a trajectory of more than 1 km and 1,874 V2V and 267 V2X UWB measurements. Additionally, we offer 3D bounding box annotations for evaluating object detection approaches, as well as highly accurate ground truth poses for each vehicle throughout the measurement campaign.

AB - Recently published datasets have been increasingly comprehensive with respect to their variety of simultaneously used sensors, traffic scenarios, environmental conditions, and provided annotations. However, these datasets typically only consider data collected by one independent vehicle. Hence, there is currently a lack of comprehensive, real-world, multi-vehicle datasets fostering research on cooperative applications such as object detection, urban navigation, or multi-agent SLAM. In this paper, we aim to fill this gap by introducing the novel LUCOOP dataset, which provides time-synchronized multi-modal data collected by three interacting measurement vehicles. The driving scenario corresponds to a follow-up setup of multiple rounds in an inner city triangular trajectory. Each vehicle was equipped with a broad sensor suite including at least one LiDAR sensor, one GNSS antenna, and up to three IMUs. Additionally, Ultra-Wide-Band (UWB) sensors were mounted on each vehicle, as well as statically placed along the trajectory enabling both V2V and V2X range measurements. Furthermore, a part of the trajectory was monitored by a total station resulting in a highly accurate reference trajectory. The LUCOOP dataset also includes a precise, dense 3D map point cloud, acquired simultaneously by a mobile mapping system, as well as an LOD2 city model of the measurement area. We provide sensor measurements in a multi-vehicle setup for a trajectory of more than 4 km and a time interval of more than 26 minutes, respectively. Overall, our dataset includes more than 54,000 LiDAR frames, approximately 700,000 IMU measurements, and more than 2.5 hours of 10 Hz GNSS raw measurements along with 1 Hz data from a reference station. Furthermore, we provide more than 6,000 total station measurements over a trajectory of more than 1 km and 1,874 V2V and 267 V2X UWB measurements. Additionally, we offer 3D bounding box annotations for evaluating object detection approaches, as well as highly accurate ground truth poses for each vehicle throughout the measurement campaign.

KW - cooperative positioning

KW - Dataset

KW - georeferencing

KW - GNSS

KW - IMU

KW - LiDAR

KW - localization

KW - multi-agent

KW - object detection

KW - SLAM

KW - urban navigation

KW - UWB

UR - http://www.scopus.com/inward/record.url?scp=85168004032&partnerID=8YFLogxK

U2 - 10.1109/IV55152.2023.10186693

DO - 10.1109/IV55152.2023.10186693

M3 - Conference contribution

AN - SCOPUS:85168004032

SN - 979-8-3503-4692-3

T3 - IEEE Intelligent Vehicles Symposium, Proceedings

BT - IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

Y2 - 4 June 2023 through 7 June 2023

ER -

Von denselben Autoren