Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network

Publikation: Beitrag in FachzeitschriftKonferenzaufsatz in FachzeitschriftForschungPeer-Review

Autorschaft

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten (von - bis)101-109
Seitenumfang9
FachzeitschriftISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Jahrgang5
Ausgabenummer1
Frühes Online-Datum17 Mai 2022
PublikationsstatusVeröffentlicht - 2022
Veranstaltung2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission I - Nice, Frankreich
Dauer: 6 Juni 202211 Juni 2022

Abstract

High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.

ASJC Scopus Sachgebiete

Zitieren

Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network. / Yuan, Y.; Sester, M.
in: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Jahrgang 5, Nr. 1, 2022, S. 101-109.

Publikation: Beitrag in FachzeitschriftKonferenzaufsatz in FachzeitschriftForschungPeer-Review

Yuan Y, Sester M. Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2022;5(1):101-109. Epub 2022 Mai 17. doi: 10.48550/arXiv.2205.09418, 10.5194/isprs-annals-V-1-2022-101-2022
Yuan, Y. ; Sester, M. / Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network. in: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2022 ; Jahrgang 5, Nr. 1. S. 101-109.
Download
@article{ba782c4c85b74c68ab06d824fef693c8,
title = "Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network",
abstract = "High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.",
keywords = "Collective Perception, Localization, Point Cloud, Registration, Sensor Fusion, Sensor Network",
author = "Y. Yuan and M. Sester",
year = "2022",
doi = "10.48550/arXiv.2205.09418",
language = "English",
volume = "5",
pages = "101--109",
number = "1",
note = "2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission I ; Conference date: 06-06-2022 Through 11-06-2022",

}

Download

TY - JOUR

T1 - Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network

AU - Yuan, Y.

AU - Sester, M.

PY - 2022

Y1 - 2022

N2 - High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.

AB - High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.

KW - Collective Perception

KW - Localization

KW - Point Cloud

KW - Registration

KW - Sensor Fusion

KW - Sensor Network

UR - http://www.scopus.com/inward/record.url?scp=85132825218&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2205.09418

DO - 10.48550/arXiv.2205.09418

M3 - Conference article

VL - 5

SP - 101

EP - 109

JO - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

JF - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

SN - 2194-9042

IS - 1

T2 - 2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission I

Y2 - 6 June 2022 through 11 June 2022

ER -

Von denselben Autoren