Details
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 219-226 |
Seitenumfang | 8 |
Fachzeitschrift | ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Jahrgang | 5 |
Ausgabenummer | 4 |
Publikationsstatus | Veröffentlicht - 18 Mai 2022 |
Veranstaltung | 2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission IV - Nice, Frankreich Dauer: 6 Juni 2022 → 11 Juni 2022 |
Abstract
Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.
ASJC Scopus Sachgebiete
- Physik und Astronomie (insg.)
- Instrumentierung
- Umweltwissenschaften (insg.)
- Umweltwissenschaften (sonstige)
- Erdkunde und Planetologie (insg.)
- Erdkunde und Planetologie (sonstige)
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Jahrgang 5, Nr. 4, 18.05.2022, S. 219-226.
Publikation: Beitrag in Fachzeitschrift › Konferenzaufsatz in Fachzeitschrift › Forschung › Peer-Review
}
TY - JOUR
T1 - Improving 3d pedestrian detection for wearable sensor data with 2d human pose
AU - Kamalasanan, V.
AU - Feng, Y.
AU - Sester, M.
PY - 2022/5/18
Y1 - 2022/5/18
N2 - Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.
AB - Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.
KW - 3D pedestrian detection
KW - augmented reality
KW - human pose estimation
KW - shared space
KW - wearable sensor
UR - http://www.scopus.com/inward/record.url?scp=85132013934&partnerID=8YFLogxK
U2 - 10.5194/isprs-Annals-V-4-2022-219-2022
DO - 10.5194/isprs-Annals-V-4-2022-219-2022
M3 - Conference article
AN - SCOPUS:85132013934
VL - 5
SP - 219
EP - 226
JO - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
JF - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
SN - 2194-9042
IS - 4
T2 - 2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission IV
Y2 - 6 June 2022 through 11 June 2022
ER -