Cited time in webofscience Cited time in scopus

Full metadata record

DC Field Value Language
dc.contributor.author Bae, Hyunjin -
dc.contributor.author Lee, Gu -
dc.contributor.author Yang, Jaeseung -
dc.contributor.author Shin, Gwanjun -
dc.contributor.author Choi, Gyeungho -
dc.contributor.author Lim, Yongseob -
dc.date.accessioned 2021-10-07T03:00:21Z -
dc.date.available 2021-10-07T03:00:21Z -
dc.date.created 2021-05-14 -
dc.date.issued 2021-05 -
dc.identifier.issn 1424-8220 -
dc.identifier.uri http://hdl.handle.net/20.500.11750/15422 -
dc.description.abstract In autonomous driving, using a variety of sensors to recognize preceding vehicles at middle and long distances is helpful for improving driving performance and developing various functions. However, if only LiDAR or cameras are used in the recognition stage, it is difficult to obtain the necessary data due to the limitations of each sensor. In this paper, we proposed a method of converting the vision-tracked data into bird’s eye-view (BEV) coordinates using an equation that projects LiDAR points onto an image and a method of fusion between LiDAR and vision-tracked data. Thus, the proposed method was effective through the results of detecting the closest in-path vehicle (CIPV) in various situations. In addition, even when experimenting with the EuroNCAP autonomous emergency braking (AEB) test protocol using the result of fusion, AEB performance was improved through improved cognitive performance than when using only LiDAR. In the experimental results, the performance of the proposed method was proven through actual vehicle tests in various scenarios. Consequently, it was convincing that the proposed sensor fusion method significantly improved the adaptive cruise control (ACC) function in autonomous maneuvering. We expect that this improvement in perception performance will contribute to improving the overall stability of ACC. © 2021 by the authors. Licensee MDPI, Basel, Switzerland. -
dc.language English -
dc.publisher MDPI -
dc.title Estimation of the closest in-path vehicle by low-channel lidar and camera sensor fusion for autonomous vehicles -
dc.type Article -
dc.identifier.doi 10.3390/s21093124 -
dc.identifier.scopusid 2-s2.0-85104947549 -
dc.identifier.bibliographicCitation Sensors, v.21, no.9 -
dc.description.isOpenAccess TRUE -
dc.subject.keywordAuthor Autonomous emergency braking (AEB) test -
dc.subject.keywordAuthor Bird’s eye-view (BEV) -
dc.subject.keywordAuthor Closest in-path vehicle (CIPV) -
dc.subject.keywordAuthor Sensor fusion -
dc.subject.keywordAuthor Alignment of point clouds to images -
dc.subject.keywordPlus Braking performance -
dc.subject.keywordPlus Autonomous driving -
dc.subject.keywordPlus Cameras -
dc.subject.keywordPlus Maneuverability -
dc.subject.keywordPlus Optical radar -
dc.subject.keywordPlus Camera sensor -
dc.subject.keywordPlus Cognitive performance -
dc.subject.keywordPlus Driving performance -
dc.subject.keywordPlus Overall stabilities -
dc.subject.keywordPlus Test protocols -
dc.subject.keywordPlus Speed control -
dc.subject.keywordPlus Adaptive cruise control (ACC) -
dc.subject.keywordPlus Various functions -
dc.subject.keywordPlus Autonomous vehicles -
dc.subject.keywordPlus Adaptive cruise control -
dc.subject.keywordPlus Automobile drivers -
dc.subject.keywordPlus Automobile testing -
dc.citation.number 9 -
dc.citation.title Sensors -
dc.citation.volume 21 -

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE