Cited time in webofscience Cited time in scopus

Full metadata record

DC Field Value Language
dc.contributor.author Kumar, Gurumadaiah Ajay -
dc.contributor.author Lee, Jin Hee -
dc.contributor.author Hwang, Jongrak -
dc.contributor.author Park, Jaehyeong -
dc.contributor.author Yoon, Sung Hoon -
dc.contributor.author Kwon, Soon -
dc.date.accessioned 2020-04-13T06:03:32Z -
dc.date.available 2020-04-13T06:03:32Z -
dc.date.created 2020-03-20 -
dc.date.issued 2020-02 -
dc.identifier.citation Symmetry, v.12, no.2, pp.324 -
dc.identifier.issn 2073-8994 -
dc.identifier.uri http://hdl.handle.net/20.500.11750/11663 -
dc.description.abstract The fusion of light detection and ranging (LiDAR) and camera data in real-time is known to be a crucial process in many applications, such as in autonomous driving, industrial automation, and robotics. Especially in the case of autonomous vehicles, the efficient fusion of data from these two types of sensors is important to enabling the depth of objects as well as the detection of objects at short and long distances. As both the sensors are capable of capturing the different attributes of the environment simultaneously, the integration of those attributes with an efficient fusion approach greatly benefits the reliable and consistent perception of the environment. This paper presents a method to estimate the distance (depth) between a self-driving car and other vehicles, objects, and signboards on its path using the accurate fusion approach. Based on the geometrical transformation and projection, low-level sensor fusion was performed between a camera and LiDAR using a 3D marker. Further, the fusion information is utilized to estimate the distance of objects detected by the RefineDet detector. Finally, the accuracy and performance of the sensor fusion and distance estimation approach were evaluated in terms of quantitative and qualitative analysis by considering real road and simulation environment scenarios. Thus the proposed lowlevel sensor fusion, based on the computational geometric transformation and projection for object distance estimation proves to be a promising solution for enabling reliable and consistent environment perception ability for autonomous vehicles. © 2020 by the authors. -
dc.language English -
dc.publisher MDPI AG -
dc.title LiDAR and Camera Fusion Approach for Object Distance Estimation in Self-Driving Vehicles -
dc.type Article -
dc.identifier.doi 10.3390/sym12020324 -
dc.identifier.wosid 000521147600008 -
dc.identifier.scopusid 2-s2.0-85080856361 -
dc.type.local Article(Overseas) -
dc.type.rims ART -
dc.description.journalClass 1 -
dc.citation.publicationname Symmetry -
dc.contributor.nonIdAuthor Kumar, Gurumadaiah Ajay -
dc.contributor.nonIdAuthor Hwang, Jongrak -
dc.contributor.nonIdAuthor Yoon, Sung Hoon -
dc.identifier.citationVolume 12 -
dc.identifier.citationNumber 2 -
dc.identifier.citationStartPage 324 -
dc.identifier.citationTitle Symmetry -
dc.type.journalArticle Article -
dc.description.isOpenAccess Y -
dc.subject.keywordAuthor computational geometry transformation -
dc.subject.keywordAuthor projection -
dc.subject.keywordAuthor sensor fusion -
dc.subject.keywordAuthor self-driving vehicle -
dc.subject.keywordAuthor sensor calibration -
dc.subject.keywordAuthor depth sensing -
dc.subject.keywordAuthor point cloud to image mapping -
dc.subject.keywordAuthor autonomous vehicle -
dc.subject.keywordPlus EXTRINSIC CALIBRATION -
dc.subject.keywordPlus 3D LIDAR -
dc.subject.keywordPlus REGISTRATION -
dc.subject.keywordPlus SYSTEM -
dc.contributor.affiliatedAuthor Kumar, Gurumadaiah Ajay -
dc.contributor.affiliatedAuthor Lee, Jin Hee -
dc.contributor.affiliatedAuthor Hwang, Jongrak -
dc.contributor.affiliatedAuthor Park, Jaehyeong -
dc.contributor.affiliatedAuthor Yoon, Sung Hoon -
dc.contributor.affiliatedAuthor Kwon, Soon -
Files in This Item:
2-s2.0-85080856361.pdf

2-s2.0-85080856361.pdf

기타 데이터 / 14.76 MB / Adobe PDF download
Appears in Collections:
Division of Automotive Technology 1. Journal Articles

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE