Cited 0 time in webofscience Cited 0 time in scopus

A Novel Human Detection Scheme and Occlusion Reasoning using LIDAR-RADAR Sensor Fusion

A Novel Human Detection Scheme and Occlusion Reasoning using LIDAR-RADAR Sensor Fusion
Translated Title
라이다-레이더 센서 융합을 이용한 새로운 사람 탐지 방법 및 가려짐 추론
Kwon, Seong Kyung
DGIST Authors
Kwon, Seong Kyung; Son, Sang Hyuk; Lee, Jong Hun; Lee, Jin Hee
Son, Sang Hyuk
Lee, Jong Hun; Lee, Jin Hee
Issue Date
Available Date
Degree Date
2017. 2
Pedestrian detectionLIDAR-RADAR sensor fusionHuman Characteristic FunctionOccluded Object detectionOccluded depth generation보행자 감지라이다-레이더 센서 융합사람 특성 곡선가려진 물체 감지폐색 깊이 생성
Human detection technologies are widely used in smart homes and autonomous vehicles. In addition, object detections are critical technologies for the safety of pedestrians and drivers in the autonomous vehicles. However, in order to detect human, autonomous vehicle researchers have used a high-resolution LIDAR and smart home researchers have applied a camera with a narrow detection range. Despite the development of sensors and their sensor fusion technologies in order to improve the accuracy of object detection, occluded pedestrian detection technology remains a still challenging topic. Conventional occluded pedestrian detection has utilized a camera that extracts a variety of characteristics such as their color and contour of objects. However, a camera has vulnerabilities like as high sensitivity of environmental changes and high complexity of image processing. LIDAR-RADAR fusion method has been mainly used to recognize moving vehicles since the method can estimate their velocities by using Doppler Effect. Also, the fusion method is robust about environmental changes and weather conditions. Furthermore, to our best knowledge, the occluded pedestrian detection using LIDAR-RADAR fusion has not yet been reported. These studies for occluded pedestrian detection employ camera-based methods that have characteristics such as much sensitiveness and heavy image processing. To solve these problems, we propose a new occluded depth generation based reasoning method utilizing a LIDAR-RADAR sensor fusion. In order to classify the human, we concomitantly propose a novel method with a low-cost and low-resolution LIDAR that can detect human quickly and precisely without complex learning algorithm and additional devices. In other words, the human can be distinguished from objects by using a new human characteristics function which is empirically extracted from the characteristics of a human. The proposed method consists of object detection, occluded depth generation, and then occluded pedestrian detection. Occluded depth generation is an effective means to find out an obscured area hidden by any obstacles. The objects within the occluded depth are detected by RADAR and an occluded object is estimated as a pedestrian by means of unique human Doppler distribution measured from RADAR. In addition, the proposed method has low processing computation in comparison with conventional learning methods because it generates precise fusion ROI (Region of Interest) by combining ROIs of each sensor. Therefore, an occluded pedestrian can be estimated by utilizing the RADAR Doppler pattern and the LIDAR human characteristics curve within the fusion ROI. In addition, we verified the effectiveness of the proposed algorithm through a number of experiments. ⓒ 2017 DGIST
Table Of Contents
I. Introduction 1 -- II. Related Work 5 -- III. LIDAR-based Human Detection 8 -- 1. Conventional LIDAR-based human detection methods 8 -- 2. Human Characteristic Function 9 -- 3. Experiments 13 -- 4. Conclusion 21 -- IV. Occluded Pedestrian Detection 22 -- 1. Conventional Occluded Object Detection Systems 22 -- 2. Occluded Depth based Occlusion Reasoning Scheme 23 -- 3. Experiments 30 -- 4. Conclusion 46 -- V. Summary and Future work 48 -- References 51 -- 요약문 56
Information and Communication Engineering
Information and Communication EngineeringThesesMaster

qrcode mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.