DETECTION AND LOCALIZATION OF TRAFFIC LIGHTS USING YOLOV3 AND STEREO VISION
Keywords: traffic lights, detection, localization, convolutional neural network, stereo vision
Abstract. This paper focus on traffic light distance measurement using stereo camera which is a very important and challenging task in image processing domain, where it is used in several systems such as Driving Safety Support Systems (DSSS), autonomous driving and traffic mobility. In this paper, we propose an integrated traffic light distance measurement system for self-driving based on stereo image processing. Therefore, an algorithm to spatially locate the detected traffic light is required in order to make these detections useful. In this paper, an algorithm to detect, classify the traffic light colours and spatially locate traffic light are integrated. Detection and colours classification are made simultaneously via YOLOv3, using RGB images. 3D traffic light localization is achieved by estimating the distance from the vehicle to the traffic light, by looking at detector 2D bounding boxes and the disparity map generated by stereo camera. Moreover, Gaussian YOLOv3 weights based on KITTI and Berkeley datasets has been replaced with the COCO dataset. Therefore, a detection algorithm that can cope with mislocalizations is required in autonomous driving applications. This paper proposes an integrated method for improving the detection accuracy and traffic lights colours classification while supporting a real-time operation by modelling the bounding box (bbox) of YOLOv3. The obtained results show fair results within 20 meters away from the sensor, while misdetection and classification appeared in further distance.