Ultra-Fast Visible Lane Distance Estimation Using a Single Camera

Accurate distance estimation is crucial for ensuring the safety and efficiency of intelligent and future autonomous mobility. In this paper, we propose a method for real-time estimation of visible lane distance based on image data only from a single camera and an existing road lane marking estimation system. By leveraging camera intrinsic and height above the road, we utilize lane parallelism and geometric properties of frames from a camera to compute the distance to the furthest visible lane point. Intelligent recalculations of horizon level are performed for every frame, accommodating variations encountered during uphill or downhill driving scenarios. Moreover, the proposed solution is designed to be theoretically adaptable to gauge the distance to any detected object on the camera frame, not only road lanes. We evaluate the effectiveness of our method on a dataset created for this purpose, assessing estimation performance through comparison with ground truth data obtained from high-precision GPS measurements. The results indicate comparable estimation performance with other works. Coupled with the capability for real-time implementation in vehicles, we underscore the potential of our approach for advanced driver assistance systems (ADAS) in various aspects, such as safe driving speeds, obstacle avoidance, or collision prevention.

This research contributes to the ongoing in-lab efforts in developing robust and practical solutions for autonomous mobility applications. Our study addresses the challenge of estimating visible driving lane distance accurately and efficiently (in real-time) and using only a single camera. The proposed algorithm was validated by creating a dedicated dataset for this paper, utilizing high-precision GPS data from vehicles engaged in the experiment as the ground truth.

Compared to the recent works, the proposed algorithm stands out for its versatility: it can be effectively used in different scenarios, especially thanks to the accommodation of variations in horizon levels encountered during uphill or downhill driving scenarios.

The proposed algorithm achieves the objectives of both precision in distance estimation (2.78m average RMSE) and minimal computation time (1897 frames per second (fps) on NVIDIA Jetson Orin NANO).

Research was recently published in a renowned scientific journal IEEE Access as an open-access article and can be found here: https://ieeexplore.ieee.org/document/10547209