A lane departure warning system relying exclusively on a camera has several shortcomings and tends to be sensitive to, e.g., bad weather and abrupt manoeuvres. To handle these situations, the system proposed in this project uses a dynamic model of the vehicle and integration of relative motion sensors to estimate the vehicle’s position on the road. The relative motion is measured using vision, inertial, and vehicle sensors. All these sensors types are affected by errors such as offset, drift and quantization.
However the different sensors are sensitive to different types of errors, e.g., the camera system is rather poor at detecting rapid lateral movements, a type of situation which an inertial sensor practically never fails to detect. These kinds of complementary properties make sensor fusion interesting. The approach of this project is to use an already existing lane departure warning system as vision sensor in combination with an inertial measurement unit to produce a system that is robust and can achieve good warnings if an unintentional lane departure is about to occur.
For the combination of sensor data, different sensor fusion models have been proposed and evaluated on experimental data. The models are based on a nonlinear model that is linearized so that a Kalman filter can be applied. Experiments show that the proposed solutions succeed at handling situations where a system relying solely on a camera would have problems. The results from the testing show that the original lane departure warning system, which is a single camera system, is outperformed by the suggested system.
Source: Linköping University
Author: Almgren, Erik