Integration of Sensor Array Techniques for Efficient Tracking of Approaching Individuals and Objects: A Technological Perspective
Main Article Content
Abstract
Conventional object tracking systems are confronted with issues regarding accuracy, noise filtering, and object categorization. This work overcomes these issues by integrating several sensor modalities, such as LiDAR, RADAR, Infrared (IR), and Ultrasonic sensors, within a single tracking framework. The research aims to improve tracking accuracy via weighted sensor fusion, apply an optimized Kalman filter for noise filtering, and utilize DBSCAN clustering for optimized object classification. Performance is measured in terms of Mean Absolute Error (MAE), Root Mean Square Error (RMSE), and Signal-to-Noise Ratio (SNR) metrics. An experimental research design is conducted, using real-world sensor data coupled with computational modeling. Sensors have weighted contributions according to their capabilities. Kalman filtering of trajectory estimates is performed, and DBSCAN clustering separates out objects. Controlled experiments and benchmark comparisons are carried out for data collection. Results show a 30% improvement in detection accuracy over single-sensor methods. The Kalman filter minimizes trajectory error by 25%, and DBSCAN returns 85% accurate human-object classification. Performance metrics validate negligible error (MAE: 2.221 m, RMSE: 2.758 m) and efficient noise rejection (SNR: 4.297 dB). This work pushes the boundaries of multi-sensor tracking by enhancing sensor fusion approaches, motion estimation, and classification efficiency. The system's adaptability to varying conditions makes it useful in security, autonomous navigation, and industrial automation. Integration of deep learning-based sensor fusion is a possible direction for future work.