NavFusionZ: Privacy-Enhanced Multisensor Navigation

NavFusionZ merges ML sensor fusion and zero-knowledge proofs to ensure secure navigation. It integrates LiDAR, ultrasonic, and infrared sensors for precise and privacy-safe navigation paths.

  • 12,148 Raised
  • 411 Views
  • 9 Judges

Categories

  • Giza Track

Gallery

Description

Team Name/Project Name: Nav Fusion Z

Team Members:

  • Kristian Jay Raganas

Project Summary: Enhancing Distance Measurement with Sensor Fusion and Machine Learning

Rationale   

The project is aimed at resolving the key issue of distance measurement, especially in the domains of autonomous vehicles, robotics, or industrial automation. Traditional single-sensor approaches have a quality of measurable inaccuracies because the environment of the measurement is very prone to affect these instruments, together with characteristics inherent in the objects measured. The combined use of multiple sensors like LiDAR, Ultrasonic, VCSEL, Infrared, along with machine learning algorithms, seems to be a promising solution because it further improves the accuracy of the measurement by using the individual strengths of each sensor.


Related Works   

The literature review highlights the advancements in sensor technologies and the pivotal role of sensor fusion and machine learning in addressing the accuracy limitations of individual sensors. Key references include Zhang (2010) and Singh and Nagla (2019), who explore sensor classifications and the comprehensive review of sensor technologies, respectively. This section establishes the foundation for the research by discussing the significance of integrating multiple sensors and applying machine learning for data-driven enhancements.


Value Proposition   

This project shows that there is much that can be done about measurement precision in distance measurement using sensor fusion and machine learning. The application results in an enhancement of reliability and functionality of systems which require the measurement of precise distance information, ensuring safer and more efficient operations in various industries.


Methodology   

The methodology section outlines the experimental design for data collection, utilizing sensors under various conditions and employing machine learning models like Ridge Regression, Random Forest Regression, and KNN Regression for data analysis. The approach is meticulously designed to test the hypothesis that sensor fusion, coupled with machine learning, can significantly improve measurement accuracy.

Distance Measurement Technologies   

  • LiDAR: Measure high-resolution and range but struggles with environmental conditions like fog.
  • Ultrasonic: Cost-effective, works with less light, yet faces limitations in angular resolution and range.
  • VCSEL: High-precision measurements with challenges from external light sources in photon competition.
  • Infrared: Right for short-range measurements, still, hindered by non-linearity and sensitivity to environmental conditions.

Zero-Knowledge Proof Applications   

  • Incorporating ZKP can safeguard the privacy and integrity of the data processed by the machine learning models in sensor fusion. This approach is particularly valuable in scenarios where sensitive information is involved, ensuring that the system can prove the accuracy of its measurements without revealing the underlying sensor data.

Results and Discussion   

Findings reveal the efficacy of machine learning models in minimizing measurement errors and the superior performance of sensor fusion in enhancing distance measurement accuracy. The discussion also touches on the limitations encountered and the potential of ZKP technology in safeguarding data privacy and integrity within the system.

Recommendations for Future Work   

Concluding that sensor fusion and machine learning markedly improve distance measurement accuracy, the research suggests further exploration into additional sensors and refined machine learning models. The potential expansion of ZKP technology applications promises to bolster data security and privacy in sensitive implementations.

References:

  1. [1] P. Zhang, "Advance Industrial Control Technology," 2010, pp. 73-116, doi:10.1016/B978-1-4377-7807-6.10003-8.
  2. [2] R. Singh and K. S. Nagla, "A modified sensor fusion framework for quantifying and removing the effect of harsh environmental condition for reliable mobile robot mapping," Sensor Review, vol. 39, no. 4, pp. 456-472, 2019, doi:10.1108/SR-10-2018-0272.
  3. [3] A. Carullo and M. Parvis, "An ultrasonic sensor for distance measurement in automotive applications," IEEE Sensors Journal, vol. 1, no. 2, p. 143, 2001, doi:10.1109/JSEN.2001.936931.
  4. [4] V. De Silva, J. Roche, and A. Kondoz, "Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots," Sensors, 2018, doi:10.3390/s18082730.
  5. [5] M. Bijelic, T. Gruber, and W. Ritter, "A Benchmark for LiDAR Sensors in Fog: Is Detection Breaking Down?" in 2018 IEEE Intelligent Vehicles Symposium (IV), pp. 760-767, doi:10.1109/IVS.2018.8500543.


Schematic Diagram: 

- Ultrasonic HC-SR04

- VCSEL connection


- IR connection


Data Gathering Setup:

This project aims to investigate the impact of several common factors on the accuracy of distance-measuring sensors, specifically examining:

  • Light Exposure and Temperature: Analyzing the sensor's performance during day and night conditions outdoors.
  • Object Size: Assessing accuracy based on the size of the object being detected (small vs. big).
  • Color: Evaluating how object color (black vs. white) affects measurement accuracy.
  • Barrier Transparency: Determining the influence of transparent versus non-transparent barrier materials on sensor readings, using a transparent acrylic sheet as a test case.

Data have been collected across all possible combinations of these conditions for each sensor type under investigation

Data Gathering Documentation:

Verifiable Model Conversion using Giza AI actions Process and Documentation:

- AI Action Run (Model Training)- AI Action Run (Prediction Generation)

-Transpilation

-Deployment

-Verfication

Github Link: https://github.com/Chanetics/Nav-Fusion-Z

Attachments