System-Level Analysis of Module Uncertainty Quantification in the Autonomy Pipeline
Abstrak
Modern autonomous systems with machine learning components often use uncertainty quantification to help produce assurances about system operation. However, there is a lack of consensus in the community on what uncertainty is and how to perform uncertainty quantification. In this work, we propose that uncertainty measures should be understood within the context of overall system design and operation. To this end, we present two novel analysis techniques. First, we produce a probabilistic specification on a module's uncertainty measure given a system specification. Second, we propose a method to measure a system's input-output robustness in order to compare system designs and quantify the impact of making a system uncertainty-aware. In addition to this theoretical work, we present the application of these analyses on two real-world autonomous systems: an autonomous driving system and an aircraft runway incursion detection system. We show that our analyses can determine desired relationships between module uncertainty and error, provide visualizations of how well an uncertainty measure is being used by a system, produce principled comparisons between different uncertainty measures and decision-making algorithm designs, and provide insights into system vulnerabilities and tradeoffs.
Topik & Kata Kunci
Penulis (8)
Sampada Deglurkar
Haotian Shen
Anish Muthali
Marco Pavone
Dragos Margineantu
Peter Karkus
Boris Ivanovic
Claire J. Tomlin
Akses Cepat
- Tahun Terbit
- 2024
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓