International Journal on Science and Technology
E-ISSN: 2229-7677
•
Impact Factor: 9.88
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 16 Issue 4
October-December 2025
Indexing Partners
Calibration and Uncertainty Quantification for Safety-Critical Systems
| Author(s) | Mr. VIBHOR PUNDHIR, Mr. ANURAG JETHI, Mr. AAYUSH KUMAR, Mr. ADARSH SINGH CHAUHAN, Ms. AKSHITA SAHU, Dr. KAVITHA VIJAY |
|---|---|
| Country | India |
| Abstract | Deep neural networks have changed several disciplines and have reached levels of accuracy never seen before in difficult jobs like analysing medical images and perceiving autonomous vehicles. However, even though current neural networks are quite good at classifying things, they typically have bad calibration, which leads to predictions that are too confident and don't show how uncertain the predictions really are. This seriously hurts the dependability and trustworthiness of the system. This study offers a thorough and detailed examination of cutting-edge calibration and uncertainty quantification techniques aimed at guaranteeing dependable confidence assessments in safety-critical systems. We systematically survey fundamental concepts of model calibration, comprehensively examine the root causes of miscalibration in modern deep neural networks, thoroughly discuss calibration methods including post-hoc techniques such as temperature scaling, Platt scaling, and isotonic regression, regularisation approaches including label smoothing, mixup, and focal loss, and sophisticated uncertainty estimation frameworks including Bayesian neural networks, Monte Carlo dropout, deep ensembles, and conformal prediction. We provide an in-depth examination of assessment metrics used to gauge calibration quality, such as ECE, MCE, Brier score, and log loss. We also investigate their practical applications in the fields of medical imaging and autonomous driving, while pinpointing existing obstacles and prospective research avenues. The results show that there are several efficient calibration solutions that work well together. For example, temperature scaling makes things better without adding much to the cost of computing, ensemble approaches work well, and conformal prediction gives theoretical assurances. |
| Keywords | Deep Learning, Model Calibration, Uncertainty Quantification, Safety-Critical Systems, Neural Network Reliability, Medical Imaging, Autonomous Driving, Confidence Estimation |
| Field | Computer > Artificial Intelligence / Simulation / Virtual Reality |
| Published In | Volume 16, Issue 4, October-December 2025 |
| Published On | 2025-11-13 |
| DOI | https://doi.org/10.71097/IJSAT.v16.i4.9291 |
| Short DOI | https://doi.org/hbbmz6 |
Share this

CrossRef DOI is assigned to each research paper published in our journal.
IJSAT DOI prefix is
10.71097/IJSAT
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.