Liu, KuiKuiLiuGoossens, BartBartGoossensDe Schepper, TomTomDe SchepperPhilips, WilfriedWilfriedPhilips2026-01-082026-01-082025-121051-8215WOS:001632387500015https://imec-publications.be/handle/20.500.12860/58624Post-training quantization (PTQ) is an effective solution for deploying deep neural networks on edge devices with limited resources. PTQ is especially attractive because it does not require access to the entire original training dataset on the promise of being able to use a much smaller calibration dataset. However, many existing PTQ methods still require a sufficiently large calibration dataset (e.g., more than 1000 images) to achieve satisfactory model accuracy. In this paper, we present a novel post-training quantization method that estimates quantization parameters using a Bayesian Maximum A Posterior (MAP) estimator. By modeling the uncertainty of quantization operations, we formulate the neural network quantization as a Bayesian inference problem. In our method, we first employ probabilistic programming techniques to optimize quantization parameters by maximizing the posterior of quantization step sizes. In addition, we introduce a Minimum Description Length (MDL) prior that favors low quantization bit widths and a validation procedure, which enhances PTQ performance when learning from small calibration datasets. Comprehensive evaluations demonstrate that the proposed method can improve the PTQ performance using a minimal calibration dataset of just 64 images, and achieve nearly state-of-the-art PTQ performance. Furthermore, the proposed method shows strong generalization ability when calibrated on different data sources and tested across diverse data.1NEURAL-NETWORK QUANTIZATIONQuantization (signal)TrainingProbabilistic logicCalibrationNeural networksOptimizationBayes methodsProgrammingDegradationAdaptation modelsPost-training quantizationBayesian optimizationprobabilistic programmingScience & TechnologyTechnologyImproving Post-Training Quantization via Probabilistic ProgrammingJournal article10.1109/TCSVT.2025.3588737WOS:001632387500015https://ieeexplore.ieee.org/document/11080033/media#media