Integrating imprecise data in generative models using interval-valued Variational Autoencoders - LIRMM - Laboratoire d’Informatique, de Robotique et de Microélectronique de Montpellier
Article Dans Une Revue Information Fusion Année : 2025

Integrating imprecise data in generative models using interval-valued Variational Autoencoders

Olivier Strauss

Résumé

Variational Autoencoders (VAEs) enable the integration of diverse data sources into a unified latent representation, facilitating the fusion of information from various inputs and the creation of disentangled representations that separate different factors of variation in the data. Traditional VAEs, however, are limited by assuming a single prior distribution for latent variables, which restricts their ability to handle epistemic uncertainty from imprecise measurements and incomplete data. This paper introduces the Interval-Valued Variational Autoencoder (iVAE), which employs a family of prior distributions and incorporates specialized neurons and redefined objective functions for handling interval-valued data. This architecture maintains computational efficiency while extending the model’s applicability to scenarios with pronounced epistemic uncertainty. The iVAE’s efficacy is demonstrated in managing two types of data: intrinsically interval-valued and noisy data preprocessed into interval formats. The first category is exemplified by a graphical analysis of questionnaires, while the second involves case studies focused on estimating the remaining useful life of aviation engines, where the iVAE outperforms traditional methods, thereby providing more accurate diagnostics and robust predictions.

Dates et versions

lirmm-04798310 , version 1 (22-11-2024)

Identifiants

Citer

Luciano Sánchez, Nahuel Costa, Inés Couso, Olivier Strauss. Integrating imprecise data in generative models using interval-valued Variational Autoencoders. Information Fusion, 2025, 114, pp.102659. ⟨10.1016/j.inffus.2024.102659⟩. ⟨lirmm-04798310⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

More