CDRM: Causal disentangled representation learning for missing data
M Chen, H Wang, R Wang, Y Peng, H Zhang - Knowledge-Based Systems, 2024 - Elsevier
Missing data pose significant challenges during representation learning of observational
data. The incompleteness of data can result in a deterioration of generative performance in …
data. The incompleteness of data can result in a deterioration of generative performance in …
Cf-vae: Causal disentangled representation learning with vae and causal flows
D Fan, Y Hou, C Gao - arXiv preprint arXiv:2304.09010, 2023 - arxiv.org
Learning disentangled representations is important in representation learning, aiming to
learn a low dimensional representation of data where each dimension corresponds to one …
learn a low dimensional representation of data where each dimension corresponds to one …
Self-Distilled Disentangled Learning for Counterfactual Prediction
X Li, M Gong, L Yao - arXiv preprint arXiv:2406.05855, 2024 - arxiv.org
The advancements in disentangled representation learning significantly enhance the
accuracy of counterfactual predictions by granting precise control over instrumental …
accuracy of counterfactual predictions by granting precise control over instrumental …
Causal representation learning via counterfactual intervention
X Li, S Sun, R Feng - Proceedings of the AAAI Conference on Artificial …, 2024 - ojs.aaai.org
Existing causal representation learning methods are based on the causal graph they build.
However, due to the omission of bias within the causal graph, they essentially encourage …
However, due to the omission of bias within the causal graph, they essentially encourage …
Learning causally disentangled representations via the principle of independent causal mechanisms
Learning disentangled causal representations is a challenging problem that has gained
significant attention recently due to its implications for extracting meaningful information for …
significant attention recently due to its implications for extracting meaningful information for …
ProtoVAE: Prototypical Networks for Unsupervised Disentanglement
V Patil, M Evanusa, J JaJa - arXiv preprint arXiv:2305.09092, 2023 - arxiv.org
Generative modeling and self-supervised learning have in recent years made great strides
towards learning from data in a completely unsupervised way. There is still however an …
towards learning from data in a completely unsupervised way. There is still however an …
Disdiff: Unsupervised disentanglement of diffusion probabilistic models
Targeting to understand the underlying explainable factors behind observations and
modeling the conditional generation process on these factors, we connect disentangled …
modeling the conditional generation process on these factors, we connect disentangled …
On causally disentangled representations
AG Reddy, VN Balasubramanian - … of the AAAI Conference on Artificial …, 2022 - ojs.aaai.org
Abstract Representation learners that disentangle factors of variation have already proven to
be important in addressing various real world concerns such as fairness and interpretability …
be important in addressing various real world concerns such as fairness and interpretability …
[PDF][PDF] Initializing Then Refining: A Simple Graph Attribute Imputation Network.
Abstract Representation learning on the attribute-missing graphs, whose connection
information is complete while the attribute information of some nodes is missing, is an …
information is complete while the attribute information of some nodes is missing, is an …
FragmGAN: Generative adversarial nets for fragmentary data imputation and prediction
F Fang, S Bao - Statistical Theory and Related Fields, 2024 - Taylor & Francis
Modern scientific research and applications very often encounter 'fragmentary data'which
brings big challenges to imputation and prediction. By leveraging the structure of response …
brings big challenges to imputation and prediction. By leveraging the structure of response …