Canonical correlation analysis on data with censoring and error information

Jianyong Sun, Simeon Keates

Research output: Contribution to journalArticle

12 Citations (Scopus)

Abstract

We developed a probabilistic model for canonical correlation analysis in the case when the associated datasets are incomplete. This case can arise where data entries either contain measurement errors or are censored (i.e., nonignorable missing) due to uncertainties in instrument calibration and physical limitations of devices and experimental conditions. The aim of our model is to estimate the true correlation coefficients, through eliminating the effects of measurement errors and abstracting helpful information from censored data. As exact inference is not possible for the proposed model, a modified variational Expectation-Maximization (EM) algorithm was developed. In the algorithm developed, we approximated the posteriors of the latent variables as normal distributions. In the experiment, the modified E-step approximation accuracy is first empirically demonstrated by being compared to hybrid Monte Carlo (HMC) sampling. The following experiments were carried out on synthetic datasets with different numbers of censored data and different correlation coefficient settings to compare the proposed algorithm with a maximum a posteriori (MAP) solution and a Markov Chain-EM solution. Experimental results showed that the variational EM solution compares favorably against the MAP solution, approaching the accuracy of the Markov Chain-EM, while maintaining computational simplicity. We finally applied the proposed algorithm to finding the mostly correlated properties of galaxy group with the X-ray luminosity.
Original languageEnglish
Pages (from-to)1909-1919
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume24
Issue number12
DOIs
Publication statusPublished - Dec 2013

Fingerprint Dive into the research topics of 'Canonical correlation analysis on data with censoring and error information'. Together they form a unique fingerprint.

  • Cite this