-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add code used as the compared methods
- Loading branch information
Showing
5 changed files
with
968 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
These MATLAB codes implement the variational Bayesian low-rank | ||
matrix completion and robust PCA methods described in the paper | ||
|
||
S. D. Babacan, M. Luessi, R. Molina, and A. K. Katsaggelos, | ||
"Sparse Bayesian Methods for Low-Rank Matrix Estimation," | ||
IEEE Transactions on Signal Processing, 2012. | ||
|
||
VBMC: Variational Bayesian low-rank matrix completion main code | ||
VBRPCA: Variational Bayesian robust PCA main code | ||
|
||
Example runs of these codes are shown in the scripts RunVBMC and RunVBRPCA. These scripts explain the general use of these codes, additional details can be found in the main codes. | ||
|
||
Please contact S. Derin Babacan at dbabacan [at] gmail.com for bug reports/ comments/ suggestions/ questions. | ||
|
||
Copyright (c): S. Derin Babacan, 2011 | ||
This program is free software: you can redistribute it and/or modify | ||
it under the terms of the GNU General Public License as published by | ||
the Free Software Foundation, either version 3 of the License, or | ||
(at your option) any later version. | ||
|
||
This program is distributed in the hope that it will be useful, | ||
but WITHOUT ANY WARRANTY; without even the implied warranty of | ||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | ||
GNU General Public License <http://www.gnu.org/licenses/> for more details. | ||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,74 @@ | ||
%% An example run of Variational Matrix Low-Rank Matrix Completion | ||
clear | ||
clc | ||
|
||
%% Create a low-rank matrix | ||
% Dimensions | ||
m = 300; | ||
n = 300; | ||
r = 10; % rank | ||
|
||
A = randn(m,r); | ||
B = randn(r,n); | ||
X_true = A*B; % low rank | ||
|
||
% random sparse sampling | ||
df = r*(m+n-r); % Degrees of freedom | ||
p = 0.2; % percentage measurements | ||
pmn = p*m*n; | ||
osdf = pmn / df; % Oversampling degrees of freedom | ||
|
||
Omega = randperm(m*n); Omega = Omega(1:pmn); | ||
P = zeros(m,n); P(Omega) = 1; | ||
|
||
% Observations | ||
Y = P.*X_true; | ||
|
||
% Add noise? | ||
sigma = 0; % no noise | ||
sigma = 1e-3; % some noise | ||
% sigma = sqrt(1e-3); % more noise | ||
|
||
Y = Y + sigma*randn(size(Y)); | ||
|
||
SNR = 10*log10( sum(sum((P.*X_true).^2)) / sum(sum((Y-P.*X_true).^2))); | ||
|
||
%% Run VBMC | ||
% all options are *optional*, everything will be set automatically | ||
% you can modify these options to get better performance | ||
options.verbose = 1; | ||
options.MAXITER = 100; | ||
options.DIMRED = 1; % Reduce dimensionality during iterations? | ||
% you can also set the threshold to reduce dimensions | ||
% options.DIMRED_THR = 1e3; | ||
%Estimate noise variance? (beta is inverse noise variance) | ||
options.UPDATE_BETA = 1; | ||
% options.UPDATE_BETA_START = 1;% iteration number to start estimating noise variance | ||
|
||
% If the noise inv. variance is not to be estimated, set | ||
% options.UPDATE_BETA = 0; % and set beta using | ||
% options.beta = 1e3; | ||
% Manually tuning this parameter can give significantly better results. | ||
|
||
% options.initial_rank = 'auto'; % This sets to the maximum possible rank | ||
options.initial_rank = 50; % or we can set a value. | ||
|
||
% Provide original matrix for simulations | ||
options.X_true = X_true; | ||
|
||
|
||
|
||
fprintf('Running VB matrix completion...\n'); | ||
tic | ||
[X_hat, A_hat, B_hat] = VBMC(P, Y, options); | ||
t_total = toc; | ||
|
||
% Show results | ||
err = norm( X_hat - X_true, 'fro' ) / norm( X_true, 'fro'); | ||
r_hat = rank(X_hat); | ||
fprintf('\n\n-------------------------------------------------------------------\n') | ||
fprintf('Dimensions (%d, %d), true rank = %d, measurements = %g %%, SNR = %g\n',m, n, r, p*100, SNR); | ||
fprintf('Rel. error = %g, estimated rank = %d, time = %g\n', err, r_hat, t_total); | ||
fprintf('-------------------------------------------------------------------\n') | ||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,81 @@ | ||
%% An example run of VBRPCA | ||
clear | ||
clc | ||
|
||
%% Create a low-rank + sparse matrix | ||
% Dimensions | ||
m = 500; | ||
n = 500; | ||
r = 25; % rank of the low-rank component | ||
|
||
A = randn(m,r); | ||
B = randn(r,n); | ||
X_true = A*B; % low rank component | ||
|
||
SparseRatio = 0.05; | ||
E_true = zeros(size(X_true)); | ||
Eomega = randsample(m*n, round(m*n*SparseRatio)); | ||
E_true(Eomega) = -10+20*rand(length(Eomega),1); % sparse component | ||
|
||
% Observation | ||
Y = X_true + E_true; | ||
|
||
% Add noise? | ||
sigma = 0; % no noise | ||
% sigma = 1e-3; % some noise | ||
% sigma = sqrt(1e-3); % more noise | ||
|
||
Y = Y + sigma*randn(size(Y)); | ||
|
||
SNR = 10*log10( sum(sum((X_true+E_true).^2)) / sum(sum((Y-X_true-E_true).^2))); | ||
|
||
|
||
%% Run VRBPCA | ||
% all options are *optional*, everything will be set automatically | ||
% you can modify these options to get better performance | ||
options.verbose = 1; | ||
options.initial_rank = 'auto'; % This sets to the maximum possible rank | ||
% options.initial_rank = 300; % or we can use a value. | ||
options.X_true = X_true; | ||
options.E_true = E_true; | ||
options.inf_flag = 2; % inference flag for the sparse component | ||
% 1 for standard VB, 2 for MacKay. MacKay generally converges faster. | ||
|
||
options.MAXITER = 200; | ||
%Estimate noise variance? (beta is inverse noise variance) | ||
options.UPDATE_BETA = 1; | ||
% If the noise inv. variance is not to be estimated, set | ||
% options.UPDATE_BETA = 0; % and set beta using | ||
% options.beta = 1e3; | ||
|
||
% Select the optimization mode: | ||
% 'VB': fully Bayesian inference (default) | ||
% 'VB_app': fully Bayesian with covariance approximation | ||
% 'MAP': maximum a posteriori (covariance is set to 0) | ||
options.mode = 'VB'; | ||
% For large scale problems, set to 'VB_app'. | ||
% options.mode = 'VB_app'; | ||
|
||
|
||
fprintf('Running VBRPCA...\n') | ||
tic | ||
[X_hat, A_hat, B_hat, E_hat] = VBRPCA(Y,options); | ||
t_total = toc; | ||
|
||
% Results | ||
Xerr = norm( X_hat - X_true, 'fro' ) / norm( X_true, 'fro'); | ||
Eerr = norm( E_hat - E_true, 'fro' ) / norm( E_true, 'fro'); | ||
|
||
r_hat = rank(X_hat); | ||
% the model does not allow for "exact" sparsity, but only "soft" sparsity. | ||
% That is, most coefficients are almost zero (very small values), but not | ||
% exactly zero. If the sparse coefficients are needed, one can use | ||
% thresholding such as | ||
nonzero_E = length(find(abs(E_hat)>1e-7)); | ||
|
||
% Show results | ||
fprintf('\n\n-------------------------------------------------------------------\n') | ||
fprintf('Dimensions (%d, %d), true rank = %d, nonzero E elements = %d, SNR = %g\n',m, n, r, length(Eomega), SNR); | ||
fprintf('low rank comp. error = %g, sparse comp. error %g, est. rank = %d,\nnonzero E elements = %d, running time = %g\n', Xerr, Eerr, r_hat, nonzero_E, t_total); | ||
fprintf('-------------------------------------------------------------------\n') | ||
|
Oops, something went wrong.