Home > src > runLRLeave1Out.m

runLRLeave1Out

PURPOSE ^

calculate leave 1 subject out accuracy for L1 penalized logistic regression

SYNOPSIS ^

function [ percCorrect w predictedVsTrue] = runLRLeave1Out( featureVect, classLabels, numSamplesPerSubj, lambdaScalar, expLabels )

DESCRIPTION ^

 calculate leave 1 subject out accuracy for L1 penalized logistic regression
 
 syntax: [ percCorrect w ] = runLRLeave1Out( featureVect, classLabels, numSamplesPerSubj, lambdaScalar, expLabels )
 
 Inputs:
   featureVect: (dim x numSamples) matrix of data
   classLabels: class labels, valued [0,1,2] for visualization, leave empty to
       not visualize. Originally, class 1 for learn, 0 for didn't learn,
       2 for remove because unclear
   numSamplesPerSubj: as name implies, for splitting up the data into
      leave 1 subject out cross validation
   lambdaScalar: optional input specifying regularization parameter for
       the  L1 penalized Logistic Regression
   expLabels: optional input usually returned by 'getLeave1OutLabels'
      function which specifies how to partition the data for leave 1
      subject out cross validation
 
 Outputs:
   percCorrect: percentage correctly classified by leave 1 subj out CV
   w: w vector returned by all the well labeled data
   predictedVsTrue: cell containing the [predeicted ; true] labels of the
       data for each fold of the cross validation
 
 Note, class 1 for learn, 0 for didn't learn, 2 for remove because unclear

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 
0002 % calculate leave 1 subject out accuracy for L1 penalized logistic regression
0003 %
0004 % syntax: [ percCorrect w ] = runLRLeave1Out( featureVect, classLabels, numSamplesPerSubj, lambdaScalar, expLabels )
0005 %
0006 % Inputs:
0007 %   featureVect: (dim x numSamples) matrix of data
0008 %   classLabels: class labels, valued [0,1,2] for visualization, leave empty to
0009 %       not visualize. Originally, class 1 for learn, 0 for didn't learn,
0010 %       2 for remove because unclear
0011 %   numSamplesPerSubj: as name implies, for splitting up the data into
0012 %      leave 1 subject out cross validation
0013 %   lambdaScalar: optional input specifying regularization parameter for
0014 %       the  L1 penalized Logistic Regression
0015 %   expLabels: optional input usually returned by 'getLeave1OutLabels'
0016 %      function which specifies how to partition the data for leave 1
0017 %      subject out cross validation
0018 %
0019 % Outputs:
0020 %   percCorrect: percentage correctly classified by leave 1 subj out CV
0021 %   w: w vector returned by all the well labeled data
0022 %   predictedVsTrue: cell containing the [predeicted ; true] labels of the
0023 %       data for each fold of the cross validation
0024 %
0025 % Note, class 1 for learn, 0 for didn't learn, 2 for remove because unclear
0026 
0027 function [ percCorrect w predictedVsTrue] = runLRLeave1Out( featureVect, classLabels, numSamplesPerSubj, lambdaScalar, expLabels )
0028 
0029 if nargin < 4 || isempty( lambdaScalar)
0030     lambdaScalar = 1;
0031 end
0032 
0033 % leave one subject out cross validation
0034 [ dim numSamples] = size( featureVect);
0035 
0036 % center and scale variables to unit variance
0037 featureVect = featureVect - repmat( mean(featureVect,2), [1,numSamples] );
0038 featStdev = std( featureVect, 0, 2);
0039 featureVect( featStdev ~= 0,:) = featureVect( featStdev ~= 0,:)./repmat(featStdev(featStdev ~= 0), [1,numSamples]);
0040 % featureVect = featureVect./repmat( std( featureVect, 0, 2)+.001, [1,numSamples]);
0041 
0042 featureVect = [ones(numSamples,1) featureVect']'; % Add Bias element to features (at top)
0043 classLabels( classLabels == 0) = -1; % Convert y to {-1,1} representation
0044 
0045 if nargin < 5 || isempty(expLabels)
0046     expLabels = getLeave1OutLabels( numSamples, numSamplesPerSubj);
0047 %     expLabels = balanceClasses( expLabels, classLabels );  % even out number samples in each class
0048 end
0049 numTrials = length(expLabels);
0050 
0051 % init weights and lambda
0052 w_init = zeros(dim+1,1);
0053 lambda = lambdaScalar*ones(dim+1,1);  %15 %[ 1./(std( featureVect, 0, 2)+.01)]; %
0054 lambda(1) = 0; % Do not penalize bias variable
0055 options = struct('verbose',0);
0056 totalNumTest = 0;
0057 numCorrect = 0;
0058 
0059 
0060 predictedVsTrue = cell( numTrials,1 );
0061 
0062 for i1 = 1:numTrials 
0063     
0064     trainLabels = classLabels(:,expLabels(i1).train);
0065     trainFeatures = featureVect(:,expLabels(i1).train);
0066     trainFeatures( :, trainLabels==2) = [];
0067     trainLabels( :, trainLabels==2) = [];
0068     
0069     testLabels = classLabels(:,expLabels(i1).test);
0070     testFeatures = featureVect(:,expLabels(i1).test);
0071     testFeatures( :, testLabels==2) = [];
0072     testLabels( :, testLabels==2) = [];
0073 
0074     if ~isempty( testLabels) 
0075         % training L1-Regularized Logistic Regression
0076         % define objective variables
0077         funObj = @(w)LogisticLoss(w,trainFeatures',trainLabels');
0078         w = L1GeneralProjection(funObj,w_init,lambda, options );
0079 
0080         % testing,
0081         predictedLabels = sign(testFeatures'*w);
0082         tempNumCorrect = sum(testLabels' == predictedLabels);
0083         numCorrect = numCorrect + tempNumCorrect;
0084         numTest = length(testLabels);
0085         totalNumTest = totalNumTest+numTest;
0086         % Output things
0087         fprintf( 'Accuracy = %.2f, (%d/%d), %d Guessed 1; Train # class 1: %d, class 0: %d. \n', ...
0088             tempNumCorrect/numTest, tempNumCorrect,numTest, length(find(predictedLabels==1)), ...
0089             length(find(trainLabels==1)), length(find(trainLabels==-1)));
0090         
0091         predictedLabels( predictedLabels == -1) = 0;
0092         testLabels( testLabels == -1) = 0;
0093         predictedVsTrue{i1} = [ predictedLabels'; testLabels];
0094                 
0095     end
0096 end
0097 percCorrect = numCorrect/totalNumTest; %length(classLabels~=2);
0098 
0099 % normal vector of optimal hyperplane (ALL DATA)
0100 featureVect( :, classLabels==2) = [];
0101 classLabels( :, classLabels==2) = [];
0102 
0103 funObj = @(w)LogisticLoss(w,featureVect',classLabels');
0104 w = L1GeneralProjection(funObj,w_init,lambda, options);
0105 w(1) =[];
0106 
0107 
0108 
0109 
0110 
0111

Generated on Wed 20-Jan-2016 11:50:43 by m2html © 2005