Home > wafo > kdetools > hbcv.m

hbcv

PURPOSE ^

Biased Cross-Validation estimate of smoothing parameter.

SYNOPSIS ^

[h,hvec,score]=hbcv(A,kernel,hvec)

DESCRIPTION ^

 HBCV  Biased Cross-Validation estimate of smoothing parameter. 
  
  CALL: [hs,hvec,score] = hbcv(data,kernel,hvec);  
   
    hs     = smoothing parameter 
    hvec   = vector defining possible values of hs 
             (default linspace(0.25*h0,h0),100), h0=hos(data,kernel)) 
    score  = score vector 
    data   = data vector 
    kernel = 'epanechnikov'  - Epanechnikov kernel. 
             'biweight'      - Bi-weight kernel. 
             'triweight'     - Tri-weight kernel.   
             'triangluar'    - Triangular kernel. 
             'gaussian'      - Gaussian kernel 
             'rectangular'   - Rectanguler kernel.  
             'laplace'       - Laplace kernel. 
             'logistic'      - Logistic kernel. 
  
   Note that only the first 4 letters of the kernel name is needed. 
    
   HBCV is a hybrid of crossvalidation and direct plug-in estimates. 
   The main attraction of HBCV is that it is more stable than HLSCV in 
   the sense that its asymptotic variance is considerably lower. However, 
   this reduction in variance comes at the expense of an increase in 
   bias, with HBCV tending to be larger than the HNS estimate. 
   Asymptotically HBCV has a relative slow convergence rate. 
  
   Example: data = wnormrnd(0, 1,20,1) 
           [hs hvec score] = hbcv(data,'epan'); 
           plot(hvec,score)  
  See also  hste, hboot, hns, hos, hldpi, hlscv, hscv, hstt, kde, kdefun

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

001 function [h,hvec,score]=hbcv(A,kernel,hvec) 
002 %HBCV  Biased Cross-Validation estimate of smoothing parameter. 
003 % 
004 % CALL: [hs,hvec,score] = hbcv(data,kernel,hvec);  
005 %  
006 %   hs     = smoothing parameter 
007 %   hvec   = vector defining possible values of hs 
008 %            (default linspace(0.25*h0,h0),100), h0=hos(data,kernel)) 
009 %   score  = score vector 
010 %   data   = data vector 
011 %   kernel = 'epanechnikov'  - Epanechnikov kernel. 
012 %            'biweight'      - Bi-weight kernel. 
013 %            'triweight'     - Tri-weight kernel.   
014 %            'triangluar'    - Triangular kernel. 
015 %            'gaussian'      - Gaussian kernel 
016 %            'rectangular'   - Rectanguler kernel.  
017 %            'laplace'       - Laplace kernel. 
018 %            'logistic'      - Logistic kernel. 
019 % 
020 %  Note that only the first 4 letters of the kernel name is needed. 
021 %   
022 %  HBCV is a hybrid of crossvalidation and direct plug-in estimates. 
023 %  The main attraction of HBCV is that it is more stable than HLSCV in 
024 %  the sense that its asymptotic variance is considerably lower. However, 
025 %  this reduction in variance comes at the expense of an increase in 
026 %  bias, with HBCV tending to be larger than the HNS estimate. 
027 %  Asymptotically HBCV has a relative slow convergence rate. 
028 % 
029 %  Example: data = wnormrnd(0, 1,20,1) 
030 %          [hs hvec score] = hbcv(data,'epan'); 
031 %          plot(hvec,score)  
032 % See also  hste, hboot, hns, hos, hldpi, hlscv, hscv, hstt, kde, kdefun   
033  
034  
035 % tested on : matlab 5.2 
036 % history: 
037 % revised pab aug2005 
038 % -bug fix for kernels other than Gaussian 
039 % revised pab dec2003   
040 % revised pab 20.10.1999 
041 %   updated to matlab 5.2 
042 % changed input arguments 
043 % taken from kdetools     Christian C. Beardah 1995  
044  
045 A=A(:); 
046 n=length(A); 
047  
048 if nargin<3|isempty(hvec), 
049   H=hos(A,kernel); 
050   hvec=linspace(0.25*H,H,100); 
051 else 
052   hvec=abs(hvec); 
053 end; 
054 steps=length(hvec); 
055  
056 M=A*ones(size(A')); 
057  
058 Y1=(M-M'); 
059   
060  
061 % R   = int(mkernel(x)^2) 
062 % mu2 = int(x^2*mkernel(x)) 
063 [mu2,R] = kernelstats(kernel); 
064  
065 for i=1:steps, 
066  
067   %sig = sqrt(2)*hvec(i); 
068   sig=hvec(i); 
069   
070   Y=Y1/sig; 
071  
072   term2=(Y.^4-6*Y.^2+3).*exp(-0.5*Y.^2)/sqrt(2*pi); 
073  
074   Rf = sum(sum(term2-diag(diag(term2))))/(n^2*sig^5); 
075    
076  
077   score(i)=R/(n*hvec(i))+mu2^2*hvec(i)^4*Rf/4; 
078  
079 end; 
080  
081 [L,I]=min(score); 
082  
083 h=hvec(I); 
084

Mathematical Statistics
Centre for Mathematical Sciences
Lund University with Lund Institute of Technology

Comments or corrections to the WAFO group


Generated on Thu 06-Oct-2005 02:21:16 for WAFO by m2html © 2003