9853

Nonparametric Curve Estimation by Kernel Smoothers: Efficiency of Unbiased Risk Estimate and GCV Selectors

This Demonstration considers one of the simplest nonparametric-regression problems: let be a smooth real-valued function over the interval ; recover (or estimate) when one only knows approximate values for that satisfy the model , where and the are independent, standard normal random variables. Sometimes one assumes that the noise level is known. Such a curve estimation problem is also called a signal denoising problem.
Perhaps the simplest "solution" to this problem is by the classical and widely used kernel-smoothing method, which is a particular case of the Loess method (locally weighted scatterplot smoothing); see the Details below.
In the kernel-smoothing method (like in any similar nonparametric method, e.g. the smoothing spline method) one or several smoothing parameters have to be appropriately chosen to obtain a "good" estimate of the true curve . For the kernel method, it is known that choosing a good value for the famous bandwidth parameter (see Details) is crucial, much more than the choice of the class of the kernels, which is fixed here.
The curve estimate corresponding to a given value for the bandwidth parameter is then denoted . Notice that depends on , , and the 's only through the data .
Three very popular methods are available for choosing : cross-validation (also called the "leave-one-out" principle; see PRESS statistic), generalized cross validation (GCV), and Mallows’ (also sometimes denoted by or UBR for unbiased risk estimate). In fact, cross-validation and GCV coincide in our context, where periodic end conditions are assumed. See [1] for a review; and see [2] for the definition and an analysis of . Notice that GCV (in contrast to the method) does not require that be known.
This Demonstration provides interactive assessments of the statistical efficiency of GCV and smoothing-parameter selectors; and this can be done for rather large (here, could be taken as large as with reasonably fast interactivity on a current personal computer).
Here six examples for (the "true curve") can be tried: is plotted (in blue) in the third of the three possible views using the tabs:
1. The data and the curve-estimate , where you choose by the trial-bandwidth slider, show that the choice of the bandwidth is crucial.
2. The two-curve estimates given by GCV and are very often so similar that they can not be distinguished.
3. The third tab displays quantities related to the "truth": the ASE-optimal choice yields the green curve, this is the -value that minimizes a global discrepancy between , and , defined by
,
where ASE stands for average of the squared errors. (Note that this ASE-optimal choice is a target that seems to be unattainable in practice, since we do not know the true curve .)
In the third view, the curve-estimate associated with the automatic choice is again plotted (purple curve): we can then assess the typical efficiency of (or the quite close GCV choice) by comparing this curve estimate with the targeted ASE-optimal curve.
This third view also displays the two associated ASE values, in addition to the one associated with the bandwidth chosen by-eye. This shows that it is difficult to choose by eye a trial-bandwidth in the first view (thus without the help of the true curve plot) that will turn to be as least as good as the GCV or choice, when "better" still means a lower ASE-value.
Staying in the third view and only increasing the size of the data set, one typically observes that the curve estimate and the ASE-optimal curve generally become very close, and the two associated ASE distances often become relatively similar. This agrees with the known asymptotic theory, see [3] and the references therein.
The rate at which the relative difference between the (or GCV) bandwidth and the optimal bandwidth converges to zero can be assessed in the additional panel, top right, in the third view, which shows the two kernels (precisely only the right-hand part of each of the two kernels, since they are even functions) associated with the two bandwidths: the differences between the two bandwidths can be scrutinized with a much better precision than by looking only at the two associated curves.
By varying the seed that generates the data, you can also observe that, in certain cases, there sometimes remains non-negligible differences between the (or GCV) choice and the optimal bandwidth.
  • Contributed by: Didier A. Girard
  • (CNRS-LJK and University Joseph Fourier, Grenoble)

SNAPSHOTS

  • [Snapshot]
  • [Snapshot]
  • [Snapshot]

DETAILS

The three snapshots above are respectively the three views of the results obtained when analysing a fixed data set of size 1024.
The kernel estimation method for nonparametric regression is a particular case of the Loess method mentioned in Ian McLeod's Demonstration, How Loess Works.
The case of local constant fit and a homogeneous (in space) amount of smoothing is considered here.
The kernels used here are from the class of biweight kernels that resembles the class (when varies) of the densities of the normal (Gaussian) distribution with mean and standard deviation , except that a biweight kernel has compact support, with the length of support being the kernel bandwidth, also denoted . Each candidate among the curve estimates needs to be evaluated only at the equispaced points , . The computation of such a candidate is simply and quickly provided by the built-in Mathematica function ListConvolve.
We restrict ourselves to sizes that are powers of : all the invoked discretized kernel functions can then be simply and quickly obtained by first precomputing the required kernel values for the case equal to its largest possible value and for bandwidths in a fixed fine grid of size 401 (in the Initialization step) and then by appropriately subsampling this precomputed Table.
References
[1] G. Wahba, "(Smoothing) Splines in Nonparametric Regression," Technical Report No. 1024, Department of Statistics, University of Wisconsin, Madison, WI, 2000 (Prepared for the Encyclopedia of Environmetrics)
http://www.stat.wisc.edu/wahba/ftp1/environ.ps
[2] G. Golub, M. Heath, and G. Wahba, "Generalized Cross Validation as a Method for Choosing a Good Ridge Parameter," Technometrics 21(2), 1979 pp. 215–224. http://www.jstor.org/stable/1268518.
[3] D. A. Girard, "Asymptotic Comparison of (Partial) Cross-Validation, GCV and Randomized GCV in Nonparametric Regression," The Annals of Statistics, 26(1), 1998, pp. 315–334. doi:10.1214/aos/1030563988.
http://projecteuclid.org/euclid.aos/1030563988
    • Share:

Embed Interactive Demonstration New!

Just copy and paste this snippet of JavaScript code into your website or blog to put the live Demonstration on your site. More details »

Files require Wolfram CDF Player or Mathematica.









 
RELATED RESOURCES
Mathematica »
The #1 tool for creating Demonstrations
and anything technical.
Wolfram|Alpha »
Explore anything with the first
computational knowledge engine.
MathWorld »
The web's most extensive
mathematics resource.
Course Assistant Apps »
An app for every course—
right in the palm of your hand.
Wolfram Blog »
Read our views on math,
science, and technology.
Computable Document Format »
The format that makes Demonstrations
(and any information) easy to share and
interact with.
STEM Initiative »
Programs & resources for
educators, schools & students.
Computerbasedmath.org »
Join the initiative for modernizing
math education.
Step-by-step Solutions »
Walk through homework problems one step at a time, with hints to help along the way.
Wolfram Problem Generator »
Unlimited random practice problems and answers with built-in Step-by-step solutions. Practice online or make a printable study sheet.
Wolfram Language »
Knowledge-based programming for everyone.
Powered by Wolfram Mathematica © 2014 Wolfram Demonstrations Project & Contributors  |  Terms of Use  |  Privacy Policy  |  RSS Give us your feedback
Note: To run this Demonstration you need Mathematica 7+ or the free Mathematica Player 7EX
Download or upgrade to Mathematica Player 7EX
I already have Mathematica Player or Mathematica 7+