next up previous contents
Next: Clustering or Hough Transform Up: Robust Estimation Previous: Robust Estimation

Introduction

 

As have been stated before, least-squares estimators assume that the noise corrupting the data is of zero mean, which yields an unbiased parameter estimate. If the noise variance is known, an minimum-variance parameter estimate can be obtained by choosing appropriate weights on the data. Furthermore, least-squares estimators implicitly assume that the entire set of data can be interpreted by only one parameter vector of a given model. Numerous studies have been conducted, which clearly show that least-squares estimators are vulnerable to the violation of these assumptions. Sometimes even when the data contains only one bad datum, least-squares estimates may be completely perturbed. During the last three decades, many robust techniques have been proposed, which are not very sensitive to departure from the assumptions on which they depend.

Hampel [6] gives some justifications to the use of robustness (quoted in [18]):

What are the reasons for using robust procedures? There are mainly two observations which combined give an answer. Often in statistics one is using a parametric model implying a very limited set of probability distributions though possible, such as the common model of normally distributed errors, or that of exponentially distributed observations. Classical (parametric) statistics derives results under the assumption that these models were strictly true. However, apart from some simple discrete models perhaps, such models are never exactly true. We may try to distinguish three main reasons for the derivations: (i) rounding and grouping and other ``local inaccuracies''; (ii) the occurrence of ``gross errors'' such as blunders in measuring, wrong decimal points, errors in copying, inadvertent measurement of a member of a different population, or just ``something went wrong''; (iii) the model may have been conceived only as an approximation anyway, e.g. by virtue of the central limit theorem.

If we have some a priori knowledge about the parameters to be estimated, techniques, e.g. Kalman filtering technique, based on the test of Mahalanobis distance can be used to yield a robust estimate [25].

In the following, we describe four major approaches to robust estimation.


next up previous contents
Next: Clustering or Hough Transform Up: Robust Estimation Previous: Robust Estimation

Zhengyou Zhang
Thu Feb 8 11:42:20 MET 1996