Estimation theory

2017-07-30T22:12:36+03:00[Europe/Moscow] en true Estimation theory, Confidence interval, Fisher information, Bayes estimator, Interval estimation, Likelihood function, Mean squared error, Helmert–Wolf blocking, Richardson–Lucy deconvolution, Ordinary least squares, Confidence region flashcards Estimation theory
Click to flip
  • Estimation theory
    Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component.
  • Confidence interval
    In statistics, a confidence interval (CI) is a type of interval estimate of a population parameter.
  • Fisher information
    In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X.
  • Bayes estimator
    In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss).
  • Interval estimation
    In statistics, interval estimation is the use of sample data to calculate an interval of possible (or probable) values of an unknown population parameter, in contrast to point estimation, which is a single number.
  • Likelihood function
    In statistics, a likelihood function (often simply the likelihood) is a function of the parameters of a statistical model given data.
  • Mean squared error
    In statistics, the mean squared error (MSE) or mean squared deviation (MSD) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors or deviations—that is, the difference between the estimator and what is estimated.
  • Helmert–Wolf blocking
    The Helmert–Wolf blocking (HWB) is a least squares solution method for a sparse canonical block-angular (CBA) system of linear equations.
  • Richardson–Lucy deconvolution
    The Richardson–Lucy algorithm, also known as Lucy–Richardson deconvolution, is an iterative procedure for recovering a latent image that has been blurred by a known point spread function.
  • Ordinary least squares
    In statistics, ordinary least squares (OLS) or linear least squares is a method for estimating the unknown parameters in a linear regression model, with the goal of minimizing the sum of the squares of the differences between the observed responses in the given dataset and those predicted by a linear function of a set of explanatory variables (visually this is seen as the sum of the vertical distances between each data point in the set and the corresponding point on the regression line - the smaller the differences, the better the model fits the data).
  • Confidence region
    In statistics, a confidence region is a multi-dimensional generalization of a confidence interval.