2017-08-01T17:07:02+03:00[Europe/Moscow] en true Logistic regression, Receiver operating characteristic, Vector generalized linear model, Regression validation, Linear least squares (mathematics), Standardized coefficient, Structural equation modeling, General linear model, Partition of sums of squares, Smoothing spline, Prediction interval, Multinomial logistic regression, Generalized estimating equation, Coefficient of determination, Likelihood function, Mixed model, Errors and residuals, Optimal design, Component analysis (statistics), Ordinary least squares, Linear model, Multivariate adaptive regression splines, Partial least squares regression, Gauss–Markov theorem, Curve fitting flashcards
Regression analysis

Regression analysis

  • Logistic regression
    In statistics, logistic regression, or logit regression, or logit model is a regression model where the dependent variable (DV) is categorical.
  • Receiver operating characteristic
    In statistics, a receiver operating characteristic (ROC), or ROC curve, is a graphical plot that illustrates the performance of a binary classifier system as its discrimination threshold is varied.
  • Vector generalized linear model
    In statistics, the class of vector generalized linear models (VGLMs) was proposed to enlarge the scope of models catered for by generalized linear models (GLMs).
  • Regression validation
    In statistics, regression validation is the process of deciding whether the numerical results quantifying hypothesized relationships between variables, obtained from regression analysis, are acceptable as descriptions of the data.
  • Linear least squares (mathematics)
    In statistics and mathematics, linear least squares is an approach fitting a mathematical or statistical model to data in cases where the idealized value provided by the model for any data point is expressed linearly in terms of the unknown parameters of the model.
  • Standardized coefficient
    In statistics, standardized coefficients or beta coefficients are the estimates resulting from a regression analysis that have been standardized so that the variances of dependent and independent variables are 1.
  • Structural equation modeling
    Structural equation modeling (SEM) refers to a diverse set of mathematical models, computer algorithms, and statistical methods that fit networks of constructs to data.
  • General linear model
    The general linear model is a statistical linear model.
  • Partition of sums of squares
    The partition of sums of squares is a concept that permeates much of inferential statistics and descriptive statistics.
  • Smoothing spline
    (For a broader coverage related to this topic, see Spline (mathematics).) The smoothing spline is a method of fitting a smooth curve to a set of noisy observations using a spline function.
  • Prediction interval
    In statistical inference, specifically predictive inference, a prediction interval is an estimate of an interval in which future observations will fall, with a certain probability, given what has already been observed.
  • Multinomial logistic regression
    In statistics, multinomial logistic regression is a classification method that generalizes logistic regression to multiclass problems, i.
  • Generalized estimating equation
    In statistics, a generalized estimating equation (GEE) is used to estimate the parameters of a generalized linear model with a possible unknown correlation between outcomes.
  • Coefficient of determination
    In statistics, the coefficient of determination, denoted R2 or r2 and pronounced "R squared", is a number that indicates the proportion of the variance in the dependent variable that is predictable from the independent variable.
  • Likelihood function
    In statistics, a likelihood function (often simply the likelihood) is a function of the parameters of a statistical model given data.
  • Mixed model
    A mixed model is a statistical model containing both fixed effects and random effects.
  • Errors and residuals
    In statistics and optimization, errors and residuals are two closely related and easily confused measures of the deviation of an observed value of an element of a statistical sample from its "theoretical value".
  • Optimal design
    In the design of experiments, optimal designs (or optimum designs) are a class of experimental designs that are optimal with respect to some statistical criterion.
  • Component analysis (statistics)
    Component analysis is the analysis of two or more independent variables which comprise a treatment modality.
  • Ordinary least squares
    In statistics, ordinary least squares (OLS) or linear least squares is a method for estimating the unknown parameters in a linear regression model, with the goal of minimizing the sum of the squares of the differences between the observed responses in the given dataset and those predicted by a linear function of a set of explanatory variables (visually this is seen as the sum of the vertical distances between each data point in the set and the corresponding point on the regression line - the smaller the differences, the better the model fits the data).
  • Linear model
    In statistics, the term linear model is used in different ways according to the context.
  • Multivariate adaptive regression splines
    In statistics, multivariate adaptive regression splines (MARS) is a form of regression analysis introduced by Jerome H.
  • Partial least squares regression
    Partial least squares regression (PLS regression) is a statistical method that bears some relation to principal components regression; instead of finding hyperplanes of maximum variance between the response and independent variables, it finds a linear regression model by projecting the predicted variables and the observable variables to a new space.
  • Gauss–Markov theorem
    In statistics, the Gauss–Markov theorem, named after Carl Friedrich Gauss and Andrey Markov, states that in a linear regression model in which the errors have expectation zero and are uncorrelated and have equal variances, the best linear unbiased estimator (BLUE) of the coefficients is given by the ordinary least squares (OLS) estimator.
  • Curve fitting
    Curve fitting is the process of constructing a curve, or mathematical function, that has the best fit to a series of data points, possibly subject to constraints.