Электронная книга: Mohammad Arashi «Theory of Ridge Regression Estimation with Applications»

Theory of Ridge Regression Estimation with Applications

A guide to the systematic analytical results for ridge, LASSO, preliminary test, and Stein-type estimators with applications Theory of Ridge Regression Estimation with Applications offers a comprehensive guide to the theory and methods of estimation. Ridge regression and LASSO are at the center of all penalty estimators in a range of standard models that are used in many applied statistical analyses. Written by noted experts in the field, the book contains a thorough introduction to penalty and shrinkage estimation and explores the role that ridge, LASSO, and logistic regression play in the computer intensive area of neural network and big data analysis. Designed to be accessible, the book presents detailed coverage of the basic terminology related to various models such as the location and simple linear models, normal and rank theory-based ridge, LASSO, preliminary test and Stein-type estimators.The authors also include problem sets to enhance learning. This book is a volume in the Wiley Series in Probability and Statistics series that provides essential and invaluable reading for all statisticians. This important resource: Offers theoretical coverage and computer-intensive applications of the procedures presented Contains solutions and alternate methods for prediction accuracy and selecting model procedures Presents the first book to focus on ridge regression and unifies past research with current methodology Uses R throughout the text and includes a companion website containing convenient data sets Written for graduate students, practitioners, and researchers in various fields of science, Theory of Ridge Regression Estimation with Applications is an authoritative guide to the theory and methodology of statistical estimation.

Издательство: "John Wiley&Sons Limited"

ISBN: 9781118644522

электронная книга

Купить за 12175.39 руб и скачать на Litres

Другие книги автора:

КнигаОписаниеГодЦенаТип книги
Statistical Inference for Models with Multivariate t-Distributed ErrorsThis book summarizes the results of various models under normal theory with a brief review of the literature. Statistical Inference for Models with Multivariate t-Distributed Errors: Includes a wide… — John Wiley&Sons Limited, электронная книга Подробнее...8802.98электронная книга

Look at other dictionaries:

  • Linear regression — Example of simple linear regression, which has one independent variable In statistics, linear regression is an approach to modeling the relationship between a scalar variable y and one or more explanatory variables denoted X. The case of one… …   Wikipedia

  • Robust regression — In robust statistics, robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non parametric methods. Regression analysis seeks to find the effect of one or more independent… …   Wikipedia

  • Nonlinear regression — See Michaelis Menten kinetics for details In statistics, nonlinear regression is a form of regression analysis in which observational data are modeled by a function which is a nonlinear combination of the model parameters and depends on one or… …   Wikipedia

  • Structural equation modeling — (SEM) is a statistical technique for testing and estimating causal relations using a combination of statistical data and qualitative causal assumptions. This definition of SEM was articulated by the geneticist Sewall Wright (1921),[1] the… …   Wikipedia

  • Ordinary least squares — This article is about the statistical properties of unweighted linear regression analysis. For more general regression analysis, see regression analysis. For linear regression on a single variable, see simple linear regression. For the… …   Wikipedia

  • Degrees of freedom (statistics) — In statistics, the number of degrees of freedom is the number of values in the final calculation of a statistic that are free to vary.[1] Estimates of statistical parameters can be based upon different amounts of information or data. The number… …   Wikipedia

  • Minimum description length — The minimum description length (MDL) principle is a formalization of Occam s Razor in which the best hypothesis for a given set of data is the one that leads to the best compression of the data. MDL was introduced by Jorma Rissanen in 1978. It is …   Wikipedia

  • Least squares — The method of least squares is a standard approach to the approximate solution of overdetermined systems, i.e., sets of equations in which there are more equations than unknowns. Least squares means that the overall solution minimizes the sum of… …   Wikipedia

  • Minimum message length — (MML) is a formal information theory restatement of Occam s Razor: even when models are not equal in goodness of fit accuracy to the observed data, the one generating the shortest overall message is more likely to be correct (where the message… …   Wikipedia

  • Errors and residuals in statistics — For other senses of the word residual , see Residual. In statistics and optimization, statistical errors and residuals are two closely related and easily confused measures of the deviation of a sample from its theoretical value . The error of a… …   Wikipedia

  • Linear least squares (mathematics) — This article is about the mathematics that underlie curve fitting using linear least squares. For statistical regression analysis using least squares, see linear regression. For linear regression on a single variable, see simple linear regression …   Wikipedia