Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/658887/h…
regression - How to calculate the slope of a line of best fit that ...
This kind of regression seems to be much more difficult. I've read several sources, but the calculus for general quantile regression is going over my head. My question is this: How can I calculate the slope of the line of best fit that minimizes L1 error? Some constraints on the answer I am looking for:
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/74622/co…
regression - Converting standardized betas back to original variables ...
I have a problem where I need to standardize the variables run the (ridge regression) to calculate the ridge estimates of the betas. I then need to convert these back to the original variables scale.
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/447455/m…
Multivariable vs multivariate regression - Cross Validated
Multivariable regression is any regression model where there is more than one explanatory variable. For this reason it is often simply known as "multiple regression". In the simple case of just one explanatory variable, this is sometimes called univariable regression. Unfortunately multivariable regression is often mistakenly called multivariate regression, or vice versa. Multivariate ...
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/633091/s…
Support Vector Regression vs. Linear Regression - Cross Validated
Linear regression can use the same kernels used in SVR, and SVR can also use the linear kernel. Given only the coefficients from such models, it would be impossible to distinguish between them in the general case (with SVR, you might get sparse coefficients depending on the penalization, due to $\epsilon$-insensitive loss)
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/175/how-…
How should outliers be dealt with in linear regression analysis ...
What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/12900/wh…
regression - When is R squared negative? - Cross Validated
Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is equivalent to the squared correlation between the predictor and the dependent variable -- again, this must be non-negative.
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/65287/di…
regression - Difference between forecast and prediction ... - Cross ...
I was wondering what difference and relation are between forecast and prediction? Especially in time series and regression? For example, am I correct that: In time series, forecasting seems to mea...
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/76226/in…
regression - Interpreting the residuals vs. fitted values plot for ...
Consider the following figure from Faraway's Linear Models with R (2005, p. 59). The first plot seems to indicate that the residuals and the fitted values are uncorrelated, as they should be in a
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/22718/wh…
correlation - What is the difference between linear regression on y ...
The Pearson correlation coefficient of x and y is the same, whether you compute pearson(x, y) or pearson(y, x). This suggests that doing a linear regression of y given x or x given y should be the ...
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/29781/wh…
When conducting multiple regression, when should you center your ...
In some literature, I have read that a regression with multiple explanatory variables, if in different units, needed to be standardized. (Standardizing consists in subtracting the mean and dividin...