Regression Results Ifc
A useful resource for regression can be found at (https://online.stat.psu.edu/stat501/lesson/5/5.3)
Inheritors
Properties
This is the adjusted R-squared = 1 - ((1 - R-squared)*(n-1)/(n-p)) where n is the number of observations and p is the number of parameters estimated (including the intercept).
The Cook distance measures for diagnostic plotting
An estimate of the variance of the (residual) errors. This is MSE = SSE/(n-p)
This is MSR/MSE
Indicates true if the regression model includes an intercept term.
The diagonal entries from the hat matrix
The hat matrix is defined in terms of the design matrix X by $X(X^{T}X)^{-1}X^{T}$
A pseudonym for error variance (MSE)
This is MSR = SSR/(p-1)
The total number of observations (y_1, y_2, ..., y_n), where n = the number of observations.
Number of parameters in the model (including the intercept, if estimated)
An array containing the estimated parameters of the regression. The b_0, b_1_,..., b_k, where b_0 is the intercept term and k is the number of parameters estimated, so p = k + 1 is the total number of parameters (including the intercept term).
The standard error estimate for each regression coefficient.
Estimates for the variance of the regression parameters. The variance-covariance matrix of the regression parameters
The test statistics for testing if parameter j is significant. This is parametersi divided by parametersStdErrori.
This is the yHat_i. The predicted values for each observation.
The names of the predictor variables
An estimate of the variance of Y. The sample variance of the dependent variable.
The degrees of freedom for the regression (numParameters - 1)
The average distance that the observed values fall from the regression line. It tells you how wrong the regression model is on average using the units of the response variable. Smaller values are better because it indicates that the observations are closer to the fitted line. The standard deviation of the errors in the regression model. Sometimes called the standard error of the estimate. This is the square root of MSE.
This is the sum of squares of the regression (SSR) SST = SSR + SSE. Thus, SSR = SST - SSE
The array of residual errors, e_i = (y_i - yHat_i)
This is SSE (sum of squared residual error).
The response values, the regressand values, the Y's
The name of the response variable
The array of standardize residuals
The studentized residuals for diagnostic plotting
The SST total sum of squares. Sum of squared deviations of Y from its mean.
Functions
ANOVA results for regression as a string
The regression results in the form of a html string
This assumes that the errors are normally distributed with mean zero and constant variance. The level must be a valid probability. The default is 0.95.
A data frame holding the parameter results for the regression.
The data associated with the named predictor. The name must exist as a predictor name.
All the residual data in a data frame (responseName, "Predicted", "Residuals", "StandardizedResiduals", "StudentizedResiduals", "h_ii", "CookDistances")
A plot of the residuals based on observation order.
A scatter plot of the residuals (on y-axis) and predicted (on x-axis).
Shows the diagnostic plots within a browser window.
Shows the diagnostic plots within a browser window.
A fit distribution plot of the standardized residuals for checking normality.