This help topic is for R version 1.3.1. For the current version of R, try https://stat.ethz.ch/R-manual/R-patched/library/base/html/lm.summaries.html
lm.summaries {base}R Documentation

Accessing Linear Model Fits

Description

All these functions are methods for class lm or summary.lm objects.

Usage

summary(object, correlation = FALSE)
coefficients(object, ...) ; coef(object, ...)
df.residual(object, ...)
family(object, ...)
formula(x, ...)
fitted.values(object, ...)
residuals(object,
          type=c("working","response", "deviance","pearson", "partial"), ...)
weights(object, ...)

print(summary.lm.obj, digits = max(3, getOption("digits") - 3),
      symbolic.cor = p > 4,
      signif.stars= getOption("show.signif.stars"), ...)

Arguments

object, x

an object of class lm, usually, a result of a call to lm.

Details

print.summary.lm tries to be smart about formatting the coefficients, standard errors, etc. and additionally gives “significance stars” if signif.stars is TRUE.

The generic accessor functions coefficients, effects, fitted.values and residuals can be used to extract various useful features of the value returned by lm.

Value

The function summary.lm computes and returns a list of summary statistics of the fitted linear model given in lm.obj, using the components (list elements) "call" and "terms" from its argument, plus

residuals

the weighted residuals, the usual residuals rescaled by the square root of the weights specified in the call to lm.

coefficients

a p \times 4 matrix with columns for the estimated coefficient, its standard error, t-statistic and corresponding (two-sided) p-value.

sigma

the square root of the estimated variance of the random error

\hat\sigma^2 = \frac{1}{n-p}\sum_i{R_i^2},

where R_i is the i-th residual, residuals[i].

df

degrees of freedom, a 3-vector (p, n-p, p*).

fstatistic

a 3-vector with the value of the F-statistic with its numerator and denominator degrees of freedom.

r.squared

R^2, the “fraction of variance explained by the model”,

R^2 = 1 - \frac{\sum_i{R_i^2}}{\sum_i(y_i- y^*)^2},

where y^* is the mean of y_i if there is an intercept and zero otherwise.

adj.r.squared

the above R^2 statistic “adjusted”, penalizing for higher p.

cov.unscaled

a p \times p matrix of (unscaled) covariances of the \hat\beta_j, j=1, \dots, p.

correlation

the correlation matrix corresponding to the above cov.unscaled, if correlation = TRUE is specified.

See Also

The model fitting function lm, anova.lm.

coefficients, deviance, effects, fitted.values, glm for generalized linear models, lm.influence for regression diagnostics, weighted.residuals, residuals, residuals.glm, summary.

Examples


##-- Continuing the  lm(.) example:
coef(lm.D90)# the bare coefficients
sld90 <- summary(lm.D90 <- lm(weight ~ group -1))# omitting intercept
sld90
coef(sld90)# much more

## The 2 basic regression diagnostic plots [plot.lm(.) is preferred]
plot(resid(lm.D90), fitted(lm.D90))# Tukey-Anscombe's
abline(h=0, lty=2, col = 'gray')

qqnorm(residuals(lm.D90))

[Package base version 1.3.1 ]