Chapter 6: Regression

Simple linear regression is mathematically related to correlation, but is conceptually different. We use simple regression when we want to predict a continuous dependent (or outcome) variable from a continuous independent (or predictor) variable. We will also introduce multiple regression that we use when we want to predict a continuous outcome from 2+ continuous or categorical predictor variables.

One of the implications or using the regression technique is that we construct a linear statistical model that we can use to make predictions about the outcome of future and yet unseen data.

A regression analysis fits a linear model, basically a straight line, to some data. The model can be described through two components:

  1. The intercept: the value at which the regression model intersects the y-axis, and
  2. The slope (or beta value): the slope coefficient, i.e. the increase in the dependent variable for each increment of the predictor variable.

We can formalize the general idea of regression as:

Yi = b0 + bX +ε i0

Assumptions for regression

results matching ""

    No results matching ""