MATH 261A: Regression Theory and Methods
San Jose State University, Fall 2020Course information [Syllabus]
Math 261A is a graduate course in regression theory and methods. It provides an overview of the most commonly used regression techniques such as simple and multiple linear regression, use of categorical variables in regression, model diagnostics, variable transformations and nonlinear regression techniques. Other topics discussed include variable selection, regression trees, logistic regression and statistical inference for regression models. The theoretical aspects of the models are discussed. Practical applications include data analysis with the statistical software package R.
Prerequisites: Math 39 and Math 161A (each with a grade of B or better), Math 163* and Math 167R* (*may be taken concurrently)
Textbook: Montgomery, Peck & Vining, Introduction to Linear Regression Analysis. 5th edition, 2012. ISBN: 978-0-470-54281-1. This text is available as an e-book (for free) through the SJSU library.
Technology requirements:
- Zoom: Class meetings will be on Zoom [Click here to register].
- Canvas: Zoom recordings, assignments and grades will be posted in Canvas (accessible via http://one.sjsu.edu/).
- Proctorio: Tests will be delivered via Proctorio.
- Piazza: This course will use Piazza as the bulletin board. Please post all course-related questions there.
- Computing:
- A scientific calculator is required for use on Homework assignments and Exams. Calculators that can compute Normal, t, F, and chi-squared distribution probabilities and quantiles (e.g., TI-84) are preferred.
- Access to a computer that runs R (a freely available statistical software that runs under Windows, Mac and Linux environments) is required for homework assignments.
Lecture slides
Slides are continuously being updated. You are suggested to download a new copy right before each class (remember to refresh your browser).
Topics (and lecture material) |
Textbook sections |
Assigned problems | |
---|---|---|---|
0 | Chapter 1 | None (reading only) | |
1 | Simple linear regression [slides] | Chapter 2 (2.1 - 2.6) |
HW1: 2.6, 2.12, 2.25, 2.27, 2.32 |
2 | Multiple linear regression [slides] | Chapter 3 (3.1-3.5, 3.8-3.10) |
HW2 (see Canvas) |
3 | Model adequacy checking [slides] | Chapter 4 (4.1-4.3, 4.5) |
HW3 (see Canvas) |
4 | Transformations and weighting [slides] | Chapter 5 (5.1-5.5) |
HW4 (see Canvas) |
5 | Diagnostics for leverage and influence [slides] | Chapter 6 (6.1-6.7) |
HW5 (see Canvas) |
6 | Polynomial regression [slides] | Chapter 7 (7.1-7.4) |
HW6 (see Canvas) |
7 | Indicator variables [slides] | Chapter 8 (8.1-8.2) |
HW7 (see Canvas) |
8 | Multicollinearity (and model validation) [slides] |
Section 3.10 Chapter 9 (9.1-9.5) Chapter 11 (11.1-11.2) |
HW8 (to be posted) |
9 | Variable selection and model building [slides] |
Chapter 10 (10.1-10.3) |
|
10 | Generalized linear models [slides] |
Chapter 13 (13.1-13.3) |