**Course Announcement****Time:**Monday, Wednesday, Friday at 11:00 AM - 11:50 AM

**Location:**Taft Hall 216**Instructor:**Jie Yang

**Office:**SEO 513

**Phone:**(312) 413-3748

**E-Mail:**jyang06 AT uic DOT edu

**Office Hours:**Monday, Wednesday, Friday at 12:00 p.m. - 1:00 p.m. (or by appointment)**Textbook:**Geof H. Givens and Jennifer A. Hoeting,*Computational Statistics*, John Wiley & Sons, Inc., 2nd edition, 2013.

Preview table of contents and preface.

**Course Contents:**EM Optimization Methods, Simulation and Monte Carlo Integration, Markov Chain Monte Carlo, Bootstrapping, Nonparametric Density Estimation, Bivariate Smoothing

**Prerequisite:**STAT 411 or consent of instructor.**Homework:**Turn in every Friday before class; half of the grade counts for completeness; half of the grade counts for correctness of one selected problem.

**Exam:**April 17th, Wednesday, 11:00 a.m. - 11:50 a.m.

**Project:**Students are required to work in groups on course projects and submit their final reports before May 3th, Friday, 11:00 am. The projects may come from the optional problems assigned by the instructor or be proposed by the students themselves upon the approval of the instructor.

**Grading:**Homework 20%, Exam 40%, Project 40%

**Grading Scale:**90% A , 80% B , 70% C , 60% D**Format of Exam:**Exam is mainly based on the homework and the examples discussed in class. The last class session before the exam is a review session. Please prepare any questions that you may have.*No makeup exam will be given without a valid excuse*.

**Course Syllabus****WEEK****SECTIONS****BRIEF DESCRIPTION**01/14 - 01/18 Introduction; 6.1; 6.2 Introduction to the Monte Carlo method; Exact simulation 01/21 - 01/25 Holiday; 6.2; 6.3 Exact simulation; Approximate simulation 01/28 - 02/01 6.3; 6.4; 6.4 Approximate simulation; Variance reduction techniques 02/04 - 02/08 1.7; 7.1; 7.1 Markov chains; Metropolis-Hastings algorithm 02/11 - 02/15 7.2; 7.2; 7.3 Gibbs sampling; Implementation 02/18 - 02/22 9.1; 9.2; 9.2 The bootstrap principle; Basic methods 02/25 - 03/01 9.2; 9.3; 9.3 Basic methods; Bootstrap inference 03/04 - 03/08 9.8; 4.1; 4.1 Permutation tests; Missing data, marginalization, and notation 03/11 - 03/15 4.2; 4.2; 4.2 The EM algorithm 03/18 - 03/22 4.3; 4.3; 10.1 EM Variants; Measures of performance 04/01 - 04/05 10.2; 10.2; 10.3 Kernel density estimation; Nonkernel methods 04/08 - 04/12 Review; Exam; 11.1 Predictor-response data 04/15 - 04/19 11.1; 11.2; 11.2 Predictor-response data; Linear smoothers 04/22 - 04/26 11.3; 11.3; 11.4 Comparison of linear smoothers; Nonlinear smoothers 04/29 - 05/03 11.4; 11.5; 11.5 Nonlinear smoothers; Confidence bands

**Homework**- Homework #1, due 01/25/2019

- Homework #2, due 02/08/2019

- Homework #3, data for Problem 6.4, due 02/22/2019

- Homework #4, due 03/01/2019

- Homework #5, due 03/13/2019

- Homework #6, due 03/22/2019

- Homework #7, data for Problem 10.1, due 04/12/2019

- Homework #1, due 01/25/2019
**Using R**- Download
**R**for Free -- the most popular software used by statisticians

**R Studio**for Free -- a convenient R programming envirenment

- Learn R in 15 Minutes

- Use R to Compute Numerical Integrals

- Downloadable Books on R:
*An Introduction to R*, by William N. Venables, David M. Smith and the R Development Core Team

*Using R for Data Analysis and Graphics - Introduction, Code and Commentary*, by John H. Maindonald

**More R Books in Different Languages ...**

- R Code for the Course
- R code for §6.1 & §6.2 (R tips, simple Monte Carlo estimator)

- R code for §6.2 (Inverse cumulative distribution, rejection sampling)

- R code for §6.2.3 (Squeezed rejection sampling)

- R code for §6.3.1 (Sampling importance resampling)

- R code for §6.4.1 (Importance sampling)

- R code for §6.4.2, §6.4.2 (Antithetic sampling, control variates)

- R code for §7.1 (Metropolis-Hastings algorithm) data for Examples 7.2, 7.3

- R code for §7.2 (Gibbs sampler)

- R code for §9.1, §9.2, §9.3 (Bootstrap) data for Examples 9.3 ~ 9.8

- R code for §4.2 (EM algorithm)

- R code for §4.2.3 (SEM algorithm)

- R code for §10.2 (Kernel density estimation) data for Examples 10.1, 10.5, data for Examples 10.2 ~ 10.4

- R code for §10.2.2 (Choice of kernel)

- R code for §10.3 (Logspline)

- R code for §11.2.1 (Constant-span running mean) data for Examples 11.1 ~ 11.7

- R code for §11.2.2 ~ §11.2.5 (Running polynomials, local regression smoothing, kernel smoother, spline smoothing)

- R code for §11.4 (Loess smoother, supersmoother) data for Example 11.8

- R code for §11.5 (Confidence bands) data for Examples 11.9 ~ 11.11

- R code for §6.1 & §6.2 (R tips, simple Monte Carlo estimator)

- Download
**Relevant Course Materials**- Textbook Website --
Datasets, Errata,
R Code

- James E. Gentle,
*Computational Statistics*, Springer, 2009 -- a good reference book

- Textbook Website --
Datasets, Errata,
R Code

UIC Home | Library | MSCS | Jie's Home