UGA Bulletin Logo

Statistical Inference for Data Scientists


Course Description

Mathematical and computational approaches to estimation and inference from frequentist and Bayesian perspectives. Sampling distributions; maximum likelihood estimation; computational maximization of likelihoods, including grid search, Newton- Raphson methods; likelihood ratio tests. Simulations of power and error rates. Introduction to Bayesian inference; prior and posterior distributions; model building; sampling from the posterior distribution; MCMC algorithms.

Additional Requirements for Graduate Students:
Additional and/or alternative problems of a more challenging nature will be required for graduate students on homework and exams.


Athena Title

Stat Inference Data Scientists


Undergraduate Prerequisite

CSCI 3360 and STAT 4510/6510


Graduate Prerequisite

STAT 4510/6510


Semester Course Offered

Offered spring


Grading System

A - F (Traditional)


Course Objectives

This course is targeted at students in data science and will teach them principles of statistical estimation and inference in the two major inferential paradigms of maximum likelihood and Bayesian inference. After taking this course, students will be able to - mathematically derive sampling distributions of sample statistics - simulate the sampling distribution of sample statistics - define likelihood - calculate likelihoods for iid samples - maximize likelihoods mathematically - derive maximum likelihood estimators for parameters of important probability distributions - use computational methods for maximizing likelihoods, including grid search, and Newton-Raphson methods. - use likelihood ratios as a flexible tool for hypothesis testing - develop simulations of power for hypothesis testing - develop simulations of error rates for hypothesis testing - contrast the philosophy of Bayesian inference versus frequentist inference - compute posterior distributions in Bayesian inference. - choose an appropriate prior distribution - describe MCMC algorithms, including Metropolis-Hastings and Gibbs samplers - implement MCMC algorithms, including Metropolis-Hastings and Gibbs samplers


Topical Outline

Sampling distributions, maximum likelihood, computational methods for maximum likelihood, likelihood ratio tests, introduction to Bayesian inference, philosophy of Bayesian inference, calculating the posterior distribution, choosing the prior, computational methods for sampling from the posterior.


Syllabus