UGA Bulletin Logo

Mathematical Statistics II


Course Description

Exploration of concepts in statistical inference, including sampling distributions, point estimation, hypothesis testing, and regression analysis. Topics include point estimation, properties of estimators, evaluation of estimators, confidence intervals, method of moments, maximum likelihood estimation, Neyman-Pearson tests, uniformly powerful tests, likelihood ratio tests, and least squares estimation.


Athena Title

Math Statistics II


Prerequisite

STAT 4510


Semester Course Offered

Offered spring


Grading System

A - F (Traditional)


Student Learning Outcomes

  • Students will explain the principles of sampling distributions and the Central Limit Theorem and their implications for statistical inference.
  • Students will compute and evaluate point estimators, assessing their bias, mean squared error, and efficiency.
  • Students will apply the method of moments and maximum likelihood estimation to derive estimators for population parameters.
  • Students will construct and interpret confidence intervals for small- and large-sample scenarios.
  • Students will formulate and test statistical hypotheses using Neyman-Pearson theory, likelihood ratio tests, and uniformly most powerful tests.
  • Students will analyze and justify the properties of least squares estimators in linear regression models.
  • Students will develop and assess regression models, making predictions and drawing statistical inferences from estimated parameters.

Topical Outline

  • Review of Functions of Random Variables • Review of multivariable transformations using Jacobians
  • Sampling Distributions and Central Limit Theorems • Sampling distributions related to the normal distribution • The Central Limit Theorem • Normal Approximation to the Binomial distribution
  • Point Estimation • The bias and mean squared error of point estimators • Unbiased estimators and evaluation of estimators • Small- and Large-sample confidence intervals for population parameters
  • Properties of Point Estimators and Methods of Estimation • Consistency and Relative Efficiency • Principle of sufficiency and minimum variance unbiased estimation • Methods of moments and maximum likelihood estimation
  • Hypothesis Testing • Elements of hypothesis testing • Neyman-Pearson tests and calculation of Type II error and power of tests • Uniformly most powerful tests, likelihood ratio tests, and large sample tests • Testing in one-sample and two-sample problems • Relationship between hypothesis testing and interval estimation
  • Linear Models and Estimation by Least Squares • Method of Least Squares for simple linear regression • Properties of least squares estimators • Point and interval estimation of regression parameters • Prediction intervals • Statistical inference of regression parameters • Extensions to multiple regression