## Course Description

Welcome to 6.041/6.431, a subject on the modeling and analysis of random phenomena and processes, including the basics of statistical inference. Nowadays, there is broad consensus that the ability to think probabilistically is a fundamental component of scientific literacy. For example:

- The concept of statistical significance (to be touched upon at the end of this course) is considered by the Financial Times as one of “The Ten Things Everyone Should Know About Science”.
- A recent Scientific American article argues that statistical literacy is crucial in making health-related decisions.
- Finally, an article in the New York Times identifies statistical data analysis as an upcoming profession, valuable everywhere, from Google and Netflix to the Office of Management and Budget.

The aim of this class is to introduce the relevant models, skills, and tools, by combining mathematics with conceptual understanding and intuition.

## General Information

Welcome to 6.041/6.431! This fundamental subject is concerned with the nature, formulation, and analysis of probabilistic situations. No previous experience with probability is assumed. This course is fun, but also demanding.

Students intending to take the undergraduate version of the course need to sign up for 6.041, while those intending to take the graduate version should sign up for 6.431, which includes full participation in 6.041, together with some additional homework problems, additional topics, and possibly different quiz and exam questions.

6.041/6.431* *has three types of class sessions: lectures, recitations, and tutorials. The lectures and recitations each meet twice a week. In addition, there will be a tutorial once a week, which is not mandatory, but is highly recommended.

Lectures serve to introduce new concepts. They have an overview character, but also include some derivations and motivating applications. In recitation, your instructor elaborates on the theory, works through new examples with your participation, and answers your questions about them. In tutorial, you discuss and solve new examples with a little help from your classmates and your instructor. Tutorials are active sessions to help you develop confidence in thinking about probabilistic situations in real time. Tutorials are highly recommended; past students have found them to be very helpful.

## Prerequisites

The prerequisite for 6.041/6.431 is 18.02, or a year of college-level calculus for those with undergraduate degrees from other universities.

## Text

The text for this course is:

Bertsekas, Dimitri, and John Tsitsiklis. *Introduction to Probability*. 2nd ed. Athena Scientific, 2008. ISBN: 9781886529236.

Solutions to end-of-chapter problems are available: (PDF – 1.5MB)

A few of these problems will be covered in recitation and tutorial. The remaining ones can be used for self-study (for best results, always try to solve a problem on your own, before reading the solution).

Additionally, the following books may be useful as references. They cover many of the topics in this course, although in a different style. You may wish to consult them to get a different perspective on particular topics.

Drake, Alvin. *Fundamentals of Applied Probability Theory*. McGraw-Hill, 1967. ISBN: 9780070178151.

Ross, Sheldon. *A First Course in Probability*. 8th ed. Prentice Hall, 2009. ISBN: 9780136033134.

**Instructor:** Prof. John Tsitsiklis

**MIT Course Number:** 6.041 / 6.431

**Recorded: **Fall 2010

**Level:** Undergraduate / Graduate

[hr]

## Reading:

Bertsekas, Dimitri, and John Tsitsiklis. *Introduction to Probability*. 2nd ed. Athena Scientific, 2008. ISBN: 978188652923.

LEC # | TOPICS | READINGS |
---|---|---|

1 | Probability models and axioms | Sections 1.1–1.2 |

2 | Conditioning and Bayes’ rule | Sections 1.3–1.4 |

3 | Independence | Section 1.5 |

4 | Counting | Section 1.6 |

5 | Discrete random variables; probability mass functions; expectations | Sections 2.1–2.4 |

6 | Discrete random variable examples; joint PMFs | Sections 2.4–2.5 |

7 | Multiple discrete random variables: expectations, conditioning, independence | Sections 2.6–2.7 |

8 | Continuous random variables | Sections 3.1–3.3 |

9 | Multiple continuous random variables | Sections 3.4–3.5 |

10 | Continuous Bayes rule; derived distributions | Sections 3.6; 4.1 |

11 | Derived distributions; convolution; covariance and correlation | Sections 4.1–4.2 |

12 | Iterated expectations; sum of a random number of random variables | Sections 4.3; 4.5 |

13 | Bernoulli process | Section 6.1 |

14 | Poisson process – I | Section 6.2 |

15 | Poisson process – II | Section 6.2 |

16 | Markov chains – I | Sections 7.1–7.2 |

17 | Markov chains – II | Section 7.3 |

18 | Markov chains – III | Section 7.3 |

19 | Weak law of large numbers | Sections 5.1–5.3 |

20 | Central limit theorem | Section 5.4 |

21 | Bayesian statistical inference – I | Sections 8.1–8.2 |

22 | Bayesian statistical inference – II | Sections 8.3–8.4 |

23 | Classical statistical inference – I | Section 9.1 |

24 | Classical inference – II | Sections 9.1–9.4 |

25 | Classical inference – III; course overview | Sections 9.1–9.4 |

## Assignments:

## Recitations:

## Tutorials:

## Exams:

(*Source: MIT Open Courseware & YouTube | MIT*)

You must log in to post a comment.