EE 276: Information Theory (STATS 376A)
(Formerly
EE 376A.) Project-based course about how to measure, represent, and communicate information effectively. Why bits have become the universal currency for information exchange. How information theory bears on the design and operation of modern-day systems such as smartphones and the Internet. The role of entropy and mutual information in data compression, communication, and inference. Practical compressors and error correcting codes. The information theoretic way of thinking. Relations and applications to probability, statistics, machine learning, biological and artificial neural networks, genomics, quantum information, and blockchains. Prerequisite: a first undergraduate course in probability.
Terms: Win
| Units: 3
EE 377: Information Theory and Statistics (STATS 311)
Information theoretic techniques in probability and statistics. Fano, Assouad,nand Le Cam methods for optimality guarantees in estimation. Large deviationsnand concentration inequalities (Sanov's theorem, hypothesis testing, thenentropy method, concentration of measure). Approximation of (Bayes) optimalnprocedures, surrogate risks, f-divergences. Penalized estimators and minimumndescription length. Online game playing, gambling, no-regret learning. Prerequisites:
EE 276 (or equivalent) or
STATS 300A.
Terms: Aut
| Units: 3
Instructors:
Duchi, J. (PI)
;
Asi, H. (TA)
STATS 311: Information Theory and Statistics (EE 377)
Information theoretic techniques in probability and statistics. Fano, Assouad,nand Le Cam methods for optimality guarantees in estimation. Large deviationsnand concentration inequalities (Sanov's theorem, hypothesis testing, thenentropy method, concentration of measure). Approximation of (Bayes) optimalnprocedures, surrogate risks, f-divergences. Penalized estimators and minimumndescription length. Online game playing, gambling, no-regret learning. Prerequisites:
EE 276 (or equivalent) or
STATS 300A.
Terms: Aut
| Units: 3
Instructors:
Duchi, J. (PI)
;
Asi, H. (TA)
Filter Results: