The following graduate courses were offered during Fall 2022 by University of Illinois at Chicago, Northwestern University, Toyota Technological Institute at Chicago, University of Chicago, and Illinois Institute of Technology:

University of Illinois at Chicago (UIC):

Causal Inference and Learning  Elena Zheleva (in person) 

Causal reasoning, structural causal models, interventions and counterfactuals, identification, mediation, attribution, dealing with confounding, selection, and interference bias. 

Introduction to Digital Speech Processing  Ahmet Enis Cetin (hybrid)

Both basic speech coding and recognition are covered in this course. Waveform coders: PCM, DPCM, wavelet and DCT based encoders. Vocoders: Linear predictive coding (LPC-10), Code excited linear prediction (CELP) and MELP. Cepstrum, Mel-cepstrum, Speech recognition using Hidden Markov Models and deep neural network based speech recognition

Convex Optimization Shuo Han

This graduate-level course covers three main aspects of convex optimization: theory, applications (e.g., machine learning, signal/image processing, controls), and algorithms. After taking the course, students should be able to recognize convexity and use convex optimization to model and solve problems that arise in engineering applications. Students will also gain a basic understanding of how convex optimization problems are solved algorithmically so as to determine whether a given problem can be solved using off-the-shelf solvers.

Mathematical Foundations of Data Science Lev Reyzin (hybrid)

This course covers the mathematical foundations of modern data science from a theoretical computer science perspective. Topics will include random graphs, small world phenomena, random walks, Markov chains, streaming algorithms, clustering, graphical models, singular value decomposition, and random projections.

Big Data Analysis  Yichao Wu (in person)

High dimensional and big data analysis is one of the most active research areas in statistics today given the unprecedented size and complexity of high-throughput data. We will study cutting-edge developments in the methods and theory of statistical inference including data from genetic, microarrays, proteomics, fMRI, cancer clinical trials and high frequency financial data.

 

Northwestern University (NU):

 

Data Economics  Jason Hartline (online)

This advanced topics seminar will consider theoretical topics in the space of data economics. As data science transforms science and society, it is important to develop the economics of data. Collecting data is costly, possessing data gives market power, sharing data has risks and benefits, conclusions from data depend on data quantity and quality. The readings of the course will be drawn from the recent and classic literature pertaining to data economics. Topics include: valuing data, eliciting data, incentivizing data collection and sharing, adaptive data analysis, game theory with data.

Algorithms with Predictions Aditya Bhaskara (hybrid)

In applications like routing, job scheduling, caching, etc., requests arrive sequentially, and the goal of the system is to handle requests as they arrive, while optimizing an appropriate overall objective. The field of online algorithms focuses on developing and analyzing such systems, and ensuring that they are “competitive” against the best solution in hindsight. While online algorithms have been developed for a wide variety of problems, the standard worst-case guarantees turn out to be insufficient in many applications. The question then is if we can use the “structure” in real world data to obtain better guarantees. Structure is typically a problem-dependent notion, but one general way to model it is to assume that there is a predictive model (possibly based on machine learning on past data) that can provide partial information (or “advice”) about the entire problem instance, e.g., about future requests. In this course, we will see how to design algorithms that can exploit such advice. We will consider a variety of online problems ranging from data structures and caching to online learning and bandits.

Modern Discrete Probability Julia Gaudio (in person)

This is a graduate-level course focused on techniques and models in modern discrete probability. Topics include: the first and second moment methods, Chernoff bounds and large deviations, martingales, concentration inequalities, branching processes, percolation, and Markov chains. Examples will be drawn from random structure and algorithm applications. The goal of the course is to equip students to carry out their own research using the toolkit of discrete probability.

 

Toyota Technological Institute at Chicago (TTIC):

Introduction to Machine Learning  Greg Shakhnarovich (in person) 

 A systematic introduction to machine learning, covering theoretical as well as practical aspects of the use of statistical methods. Topics include linear models for classification and regression, support vector machines, regularization and model selection, and introduction to structured prediction and deep learning. Application examples are taken from areas like information retrieval, natural language processing, computer vision and others.

Information and Coding Theory Madhur Tulsiani (in person)

This course is meant to serve as an introduction to some basic concepts in information theory and error-correcting codes, and some of their applications in computer science and statistics. We plan to cover the following topics: Introduction to entropy and source coding. Some applications of entropy to counting problems. Mutual information and KL-divergence. Method of types and hypothesis testing. Minimax rate bounds. I-projections, maximum entropy, exponential families and applications. Introduction to error-correcting codes. Unique and list decoding of Reed-Solomon and Reed-Muller codes. Applications of information theory to problems in theoretical computer science.

 

University of Chicago (UC):

Mathematical Foundations of Machine Learning Rebecca Willett and Eric Jonas (in person)

This course is an introduction to the mathematical foundations of machine learning that focuses on matrix methods and features real-world applications ranging from classification and clustering to denoising and data analysis. Mathematical topics covered include linear equations, regression, regularization, the singular value decomposition, and iterative algorithms. Machine learning topics include the lasso, support vector machines, kernel methods, clustering, dictionary learning, neural networks, and deep learning

Mathematical Computation I: Matrix Computation  Lek-Heng Lim

 

This course will present a global overview of a number of topics, from classical to modern to state-of-the-art. The fundamental principles and techniques will be covered in depth but towards the end of the course we will also discuss some exciting recent developments. Numerical linear algebra is quite different from linear algebra. We will be much less interested in algebraic results that follow from the axiomatic definitions of fields and vector spaces but much more interested in analytic results that hold only over the real and complex fields. The main objects of interest are real- or complex-valued matrices, which may come from differential operators, integral transforms, bilinear and quadratic forms, boundary and coboundary maps, Markov chains, graphs, metrics, correlations, hyperlink structures, cell phone signals, DNA microarray measurements, movie ratings by viewers, friendship relations in social networks, etc.

Machine Learning  Cong Ma (in person) 

This course provides hands-on experience with a range of contemporary machine learning algorithms, as well as an introduction to the theoretical aspects of the subject. Topics covered include: the PAC framework, Bayesian learning, graphical models, clustering, dimensionality reduction, kernel methods including SVMs, matrix completion, neural networks, and an introduction to statistical learning theory.

 

Illinois Institute of Technology (IIT):

Trustworthy Machine Learning  Binghui Wang (hybrid) 

This graduate-level course mainly studies machine learning under adversarial settings. The first half part will talk about security and privacy attacks to machine learning and the second half part will talk about robust and privacy-preserving machine learning against security and privacy attacks.

Introduction to Data Science and Stochastic Dynamics Jeffrey Duan and Ming Zhong (hybrid)

Introductory graduate level course on how to use machine learning to learn and improve modeling using stochastic dynamics, topics include scientific machine learning using PINN and PIGP, learning model equations from observation, uncertainty quantification, inverse problems, etc.

Join Our Newsletter