The C&O department has 36 faculty members and 60 graduate students. We are intensely research oriented and hold a strong international reputation in each of our six major areas:
- Algebraic combinatorics
- Combinatorial optimization
- Continuous optimization
- Cryptography
- Graph theory
- Quantum computing
Read more about the department's research to learn of our contributions to the world of mathematics!

News
Three C&O faculty win Outstanding Performance Awards
The awards are given each year to faculty members across the University of Waterloo who demonstrate excellence in teaching and research.
Prof. Alfred Menezes is named Fellow of the International Association for Cryptologic Research
The Fellows program, which was established in 2004, is awarded to no more than 0.25% of the IACR’s 3000 members each year and recognizes “outstanding IACR members for technical and professional contributions to cryptologic research.”
C&O student Ava Pun receives Jessie W. H. Zou Memorial Award
She received the award in recognition of her research on simulating virtual training environments for autonomous vehicles, which she conducted at the start-up Waabi.
Events
Algebraic Graph Theory-Meri Zaimi
Title: Finite bivariate Tratnik functions
Speaker: |
Meri Zaimi |
Affiliation: |
Perimeter Institute for Theoretical Physics |
Location: | Please contact Sabrina Lato for Zoom link. |
Abstract: In the context of algebraic combinatorics, P- and Q-polynomial association schemes are important objects and are closely related to distance-regular graphs. The polynomials appearing in these structures are classified by Leonard's theorem, and they belong to the discrete part of the (q-)Askey scheme. Relatively recently, the notions of P- and Q-polynomial association schemes as well as of distance-regular graphs have been generalized to the multivariate case. There is however no multivariate analog of Leonard's theorem. With the purpose of progressing in that direction, I will discuss ongoing work concerning certain finite families of bivariate functions, said of Tratnik type, which are expressed as an intricate product of univariate polynomials of the (q-)Askey scheme. The goal is to classify such functions which satisfy some generalized bispectral properties, that is, two recurrence relations and two (q-)difference equations of certain types.
Algebraic and enumerative combinatorics seminar-Harper Niergarth and Kartik Singh
Title: The quasisymmetric Macdonald polynomials are quasi-Schur positive at t = 0
Speaker | Harper Niergarth and Kartik Singh |
Affiliation | University of Waterloo |
Location | MC 5479 |
Abstract: The quasisymmetric Macdonald polynomials G_\gamma (X; q, t) are a quasisymmetric refinement of the symmetric Macdonald polynomials that specialize to the quasisymmetric Schur functions QS_\alpha (X). We study the t = 0 specialization G_\gamma (X; q,0), which can be described as a sum over weighted multiline queues. We show that G_\gamma (X; q, 0) expands positively in the quasisymmetric Schur basis and give a charge formula for the quasisymmetric Kostka-Foulkes polynomials K_{\gamma,\alpha}(q) in the expansion G_\gamma (X; q, 0) = \sum K_{\gamma,\alpha}(q) QS_\alpha(X). The proof relies heavily on crystal operators, and if you do not know what that means, come find out! This is joint work with Olya Mandelshtam.
There will be a pre-seminar presenting relevant background at the beginning graduate level starting at 1pm,
Tutte colloquium-Aukosh Jagannath
Title:: The training dynamics and local geometry of high-dimensional learning
Speaker: | Aukosh Jagannath |
Affiliation: | University of Waterloo |
Location: | MC 5501 |
Abstract:Many modern data science tasks can be expressed as optimizing a complex, random functions in high dimensions. The go-to methods for such problems are variations of stochastic gradient descent (SGD), which perform remarkably well—c.f. the success of modern neural networks. However, the rigorous analysis of SGD on natural, high-dimensional statistical models is in its infancy. In this talk, we study a general model that captures a broad range of learning tasks, from Matrix and Tensor PCA to training two-layer neural networks to classify mixture models. We show the evolution of natural summary statistics along training converge, in the high-dimensional limit, to a closed, finite-dimensional dynamical system called their effective dynamics. We then turn to understanding the landscape of training from the point-of-view of the algorithm. We show that in this limit, the spectrum of the Hessian and Information matrices admit an effective spectral theory: the limiting empirical spectral measure and outliers have explicit characterizations that depend only on these summary statistics. I will then illustrate how these techniques can be used to give rigorous demonstrations of phenomena observed in the machine learning literature such as the lottery ticket hypothesis and the "spectral alignment" phenomenona. This talk surveys a series of joint works with G. Ben Arous (NYU), R. Gheissari (Northwestern), and J. Huang (U Penn).
This talk is based on joint work with Saeed Ghadimi and Henry Wolkowicz from University of Waterloo and Diego Cifuentes and Renato Monteiro from Georgia Tech.