2024 Canadian Workshop of Information Theory


Keynote 1
Frank Kschischang, University of Toronto

Coding for Fiber-Optic Communications

Abstract: Error-control coding schemes for fiber-optic communication systems are challenged by the high data rates that must be achieved, but can benefit from the use of codes with very long block lengths. We describe some of our recent and not-so-recent work on the design of codes that exhibit beneficial tradeoffs between code performance and decoding complexity. Joint work with Masoud Barakatain, Mohannad Shehadeh, and Alvin Sukmadji.

Biography: A graduate of the University of British Columbia, Frank R. Kschischang is a Professor of Electrical and Computer Engineering at the University of Toronto, where he has been a faculty member since 1991. His research interests are centered on applications of coding theory for reliable information transmission over various types of noisy channels. He has received a number of awards for his teaching, service, and research, including the 2019 Sustained Excellence in Teaching Award from the Faculty of Applied Science and Engineering at the University of Toronto, the 2016 IEEE Aaron D. Wyner Distinguished Service Award of the IEEE Information Theory Society, the 2010 IEEE Communications Society and Information Theory Society Joint Paper Award, the 2018 IEEE Information Theory Society Paper Award, and the 2023 IEEE Richard W. Hamming Medal.

Keynote 2
Tara Javidi, University of California, San Diego

Zeroth-Order Optimization: An Information Theoretic Perspective

Abstract: In this talk, I will review the problem of maximizing a black-box function both in the centralized and decentralized setting, motivated by a wide variety of engineering design applications from the heuristic optimization of wireless networks to hardware acceleration to neural network architecture search. First, I discuss the centralized problem when our underlying function belongs to a reproducing kernel Hilbert space. In the second part of the talk I connect our findings to the value of gradient information. Relying on noisy gradient estimates, we will consider the problem when the function is a strongly convex yet decomposable function to be optimized in a decentralized manner.

Biography: Tara Javidi received her BS in electrical engineering at Sharif University of Technology, Tehran, Iran. She received her MS degrees in electrical engineering and in applied mathematics from the University of Michigan, Ann Arbor as well as her Ph.D. in electrical engineering and computer science. In 2005, she joined the University of California, San Diego, where she holds faculty positions in ECE Department and Halicioglu Data Science Institute. Tara Javidi has received a number of awards including the 2021 IEEE Communications Society & Information Theory Society Joint Paper Award.

Keynote 3 (Ian F Blake Lecture)
Greg Wornell, Massachusetts Institute of Technology

On Getting Higher Resolution with Fewer Bits: Information Theoretic Perspectives on ADC Architecture

Abstract: Information theoretic analysis has proven to be extraordinary useful in rethinking system architecture in diverse applications over many decades. As a recent example, this talk will describe perspectives on analog-to-digital converter (ADC) design that arise from such analysis. In particular, it suggests both greater resolution and flexibility is possible using a universal analog front-end, and leveraging digital processing in new and interesting ways.

Biography: Greg received his BASc from UBC, and his PhD from MIT. Since 1991, he has been on the faculty at MIT, where he is the Sumitomo Professor of Engineering in the Department of Electrical Engineering and Computer Science. At MIT he leads the Signals, Information, and Algorithms Laboratory within the Research Laboratory of Electronics (RLE), and is also affiliated with the Computer Science and Artificial Intelligence Laboratory (CSAIL). He has won a number of awards for both his research and teaching, including the 2019 IEEE Leon K. Kirchmayer Graduate Teaching Award.

Keynote 4
Mary Wootters, Stanford University

Low-bandwidth computation on top of error correction

Abstract: Error correction is a fundamental tool for protecting data from noise. In this talk, I'll focus on distributed settings, and ask: What happens when we want to compute on data that is already protected with error correction? It turns out that, in some cases, we can take advantage of error correction in order to speed up or reduce the communication costs of that computation. I'll mention a few places where this comes up – including distributed storage, distributed computation and homomorphic secret sharing – and I'll discuss some recent theoretical results.

Biography: Mary Wootters is an associate professor of Computer Science and Electrical Engineering at Stanford University. She received a PhD in mathematics from the University of Michigan in 2014, and a BA in math and computer science from Swarthmore College in 2008; she was an NSF postdoctoral fellow at Carnegie Mellon University from 2014 to 2016. She works in theoretical computer science, applied math, and information theory; her research interests include error correcting codes and randomized algorithms for dealing with high dimensional data. Her Ph.D. thesis received the Sumner B. Myers Memorial Prize from the UMich Math Department and and the EATCS Distinguished Dissertation award. She is the recipient of an NSF CAREER award, was named a Sloan Research Fellow in 2019 and a Google Research Scholar in 2021; she was awarded the IEEE Information Theory Society James L. Massey award in 2022, and named the IEEE Information Theory Society Goldsmith Lecturer for 2024.