Assistant Professor Department of Electrical and Computer Engineering Department of Statistical Science Duke University Email: galen.reeves@duke.edu Phone: 9196684042

My research interests lie at the intersection of signal processing, statistics, and information theory, with applications in highdimensional statistical inference, compressed sensing, and machine learning.
02/2018  I received the NSF Faculty Early Career Development Program (CAREER) Award
01/2018  This Spring I'm teaching ECE 280  Signals and Systems.
10/2017  I've just uploaded to arxiv my paper Additivity of Information in Multilayer Networks via Additive Gaussian Noise Transforms, which was presented at the 2017 Allerton Conference on Communication, Control, and Computing. This paper provides a new method for analyzing the fundamental limits of statistical inference with generative models defined by multilayer networks. This works has a number of interesting connections wth free probability theory and random matrix theory, the replica method from statistical physics, and approximate inference algorithms based on approximation message passing.
10/2017  I presented the following talk at the 2017 Allerton Conference on Communication, Control, and Computing:
G. Reeves, "Additivity of Information in Multilayer Networks via Additive Gaussian Noise Transforms," Allerton, 2017
G. Reeves, "TwoMoment Inequalities for Rényi Entropy and Mutual Information," ISIT 2017 [slides]
A. Kipnis, G. Reeves, Y. C. Eldar, A. Goldsmith, "Compressed Sensing under Optimal Quantization," ISIT 2017 [slides]
G. Reeves, "Conditional Central Limit Theorems for Gaussian Projections," ISIT 2017 [slides]
03/2017  My work with Efe Aras on modeling traffic with selfdriving cars has been featured on the Pratt School of Engineering website.
03/2017  I've uploaded slides for the talk I presented at the Workshop on Statistical physics, Learning, Inference and Networks, École de Physique in les Houches in March 2017.
02/2017  I've just uploaded to arxiv my paper TwoMoment Inequalities for Rényi Entropy and Mutual Information. This paper provides a generalization of some of the techniques that were used in my recent work on the conditional central limit theorems for Gaussian projections. In particular, it is shown how mutual information can be upper bounded in terms of weighted integrals involving the variance of the conditional density.
12/2016  I've just uploaded to arxiv my paper Conditional Central Limit Theorems for Gaussian Projections. This paper deals with the surprising phenomenon that most projections of a highdimensional vector are close to Gaussian. Many of the ideas were motivated by my work with Henry Pfsiter on the replicasymmetric prediction for compressed sensing.
V. Mayya, B. Mainsah, and G. Reeves. "InformationTheoretic Analysis of Refractory Effects in the P300 Speller," Asilomar 2016.
V. Mayya, B. Mainsah, and G. Reeves. "Modeling the P300Based BrainComputer Interface As a Channel with Memory," Allerton 2016.
08/2016  I'm teaching ECE 587 / STA 563  Information Theory in Fall 2016
07/2016  Henry Pfister and I have just uploaded to the arxiv our paper The ReplicaSymmetric Prediction for Compressed Sensing with Gaussian Matrices is Exact. The paper resolves a longstanding open problem about concerning results made using the powerful but heuristic replica methods from statistical physics. The main ideas in the proof are outlined in this video, which I presented at the IHP Nexus of Information and Computation Theories, March 2016.
04/2016  Here is as video of an invited talk outlining a rigorous proof of the replicasymmetric prediction for compressed sensing that I presented at the IHP Nexus of Information and Computation Theories, March 2016.
02/2016  I'm helping coorganize the 2016 NorthAmerican School of Information Theory at Duke University.
01/2016  I'm teaching STA 741 / ECE 741  Compressed Sensing and Related Topics in Spring 2016.
Galen Reeves joined the faculty at Duke University in Fall 2013, and is currently an Assistant Professor with a joint appointment in the Department of Electrical & Computer Engineering and the Department of Statistical Science. He completed his PhD in Electrical Engineering and Computer Sciences at the University of California, Berkeley in 2011. From 2011 to 2013 he was a postdoctoral associate in the Departments of Statistics at Stanford University, where he was supported by an NSF VIGRE fellowship. In the summer of 2011, he was a postdoctoral researcher in the School of Computer and Communication Sciences at EPFL, Switzerland; in the spring of 2009, he was a visiting scholar at the Technical University of Delft, The Netherlands; and in the summer of 2008, he was a research intern in the Networked Embedded Computing Group at Microsoft Research, Redmond. He received his MS in Electrical Engineering from UC Berkeley in 2007, and BS in Electrical and Computer Engineering from Cornell University in 2005.