Electrical and Computer Engineering
Ziv Goldfeld joined the School of Electrical and Computer Engineering at Cornell University as an assistant professor in July 2019. He is a a graduate field member in Computer Science, and the Center of Applied Mathematics. During the 2017-2019 academic years, he was a postdoctoral research fellow in the Laboratory for Information and Decision Systems (LIDS) of the Electrical Engineering and Computer Science Department at MIT. Before that, Goldfeld graduated with a B.Sc., M.Sc. and Ph.D. (all summa cum laude) in Electrical and Computer Engineering from Ben Gurion University, Israel, in 2012, 2012 and 2017, respectively.
Goldfeld’s research interests include statistical machine learning, optimal transport theory, information theory, high-dimensional statistic, applied probability and interacting particle systems. He seeks to understand and design engineering systems by formulating and solving mathematical models. A main focus is working towards a comprehensive statistical learning theory to obtain better understanding and strong performance guarantees for modern ML methods that operate on real-world high-dimensional data.
In a recent project, Goldfeld aims to create a global, high-dimensional inference framework that prompts a scalable (in dimension) generalization and sample complexity theory for ML. To that end, a new class of discrepancy measures between probability distributions adapted to high-dimensional spaces is developed. Termed smooth statistical distances (SSDs), these distances level out irregularities in the considered distributions (via convolution with a chosen smoothing kernel) in a way that preserves inference capability, but alleviates the curse of dimensionality when estimating them from data. Measuring or optimizing distances between distributions is central to basic inference setups and advanced ML tasks. Therefore, research questions include: (i) SSD fundamentals, encompassing geometric, topological and functional properties; (ii) high-dimensional statistical questions, such as empirical approximation, limit distributions, testing, goodness-of-fit, etc.; and (iii) learning-theoretic applications, including generalization theory for generative models, efficient barycenter computation/estimation, anomaly detection, etc.
Another focus is developing tools, rooted in information theory, for measuring the flow of information through deep neural networks (DDNs). The goal here is to explain the process by which DNNs progressively build representations of data—from crude and over-redundant representations in shallow layers to highly-clustered and interpretable ones in deeper layers—and to give the designer more control over that process. To that end, the project develops efficient estimators of information measures over the network (possibly via built-in dimensionality reduction techniques). Such estimators also lead to new visualization, optimization, and pruning methods of DDNs. New instance-dependent generalization bounds based on information measures are also of interest.
Additional research trajectories include causal machine learning and relation to the directed information functional, information-theoretic security, high-dimensional nonparametric estimation, and interacting particle systems.
- Information Theory and Communications
- Statistics and Machine Learning
- Systems and Networking
- Complex Systems, Network Science and Computation
ECE 6970 – Statistical distances for modern machine learning (Fall 2019)
ECE 5630 – Information Theory for Data Transmission, Security and Machine Learning (Spring 2020)
ECE 4110 – Random Signals in Communications and Signal Processing
- Z. Goldfeld, K. Greenewald and K. Kato, “Asymptotic guarantees for generative modeling based on the smooth Wasserstein distance”.
- Z. Aharoni, D. Tzur, Z. Goldfeld and H. H. Permuter, “Directed information: neural estimation and applications to learning from sequential data”.
- Z. Goldfeld and K. Greenewald, “Gaussian-smoothed optimal transport: metric structure and statistical efficiency”.
- Z. Goldfeld, K. Greenewald, Y. Polyanskiy and J. Weed, “Convergence of smoothed empirical measures with applications to entropy estimation.”
- Z. Goldfeld, E. van den Berg, K. Greenewald, I. Melnyk, N. Nguyen, B. Kingsbury and Y. Polyanskiy, “Estimating information flow in deep neural networks.”
Selected Awards and Honors
- NSF CAREER Award, 2021
- NSF CRII Award, 2020
- IBM University Award, 2020
- The Rothschild Postdoctoral Fellowship, 2017
- The Ben Gurion University Postdoctoral Fellowship, 2017
- Feder Award in the National Contest for Outstanding Student Research, 2016
- Best Tutor Award (Electrical and Computer Engineering Department, Ben Gurion University), 2016
- Best Student Paper Award at the International Conference on the Science of Electrical Engineering (ICSEE), 2014
- B.Sc. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2012
- M.Sc. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2012
- Ph.D. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2018