Eric Balkanski

ASSISTANT PROFESSOR

Research Interests

Combinatorial Optimization, Data-driven Algorithm Design, Machine Learning, Mechanism Design, Game Theory

Eric Balkanski’s research lies at the intersection of algorithms and machine learning. He is interested in developing novel optimization frameworks that are motivated by applications in machine learning.

In particular, his research focuses on data-driven algorithm design, combinatorial optimization, and mechanism design. He develops novel models and algorithmic machinery to address modern challenges of decision-making. For example, he has been working on exponentially faster algorithms for submodular optimization, with applications in data summarization, recommendation systems, and network analysis.

Balkanski received his BS degree in Mathematical Sciences and Computer Science from Carnegie Mellon University and his PhD in Computer Science from Harvard University. He is the recipient of an ACM SIGecom Doctoral Dissertation Honorable Mention Award, a Google PhD fellowship, a Smith Family Graduate Science and Engineering Fellowship, a Best Paper Award at the 18th Conference on Implementation and Application of Automata in 2013, and is also an Andrew Carnegie Society Scholar. He co-founded Robust Intelligence, an AI security startup. 

RESEARCH EXPERIENCE

  • PhD in Computer Science, Harvard University, 2014-2019

PROFESSIONAL EXPERIENCE

  • Assistant professor of Industrial Engineering and Operations Research, Columbia University, 2020-
  • Co-founder, Robust Intelligence, 2019-

HONORS & AWARDS

  • ACM SIGecom Doctoral Dissertation Honorable Mention Award, 2020
  • Google PhD fellowship, 2017
  • Smith Family Graduate Science and Engineering Fellowship, 2015
  • Best Paper Award, 18th Conference on Implementation and Application of Automata, 2013

SELECTED PUBLICATIONS

  • The Adaptive Complexity of Maximizing a Submodular Function, Eric Balkanski and Yaron Singer, 50th ACM Symposium on Theory of Computing (STOC 2018).
  • The Limitations of Optimization from Samples, Eric Balkanski, Aviad Rubinstein, and Yaron Singer, 49th ACM Symposium on Theory of Computing (STOC 2017).
  • An Exponential Speedup in Parallel Running Time for Submodular Maximization without Loss in Approximation, Eric Balkanski, Aviad Rubinstein, and Yaron Singer, 30th ACM-SIAM Symposium on Discrete Algorithms (SODA 2019).
  • An Optimal Approximation for Submodular Maximization under a Matroid Constraint in the Adaptive Complexity Model, Eric Balkanski, Aviad Rubinstein, and Yaron Singer, 51st ACM Symposium on Theory of Computing (STOC 2019).
  • On the Construction of Substitutes, Eric Balkanski and Renato Paes Leme, 19th ACM Conference on Economics and Computation (EC 2018). Journal version in Mathematics of Operations Research (MOR).
  • Statistical Cost Sharing, Eric Balkanski, Umar Syed, Sergei Vassilvitskii, 31st Conference on Neural Information Processing Systems (NIPS 2017).
  • The Importance of Communities for Learning to Influence, Eric Balkanski, Nicole Immorlica, and Yaron Singer, 31st Conference on Neural Information Processing Systems (NIPS 2017).
  • A Lower Bound for Parallel Submodular Minimization, Eric Balkanski and Yaron Singer, 52nd ACM Symposium on Theory of Computing (STOC 2020).
  • Secretary Ranking with Minimal Inversions, Sepehr Assadi, Eric Balkanski, and Renato Paes Leme, 33rd Conference on Neural Information Processing Systems (NeurIPS 2019).
  • Learning to Optimize Combinatorial Functions, Nir Rosenfeld, Eric Balkanski, Amir Globerson, and Yaron Singer, 35th International Conference on Machine Learning (ICML 2018).