generic speaker image
Rachel Cummings - Georgia Tech College of Engineering. Atlanta, GA, US

Rachel Cummings

Assistant Professor, Industrial and Systems Engineering | Georgia Tech College of Engineering


Rachel Cummings is an expert in data privacy, algorithmic economics, optimization, statistics, and information theory.







Symposium on Data Privacy: Rachel Cummings Rachel Cummings: The Strange Case of Privacy in Equilibrium Models Differentially Private Change-Point Detection Radical Mechanisms: Market Design for Personal Data, Rachel Cummings ORiginals Season 1, Episode 3 - Rachel Cummings




Dr. Rachel Cummings is an Assistant Professor in the H. Milton Stewart School of Industrial and Systems Engineering at Georgia Tech. Her research interests lie primarily in data privacy, with connections to machine learning, algorithmic economics, optimization, statistics, and information theory. Her work has focused on problems such as strategic aspects of data generation, incentivizing truthful reporting of data, privacy-preserving algorithm design, impacts of privacy policy, and human decision-making.

Dr. Cummings received her Ph.D. in Computing and Mathematical Sciences from the California Institute of Technology, her M.S. in Computer Science from Northwestern University, and her B.A. in Mathematics and Economics from the University of Southern California.

She is the recipient of a Google Research Fellowship, a Simons-Berkeley Research Fellowship in Data Privacy, the ACM SIGecom Doctoral Dissertation Honorable Mention, the Amori Doctoral Prize in Computing and Mathematical Sciences, a Caltech Leadership Award, a Simons Award for Graduate Students in Theoretical Computer Science, and the Best Paper Award at the 2014 International Symposium on Distributed Computing. Dr. Cummings also serves on the ACM U.S. Public Policy Council's Privacy Committee.

Areas of Expertise (10)

Medical Instrumentation

Home Monitoring of Chronic Disease

Data Generation

Machine Learning

Data Privacy

Algorithmic Economics



Non-invasive Physiological Monitoring

Cardiomechanical Signs

Selected Accomplishments (3)

Google Research Fellowship

Google Research Fellowship Spring 2019

Simons-Berkeley Research Fellowship in Data Privacy

Simons-Berkeley Research Fellowship in Data Privacy Spring 2019

ACM SIGecom Doctoral Dissertation Honorable Mention

ACM SIGecom Doctoral Dissertation Honorable Mention 2018

Education (3)

California Institute of Technology: Ph.D., Computing and Mathematical Sciences 2017

Northwestern University: M.S., Computer Science 2013

University of Southern California: B.A., Mathematics, Economics 2011

Selected Articles (3)

Differentially Private Online Submodular Minimization

The 22nd International Conference on Artificial Intelligence and Statistics

2019 In this paper we develop the first algorithms for online submodular minimization that preserve differential privacy under full information feedback and bandit feedback. Our first result is in the full information setting, where the algorithm can observe the entire function after making its decision at each time step. We give an algorithm in this setting that is $\eps $-differentially private and achieves expected regret $\tilde {O}\left (\frac {n\sqrt {T}}{\eps}\right) $ over rounds for a collection of elements. Our second result is in the bandit setting, where the algorithm can only observe the cost incurred by its chosen set, and does not have access to the entire function.

view more

On the Compatibility of Privacy and Fairness

Georgia Institute of Technology

2019 In this work, we investigate whether privacy and fairness can be simultaneously achieved by a single classifier in several different models. Some of the earliest work on fairness in algorithm design defined fairness as a guarantee of similar outputs for “similar” input data, a notion with tight technical connections to differential privacy. We study whether tensions exist between differential privacy and statistical notions of fairness, namely Equality of False Positives and Equality of False Negatives (EFP/EFN). We show that even under full distributional access, there are cases where the constraint of differential privacy precludes exact EFP/EFN. We then turn to ask whether one can learn a differentially private classifier which approximately satisfies EFP/EFN, and show the existence of a PAC learner which is private and approximately fair with high probability. We conclude by giving an efficient algorithm for classification that maintains utility and satisfies both privacy and approximate fairness with high probability.

view more

The Implications of Privacy-Aware Choice

California Institute of Technology

2017 Privacy concerns are becoming a major obstacle to using data in the way that we want. It's often unclear how current regulations should translate into technology, and the changing legal landscape surrounding privacy can cause valuable data to go unused. In addition, when people know that their current choices may have future consequences, they might modify their behavior to ensure that their data reveal less---or perhaps, more favorable---information about themselves. Given these concerns, how can we continue to make use of potentially sensitive data, while providing satisfactory privacy guarantees to the people whose data we are using? Answering this question requires an understanding of how people reason about their privacy and how privacy concerns affect behavior.

view more