Lorrie Faith Cranor is a Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. In 2016 she served as Chief Technologist at the US Federal Trade Commission, working in the office of Chairwoman Ramirez. She is also a co-founder of Wombat Security Technologies, Inc, a security awareness training company. She has authored over 150 research papers on online privacy, usable security, and other topics. She has played a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability (O'Reilly 2005) and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P (O'Reilly 2002). She has served on a number of boards, including the Electronic Frontier FoundationBoard of Directors, and on the editorial boards of several journals. In her younger days she was honored as one of the top 100 innovators 35 or younger by Technology Review magazine. More recently she was named an ACM Fellow for her contributions to usable privacy and security research and education, and an IEEE Fellow for her contributions to privacy engineering. She was previously a researcher at AT&T-Labs Research and taught in the Stern School of Business at New York University. She holds a doctorate in Engineering and Policy from Washington University in St. Louis. In 2012-13 she spent her sabbatical as a fellow in the Frank-Ratchye STUDIO for Creative Inquiry at Carnegie Mellon University where she worked on fiber arts projects that combined her interests in privacy and security, quilting, computers, and technology. She practices yoga, plays soccer, and runs after her three children.
Areas of Expertise (5)
Engineering and Policy
Cybersecurity and Privacy
Media Appearances (5)
Annoying Password Rules Actually Make Us Less Secure
The Wall Street Journal online
Does your company network or a frequently visited website force you to come up with a new password because it has declared your old one is past its expiration date?
How to tell if a gadget is secure? Look for this new government seal.
The Washington Post online
Professor Lorrie Cranor of Carnegie Mellon University, whose research includes ways to make better security and privacy disclosures to users, said she hopes the final standard doesn’t gloss over privacy.
Mandatory password updates are passe
The Washington Post online
“Most people, if they know they're going to have to change their password on a regular basis, they will pick a relatively weaker password and use a pattern for how they change it,” Lorrie Cranor, director of CyLab Security and Privacy Institute at Carnegie Mellon University, told me. And weaker passwords that are easy to predict are catnip for malicious hackers.
Google Settings Still Confusing After $85 Million Lawsuit Over How Confusing They Were
“There’s a lot of fine print when you pause location history. Most people aren’t going to read it, and even if you do, it is confusing,” says Lorrie Cranor, a professor at Carnegie Mellon University whose research includes privacy settings and interfaces. “I’m a privacy expert and I still find it difficult to understand exactly what is getting turned off.”
Personalities of Pittsburgh: Lorrie Cranor is securing privacy in the digital age
Pittsburgh Business Times online
Lorrie Cranor has dedicated her career to cybersecurity and protecting personal information.
Industry Expertise (4)
Writing and Editing
Distinguished Professor of Engineering Award (professional)
2022 Carnegie Mellon University College of Engineering
Alumni Achievement Award (professional)
2019 McKelvey School of Engineering, Washington University in St. Louis
AAAS Fellow (professional)
Allen Newell Award for Research Excellence
2019 Carnegie Mellon University School of Computer Science
Andrew Carnegie Fellow (professional)
Washington University in St. Louis: B.S., Engineering and Public Policy 1992
Washington University in St. Louis: D.Sc., Engineering and Policy 1996
Washington University in St. Louis: M.S., Technology and Human Affairs 1993
Washington University in St. Louis: M.S., Computer Science 1996
- The Future of Privacy Forum Advisory Board
- Deep Lab : Founding member
- Wombat Security Technologies : Co-founder
User-controllable learning of policies
Various embodiments are directed to a computer implemented method for updating a policy that is enforced by a computer program. In one embodiment, a computer communicates, to a user, data regarding one or more decisions made by the program over a period of time according to a policy. Each decision is made on the particular policy in force at the time the decision is made. Policy data for the policy is stored in a machine readable format.
Less is Not More: Improving Findability and Actionability of Privacy Controls for Online Behavioral AdvertisingCHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
2023 Tech companies that rely on ads for business argue that users have control over their data via ad privacy settings. However, these ad settings are often hidden. This work aims to inform the design of findable ad controls and study their impact on users’ behavior and sentiment. We iteratively designed ad control interfaces that varied in the setting’s (1) entry point (within ads, at the feed’s top) and (2) level of actionability, with high actionability directly surfacing links to specific advertisement settings, and low actionability pointing to general settings pages (which is reminiscent of companies’ current approach to ad controls).
Understanding iOS Privacy Nutrition Labels: An Exploratory Large-Scale Analysis of App Store DataCHI EA '22: Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems
2022 Since December 2020, the Apple App Store has required all developers to create a privacy label when submitting new apps or app updates. However, there has not been a comprehensive study on how developers responded to this requirement. We present the first measurement study of Apple privacy nutrition labels to understand how apps on the U.S. App Store create and update privacy labels.
“Okay, whatever”: An Evaluation of Cookie Consent InterfacesCHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
2022 Many websites have added cookie consent interfaces to meet regulatory consent requirements. While prior work has demonstrated that they often use dark patterns — design techniques that lead users to less privacy-protective options — other usability aspects of these interfaces have been less explored. This study contributes a comprehensive, two-stage usability assessment of cookie consent interfaces. We first inspected 191 consent interfaces against five dark pattern heuristics and identified design choices that may impact usability. We then conducted a 1,109-participant online between-subjects experiment exploring the usability impact of seven design parameters.
Identifying User Needs for Advertising Controls on FacebookProceedings of the ACM on Human-Computer Interaction
2022 We conducted an online survey and remote usability study to explore user needs related to advertising controls on Facebook and determine how well existing controls align with these needs. Our survey results highlight a range of user objectives related to controlling Facebook ads, including being able to select what ad topics are shown or what personal information is used in ad targeting.
Understanding Challenges for Developers to Create Accurate Privacy Nutrition LabelsCHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
2022 Apple announced the introduction of app privacy details to their App Store in December 2020, marking the first ever real-world, large-scale deployment of the privacy nutrition label concept, which had been introduced by researchers over a decade earlier. The Apple labels are created by app developers, who self-report their app’s data practices. In this paper, we present the first study examining the usability and understandability of Apple’s privacy nutrition label creation process from the developer’s perspective.