Michael Reiter is the Lawrence M. Slifkin Distinguished Professor in the Department of Computer Science in UNC's College of Arts and Sciences. He received a B.S. degree in mathematical sciences from UNC in 1989, and M.S. and Ph.D. degrees in computer science from Cornell University in 1991 and 1993, respectively.
Dr. Reiter's research interests include all areas of computer and communications security and distributed computing. He regularly publishes and serves on conference organizing committees in these fields. He served as program chair for the the flagship computer security conferences of the IEEE, the ACM and the Internet Society,
He was named an ACM Fellow in 2008 and an IEEE Fellow in 2014.
Industry Expertise (2)
Areas of Expertise (4)
Excellence in Teaching Award (professional)
Awarded by the Computer Science Student Association of the Department of Computer Science at the University of North Carolina at Chapel Hill.
Cornell University: Ph.D., Computer Science 1993
Cornell University: M.S., Computer Science 1991
The University of North Carolina: B.S., Mathematical Sciences 1989
- ACM : Fellow
- IEEE : Fellow
Media Appearances (1)
Google-Led Denials Leave Room for U.S. Web Surveillance
Bloomberg Business online
Mining data associated with people’s communications is hardly new for the government, said Michael Reiter, a professor of computer science at the University of North Carolina at Chapel Hill. The Patriot Act, which was passed in response to the terrorist acts of Sept. 11, 2001, authorized secret U.S. surveillance of phone calls and e-mails.
Still, a government hack of corporate servers to obtain that type of information is unlikely, Reiter said.
“It’s certainly more difficult to do that and far riskier to do that than it is to just go get the court order,” he said. “It doesn’t make sense to me that the government would try to do it.”...
Event Appearances (5)
College of Information and Computer Sciences, University of Massachusetts Amherst Amherst, MA, USA
Department of Computer and Information Science and Engineering, University of Florida Gainesville, FL, USA
Computer Science and Engineering Department, University of California – Riverside Riverside, CA, USA
9th International Conference on Network and System Security New York, NY, USA
7th ACM Cloud Computing Security Workshop Denver, CO, USA
A power grid is a complex system connecting electric power generators to consumers through power transmission and distribution networks across a large geographical area. System monitoring is necessary to ensure the reliable operation of power grids, and state estimation is used in system monitoring to best estimate the power grid state through analysis of meter measurements and power system models. Various techniques have been developed to detect and identify bad measurements, including interacting bad measurements introduced by arbitrary, nonrandom causes. At first glance, it seems that these techniques can also defeat malicious measurements injected by attackers.
We present Flicker, an infrastructure for executing security-sensitive code in complete isolation while trusting as few as 250 lines of additional code. Flicker can also provide meaningful, fine-grained attestation of the code executed (as well as its inputs and outputs) to a remote party. Flicker guarantees these properties even if the BIOS, OS and DMA-enabled devices are all malicious. Flicker leverages new commodity processors from AMD and Intel and does not require a new OS or VMM. We demonstrate a full implementation of Flicker on an AMD platform and describe our development environment for simplifying the construction of Flicker-enabled code.
In this paper we propose and evaluate new graphical password schemes that exploit features of graphical input displays to achieve better security than text based passwords. Graphical input devices enable the user to decouple the position of inputs from the temporal order in which those inputs occur, and we show that this decoupling can be used to generate password schemes with substantially larger (memorable) password spaces. In order to evaluate the security of one of our schemes, we devise a novel way to capture a subset of the "memorable" passwords that, we believe, is itself a contribution. In this work we are primarily motivated by devices such as personal digital assistants (PDAs) that offer graphical input capabilities via a stylus, and we describe our prototype implementation of one of our password schemes on such a PDA, namely the Palm PilotTM.
In this paper we introduce a system called Crowds for protecting users' anonymity on the world-wide-web. Crowds, named for the notion of “blending into a crowd,” operates by grouping users into a large and geographically diverse group (crowd) that collectively issues requests on behalf of its members. Web servers are unable to learn the true source of a request because it is equally likely to have originated from any member of the crowd, and even collaborating crowd members cannot distinguish the originator of a request from a member who is merely forwarding the request on behalf of another. We describe the design, implementation, security, performance, and scalability of our system. Our security analysis introduces degrees of anonymity as an important tool for describing and proving anonymity properties.
Quorum systems are well-known tools for ensuring the consistency and availability of replicated data despite the benign failure of data repositories. In this paper we consider the arbitrary (Byzantine) failure of data repositories and present the first study of quorum system requirements and constructions that ensure data availability and consistency despite these failures. We also consider the load associated with our quorum systems, i.e., the minimal access probability of the busiest server. For services subject to arbitrary failures, we demonstrate quorum systems over n servers with a load of O(1n√), thus meeting the lower bound on load for benignly fault-tolerant quorum systems. We explore several variations of our quorum systems and extend our constructions to cope with arbitrary client failures.