Digital Contracting Is Broken. A Little "Friction" Could Go a Long Way in Fixing It

Oct 25, 2024

4 min

Brett Frischmann, JD

In mid-October, the Federal Trade Commission (FTC) announced a final “click-to-cancel” rule, which, after its provisions go into effect, will make it easier for consumers to cancel recurring memberships and subscriptions.


The rule is an undoubtable victory for consumers who have run into roadblocks attempting to protect their wallets amid the flurry of oversubscription in today’s world, but it also begets an important question: Why is oversubscription occurring in the first place?


“One important reason for that problem is that getting into contracts is frictionless, it’s too easy,” said Brett Frischmann, JD, the Charles Widger Endowed Professor in Law, Business and Economics in Villanova University’s Charles Widger School of Law. “The FTC is addressing a real concern in making it easier for people to exit agreements of this sort. But while making it as easy to unsubscribe as to subscribe sounds great – we all like even playing fields and symmetry – it might be better to also make subscribing a little more burdensome, so people understand what they are getting themselves into in the first place.”


This idea is the focus of Frischmann’s recent paper, titled “Better Digital Contracts with Prosocial Friction-in-Design, the publication of which coincides with public dissatisfaction over digital contracting processes. In August, Disney attempted to have a wrongful death lawsuit blocked, citing print in terms and conditions from a one-month Disney+ free trial the plaintiff signed up for in 2019. Since then, other companies have succeeded in recently blocking the commencement of similar lawsuits.


In the research, Frischmann and his co-author, Rice University computer scientist Moshe Vardi, describe these contracts as “dehumanizing” and that they “undermine human autonomy and sociality, by design,” citing how they elicit behavior in a pre-determined manner (such as clicking on cue) and often include side agreements with other entities, unbeknownst to the users.


“One-click” contracts rely on legal fictions, such as presuming that clicking an “I have read the terms and conditions” button actually means that they have. They are structured this way intentionally.


“The idea behind digital contracting is ‘Let’s make the contract as quick as possible before people leave or change their mind,” Frischmann said. “They only want to do the minimum that the law requires, and all the law requires is notice of terms and action that says, ‘I agree.’”


For these reasons, he argues, modern digital contracting contradicts the purpose of contract law in the first place; enabling people to reach genuine agreements and cooperate.


“It’s antithetical to the underlying values of a contract,” Frischmann said. “Autonomy is undermined because people are not able to exercise autonomy in a meaningful way when they are not actually capable of deliberating about the terms to which they are agreeing. As for being cooperative, there is no relationship. Digital contracts are completely one-sided.”


So what can be done to combat this?


“Speedbumps,” Frischmann says, referring to measures that can cause friction in the contracting process to better protect the user. Physical road speedbumps represent a useful analogy, because while they make things slightly more inconvenient for the user, they are deployed strategically where other values are at stake, like the safety of children playing outside.


“People tolerate speedbumps,” Frischmann says, “because they serve a social purpose. Friction in digital contracts is similar.”


With respect to improving digital contracting, there are multiple measures that can be taken that inherently have such friction, but not all of them are always appropriate.


Completely Automated Public Turing Test to Tell Computers and Humans Apart (CAPTCHAs), for example, are a type of friction-in-design that serve a useful social purpose (security) and have become normalized and tolerated, but some CAPTCHAs are ableist and others may generate proprietary data.


Where he sees the most beneficial friction existing is in comprehension, which in software form could be completing a task or passing a test to prove an individual understands the agreement.


Comprehension is the basis for one of Frischmann’s proposed alterations to contract law. Currently, the oft-criticized concept of informed consent is utilized. He argues it should be replaced with demonstrably informed consent, in essence requiring entities to further show that people truly comprehend what they are agreeing to.


“Right now, individuals assent to contracts, going along with terms someone else insisted upon,” he said. “But assenting to terms is very different than being informed and consenting. To demand demonstrably informed consent shifts the burden on the provider to generate evidence showing in fact a person understood and agreed.”


In the recent Disney case, for example, demonstrably informed consent would have required not just clicking an agreement when signing up for Disney+, but that Disney somehow explained to an individual that if they sign up for a free trial, they cannot take the company to court, and further generating reliable evidence that the individual understood that. If that were the case, perhaps the individual would not have signed up.


“Or, they may not have ever gone to the Disney park if they had [signed up],” Frischmann said.


This proposed change in contract law, along with the various potential methods of engineered friction in digital contracts all circle back to the same goal: slowing down contracting where it affects people in ways they do not understand.


“You can’t have digital contracting built like a highway, where it’s all as fast as possible all the time,” Frischmann said. “For our digitally networked environment, it needs to be built like a neighborhood.”


Connect with:
Brett Frischmann, JD

Brett Frischmann, JD

Charles Widger Endowed University Professor in Law, Business and Economics | Charles Widger School of Law

Brett Frischmann, JD, is a renowned scholar and expert in intellectual property, surveillance, internet law and net neutrality.

Digital ContractsSelf-Driving CarsTechnology PoliciesSurveillance Intellectual Property

You might also like...

Check out some other posts from Villanova University

4 min

Villanova Astrophysicist Joey Neilsen, PhD, Plays Prominent Role in Groundbreaking XRISM Collaboration Study

A global team of researchers using the new X-ray Imaging and Spectroscopy Mission (XRISM) telescope, launched in fall 2023, discovered something unexpected while observing a well-studied neutron star system called GX13+1. Instead of simply capturing a clearer view of its usual, predictable activity, their February 2024 observation revealed a surprisingly slow cosmic wind, the cause of which could offer new insights into the fundamental physics of how matter accumulates, or “accretes,” in certain types of binary systems. The study was one of the first from XRISM looking at wind from an X-ray binary system, and its results were published in Nature—the world's leading multidisciplinary science journal—in September 2025. Spectral analysis indicated GX13+1 was at that very moment undergoing a luminous super-Eddington phase, meaning the neutron star was shining so brightly that the radiation pressure from its surface overcame gravity, leading to a powerful ejection of any infalling material (hence the slow cosmic wind). Further comparison to previous data implied that such phases may be part of a cycle, and could “change the way we think about the behavior of these systems,” according to Joey Neilsen, PhD, associate professor of Physics at Villanova University. Dr. Neilsen played a prominent role as a co-investigator and one of the corresponding authors of the project, along with colleagues at the University of Durham (United Kingdom), Osaka University (Japan), and the University of Teacher Education Fukuoka (Japan). Overall, the collaboration featured researchers from dozens of institutions across the world. GX13+1 is a binary system consisting of a neutron star orbiting a K5 III companion star—a cooler giant star nearing the end of its life. Neutron stars are small, incredibly dense cores of supergiant stars that have undergone supernovae explosions. They are so dense, Dr. Neilsen says, that one teaspoon of its material would weigh about the same as Mount Everest. Because of this, they yield an incredibly strong gravitational field. When these highly compact neutron stars orbit companion stars, they can pull in, or accrete, material from that companion. That inflowing material forms a visible rotating disk of gas and dust called an accretion disk, which is extremely hot and shines brightly in X-rays. It’s so bright that sometimes it can actually drive matter away from the neutron star. “Imagine putting a giant lightbulb in a lake,” Dr. Neilsen said. “If it’s bright enough, it will start to boil that lake and then you would get steam, which flows away like a wind. It’s the same concept; the light can heat up and exert pressure on the accretion disk, launching a wind.” The original purpose of the study was to use XRISM to observe an accretion disk wind, with GX13+1 targeted specifically because its disk is persistently bright, it reliably produces winds, and it has been well studied using Chandra— NASA’s flagship X-ray observatory—and other telescopes for comparison. XRISM can measure the X-ray energies from these systems a factor of 10 more precisely than Chandra, allowing researchers to both demonstrate the capabilities of the new instrument and study the motion of outflowing gas around the neutron star. This can provide new insights into accretion processes. “It's like comparing a blurry image to a much sharper one,” Dr. Neilsen said. “The atomic physics hasn't changed, but you can see it much more clearly.” The researchers uncovered an exciting surprise when the higher-resolution spectrum showed much deeper absorption lines than expected. They determined that the wind was nearly opaque to X-rays and slow at “only” 1.4 million miles per hour—surprisingly leisurely for such a bright source. Based on the data, the team was able to infer that GX13+1 must have been even brighter than usual and undergoing a super-Eddington phase. So much material was ejected that it made GX13+1 appear fainter to the instrument. “There's a theoretical maximum luminosity that you can get out of an accreting object, called the Eddington limit. At that point, the radiation pressure from the light of the infalling gas is so large that it can actually hold the matter away,” Dr. Neilsen said, equating it to standing at the bottom of a waterfall and shining light so brightly that the waterfall stops. “What we saw was that GX13+1 had to have been near, or maybe even above, the Eddington limit.” The team compared their XRISM data from this super-Eddington phase to a set of previous observations without the resolution to measure the absorption lines directly. They found several older observations with faint, unusually shaped X-ray spectra similar to the one seen by XRISM. “XRISM explained these periods with funny-shaped spectra as not just anomalies, but the result of this phenomenally strong accretion disk wind in all its glory,” Dr. Neilsen said. “If we hadn’t caught this exact period with XRISM, we would never have understood those earlier data.” The connection suggests that this system spends roughly 10 percent of its time in a super-Eddington phase, which means super-Eddington accretion may be more common than previously understood—perhaps even following cycles—in neutron star or black hole binary systems. “Temporary super-Eddington phases might actually be a thing that accreting systems do, not just something unique to this system,” Dr. Neilsen said. “And if neutron stars and black holes are doing it, what about supermassive black holes? Perhaps this could pave the way for a deeper understanding of all these systems.”

4 min

Two Decades Later, Villanova Engineering Professor Who Assisted in Hurricane Katrina Investigation Reflects on Role in the Storm's Aftermath

Twenty years ago, Hurricane Katrina hit the southeastern coast of the United States, devastating cities and towns across Louisiana, Florida, Mississippi, Alabama and beyond. The storm caused nearly 1,400 fatalities, displaced more than 1 million people and generated over $125 billion in damages. Rob Traver, PhD, P.E., D. WRE, F.EWRI, F.ASCE, professor of Civil and Environmental Engineering at Villanova University, assisted in the U.S. Army Corps of Engineers' (USACE) investigation of the failure of the New Orleans Hurricane Protection System during Hurricane Katrina, and earned an Outstanding Civilian Service Medal from the Commanding General of USACE for his efforts. Dr. Traver reflected on his experience working in the aftermath of Katrina, and how the findings from the investigation have impacted U.S. hurricane responses in the past 20 years. Q: What was your role in the investigation of the failure of the New Orleans Hurricane Protection System? Dr. Traver: Immediately after Hurricane Katrina, USACE wanted to assess what went wrong with flood protections that had failed during the storm in New Orleans, but they needed qualified researchers on their team who could oversee their investigation. The American Society of Civil Engineers (ASCE), an organization I have been a part of for many years, was hired for this purpose. Our job was to make sure that USACE was asking the right questions during the investigation that would lead to concrete answers about the causes of the failure of the hurricane protection system. My team was focused on analyzing the risk and reliability of the water resource system in New Orleans, and we worked alongside the USACE team, starting with revising the investigation questions in order to get answers about why these water systems failed during the storm. Q: What was your experience like in New Orleans in the aftermath of the hurricane? Dr. Traver: My team went down to New Orleans a few weeks after the hurricane, visited all the sites we were reviewing and met with infrastructure experts along the way as progress was being made on the investigation. As we were flying overhead and looking at the devastated areas, seeing all the homes that were washed away, it was hard to believe that this level of destruction could happen in a city in the United States. As we started to realize the errors that were made and the things that went wrong leading up to the storm, it was heartbreaking to think about how lives could have been saved if the infrastructure in place had been treated as one system and undergone a critical review. Q: What were the findings of the ASCE and USACE investigation team? Dr. Traver: USACE focused on New Orleans because they wanted to figure out why the city’s levee system—a human-made barrier that protects land from flooding by holding back water—failed during the hurricane. The city manages pump stations that are designed to remove water after a rainfall event, but they were not well connected to the levee system and not built to handle major storms. So, one of the main reasons for the levee system failure was that the pump stations and levees were not treated as one system, which was one of the causes of the mass flooding we saw in New Orleans. Another issue we found was that the designers of the levee system never factored in a failsafe for what would happen if a bigger storm occurred and the levee overflowed. They had the right idea by building flood protection systems, but they didn’t think that a larger storm the size of Katrina could occur and never updated the design to bring in new meteorological knowledge on size of potential storms. Since then, the city has completely rebuilt the levees using these lessons learned. Q: What did researchers, scientists and the general population learn from Katrina? Dr. Traver: In areas that have had major hurricanes over the past 20 years, it’s easy to find what went wrong and fix it for the future, so we don’t necessarily worry as much about having a hurricane in the same place as we’ve had one before. What I worry about is if a hurricane hits a new town or city that has not experienced one and we have no idea what the potential frailties of the prevention systems there could be. Scientists and researchers also need to make high-risk areas for hurricane activity in the United States known for those who live there. People need to know what their risk is if they are in areas where there is increased risk of storms and flooding, and what they should do when a storm hits, especially now with the changes we are seeing in storm size.

5 min

Rubin Observatory Releases First Images, as The Villanova One Sky Center for Astrophysics Begins Celestial Partnership

If the first few frames are any indicator of a blockbuster movie, hold the 2035 Best Picture Oscar for the Vera C. Rubin Observatory and its ambitious new 10-year project. On June 23, 2025, scientists at the state-of-the-art facility in the mountains of north-central Chile gave the public its first glimpses into the capabilities of its 8.4-meter Simonyi Survey Telescope, equipped with the world’s largest digital camera—a 3.2 megapixel, 6,600-pound behemoth that can photograph the whole southern sky every few nights. Its task is a decade-long lapse record-called the Legacy Survey of Space and Time (LSST). The first shots on that journey have left both the general public and astronomical community in awe, revealing in rich detail a mind-boggling number of galaxies, stars, asteroids and other celestial bodies. “The amount of sky it covers, even in just one image, is unprecedented,” said David Chuss, PhD, chair of the Department of Physics, who viewed the first images with colleagues at an organized watch party. “It’s such high-precision, beautiful detail,” added Kelly Hambleton Prša, PhD, associate professor of Astrophysics and Planetary Sciences. “It’s just mind-blowing.” What Makes Rubin and LSST So Unique? Simply, this revolutionary instrument, embarking on an equally revolutionary initiative, will observe half the sky to a greater depth and clarity than any instrument ever has before. Consider this: "The Cosmic Treasure Chest” image released by Rubin contains 1,185 individual exposures, taken over seven nights. Each one of those individual exposures covers 10 square degrees of night sky, which is about the same as looking up at 45 full moons positioned around one another. It may seem like a small size, but click the image yourself, and zoom in and out. The amount of sky captured in that range—enough to show roughly 10 million galaxies—is astounding. Per the Observatory, “it is the only astronomical tool in existence that can assemble an image this wide and deep so quickly.” “At the end of 10 years, Rubin will have observed 20 billion galaxies, and each night in that time frame it will generate 20 terabytes of data,” Dr. Hambleton Prša said. “And, because Rubin has so many different filters, we get to see the same objects in so many different ways.” According to Dr. Hambleton Prša and Dr. Chuss, the power and precision of the Rubin LSST, combined with the shear area of the sky that will be observed, will allow for an incredibly in-depth study of myriad objects, processes and events in ways nobody has ever studied them before. “For example, in our galaxy, we expect to observe only two supernovae per century,” Dr. Hambleton Prša said. “But we're observing 20 billion galaxies. For someone studying this phenomenon, the number of supernovae that they’re going to observe will be off the charts. It is an exquisite survey.” It will also provide insight into the universe’s oldest and most puzzling enigmas. “Rubin is able to look back into our universe at times when it was much smaller during its expansion and really address some of these incredible mysteries out there, like dark energy,” Dr. Chuss said. “We know the universe is expanding and that this expansion is accelerating. Rubin will trace the history of that acceleration and, from that, provide insight into the physics of the mysterious dark energy that appears to be driving it.” To enhance the technological capabilities of its instrument, scientists were invited to contribute towards the selection of the observing strategy of the telescope. The Rubin team took into consideration continual input from the astrophysics community, separated into what they call “science collaborations.” To achieve this, the Rubin team generated proposed simulations for collecting observations, which the science collaborations then assessed for their specific science goals. “The Rubin team then iterated with the science collaborations, taking into account feedback, to ultimately obtain the best strategy for the largest number of science cases,” Dr. Hambleton Prša said. Dr. Hambleton Prša is the primary contact for the Pulsating Star Subgroup, which is part of the Transients and Variable Stars Science Collaboration, the science collaboration that focuses on objects in the sky that change with time. She was the lead author among 70 co-authors on the roadmap for this science collaboration, underscoring the significant scale of community participation for each of these areas. Joined Under One Sky Dr. Hambleton Prša, Dr. Chuss and other members of the Astrophysics and Planetary Sciences Department and Department of Physics at Villanova have a vested interest in Rubin and the LSST project. In April, the two departments joined forces to launch The Villanova One Sky Center for Astrophysics, co-directed by the two faculty members. With goals to elevate the University's longstanding record of research eminence in astronomy and astrophysics and create opportunities for more students to access the disciplines, the Center partnered with the Rubin Observatory to help realize the mission. Both Villanova and Rubin share a similar vision on expanding access to this broad field of study. Fortuitously, the launch of The Villanova One Sky Center coincided with the initial data released from Rubin. What will result, Dr. Chuss says, will be a “truly awesome impact on both our Center and institution.” Dr. Hambleton Prša will advance her own research of pulsating stars, and Andrej Prša, PhD, professor of Astrophysics and Planetary Science and the primary contact for the Binary Star Subgroup, will broaden his study of short-period binary stars. Joey Neilsen, PhD, associate professor of Physics, will expand his research in black hole astrophysics. Becka Phillipson, PhD, an assistant professor of Physics, who recently led a proposal for Villanova to join the Rubin LSST Discovery Alliance, aims to increase the scope of her study of chaotic variability of compact objects. Dr. Chuss, who generally works on infrared and microwave polarimetry, which is “outside the wavelength ranges of Rubin” is interested in its complementarity with other observations, such as those of the cosmic microwave background—the oldest light in the universe—and the evolution of the large-scale structure of the universe. Subjects, he says, which are “exactly in the wheelhouse for Rubin.” Other faculty members are interested in topics such as how Rubin’s observations may change the knowledge of both the history and structure of our solar system and the population of Milky Way satellite galaxies. That is not to mention, Dr. Hambleton Prša points out, the daily 20 terabytes of data that will become available for students and postdoctoral researchers under their tutelage, who will be heavily involved in its analysis for their own projects and ideas. “This partnership is going to greatly increase our opportunities and elevate our profile,” Dr. Chuss said. “It will make our program even more attractive for faculty, postdocs and students to come and to share their knowledge and expertise. “Together, we will all have access to an incredible movie of this epoch of our universe, and the knowledge and surprises that come with it along the way.”

View all posts