Why Japan Issued a "Megaquake" Advisory Following Last Week's Tremor

Aug 14, 2024

3 min

The magnitude 7.1 earthquake that struck Japan's southern islands on August 8 left some residents of the country in panic. Not from the tremor itself, which caused only a handful of minor injuries and quick-expiring tsunami alerts, but rather the unprecedented advisory from the Japanese Meteorological Agency warning of an elevated risk of a "megaquake" in the region over the coming weeks.


A "megaquake," short for a megathrust earthquake, is a type of temblor that occurs at a subduction zone, where one tectonic plate slips under another. A release of the tension that forms the thrust fault where the two plates meet can trigger some of the strongest earthquakes on the planet, measuring 9.0 or higher on the Richter scale, and produce large tsunamis.


It may sound a bit alarmist, but Isabel Hong, PhD, assistant professor in Villanova University's Department of Geography and the Environment, assures that "even though it is not possible to predict earthquakes, the advisory comes from a place of prior knowledge."


"We can't say for certain [when these earthquakes will happen]," she reiterated. "But probability suggests it could be more likely, in part because this smaller earthquake event occurred."


The acute event—last week's earthquake—is indeed the root of the alert, which was issued in the following hours. The quake's epicenter was located close to the end of the Nankai Trough, a subduction zone off the coast of Japan where the Philippine Sea Plate slips under the Eurasian Plate. The Nankai Trough has historically produced strong earthquakes, most recently an 8.0 tremor in 1946.


"Oftentimes, a large earthquake event can then trigger subsequent earthquakes," Dr. Hong said. "It can transfer stress to other faults that can make it more conducive for other earthquakes to then rupture, and that's the general belief of what's happening with the Nankai Trough right now."


To compound the acute disturbance last week, Japanese government officials had already previously warned of a 70-80 percent likelihood of a Nankai Trough earthquake measuring 8-9 on the scale within the next 30 years. That warning was the product of extensive research into the region's seismic history.


"All of the data that goes into [an advisory like that] is pulled from the work of dedicated scientists looking at past earthquake and tsunami deposits," said Dr. Hong, who herself studies prehistoric geohazards by analyzing their geologic trails along coasts. "This allows us to refine our understanding of the frequency of such events in a region. In this case, scientists can say, 'These happen about every 100 years, and it's already been over 70. Therefore, there's a higher probability another will occur in the next 30.'"


If it does, officials fear that a strong earthquake could trigger a massive tsunami that would reach the coast of Japan within minutes due to its proximity, threatening the lives of hundreds of thousands of individuals.


"Tsunamis occur along active subduction zones like the Nankai Trough," Dr. Hong said. "They do have to be generated by a strong earthquake, yes, but more important in their impact to coastal communities is the shape of the coastline offshore. If they go from deep to shallow water very fast, the tsunami builds tall."


So, whether it appears alarmist or not, having the ability to study these seismic events in a way that can warn individuals of heightened risks should not be taken for granted, says Dr. Hong. Early warning signs and advisories for potential geohazards can save lives.


"One of the reasons we dig into the geologic past is to help inform people what could happen in the future."


Powered by

You might also like...

Check out some other posts from Villanova University

3 min

A Century and a Half of Connectivity: Professor Mojtaba Vaezi Reflects on the Evolution and Future of Communication Technology

On March 10, 1876, Alexander Graham Bell spoke the first words ever transmitted over telephone: “Mr. Watson, come here; I want you.” This simple request to Bell’s assistant, Thomas Watson, marked a significant milestone in direct person-to-person communication. Now, 150 years later, this message has paved the way for advanced cellular technology in the form of satellites, wireless networks and the personal devices we carry everywhere. For Mojtaba Vaezi, PhD, associate professor of electrical and computer engineering at Villanova University and director of the Wireless Networking Laboratory, Bell’s few words spoken over telephone marked the beginning of an ongoing technological revolution. “One hundred fifty years ago when telephone communication first started, there was essentially a wired line and a transmitting voice,” said Dr. Vaezi. “That simple, basic transmission has transformed the field of communication technology in unimaginable ways.” According to Dr. Vaezi, five shifts have defined the past century and a half of communication technology: wired devices to wireless, analog to digital, voice to data, fixed landlines to mobile phones and human-to-human communication giving way to an increasing focus on machines and artificial intelligence. Early wireless networks were built around one device per person. Today's networks must support multiple devices per person, plus the technology behind innovations such as smart homes, driverless cars and even remote surgery. “Applications are much more diverse now, so communication has to follow,” said Dr. Vaezi. “A big portion of communication now, in terms of number of connections to the network, is from machine to machine—not human to human or even human to machine." The growing number of connections can cause a host of issues for users. When multiple users share the same wireless spectrum simultaneously, their signals interfere with one another—a problem that is becoming more acute as the number of connected devices increases exponentially. Dr. Vaezi’s research at Villanova focuses on developing techniques that allow multiple users to transmit messages on the same frequency at the same time and still be understood. Another vibrant research area of Dr. Vaezi’s involves Integrated Sensing and Communication (ISAC). This field of study focuses on integrating wireless communications and radar so they can function within the same spectrum. “Historically, radar and wireless communication work in different bandwidths or spectrums and use separate devices. Although they are related, they happen in different fields,” said Dr. Vaezi. “Almost every communication scheme that has been developed has focused on this: How can we better utilize the spectrum?” ISAC is increasingly important as new innovations like driverless cars become fixtures in everyday life. These vehicles rely on radar to continuously scan for hazards, and when a hazard is detected, a signal must be sent to trigger safety mechanisms. Currently, the radar and communications systems operate on separate bandwidths using separate hardware. Dr. Vaezi's research explores how both functions could be housed in a single device running on one shared spectrum. Areas of study like Dr. Vaezi’s that focus on machine to machine communication are becoming increasingly relevant as communication technology evolves and moves away from simple person to person messaging. As for the next big milestone in communications, Dr. Vaezi is looking ahead to the implementation of 6G by 2030, though he tempers expectations. For most users, the change will feel modest, amounting to slightly faster device speeds. The most massive shift with 6G will be the amount of added coverage in areas that previously did not have network accessibility. “Say you order a package and it’s coming from somewhere abroad,” explained Dr. Vaezi. “6G will add network coverage over oceans, so you’ll be able to track your package in real time using that satellite technology.” The sixth generation of cellular technology will continue to connect our world and optimize current communications to accommodate more users and devices that need network access each day. It is far different from Alexander Graham Bell’s historic phone call 150 years ago. That brief exchange over a single wired line laid the groundwork for a communications ecosystem that now supports billions of devices, complex data networks and emerging technologies yet to be seen. It also serves as a reminder that despite how far communication technology has come, and how complex it has gotten, it all shares a common, simple goal: to transmit information from one point to another.

4 min

Strategic Closure of Strait of Hormuz Puts Pressure on US, Threatens Global Oil Trade Stability

Less than a week after the onset of the war in Iran, and amid escalating conflict in the region, Iran effectively closed the Strait of Hormuz to shipping tankers moving oil from the Middle East by threatening attacks against any vessel who entered the waterway. Thus, the small body of water, which moves a large percentage of the world’s crude oil, has become one of the most discussed places in the world in recent days. Frank Galgano, PhD, is a professor of Geography and the Environment at Villanova University. He is an expert in military and Middle East geography and has also studied global maritime shipping and access to natural resources. Dr. Galgano says there geographic, geopolitical, military and economic factors at play, along with widespread potential consequences, as Iran holds steady on their closure of the strait and the U.S. considers how, or if, it will attempt to help escort oil ships through. Geography and Significance of Strait of Hormuz Situated between Iran to the north and Oman and the United Arab Emirates to the south, the Strait of Hormuz is a narrow shipping lane that connects the Persian Gulf to the Gulf of Oman and, further out, the Arabian Sea. It is one of the most vital chokepoints in the Middle East, along with the Suez Canal, Straits of Tiran, Bab al-Mandab and the Turkish Straits. “Right now, because of oil, it is the most important,” Dr. Galgano said. “Every day, roughly 20 percent of global petrochemical use goes through Hormuz.” The strait itself is barely over 20 nautical miles at its narrowest, but only a small portion of that is shipping lanes. Depth constraints limit shipping to two lanes, each two miles wide, with a two-mile buffer between. “You’re essentially looking at all of that shipping constrained to six nautical miles, and the ships are relatively slow,” Dr. Galgano said. “There are usually about 14-25 tankers every 24 hours transiting the Gulf, so there is always a ship in line." By Iran threatening military action against any oil-carrying ships in Hormuz—and by shipping companies refusing to attempt to traverse it— one-fifth of the global oil trade is essentially cut off indefinitely. That is concerning, given that it takes very little to send global oil prices skyrocketing. Dr. Galgano referenced the 2010-11 Somali pirate issue that caused supertankers—which cost upward of $50,000 a day to operate—to be rerouted. “That alone caused gas prices to raise 10 cents per gallon,” he said. In this case, the biggest impact will be felt throughout Asia, which relies more heavily on oil imports. But the U.S., despite being the second-biggest producer of crude oil last year, will still feel significant effects, since oil is traded globally. “It takes these supertankers eight or 12 days to reach the East Coast from Hormuz,” Dr. Galgano said. “So, a few days later you might see diminished supplies, but there is a critical point where we would face a real shortage.” Attempting to Move Ships Through Hormuz Poses Huge Danger Unlike the Iranian-backed Houthi rebels attacks on Israeli ships and those belonging to its allies in the Red Sea last year, Iran itself has far more sophisticated weapons, along with a strong motive to do whatever it can to put pressure on the U.S. and involved allies. In addition to drones designed for attacking ships—like the ones used by Houthis—Iran also possesses Chinese and Russian anti-ship missiles, according to the professor. “Ships are very vulnerable,” he said, then referencing the 2000 bombing of the USS Cole by Al Qaeda operatives. “That was just two guys in a rubber boat with an explosive device, and it almost sunk the whole ship. If one is carrying oil, it becomes almost like a large fuel bomb.” The United States has weighed the idea of sending a convoy to help escort and protect these ships. They did as much in the late 1980s in Operation Earnest Will, in which President Reagan ordered Kuwaiti supertankers—which were being fired at—to reflag under the U.S. flag so the Navy could legally escort them. But weapons technology has changed, and while U.S. naval ships could certainly defend themselves, “supertankers are slow and it is still an incredibly dangerous operation,” Dr. Galgano said. “The convoy would have to be lucky 100 percent of the time. Iran would only have to be lucky once to hit a ship and cause an immediate fiasco, both physically and in the media.” Global Dependance on Shipped Goods According to Dr. Galgano, between 75 and 90 percent of all items you handle on a day-to-day basis come from inside the hull of a ship: shocks on your car, clothes on your back, or components of your computer. When shipment is disrupted, it can cause supply chain and cost issues. “During the pandemic, Ford was waiting on chips for F-150s, and HP was waiting in chemicals to make ink,” Dr. Galgano said. “Even the ship that got stuck in the Suez Canal a few years ago caused $10 billion in losses per day due to the backup.” For commodities like oil, the indefinite inability to utilize perhaps the most important shipping lanes in the world due to large scale conflict quickly raises the economic stakes to even greater levels. “Iran absolutely knows that, and they see this as a bargaining chip,” Dr. Galgano said. “Cause economic pain to force cessation of the attacks.”

5 min

23andMe’s Bankruptcy Exposes Fragility of How Genetic Data is Utilized Beyond Fee-For-Service, Says Villanova Law Professor

When individuals sign up for direct-to-consumer genetic testing, the extent to which they ever think about their genetic data is likely in the context of the service for which they paid: information on predisposition to a genetic illness, or confirmation of an ethnic background, for example. But that data doesn’t just sit on a shelf, and while the most mainstream concern for such services is the privacy of your data, there is also the question of what else the companies do with it, and how. Ana Santos Rutschman, SJD, LLM, professor and faculty director of the Health Innovation Lab at Villanova University Charles Widger School of Law, is particularly interested in the latter. In June 2025, she co-authored an amicus brief centered on data protection and patient’s interests amid genetic testing company 23andMe’s bankruptcy proceedings. In December, many of those same co-authors published a paper in Nature Genetics, highlighting 23andMe’s bankruptcy as “an inflection point for the direct-to-consumer genetics market,” especially as it pertains to the broader corporate use of individuals’ scientific data. The reason? “How that data is used all depends on the policies of the individual companies,” she said. Genetic Testing Companies Use Your Data For More Than The Services You Pay For Those who utilize genetic testing companies—for any reason—are likely also consenting, often unknowingly, to other unrelated items. This includes acknowledgment of information related to how your data might be further used or monetized. “Most people don't think about secondary and tertiary uses of their data,” said Professor Rutschman. “[What they consent to] is displayed on the website somewhere, but it’s not easily understandable and accessible. It’s fine print.” Such companies often operate beyond the traditional “fee for a service” relationship with consumers. Yes, they will give you the information you paid for—finding out whether you have German ancestry or are predisposed to certain genetic disease—but instead of that genetic data just being stored somewhere, it’s often sold for research purposes. Today, in the age of AI big data, that might look something like this: The company puts your data in a box with parameters, along with thousands of others. Perhaps they are then able to observe a pattern that, until all that data was compiled, was previously unknown. They come up with a diagnostic or a medicine and patent it. That patent is licensed to somebody else, and the company makes money on the product. The use of that data for scientific purposes—even ones that turn a profit— is not problematic in itself, says Professor Rutschman. “Some people may even choose a company that allows scientific research over one that doesn’t. Many people may not care, but some will. The uses are not common knowledge, and that is worrisome. The public should be well-informed about what’s happening.” Deeper problems may arise when they aren’t informed of those potential uses of their data. Professor Rutschman cited the infamous Henrietta Lacks case, in which Lacks’ cells were, and continue to be, one of the most valuable cell lines in cancer research. Neither Lacks nor her family were paid for the widespread use of her genetic material until a settlement was reached long after her death. “When you have biologics involved, a concern is that if you have something potentially valuable, you may not see any money from it.” Bankruptcy Can Cause Policy Upheaval To understand the role bankruptcy can play in all of this, one needs to refer back to the power of individual company policy in this space. There are no external laws that dictate how these companies can further monetize their data, says Professor Rutschman, as long as they don’t violate other laws, such as privacy laws. That means that when a company like 23andMe goes bankrupt, as was the case in 2025, new ownership could enact completely different corporate policies for use of their property. In their specific case, the company was essentially bought back by 23andMe founder and CEO Anne Wojcicki’s non-profit, all but ensuring policies would remain the same. But that is exactly why Professor Rutschman and others are highlighting this specific case. “Bankruptcy is bad in the sense that there's a lot of uncertainty,” she said. “In this instance, the person coming in was the person who was there before, so the policy is likely to continue. But that's very rare. There are a roster of companies with access to biological materials. 23andMe is a good example of something not going horribly wrong, but with the understanding that it absolutely could.” Ways in which that could happen could be new ownership undermining the original intent of the data use by cessation of the company’s previous policies, or charging exorbitant prices to other entities to use that data for scientific research. “Because there is no law, these new owners can essentially do as they please with their proprietary data, unless they do something incredibly careless that amounts to the level of illegal,” Professor Rutschman said. “And that is concerning.” Onus Falls to Companies to Enact Safeguards To ensure a worst-case scenario for such companies does not unfold in a bankruptcy situation, Professor Rutschman points to a number of safeguards they could enact to protect their original commitments, ensure equitable access to data for scientific research and promote fair trade. One of which is implementing a company policy stating that commitments from a previous iteration of the company need to be honored if ownership is transferred. Those could include, as the authors recommend, policies “honoring original research-oriented commitments under which the data were collected,” as well as not “enclosing the dataset for exclusive commercial use.” She also highlights the need for Fair, Reasonable, and Non-Discriminatory (FRAND) voluntary licensing commitments, which are inherently more science and market friendly. “Companies in many sectors have committed to this approach, and we are saying it should apply in this space as well. You’ll charge your royalty, but it can’t be a billion dollars for a data set, nor would it be done by exclusively selling to one entity. You can get that billion dollars by selling to 15, 50 or 100 companies, and from a scientific research perspective, that’s what we want. Otherwise, you have a monopoly or duopoly. “There are a lot of different models that can be used, but ultimately what we are arguing is leaving this unaddressed is a really bad idea. It leaves everything exposed, and something bad is more likely to happen.”

View all posts