Experts Matter. Find Yours.
Connect for media, speaking, professional opportunities & more.

State-sponsored computational propaganda is a potential threat during the 2022 Winter Olympics
Sagar Samtani, assistant professor and Grant Thornton Scholar, whose research centers on AI for cybersecurity and cyber threat intelligence, is particularly watching two major cybersecurity issues during the 2022 Winter Olympics in Beijing Feb. 4-20. “The Olympics are an international, global event. As such, there are often political undertones and agendas that may drive how countries present themselves. Disinformation, misinformation, and computational propaganda that are state-sponsored or provided by individual threat actors could pose a significant threat,” Samtani said. Samtani noted that this will be biggest Olympics for streaming services. For example, NBC Universal will present Winter Olympics record of over 2,800 hours of coverage. But this move away from network reliance on broadcast channels could present a tantalizing target for hackers. “The Olympics are a widely covered, highly publicized TV event. In recent years, streaming services have grown in popularity, while conventional satellite and cable services have declined. As such, the concerns around denial-of-service attacks against prevailing streaming services as it pertains to viewing the Olympics is a very real concern,” he said. Samtani can be reached at ssamtani@iu.edu

Will Biden’s Plan to Resettle Afghans Transform the U.S. Refugee Program?
Among the high-profile anti-immigration policies that characterized the four years of Donald Trump’s presidency was a dramatic contraction in refugee resettlement in the United States. President Biden has expressed support for restoring U.S. leadership, and increased commitment is needed to help support the more than 80 million people worldwide displaced by political violence, persecution, and climate change, says UConn expert Kathryn Libal. As Libal writes, with co-author and fellow UConn professor Scott Harding, in a recent article for the Georgetown Journal of International Affairs, the rapid evacuation of more than 60,000 Afghans pushed the Biden administration to innovate by expanding community-based refugee resettlement and creating a private sponsorship program. But more resources are needed to support programs that were severely undermined in previous years and to support community-based programs that help refugees through the resettlement process: Community sponsorship also encourages local residents to “invest” in welcoming refugees. Under existing community sponsorship efforts, volunteers often have deep ties to their local communities—critical for helping refugees secure housing, and gain access to employment, education, and health care. As these programs expand, efforts to connect refugees to community institutions and stakeholders, which are crucial to help facilitate their social integration, may be enhanced. As Chris George, Executive Director of Integrated Refugee and Immigrant Services in New Haven, Connecticut, has observed, “It’s better for the refugee family to have a community group working with them that knows the schools and knows where to shop and knows where the jobs are.” As more local communities take responsibility for sponsoring refugee families, the potential for a more durable resettlement program may be enhanced. In the face of heightened polarization of refugee and immigration policies, community sponsorship programs can also foster broad-based involvement in refugee resettlement. In turn, greater levels of community engagement can help challenge opposition toward and misinformation about refugees and create greater public support for the idea of refugee resettlement. Yet these efforts are also fraught with significant challenges. Sponsor circle members may have limited capacity or skills to navigate the social welfare system, access health care services, or secure affordable housing for refugees. If group members lack familiarity with the intricacies of US immigration law, helping Afghans designated as “humanitarian parolees” attain asylum status may prove daunting. Without adequate training and ongoing support from resettlement agencies and caseworkers, community volunteers may experience “burn out” from these various responsibilities. Finally, “successful” private and community sponsorship efforts risk providing justification to the arguments of those in support of the privatization of the USRAP and who claim that the government’s role in resettlement should be limited. Opponents of refugee resettlement could argue that community groups are more effective than the existing public–private resettlement model and seek to cut federal funding and involvement in resettlement. Such action could ultimately limit the overall number of refugees the United States admits in the future. December 11 - Georgetown Journal of International Affairs. If you are a journalist looking to know more about this topic – then let us help with your coverage and questions. An associate professor of social work and human rights, Kathryn Libal is the director of UConn's Human Rights Institute and is an expert on human rights, refugee resettlement, and social welfare. She is available to speak with media – click on her icon now to arrange an interview.

Misinformation expert on YouTube ban on false information on vaccines
Lisa Fazio, associate professor of psychology and human development, is available for commentary on YouTube's ban on vaccine misinformation. Lisa is an expert in misinformation and false news, studying how people learn, interpret and remember information. She can speak to YouTube's history as a vector for misinformation and other points related to the topic, including: The importance of finding trusted sources and how it's even harder than normal to tell what is a reliable source, especially as so many personal accounts are floating around Misinformation and mixed messages sent from politicians and government officials on vaccine implications The dangers of so many "experts" talking publicly about vaccines, particularly around the periphery of their expertise and the damage that has been done, particularly during the pandemic

As America begins to stare down a fourth wave of COVID 19 – vaccine awareness and the debate about to get immunized or not is still a hot topic. And unfortunately, the level of misinformation being spread on social media is rampant. Recently, MSU’s Anjana Susarla the Omura Saxena Professor of Responsible AI was featured in The Conversation in a piece titled ‘Big tech has a vaccine misinformation problem – here’s what a social media expert recommends’ . It’s a very compelling read and must have information for anyone looking at the threat of fake news and how quickly it can spread. As well, the article also highlights tactics n blocking sites and mitigating the spread of misinformation. And if you are a journalist looking to know more about how big tech needs to keep up the fight against fake news – then let us help. Anjana Susarla is the Omura Saxena Professor of Responsible AI at Michigan State University. She's available to speak with media, simply click on her icon now to arrange an interview today.

First Commercial-Scale Wind Farm in the U.S. Would Generate Electricity to Power 400,000 Homes
The Vineyard Wind project, located off the coast of Massachusetts, is the first major offshore wind farm in the United States. It is part of a larger push to tackle climate change, with other offshore wind projects along the East Coast under federal review. The U.S. Department of the Interior has estimated that, by the end of the decade, 2,000 turbines could be along the coast, stretching from Massachusetts to North Carolina. "While the case for offshore wind power appears to be growing due to real concerns about global warming, there are still people who fight renewable energy projects based on speculation, misinformation, climate denial and 'not in my backyard' attitudes," says Karl F. Schmidt, a professor of practice in Villanova University's College of Engineering and director of the Resilient Innovation through Sustainable Engineering (RISE) Forum. "There is overwhelming scientific evidence that use of fossil fuels for power generation is driving unprecedented levels of CO2 into our atmosphere and oceans. This causes sea level rise, increasing ocean temperature and increasing ocean acidity, all which have numerous secondary environmental, economic and social impacts." Schmidt notes that what's often missing for large capital projects like the Vineyard Wind project is a life cycle assessment (LCA), which looks at environmental impacts throughout the entire life cycle of the project, i.e., from raw material extraction, manufacturing and construction through operation and maintenance and end of life. These impacts, in terms of tons/CO2 equivalent, can then be compared with the baseline—in this case, natural gas/coal power plants. "With this comprehensive look, I suspect the LCA for an offshore wind farm would be significantly less than a fossil fuel power plant," says Prof. Schmidt. Complementing the LCA should be a thorough, holistic view encompassing the pertinent social, technological, environmental, economic and political (STEEP) aspects of the project, notes Prof. Schmidt. "This would include all views of affected stakeholders, such as residents, fishermen, local officials and labor markets. Quantifying these interdependent aspects can lead to a more informed and balanced decision-making process based on facts and data." "At Villanova's Sustainable Engineering Department, we've successfully used both the LCA and STEEP processes... for many of our RISE Forum member companies' projects," notes Prof. Schmidt.

The Facebook Oversight Board’s ruling temporarily upholding the social media giant’s ban on former President Donald J. Trump, which they instructed the company to reassess within six months, noted that the parameters for an indefinite suspension are not defined in Facebook's policies. The non-decision in this high-profile case illustrates the difficulties stemming from the lack of clear frameworks for regulating social media. For starters, says web science pioneer James Hendler, social media companies need a better definition of the misinformation they seek curb. Absent a set of societally agreed upon rules, like those that define slander and libel, companies currently create and enforce their own policies — and the results have been mixed at best. “If Trump wants to sue to get his Facebook or Twitter account back, there’s no obvious legal framework. There’s nothing to say of the platform, ‘If it does X, Y, or Z, then it is violating the law,’” said Hendler, director of the Institute for Data Exploration and Applications at Rensselaer Polytechnic Institute. “If there were, Trump would have to prove in court that it doesn’t do X, Y, or Z, or Twitter would have to prove that it does, and we would have a way to adjudicate it.” As exemplified in disputes over the 2020 presidential election results, political polarization is inflamed by a proliferation of online misinformation. A co-author of the seminal 2006 Science article that established the concept of web science, Hendler said that “as society wrestles with the social, ethical, and legal questions surrounding misinformation and social media regulation, it needs technologist to help inform this debate.” “People are claiming artificial intelligence will handle this, but computers and AI are very bad at ‘I’ll know it when I see it,’” said Hendler, who’s most recent book is titled Social Machines: The Coming Collision of Artificial Intelligence, Social Networking, and Humanity. “What we need is a framework that makes it much clearer: What are we looking for? What happens when we find it? And who’s responsible?” The legal restrictions on social media companies are largely dictated by a single sentence in the Communications Decency Act of 1996, known as Section 230, which establishes that internet providers and services will not be treated as traditional publishers, and thus are not legally responsible for much of the content they link to. According to Hendler, this clause no longer adequately addresses the scale and scope of power these companies currently wield. “Social media companies provide a podium with an international reach of hundreds of millions of people. Just because social media companies are legally considered content providers rather than publishers, it doesn’t mean they’re not responsible for anything on their site,” Hendler said. “What counts as damaging misinformation? With individuals and publishers, we answer that question all the time with libel and slander laws. But what we don’t have is a corresponding set of principles to adjudicate harm through social media.” Hendler has extensive experience in policy and advisory positions that consider aspects of artificial intelligence, cybersecurity and internet and web technologies as they impact issues such as regulation of social media, and powerful technologies including facial recognition and artificial intelligence. Hendler is available to speak to diverse aspects of policies related to social media, information technologies, and AI.

Evaluating the Impact of Facebook's Ban on Vaccine Misinformation
A new Facebook policy has banned misinformation about all vaccines on its platform. Villanova University professor Jie Xu, PhD, who specializes in science and health communication, examined this decision. "On one hand, there clearly is a lot of mis/disinformation on social media regarding vaccines; some of them are simply uninformed and, of course, harmful to public health," said Dr. Xu. "On the other hand, many details relating to the COVID-19 vaccine, in my view, are still open to scientific debate." So, what determines what is labelled misinformation? Dr. Xu believes this is a complicated determination. "Science itself is evolving with falsification and revision to previous claims when new evidence comes in," Dr. Xu noted. "Who is to say that some claims deemed true at this moment won't be overturned in the future? What are the standards to be used in defining what is true information or misinformation? And perhaps more importantly, who are the 'fact-checkers' that are considered trustworthy to the majority of Americans?" However, there are some benefits to Facebook's decision. "On a more positive note, there is some preliminary evidence indicating that labeling misinformation on social media may help to alleviate the negative influence of vaccine misinformation claims," Dr. Xu said. "The challenge is that the people that are most susceptible to misinformation, and those that health professionals really want to reach out to, are the ones that have the least level of trust on this type of intervention. In some corners, this will likely to be viewed as violation to free speech and perhaps backfire." How does Facebook's banning align with free speech? "My understanding of free speech is that it's not that we don't pay a price for it—unless it's inciting violence, most information has been allowed to flow relatively freely—but it's that the alternative could be much worse," said Dr. Xu. "At the end of the day, we need to create an environment in which honest, open and critical conversations are welcomed, and we do need each other to find the truth."

Network Science Offers Key Insights into Polarization, Disinformation, and Minority Power
People tend to think of the arena of politics as being driven by human decision and emotions, and therefore unpredictable. But network scientists like Boleslaw Szymanski, a computer science professor at Rensselaer Polytechnic Institute, have found that the country’s political activity – from American society’s ever-growing partisan divide to its grappling with the spread of misinformation online – can be explained by abstract and elegant models. These models provide insights — and even answers — to a number of pressing questions: Is increasing access to information driving us apart? Can an entrenched minority ultimately prevail? Could structural changes be made that insulate us from misinformation and reduce the polarization that divides us? Szymanski studies the technical underpinnings of our choices, how we influence one another, and the impact of the algorithms we rely upon to navigate a growing ocean of information. His work has yielded fascinating insights, including research on how a committed minority will overcome less determined opposition and the development of a parameter to determine what drives polarization in Congress. Through his research on the influence of minority opinions, Szymanski found that when just 10 percent of the population holds an unshakable belief, it will ultimately be adopted by the majority of the society. “When the number of committed opinion holders is below 10 percent, there is no visible progress in the spread of ideas. It would literally take the amount of time comparable to the age of the universe for this size group to reach the majority,” said Szymanski, a computer science professor at Rensselaer Polytechnic Institute. “Once that number grows above 10 percent, the idea spreads like flame.” In his present work, Szymanski is researching tools for measuring the level of polarization in specific news sites, search engines, and social media services, and developing remedies, like algorithms that offer better data provenance, detect misinformation, and create internal consistency reasoning, background consistency reasoning, and intra-element consistency reasoning tools. “Informed citizens are the foundation of democracy, but the driving interest of big companies that supply information is to sell us a product,” Szymanski said. “The way they do that on the internet is to repeat what we showed interest in. They’re not interested in a reader’s growth — they’re interested in the reader’s continued attention.” With the political environment becoming increasingly bitter and dubious information becoming ever more prevalent, Szymanski is available to discuss his research on polarization, disinformation, and the power of a committed minority.

Paper ballots, risk-limiting audits can help defend elections and democracy, IU study finds
BLOOMINGTON, Ind. -- With just over two months before the 2020 election, three professors at the Indiana University Kelley School of Business offer a comprehensive review of how other nations are seeking to protect their democratic institutions and presents how a multifaceted, targeted approach is needed to achieve that goal in the U.S., where intelligence officials have warned that Russia and other rivals are again attempting to undermine our democracy. But these concerns over election security are not isolated to the United States and extend far beyond safeguarding insecure voting machines and questions about voting by mail. Based on an analysis of election reforms by Australia and European Union nations, they outline steps to address election infrastructure security -- such as requiring paper ballots and risk-limiting audits -- as well as deeper structural interventions to limit the spread of misinformation and combat digital repression. "In the United States, despite post-2016 funding, still more than two-thirds of U.S. counties report insufficient funding to replace outdated, vulnerable paperless voting machines; further help is needed," said Scott Shackelford, associate professor of business law and ethics in the Kelley School, executive director of the Ostrom Workshop and chair of IU's Cybersecurity Program. "No nation, however powerful, or tech firm, regardless of its ambitions, is able to safeguard democracies against the full range of threats they face in 2020 and beyond. Only a multifaceted, polycentric approach that makes necessary changes up and down the stack will be up to the task." For example, Australia -- which has faced threats from China -- has taken a distinct approach to protect its democratic institutions, including reclassifying its political parties as "critical infrastructure." This is a step that the U.S. government has yet to take despite repeated breaches at both the Democratic and Republican national committees. Based on an analysis of election reforms by Australia and European Union nations, they outline steps to address election infrastructure security -- such as requiring paper ballots and risk-limiting audits -- as well as deeper structural interventions to limit the spread of misinformation and combat digital repression. "In the United States, despite post-2016 funding, still more than two-thirds of U.S. counties report insufficient funding to replace outdated, vulnerable paperless voting machines; further help is needed," said Scott Shackelford, associate professor of business law and ethics in the Kelley School, executive director of the Ostrom Workshop and chair of IU's Cybersecurity Program. "No nation, however powerful, or tech firm, regardless of its ambitions, is able to safeguard democracies against the full range of threats they face in 2020 and beyond. Only a multifaceted, polycentric approach that makes necessary changes up and down the stack will be up to the task." For example, Australia -- which has faced threats from China -- has taken a distinct approach to protect its democratic institutions, including reclassifying its political parties as "critical infrastructure." This is a step that the U.S. government has yet to take despite repeated breaches at both the Democratic and Republican national committees. The article, "Defending Democracy: Taking Stock of the Global Fight Against Digital Repression, Disinformation and Election Insecurity," has been accepted by Washington and Lee Law Review. Other authors are Anjanette "Angie" Raymond, associate professor of business law and ethics, and Abbey Stemler, assistant professor of business law and ethics, both at Kelley; and Cyanne Loyle, associate professor of political science at Pennsylvania State University and a global fellow at the Peace Research Institute Oslo. Aside from appropriating sufficient funds to replace outdated voting machines and tabulation systems, the researchers said that Congress should encourage states to refuse to fund voting machines with paperless ballots. The researchers also suggest requiring risk-limiting audits, which use statistical samples of paper ballots to verify official election results. Other suggested steps include: Congress requiring the National Institute of Standards and Technology to update their voting machine standards, which state and county election officials rely on when deciding which machines to purchase. Australia undertook such a measure. Creating a National Cybersecurity Safety Board to investigate cyberattacks on U.S. election infrastructure and issue post-elections reports to ensure that vulnerabilities are addressed. Working with universities to develop training for election officials nationwide to prepare them for an array of possible scenarios, and creating a cybersecurity guidebook for use by newly elected and appointed election officials. "With regards to disinformation in particular, the U.S. government could work with the EU to globalize the self-regulatory Code of Practice on Disinformation for social media firms and thus avoiding thorny First Amendment concerns," Raymond said. "It could also work to create new forums for international information sharing and more effective rapid alert and joint sanctions regimes. "The international community has the tools to act and hold accountable those actors that would threaten democratic institutions," added Stemler, who also is a faculty associate at Harvard University's Berkman Klein Center for Internet and Society. "Failing the political will to act, pressure from consumer groups and civil society will continue to mount on tech firms, in particular Facebook, which may be sufficient for them to voluntarily expand their efforts in the EU globally, the same way that more firms are beginning to comply with its General Data Protection Regulation globally, as opposed to designing new information systems for each jurisdiction."

Ask an Expert: How can you recognize false information about the virus?
Lisa K. Fazio, assistant professor of psychology at Vanderbilt Peabody College of education and human development, gives tips to social media consumers on how to recognize misleading information about COVID-19. For more information, read an essay about ways to identify misinformation written by Fazio for the Peabody Reflector. Vanderbilt University faculty are sharing their expertise on a range of topics related to COVID-19. Subscribe to Vanderbilt’s “Ask an Expert” series on YouTube to get the latest updates.