Experts Matter. Find Yours.
Connect for media, speaking, professional opportunities & more.

Finding Truth among the Tweets. Our expert weighs in on the role social media has during war.
With the Israel-Hamas war raging on, social media provides a source of information for many individuals to stay up to date. Across platforms there are reliable sources but there are also those with an agenda to spread false truth, blatant lies and sew doubt with doses of 'mis' and disinformation. It's a topic Goizueta Business School professor David Schweidel is watching closely. "We are seeing once again the need for the regulation of social media platforms," says Schweidel. " Platforms have a financial incentive to serve up the most provocative and arousing content and content moderation is often at odds with financial goals." Social media is being flooded with content, much of it misinformation, and social platforms are unwilling or unable to effectively moderate what’s being posted. "Beyond the likely reduction in revenue, implementing content moderation at scale is expensive and difficult. If viewed from a short-term financial perspective, allowing for a free for all is less costly and will result in more user engagement, which drives revenue," Schweidel adds. And it is not as if legislators and lawmakers are not aware. As of today, social media platforms aren’t liable for the content posted on them (under the FCC’s Section 230). Two recent lawsuits sought to challenge section 230, but the Supreme Court declined to take such action. These challenges were based on platforms actively promoting content through their algorithms, thereby going beyond simply being intermediaries providing access to content posted online by others. Some, such as the ACLU, view this as allowing for free speech online. There's a lot more to know, such as: The challenges in identifying real vs. fake content Which platforms are being effective in moderating content How US and EU laws vary in terms of regulating misinformation on social media platforms And that's where we can help. David A. Schweidel is Professor of Marketing at Emory University’s Goizueta Business School. He's a renowned marketing analytics expert focused on the opportunities at the intersection of marketing and technology. David is available to speak with media regarding this important topic, simply click on his icon now to arrange an interview today.

The Role of Artificial Intelligence in Customer Experience
Gaurav Jain, assistant professor of marketing at the Rensselaer Lally School of Management, examines how individuals make judgments, estimates, and decisions in the absence of complete information. Previously, Jain served as the chief marketing advisor at multiple firms. Below are his thoughts on the impact of artificial intelligence (AI) on customer experience. Voice of the Customer In today's hyper-connected world, the voice of the customer (VoC) is louder and clearer than ever. But how do we sift through this cacophony to understand what our customers are really saying? Enter AI. It's revolutionizing the way customer experience teams handle VoC programs, and as a marketing leader, I find this incredibly exciting. Take direct customer feedback, for example. We're no longer just collecting survey responses and storing them in a database for quarterly review. AI algorithms, particularly those using natural language processing, are helping us instantly categorize and prioritize this feedback. Imagine an e-commerce platform that can immediately flag a customer's mention of "late delivery" in a post-purchase survey. That's not just efficient; it's customer-centric. But what about the things customers are saying when they're not directly talking to us? That's where AI-driven sentiment analysis comes in. These tools can scan social media, forums, and review sites to gauge the sentiment behind a customer's words. I've seen hotel chains use this technology to monitor travel forums and review sites. If a guest mentions "noisy rooms," even without lodging a direct complaint, the brand can proactively look into soundproofing solutions. Then there's inferred feedback, the kind you get by reading between the lines. AI can analyze customer behavior, like frequent page visits without conversion or cart abandonment, to suggest what might be going wrong. For instance, an online fashion retailer could use AI to figure out why a particular dress gets a lot of views but few purchases. Maybe it's the sizing, maybe it's the price, but the point is, you get to know without having to ask. And it doesn't stop at gathering feedback. AI is helping us turn this raw data into actionable insights. We can predict future behavior, like churn rates, based on past feedback. This allows us to be proactive rather than reactive, which is a game-changer in customer experience management. Finally, let's talk about what happens after we've gathered all this feedback. AI is ensuring that every customer who takes the time to share their thoughts receives an immediate and appropriate response. Chatbots can handle common queries or concerns, making the customer feel heard and valued right away. So, from the perspective of a marketing leader, it's not just about the efficiency that AI brings to VoC programs. It's about the opportunity to deepen our connection with customers. By truly understanding their words, their sentiments, and even their behaviors, we can craft experiences that resonate on a human level. And in a world that's increasingly digital, that human touch is what sets a brand apart. Customer Service It's truly intriguing to observe how AI is weaving its way into the customers’ experience. Online, chatbots are making waves. Chatbots are not just digital tools; they're our first point of contact, bridging the gap between brands and consumers. However, there was always the question of accuracy versus efficiency while managing these chatbots – AI has answered that question. AI chatbots provide real-time yet accurate assistance, making the digital shopping journey feel more interactive. Companies can reduce customer dropout while avoiding the expense of managing a large human customer service team. AI is revolutionizing phone-based customer service as well. Voice recognition allows natural language processing for easier navigation, while predictive analysis anticipates caller needs based on their history. Enhanced personalization means customers no longer repetitively provide account details, and emotion detection aids in gauging caller mood. The result? Reduced wait times, more efficient interactions, and a significantly improved telephonic customer experience. In essence, AI is bridging the gap between technology and human touch in the retail world, making our interactions with brands more meaningful and personalized. Again, companies can do this in a cost-effective manner. Jain is available to speak with media - - simply click on his icon now to arrange an interview today.

Hospital at Home: Understanding How It Works
As the ChristianaCare Hospital Care at Home program grows, we see the benefits of improving the way we deliver care to our patients. With each patient we care for, we are reminded that a big part of recuperating and getting better is not just physical but mental and emotional. Being home allows patient to visit with loved ones, cuddle with pets and sleep in their own bed. It also helps our health care providers better understand a patient’s living environment, making it possible for us to provide the individual services they need. Q. What is hospital care at home? A. I think of a virtual hospital as three components: a command center, technology and in-home care. The command center is a 24/7, 365-day- a-week center staffed by physicians, nurses, advanced practice clinicians and patient digital ambassadors. This team of health care providers is tethered to patients in the home by way of our technology. We give our patients a tablet that lists their daily schedule so they know who to expect in their home and the time our health care providers will arrive. It also allows them to contact the command center at any time by pressing a button. When they do that, a nurse appears on the screen right away. Edwin Bryson Sr. said ChristianaCare made it easy to treat his diabetes complications from the privacy and comfort of his own home. With hospital care at home, he said, “all I do is hit the button and a nurse comes on to assist me with anything I need. It was 24-hour service here, just like I was in the hospital.” Technology also allows us to monitor patients’ vital signs at home as we would in the traditional hospital setting. We use Bluetooth technology to upload that information into the electronic medical record. In-home care is made up of a team that goes into the home to deliver the services that a particular patient needs. This includes radiology (X-rays and ultrasound), blood tests, intravenous medications, physical therapy, occupational therapy and more. A licensed professional, such as a nurse, also visits the patient at home at least twice a day. Q. Who is eligible for hospital care at home? A. There are requirements for participation. Patients need to live within 25 miles from our Delaware hospital campuses which are in Newark and Wilmington. We also are looking for patients that meet our acute, inpatient level of care. So if they’re in observation status, for example, they wouldn’t be a good candidate. We also need patients who don’t require continuous monitoring: If a patient has telemetry monitoring or if they’re in the intensive care unit or a step-down unit, they would not be a good candidate. Our team works every day with caregivers at both Delaware hospitals to identify patients who would benefit from hospital care at home. Q. What are common sicknesses that can be treated at home? The first 20 patients we admitted into this program had 20 different diagnoses. But after treating more than 500 patients, the most common diagnoses that we see are cellulitis, sepsis, pneumonia, chronic obstructive pulmonary disease (COPD) exacerbation and congestive heart failure. Hospital at home may not be the solution for all patients, but in many cases it can help patients get better quicker and in a place where they most feel comfortable. As ChristianaCare strives for greater access to care, home may be where the health is.

#Expert Q&A: Amid the Wildfire Haze, NJIT's Alexei Khalizov Explains What's in the Air
The soot that permeated the air in New Jersey and New York this summer — courtesy of massive wildfires in Canada — is exactly what a New Jersey Institute of Technology professor is studying to determine its impact on climate change. Alexei Khalizov, an associate professor of chemistry and environmental science, is partnering with Associate Professor Gennady Gor on the three-year project, which began last year and is supported by a $620,000 grant from the National Science Foundation. Specifically, they’re examining the soot created by wildfires and the burning of fossil fuels in hopes of better predicting its impact on climate. Khalizov, who’s been at NJIT since 2013, took time out from his research to explain what millions of residents of N.J. and N.Y. are experiencing as a result of the wildfires hundreds of miles to the north. Q: What’s in the smoke? Small particles and some gas chemicals. These particles and chemicals were released by wildfires and they were picked up by the air mass and carried all the way to New Jersey from Canada. Those particles are extremely small: you can stack maybe a hundred of such particles across single human hair thickness. Q: Is breathing it the equivalent of smoking a pack of cigarettes? That would be a reasonable comparison. A cigarette is made of plant material. When it smolders and burns it releases particles that are very much like those particles from wildfires. Maybe the only difference is that the wildfires have no nicotine. But they have lots of other chemicals. Q: What factors contribute to the density of the smoke? Well, it's a major wildfire. It covers a huge territory in Canada. And the meteorology is such that this smoke is carried all the way from Canada to the U.S. without significant dilution. And due to that, the concentration of those particles is very high. Q: When did we last experienced something of this magnitude? We had some Canadian and Alaskan wildfires a few years ago. And air mass transport brought the smoke all the way to New Jersey, but it wasn't as bad as what we are observing today. Q: What about in terms of EPA standards? The Environmental Protection Agency has a list of criteria pollutants. One of those pollutants is particles smaller than 2.5 microns. And typically, if the concentration of those particles exceeds 35 micrograms per cubic meter, the air is considered unhealthy. When I looked at the map of pollution today (June 7, 2023), it showed that throughout the majority of New Jersey, the concentration is around 90 micrograms, which is two to three times higher than this unhealthy threshold. And actually, there is a location, I believe it's around Paterson, where the concentration is 140 micrograms, which is four to five times above the threshold. Q: Can the wildfires in South Jersey be contributing to this? It's possible, but probably it's not a major contribution. Also, if you look at the wind pattern, it's probably not a major factor at all. Q: Why is wind unable to disperse the smoke? For the wind to disperse the smoke, one needs to mix clean air with all this contaminated air and the amount of contaminated air is so high that there’s no clean air around to actually produce any dilution. Q: Why is there so much haze? It’s because of the continuous inflow of air, which is contaminated by emissions from the wildfires. The haze itself has a relatively short lifespan. Q: How does temperature change affect the smoke? If the temperature increases that may accelerate the rate of some chemical reactions that will also be accelerated by the sunlight. And that's one reason why the smoke that was released in Canada is not exactly the same smoke that we experience in New Jersey. As this haze is traveling over three to six hundred miles, it undergoes a number of chemical reactions and even the smell changes. You know how freshly released wood smoke smells — it's actually pretty pleasant. What we're smelling now, it's not pleasant at all. That's the result of those chemical reactions, which makes this even more unhealthy. Q: Will rain immediately clear the smoke? Yes, it will. If we could have rain, then the rain would remove the majority of these particles. And in fact, I believe we've been experiencing the smoke for several days, almost a week now. It would go up and decrease. And we’ve had several rains and those rains did really clear out some smoke. Q: What can we do individually and collectively to protect ourselves? We can help ourselves by staying indoors and wearing masks if you have to go outside. Certainly, exercising outside is not a good idea even while wearing a mask. Also, if you have a central air conditioning system, you can turn on the fan to run the air through the filter, which will remove some of these particles. It depends on what kind of filter you have — high efficiency or regular. Q: What kind of mask? Make sure that it's an N-95 mask, not a surgical mask. A surgical mask is not is not going to help you at all. Q: How does what we’re experiencing relate to your research? My collaborator and I received a major grant from the National Science Foundation to study the particles released by combustion. As they travel through the air, they change both in shape and in composition. And these changes affect their toxicity and they affect their impact on climate. These particles actually are one of the warming agents. So, we hope that within about three years of working on this project, we’ll be able to explain better what happens and then modelers will be able to predict the impacts of such events with better accuracy. Looking to know more - we can help. Alexei Khalizov is available to discuss this important topic with media - simply click on his icon now to arrange an interview.

Almost 30% of adults in the U.S. lack basic numeracy skills, meaning they don’t understand simple processes like counting, arithmetic and calculating percentages. Two professors from Georgia Southern University’s College of Education (COE) are part of a collaborative effort, funded by a $3 million National Science Foundation (NSF) grant, aimed at improving that number for upcoming generations. Sam Rhodes, Ph.D., assistant professor of elementary mathematics education, and Antonio Gutierrez de Blume, Ph.D., professor in curriculum, foundations and reading, were awarded almost $400,000 of the overall sum. In collaboration with researchers from the University of Minnesota and the University of Pennsylvania and CueThink, an online application focused on improving math problem-solving and collaboration skills, they will help middle school students better understand numerical and mathematical concepts. “The grant is important to me because I am passionate about helping students improve their abilities to engage in mathematical problem solving,” said Rhodes. “Collaborating with peers to tackle challenging problems is one of the most exciting parts of learning and doing mathematics. I want to work to bring these experiences to all students in ways that are engaging and that effectively support their learning of mathematics.” The four-year grant is part of NSF’s Discovery Research preK-12 program. Georgia Southern’s COE offers students multiple program opportunities, including undergraduate and more than 30 graduate program options that span campus locations in Savannah, Statesboro and Hinesville, and online. Programs offered by the COE prepare future teachers, school psychologists, counselors, school library media specialists, instructional technologists, researchers and leaders through intensive field experiences, cutting-edge technology and research-based instruction. Interested in knowing more? To arrange an interview with Sam Rhodes or Antonio Gutierrez de Blume simply connect with Georgia Southern's Director of Communications Jennifer Wise at jwise@georgiasouthern.edu to arrange an interview today.

The National Science Foundation announced a $2.5 million award supporting Georgia Southern University researchers in addressing high-demand workforce needs in information technology and computer science fields. The funded project, “Enabling Lifelong Success in an Information Technology Workforce,” adapts and evaluates evidence-based student support activities within the IT Department, one of the units in the Allen E. Paulson College of Engineering and Computing. The goal of the project is to identify a group of highly qualified students and to render 161 scholarships over a six-year period in an effort to increase student retention and graduation rates. “This is great news for the IT program at Georgia Southern, and it will provide a positive impact to the surrounding area as businesses’ needs for IT professionals increase,” said interim Vice President of Research and Economic Development Chris Curtis, Ph.D. Georgia Southern Professor and Department of Information Technology Chair Yiming Ji, Ph.D., is taking the lead on the grant, which, he noted, has the potential to have a profound impact on students. “This project will train a pool of talented students, especially those with financial needs, and prepare them for successful careers in IT,” said Ji. “With scholarships from the grant, students will have time to focus on studying, instead of having to work to make ends meet. These students will also receive dedicated support, including academic advising, research opportunities, internship and career service and much more. The result is that these students will become confident and have a greater future in IT careers.” The project involves four researchers, including Lei Chen, Ph.D., (co-PI), professor of IT; Hayden Wimmer, Ph.D., (co-PI), associate professor of IT; Elise Cain, Ph.D., (co-PI), assistant professor of leadership’ and Kania Greer, Ed.D., (external evaluator), program coordinator of the Center for STEM education. The project also received support from the Allen E. Paulson College of Engineering and Computing (CEC) and the Georgia Southern Office of Research. The national and regional demand for computer and IT professionals remains high. “This project will directly benefit our local, regional and national economies,” said CEC Dean Craig Harvey, Ph.D. “High-tech industries are already in and being attracted to the Savannah area, and the locations of Georgia Southern University’s campuses provide unique opportunities to train high-quality computing and IT professionals who are in high demand.” The Department of Information Technology offers a wide range of undergraduate and graduate computer and IT programs at Georgia Southern, in addition to a new Ph.D. program in applied computing. This grant is the first of its kind to be received by the IT department. The department hopes that through the use of this grant, they will build stronger partnerships with businesses and federal or state government organizations, among others. Interested in knowing more? To arrange an interview with Yiming Ji or Chris Curtis, simply connect with Georgia Southern's Director of Communications Jennifer Wise at jwise@georgiasouthern.edu to arrange an interview today.

Fort Stewart, Georgia Southern University, sign agreement to offer graduate courses on base
Representatives from Georgia Southern University and Fort Stewart signed an official memorandum of understanding Monday morning that clears the way for Georgia Southern to offer a slate of in-demand graduate courses this fall on base. Col. Manuel F. Ramirez, garrison commander at Fort Stewart and Hunter Army Airfield, and Kyle Marrero, president of Georgia Southern University, signed the agreement in front of officials from both organizations and the University’s live bald eagle mascot, Freedom, at Fort Stewart’s SFC Paul R. Smith Army Education Center. With the agreement, Georgia Southern will offer the following degree programs at the Fort Stewart education center: Master of Business Administration Master of Health Administration Master of Science in Information Technology Master of Arts in Professional Communication and Leadership Professional Communication and Leadership – Graduate Certificate Cybercrime – Graduate Certificate “This is truly a collaboration that will provide incredible opportunities and possibilities for our soldiers and our family members here on the installation,” Ramirez said. “Here at Fort Stewart, we’ve always believed in investing in our most precious asset, which is our people. And today, this partnership is a shining testament to that belief. By adding Georgia Southern to our stable of schools here at the Education Center, we’re opening doors to advanced education, professional development, and then a brighter future for all of our soldiers and their family members and all those people who call Fort Stewart home.” This partnership allows soldiers and their families to advance their career prospects by equipping them with the knowledge and skills and preparing them for increased responsibility in the Army and afterward, he said. Marrero said the courses that will be offered are being configured to allow Soldiers and their families to pursue higher education without disrupting their duties or relocating, allowing them to strike a healthier work-life balance while investing in their personal growth. Marrero thanked the teams behind the agreement and noted that the courses will be flexible and compressed to meet the unique needs of military-connected students. Marrero said this partnership between Fort Stewart and Georgia Southern University creates valuable educational opportunities for soldiers and their families by offering accessible and high-quality master’s degree programs on-site. It’s also the latest example of the University’s commitment to the military, which has led to Georgia Southern being named a “Military-Friendly” school for six years in a row and a “Gold School” for 2023-2024 by Viqtory Media, publisher of G.I. Jobs, STEM Jobs and Military Spouse magazines. “We have had a rich history of partnerships here,” Marrero said. “For us, this is a beginning and a continuation of the belief in the transformational power of education. We are proud and excited to be your partner. Thank you so much for this opportunity.” To learn more about the courses being offered at Fort Stewart and Hunter Army Airfield, visit think link below: If you are interested in knowing more about this partnership or would like to speak with Kyle Marrero, president of Georgia Southern University, simply contact Georgia Southern's Director of Communications Jennifer Wise at jwise@georgiasouthern.edu to arrange an interview today.

ChristianaCare Chief Information Security Officer Anahi Santiago Receives Prestigious Routhy Award
ChristianaCare Chief Information Security Officer Anahi Santiago, EMBA, CISM, has been recognized with the Routhy Award, which honors one cybersecurity professional each year who delivers a profound impact within health care and the information security profession. Health-ISAC awards the Routhy. The organization is a global, member-driven non-profit that offers health care stakeholders a trusted community and forum for coordinating, collaborating and sharing vital physical and cyber threat intelligence and best practices with each other. “Anahi is a one-of-a-kind CISO,” said Randy Gaboriault, MS, MBA, chief digital and information officer and senior vice president at ChristianaCare. “Deeply committed to information security, Anahi shares her wide breadth of experience and knowledge with peers throughout the country as a gifted speaker, mentor and content expert. She is expert in the field of cybersecurity. “Through her deep commitment to patient safety and information security she has led the implementation of some of the most progressive cybersecurity safeguards within ChristianaCare. This is a well deserved recognition and we are thrilled to see Anahi receive this prestigious award.” Santiago is recognized as one of the nation’s foremost cybersecurity experts. With over 20 years in Information Technology, Santiago has extensive experience in areas of cybersecurity, privacy, regulatory compliance, program management and infrastructure services. “We are so pleased to have the opportunity to recognize Anahi for her years of leadership and selfless contributions to the community of Health-ISAC, affiliated associations, and the health care sector,” said Denise Anderson, Health-ISAC president and CEO. “She has been instrumental in sharing with her peers, participating in initiatives, and mentoring others in the sector. She is absolutely deserving of the Routhy Award. Congratulations, Anahi, and thank you!” Santiago has overall responsibility for the organization’s information security program and strategic direction. She leads a team of high performing information security professionals in supporting ChristianaCare’s strategic initiatives. Santiago does this by partnering with the business leaders and managing risks, implementing policies and controls, and generating overall awareness. “For years, I’ve watched people that I admire receive this award,” Anahi Santiago said. “I want to credit this amazing Health-ISAC organization for advancing cybersecurity in health care, protecting patients and bringing cybersecurity professionals together to share, innovate and protect.”

Bioenergy experts welcome commitment to sustainability in UK’s new Biomass Strategy
New strategy outlines role of biomass in UK’s transition to net zero, with sustainability as major theme Supergen Bioenergy Hub experts worked with government departments to provide scientific evidence and insight They welcome the holistic view of sustainability in the Biomass Strategy and call for action to deliver its ambitions. A group of bioenergy experts have welcomed the Government’s new UK Biomass Strategy, but say urgent action is now vital to shape its ambitions into deliverable policies. Researchers at the Supergen Bioenergy Hub - led by Aston University - worked closely with government departments to provide scientific evidence to inform the strategy, which outlines the role biomass will play in supporting the UK’s transition to net zero and how this will be achieved. Professor Patricia Thornley, who leads the Hub, says: “This is a comprehensive and considered biomass strategy that, rightly, places sustainability at the heart of UK bioenergy development. The challenge is now to produce actions that can deliver the sustainable system of biomass required to achieve net zero.” Sustainability is a major theme within the new strategy. It includes a review of how existing sustainability policies could be improved, as well as a commitment to developing a cross-sectoral sustainability framework (subject to consultation) to ensure sustainability across the many different applications of biomass. This follows previous work led by Dr Mirjam Rӧder, Systems Topic Group Lead in the Supergen Bioenergy Hub, calling for harmonised sustainability standards across different biomass applications, which is referenced in the strategy. Dr Rӧder says: “We need rigorous approaches to sustainability governance that go beyond emissions. Considering wider environmental, social and economic trade-offs is essential for true sustainability and building trust in bioenergy projects.” The strategy considers the amount of biomass resource that might be available to the UK in the future, highlighting the importance of both imported and domestically produced biomass resources. Professor Thornley comments: “It is important that the strategy recognises the potential of imported as well as indigenous biomass in achieving global greenhouse gas reductions. Sustainable systems should grow, convert and use biomass in the locations where they can deliver most impact, ensuring we take account of all supply chain emissions. We shouldn’t shy away from imports where the source is sustainable and the overall system makes environmental, economic and social sense.” The strategy also considers how biomass should be prioritised across a variety of applications to best support the transition to net zero. Biomass applications ranging from transport fuels and hydrogen to domestic and industrial heating are recognised as important, but in the medium to long term the focus is on integration of bioenergy with carbon capture and storage (BECCS). BECCS is an emerging technology where the CO2 that may be released during the production and use of electricity, fuels or products derived from biomass is captured and stored, potentially resulting in negative emissions. Professor Thornley comments: “The priority use framework outlined in the Biomass Strategy makes eminent sense. The UK (and the global energy system) needs carbon dioxide removals to deliver net zero. BECCS has an absolutely key role to play, as reflected in the strategy. Again, while this is encouraging to see, we must not underestimate the challenges of moving towards such a radically different system at scale.” “Relying on future BECCS deployment alone to counterbalance the current excess of greenhouse gas emissions would not enable the full potential and benefits of BECCS. BECCS should be deployed alongside measures to transition away from the use of fossil fuels, not instead of them,” adds Dr Joanna Sparks, Biomass Policy Fellow at the Supergen Bioenergy Hub, who engaged closely with government departments as they developed the strategy. Dr Sparks led an extensive policy engagement and knowledge transfer process to ensure that those developing the strategy had full access to the breadth and depth of UK scientific and engineering academic expertise, ensuring a robust, independent scientific base. Professor Thornley believes continued engagement between policymakers, academics and the wider sector is vital in achieving the next steps in the delivery of the Government’s strategy. She says: “The key to successful long-term results is a close partnership between academia, industry and policy stakeholders so that we can anticipate problems and plan the pathways to success.”

Optical research illuminates a possible future for computing technology
Nathaniel Kinsey, Ph.D., Engineering Foundation Professor in the Department of Electrical and Computer Engineering (ECE), is leading a group to bring new relevance to a decades-old computing concept called a perceptron. Emulating biological neuron functions of the messenger cells within the body’s central nervous system, perceptrons are an algorithmic model for classifying binary input. When combined within a neural network, perceptrons become a powerful component for machine learning. However, instead of using traditional digital processing, Kinsey seeks to create this system using light with funding from the Air Force Office of Scientific Research. This “nonlinear optical perceptron” is an ambitious undertaking that blends advanced optics, machine learning and nanotechnology. “If you put a black sheet outside on a sunny day, it heats up, causing properties such as its refractive index to change,” Kinsey said. “That’s because the object is absorbing various wavelengths of light. Now, if you design a material that is orders of magnitude more complex than a sheet of black plastic, we can use this change in refractive index to modify the reflection or transmission of individual colors – controlling the flow of light with light.” Refractive index is an expression of a material’s ability to bend light. Researchers can harness those refractive qualities to create a switch similar to the binary 1-0 base of digital silicon chip computing. Kinsey and collaborators from the U.S. National Institute of Standards and Technology, including his former VCU Ph.D. student Dhruv Fomra, are currently working to design a new kind of optically sensitive material. Their goal is to engineer and produce a device combining a unique nonlinear material, called epsilon-near-zero, and a nanostructured surface to offer improved control over transmission and reflection of light. Kinsey’s prior research has demonstrated that epsilon-near-zero materials combine unique features that allow their refractive index to be modified quite radically – from 0.3 to 1.3 under optical illumination – which is roughly equivalent to the difference between a reflective metal and transparent water. While an effective binary switch, the large change in index requires a lot of energy (~1 milli-Joules per square centimeter). By combining epsilon-near-zero with a specifically designed nanostructure exhibiting surface lattice resonance, Kinsey hopes to achieve a reduction in the required energy to activate the response. The unique response of a nanostructure exhibiting surface lattice resonance allows light to effectively be bent 90 degrees, arriving perpendicular to the surface while being split into two waves that travel along the surface. When a large area of the nanostructure is illuminated, the waves traveling along the surface mix, where they interfere constructively or destructively with each other. This interference can produce strong modification to reflection and transmission that is very sensitive to the geometry of the nanostructure, the wavelength of the incident light and the refractive index of the surrounding materials. The mixing of optical signals along the surface can also selectively switch regions of the epsilon-near-zero material thereby performing processing operations. A key aspect of Kinsey’s work is to build nonlinear components, like diodes and transistors, that use optical signals instead of electrical ones. Transistors and other traditional electronic components are nonlinear by default because electrical charges strongly interact with each other (for example, two electrons will tend to repel each other). Creating optical nonlinear components is challenging because photons do not strongly interact, they just pass through each other. To correct for this, Kinsey employs materials whose properties change in response to incident light, but the interaction is weak and thus requires large energies to utilize. Kinsey’s device aims to reduce that energy requirement while simultaneously shaping light to perform useful operations through the use of the nanostructured surface and lightwave interference. The United States Department of Defense sees optical computing as the next step in military imaging. Kinsey’s work, while challenging, has potential to yield an enormous payoff. “Let’s say you want to find a tank within an image,” Kinsey said, “Using a camera to capture the scene, translate that image into an electrical signal and run it through a traditional, silicon-circuit-based computer processor takes a lot of processing power. Especially when you try to detect, transfer, and process higher pixel resolutions. With the nonlinear optical perceptron, we’re trying to discover if we can perform the same kinds of operations purely in the optical domain without having to translate anything into electrical signals.” Linear optical systems, like metasurfaces and photonic integrated circuits, can already process information using only a fraction of the power of traditional tools. Building nonlinear optical systems would expand the functionality of these existing linear systems, making them ideal for remote sensing platforms on drones and satellites. Initially, the resolution would not be as sharp as traditional cameras, but optical processing built into the device would translate an image into a notification of tanks, troops on the move, for example. Kinsey suggests optical-computing surveillance would make an ideal early warning system to supplement traditional technology. “Elimination or minimization of electronics has been a kind of engineering holy grail for a number of years,” Kinsey said, “For situations where information naturally exists in the form of light, why not have an optical-in and optical-out system without electronics in the middle?” Linear optical computing uses minimal power, but is not capable of complex image processing. Kinsey’s research seeks to answer if the additional power requirement of nonlinear optical computing is worthwhile given its ability to handle more complex processing tasks. Nonlinear optical computing could be applied to a number of non-military applications. In driverless cars, optical computing could make better light detection and ranging equipment (better known as LIDAR). Dark field microscopy already uses related optical processing techniques for ‘edge detection’ that allows researchers to directly view details without the electronic processing of an image. Telecommunications could also benefit from optical processing, using optical neural networks to read address labels and send data packets without having to do an optical to electrical conversion. The concept of optical computing is not new, but interest (and funding) in theory and development waned in the 1980s and 1990s when silicon chip processing proved to be more cost effective. Recent years have seen many advancements in computing, but the more recent slowdown in scaling of silicon-based technologies have opened the door to new data processing technologies. “Optical computing could be the next big thing in computing technology,” Kinsey said. “But there are plenty of other contenders — such as quantum computing — for the next new presence in the computational ecosystem. Whatever comes up, I think that photonics and optics are going to be more and more prevalent in these new ways of computation, even if it doesn’t look like a processor that does optical computing.” Kinsey and other researchers working in the field are in the early stages of scientific exploration into these optical computing devices. Consumer applications are still decades away, but with silicon-based systems reaching the limit of their potential, the future for this light-based technology is bright.







