Biography
Eakta Jain is an associate professor for the Department of Computer & Information Science & Engineering at the • Herbert Wertheim College of Engineering. Eakta is interested in the safety, privacy and security of data gathered for user modeling, particularly eye tracking data. Her areas of work include graphics and virtual reality, generation of avatars, human factors in the future of work, and transportation.
Areas of Expertise (4)
Human-Robot Interaction
Virtual Reality (VR)
Computer Graphics
Eye Tracking
Media Appearances (2)
How a horse whisperer can help engineers build better robots
UF News online
2023-04-24
Humans and horses have enjoyed a strong working relationship for nearly 10,000 years — a partnership that transformed how food was produced, people were transported and even how wars were fought and won. Today, we look to horses for companionship, recreation and as teammates in competitive activities like racing, dressage and showing. Can these age-old interactions between people and their horses teach us something about building robots designed to improve our lives?
Eakta Jain, Ph.D
SIGGRAPH
2023-03-10
SIGGRAPH’s success as a dynamic professional community is, in large part, due to the work of the dedicated volunteers who bring their experience and passion to the organization. Eakta Jain, recently elected to the SIGGRAPH Executive Committee (EC), sees her new role as an opportunity to give back. “It reflects that my professional community trusts me to look out for SIGGRAPH and help steer it such that it continues to be vibrant in the years to come.”
Articles (3)
Privacy-preserving datasets of eye-tracking samples with applications in XR
IEEE Transactions on Visualization and Computer GraphicsBrendan David-John, et. al
2023-05-01
Virtual and mixed-reality (XR) technology has advanced significantly in the last few years and will enable the future of work, education, socialization, and entertainment. Eye-tracking data is required for supporting novel modes of interaction, animating virtual avatars, and implementing rendering or streaming optimizations. While eye tracking enables many beneficial applications in XR, it also introduces a risk to privacy by enabling re-identification of users.
Introducing Explicit Gaze Constraints to Face Swapping
ETRA '23: Proceedings of the 2023 Symposium on Eye Tracking Research and ApplicationsEthan Wilson, et. al
2023-05-01
Face swapping combines one face’s identity with another face’s non-appearance attributes (expression, head pose, lighting) to generate a synthetic face. This technology is rapidly improving, but falls flat when reconstructing some attributes, particularly gaze. Image-based loss metrics that consider the full face do not effectively capture the perceptually important, yet spatially small, eye regions.
Horse as Teacher: How human-horse interaction informs human-robot interaction
CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsEakta Jain, Christina Gardner-McCune
2023-04-19
Robots are entering our lives and workplaces as companions and teammates. Though much research has been done on how to interact with robots, teach robots and improve task performance, an open frontier for HCI/HRI research is how to establish a working relationship with a robot in the first place. Studies that explore the early stages of human-robot interaction are an emerging area of research.