Professor Vladlena Benson is an industry-recognised expert in cybersecurity risk management. She is a Head of the IS Group at Aston Business School. She has successfully led the Cybersecurity and Criminology Centre, University of West London. Vladlena's research is in the field of Computer Security, Compliance, Risk, Reliability and Information Systems. She is a Board member of the Information Systems Audit and Control Association Central UK and holds several cybersecurity industry awards.
Areas of Expertise (3)
Cybersecurity and Risk Management
Women in IT Awards Editor's Choice (professional)
University of Texas at Dallas: PhD, Computer Science 2001
Media Appearances (1)
Study shows current university students are safer and more responsible social media users
Vladlena Benson, associate professor in accounting, finance and informatics at the Kingston Business School, who led the research, said that the study's findings challenged current thinking about social media. "Students' use of social media has now matured - and this group is keen to access networking services to support their learning experiences," she said. "Higher education providers must not miss the opportunity to exploit the tech-savviness of learners because of a mistaken belief that the online environment is too risky or that students won't be interested in using social media for learning."
Anne-Marie Mohammed, Vladlena Benson, George Saridakis
This article seeks to gain a better understanding of how to address some of the challenges in the digital world. In order to do this, the authors presented some of the emerging issues in the psychology of human behavior and the ever changing nature of cyber threats in the digital world. They reviewed both the theories of crime (i.e., self-control and rational choice theories) and the empirical studies that have examined user behavior on social networking sites leading to victimization. Importantly, they mentioned the role of social engineering as the entry point of many of these sophisticated attacks. They went on to examine the relevance of the human element as the starting point of implementing cyber security programs in organizations as well as securing individual online behavior. Furthermore, issues that are associated with the emerging trends in human behavior research as well as ethics were also discussed. They acknowledge that much more academic attention is needed in this area to avoid the exponential growth of future information breaches.
Anne-Marie Mohammed, Bochra Idris, George Saridakis, Vladlena Benson
This chapter discusses the risk and compliance challenges arising from the growing use of information and communication technologies by firms, in particular small- and medium-sized enterprises. It argues that firms utilize technological advancements to make business transactions quicker and more efficient and enable globalization by relying on the Internet as a strategic tool. It further demonstrates that by so doing, it in turn allows for cyber security threats, which may lead to financial losses and damaged reputations.
Nataliya Shevchuk, Harri Oinas-Kukkonen, Vladlena Benson
Because of the emerging spread of smart home technologies, reaching and influencing people's behaviour is easier than ever before. Introduction of innovative information systems (IS), present in everyday life, should not ignore the users' security concerns for privacy. We believe that this notion also applies to Green IS that trigger sustainable behaviour change. To understand better the users' perceptions of sustainable persuasive smart home technologies, we inspect a case of a persuasive smart metering system. Specifically, we look at how persuasive systems design influences intention to continue using a smart metering system as well as how risk and self-disclosure affect the impact of the persuasive systems design on a smart metering system. We developed a research model and formed hypotheses by drawing on Persuasive Systems Design model and Adaption Level Theory. We used a smart metering system enhanced with the persuasive features as an illustration of a sustainable persuasive smart home technology. Results of our study provide insights relevant for the further research of security issues in sustainable persuasive smart home technologies as well as for practitioners who introduce similar technologies to the users.
Vladlena Benson, Tom Buchanan
In the wake of fresh allegations that personality data from Facebook users have been illegally used to influence the outcome of the US general election and the Brexit vote, the debate over manipulation of social big data is gaining further momentum. This chapter addresses the social data privacy and data integrity vulnerabilities threatening the future of applications based on anticipatory computing paradigms. We investigate the organic reach phenomenon on social networks known to be responsible for propagation of ‘fake’ social content, undermining social media data integrity. We describe experimental work demonstrating that the trustworthiness of a message originator and low levels of the personality trait Agreeableness in the message receiver may increase the organic reach of ‘fake’ content on social networks. These effects may have implications for policy and practise, particularly in relevance to the threat of social data manipulation for anticipatory computing applications.
John McAlaney, Vladlena Benson
Humans are social creatures. Our behaviour is influenced by our perceptions of those around us, often to a much greater degree than we realize. However, we tend to make mistakes in our understanding of those around us and the situations that we encounter. We do so because our cognitive resources have limits, meaning that we have developed systems of coming to quick conclusions based on limited information. These processes are known as heuristics. This is not a flaw; rather it is an adaptive strategy that allows us to navigate and survive in our social worlds. Nevertheless, these tendencies may lead people to engage in cybersecurity in risky ways, either as the instigators of attacks, the targets of attacks, or the cybersecurity professionals who seek to prevent and mitigate attacks. Examples of this include group dynamics in which individuals overestimate the abilities of their own group whilst underestimating the abilities of competing groups, or failing to recognize the threat of cybersecurity risks that are difficult to visualize. In ways like those used with marketing and advertising campaigns, social engineers aim to exploit these quirks of social influence and human decision making. A greater understanding of these processes will enable us to develop more informed prevention and mitigation strategies in order to address the increasing challenges we face within cybersecurity.