Older adults’ vulnerability to deception in new digital technologies

About the Project

Background. Older adults (60+ years) often have lower digital skills (Bhattacharjee et al., 2020). They are less accurate than younger adults in discriminating manipulated images from real ones (Nightingale et al., 2022) and they are more vulnerable to fake news (Moore & Hancock, 2022) and online scams (Fraud on the Elderly, 2013). Several social and cognitive factors affect older adults’ vulnerability, including (I) being too trusting, (II) social isolation, (III) psychological vulnerability and (IV) risk taking (Shao et al., 2019). Interventions have been developed to help older adults detect fake news and manipulated images (Moore & Hancock, 2022; Nightingale et al., 2022). Recent technological developments, such as AI, Deepfakes, and immersive virtual & augmented realities may leave older adults more vulnerable than younger adults to deception and manipulation using these technologies.

Methodology. The PhD will consist of multiple studies, each looking at a specific technological context.

The first study will focus on disinformation and AI-generated images. It will investigate the importance of cognitive biases (including risk taking, sensory defects and experience with the technology) and social factors (e.g. ethnicity, social networks, social isolation, trust) in accepting disinformation and AI-generated images. It will also study the efficacy of interventions (Nightingale et al. 2022) that intend to help people detect disinformation and image manipulation. 

The second study will focus on the detection of manipulated video (both AI-generated “deepfakes” and simpler manipulations, such as false subtitles). The study will investigate participants familiarity with these manipulations, factors that predict their ability to detect them (including those in work package 1) and test the efficacy of Nightingale et al.’s (2022) intervention in helping participants detect them.

The third study will focus on participants attitudes to immersive technologies and manipulated content. Do older people perceive immersive technology as ‘real’ and what would they consider ‘fake’ or ‘manipulated’ in such virtual or augmented spaces?

Supervisors

From Psychology

Dr Lara Warmelink (Director of Studies)

Dr Sophie Nightingale

Prof Trevor Crawford

From Health Research

Dr Faraz Ahmed

Candidate requirements

Candidates should have

a) a good undergraduate degree in Psychology, Computer Science, Communication or a cogent discipline

b) experience with data collection and/or data analysis.

Desirable:

To have worked with older adults

For more information, please contact Dr Lara Warmelink at ,

Department of Psychology, Lancaster University.

Apply by 31st May via the Lancaster University admissions portal and here: https://lancasteruni.eu.qualtrics.com/jfe/form/SV_cCQ1NGkF8yobCCO

To help us track our recruitment effort, please indicate in your email – cover/motivation letter where (globalvacancies.org) you saw this job posting.

Share
Published by

Recent Posts

4 mins ago

13 mins ago

Structural Engineer – Team Leader

Job title: Structural Engineer - Team Leader Company Mana Resourcing Job description JOB TITLE: Chartered…

14 mins ago

Secretary III (RFT 1.0 FTE)

Job title: Secretary III (RFT 1.0 FTE) Company The Royal Job description Duties: Provide administrative…

15 mins ago

ML Engineer

Job title: ML Engineer Company Sanofi Job description project teams and stakeholders. Key Functional Requirements…

38 mins ago

Payroll Specialist

Job title: Payroll Specialist Company Robert Half Job description We are offering a long term…

49 mins ago
If you dont see Apply Link. Please use non-Amp version