Did you know?

The ANZCTR now automatically displays published trial results and simplifies the addition of trial documents such as unpublished protocols and statistical analysis plans.

These enhancements will offer a more comprehensive view of trials, regardless of whether their results are positive, negative, or inconclusive.

The safety and scientific validity of this study is the responsibility of the study sponsor and investigators. Listing a study does not mean it has been endorsed by the ANZCTR. Before participating in a study, talk to your health care provider and refer to this information for consumers
Trial registered on ANZCTR


Registration number
ACTRN12621000865819
Ethics application status
Approved
Date submitted
10/05/2021
Date registered
5/07/2021
Date last updated
5/07/2021
Date data sharing statement initially provided
5/07/2021
Date results information initially provided
5/07/2021
Type of registration
Retrospectively registered

Titles & IDs
Public title
The effect of emotional expression in a digital human and gender on psychological and physiological outcomes in healthy adults
Scientific title
The effect of emotional expression in a digital human and gender on psychological and physiological outcomes in healthy adults
Secondary ID [1] 303976 0
None
Universal Trial Number (UTN)
U1111-1266-9704
Trial acronym
Linked study record

Health condition
Health condition(s) or problem(s) studied:
Loneliness 321590 0
Stress 321591 0
Condition category
Condition code
Mental Health 319331 319331 0 0
Other mental health disorders

Intervention/exposure
Study type
Interventional
Description of intervention(s) / exposure
Participants were block-randomised by gender to interact with one of six versions of a digital human that varied in terms of their face type (no face/ neutral face/ emotional face) and voice type (neutral voice/ emotional voice):
1. No face, neutral voice
2. No face, emotional voice
3. Neutral face, neutral voice
4. Neutral face, emotional voice
5. Emotional face, neutral voice
6. Emotional face, emotional voice
A digital human is a type of embodied conversational agent with a humanlike embodiment and animation (based on a real human), that includes artificial intelligence for emotional intelligence (e.g., classifiers of emotional expression in a users face). The digital human in this study was a mixed race, young adult female based on a real person that used a finite state conversation engine (i.e., pre-programmed language) and responded through speech using pre-recorded voice clips from the human model.

The interaction involved completing the Relationship Closeness Induction Task (RCIT; Sedikides et al., 1999) with the digital human. The RCIT is a structured conversation task that involves reciprocal self-disclosure in response to 28 questions which gradually increase in intimacy (e.g., from "what is your name?" to "describe the last time you felt lonely"). The participant took turns at asking and answering personal questions from the RCIT with a digital human. The RCIT has been shown to reliably induce a moderate sense of closeness between human strangers in experimental psychology research, and it has been associated with improvements in wound healing (Robinson et al., 2013).

The intervention took place on a laptop computer in a private clinic room with a researcher available (PhD student) in another room to seek help from if needed. Participants completed one 15-minute digital human interaction as part of a 60-minute appointment at the University of Auckland Clinical Research Centre. Audiovisual data were recorded while participants interacted with the digital human which was transcribed and analysed. This data indicated that all participants completed the interaction. The digital human was designed and built specifically for this study and is not readily available to the public.
Intervention code [1] 320285 0
Behaviour
Intervention code [2] 320927 0
Treatment: Devices
Comparator / control treatment
For the purposes of this form, the comparator condition is:
1. No face, neutral voice

This involves interacting with a voice-only black screen that uses a neutral intonation (i..e, no emotional expression in the voice).
Control group
Active

Outcomes
Primary outcome [1] 327200 0
Loneliness (100mm visual analogue scale)
Timepoint [1] 327200 0
T1: Baseline
T2: Immediately post-intervention completion (primary endpoint).
Secondary outcome [1] 394167 0
Closeness (RCIT closeness scale; Sedikides et al., 1999)
Timepoint [1] 394167 0
T2: Immediately post-intervention completion.
Secondary outcome [2] 394168 0
Stress (100mm visual analogue scale)
Timepoint [2] 394168 0
T1: Baseline
T2: Immediately post-intervention completion.
Secondary outcome [3] 394169 0
Social support (100mm visual analogue scale)
Timepoint [3] 394169 0
T1: Baseline
T2: Immediately post-intervention completion.
Secondary outcome [4] 394170 0
Caring perceptions scale (Brave, Nass, & Hutchinson, 2005)
Timepoint [4] 394170 0
T2: Immediately post-intervention completion.
Secondary outcome [5] 394172 0
Emotional content in participant language during the interaction (Linguistic Inquiry and Word Count Software [LIWC], Pennebaker et al., 2015)
Timepoint [5] 394172 0
During conversation
Secondary outcome [6] 394173 0
Heart rate (average; Empatica E4 device)
Timepoint [6] 394173 0
During interaction
Secondary outcome [7] 394174 0
Electrodermal activity (average; Empatica E4 device)
Timepoint [7] 394174 0
During interaction
Secondary outcome [8] 394176 0
Skin temperature (average; Empatica E4 device)
Timepoint [8] 394176 0
During interaction
Secondary outcome [9] 394179 0
Reasons for feeling or not feeling closeness towards the digital human (open-ended, written qualitative question)
Timepoint [9] 394179 0
T2: Immediately post-intervention completion.
Secondary outcome [10] 394181 0
Reasons for feeling or not feeling willing to seek emotional support from the digital human in future (open-ended, written qualitative question)
Timepoint [10] 394181 0
T2: Immediately post-intervention completion.

Eligibility
Key inclusion criteria
Adults aged 18 years or older with English fluency.
Minimum age
18 Years
Maximum age
No limit
Sex
Both males and females
Can healthy volunteers participate?
Yes
Key exclusion criteria
None

Study design
Purpose of the study
Treatment
Allocation to intervention
Randomised controlled trial
Procedure for enrolling a subject and allocating the treatment (allocation concealment procedures)
Block-randomisation of participants by gender was conducted by a member of the research team who was not involved in data collection. Allocations were concealed from the researcher involved in data collection in opaque envelopes. The researcher remained blinded to the participant's allocation until opening the envelope immediately prior to starting the appropriate computer program for the participant. Although the participant was de-blinded to their condition upon starting their interaction, they remained unaware of what the digital humans in the other experimental conditions were like.
Methods used to generate the sequence in which subjects will be randomised (sequence generation)
A randomisation table was generated using Research Randomizer software that block-randomised participants by gender. This randomisation was performed by a member of the research team who did not interface with participants.
Masking / blinding
Blinded (masking used)
Who is / are masked / blinded?
The people receiving the treatment/s
The people administering the treatment/s

Intervention assignment
Parallel
Other design features
Phase
Not Applicable
Type of endpoint/s
Efficacy
Statistical methods / analysis

Recruitment
Recruitment status
Completed
Date of first participant enrolment
Anticipated
Actual
Date of last participant enrolment
Anticipated
Actual
Date of last data collection
Anticipated
Actual
Sample size
Target
Accrual to date
Final
Recruitment outside Australia
Country [1] 23683 0
New Zealand
State/province [1] 23683 0
Auckland

Funding & Sponsors
Funding source category [1] 308357 0
University
Name [1] 308357 0
The University of Auckland
Country [1] 308357 0
New Zealand
Funding source category [2] 308567 0
Commercial sector/Industry
Name [2] 308567 0
Soul Machines Ltd
Country [2] 308567 0
New Zealand
Primary sponsor type
University
Name
The University of Auckland
Address
Department of Psychological Medicine
The University of Auckland School of Medicine
Private Bag 92019
Auckland 1142
Country
New Zealand
Secondary sponsor category [1] 309177 0
Commercial sector/Industry
Name [1] 309177 0
Soul Machines Ltd
Address [1] 309177 0
Soul Machines Ltd
106 Customs Street West
Auckland CBD
Auckland 1142
Country [1] 309177 0
New Zealand

Ethics approval
Ethics application status
Approved
Ethics committee name [1] 308328 0
The University of Auckland Human Participants Ethics Committee (UAHPEC)
Ethics committee address [1] 308328 0
The University of Auckland Human Participants Ethics Committee
The University of Auckland Research Office
Private Bag 92019, Auckland 1142
Ethics committee country [1] 308328 0
New Zealand
Date submitted for ethics approval [1] 308328 0
18/10/2018
Approval date [1] 308328 0
01/11/2018
Ethics approval number [1] 308328 0
022191

Summary
Brief summary
This study investigates whether emotional expression in a digital human during a mutual self-disclosure conversation influences psychological and physiological outcomes in healthy adults. Participants were 198 adults aged 18 years or older with English fluency. Participants were block-randomized by gender to one of six conditions in which the digital human's design varied in terms of emotional expression and whether a face was present or not (i.e., neutral/ emotional voice; no/ neutral/ emotional face). Participants engaged in a 15-minute mutual self-disclosure conversation with the digital human (called the Relationship Closeness Induction Task; Sedikides et al., 1999) as part of one one-hour appointment at the University of Auckland Clinical Research Centre. As part of the appointment, participants also completed a baseline and follow-up questionnaire on demographic and psychological variables. Participants wore an Empatica E4 sensor watch that collects heart rate, electrodermal activity, and skin temperature data during their interaction with the digital human. Participants were provided with a $20 shopping voucher as compensation for their time. It is anticipated that an emotionally expressive digital human would be associated with greater closeness and improved psychological and physiological outcomes in females. It is anticipated that males will report greater closeness and improved outcomes with a neutral face and neutral voice digital human.
Trial website
Trial related presentations / publications
Public notes

Contacts
Principal investigator
Name 110322 0
Prof Elizabeth Broadbent
Address 110322 0
Department of Psychological Medicine
The University of Auckland School of Medicine
Faculty of Medical and Health Sciences
The University of Auckland
Private Bag 92019, Auckland, 1142
Country 110322 0
New Zealand
Phone 110322 0
+64 93737599
Fax 110322 0
Email 110322 0
Contact person for public queries
Name 110323 0
Elizabeth Broadbent
Address 110323 0
Department of Psychological Medicine
The University of Auckland School of Medicine
Faculty of Medical and Health Sciences
The University of Auckland
Private Bag 92019, Auckland, 1142
Country 110323 0
New Zealand
Phone 110323 0
+64 93737599
Fax 110323 0
Email 110323 0
Contact person for scientific queries
Name 110324 0
Elizabeth Broadbent
Address 110324 0
Department of Psychological Medicine
The University of Auckland School of Medicine
Faculty of Medical and Health Sciences
The University of Auckland
Private Bag 92019, Auckland, 1142
Country 110324 0
New Zealand
Phone 110324 0
+64 93737599
Fax 110324 0
Email 110324 0

Data sharing statement
Will individual participant data (IPD) for this trial be available (including data dictionaries)?
No
No/undecided IPD sharing reason/comment
Neither ethics board approval nor informed consent from participants were obtained to share participant data publicly.


What supporting documents are/will be available?

No Supporting Document Provided



Results publications and other study-related documents

Documents added manually
No documents have been uploaded by study researchers.

Documents added automatically
No additional documents have been identified.