Visit StickyLock

Canterbury University Makes VR Training for Child Educators

Canterbury University Makes VR Training for Child Educators
Canterbury University explores VR use in early childhood teacher preparation

A recent research conducted by the University of Canterbury (UC) is creating vital training opportunities for student teachers specialising in early childhood education who are not allowed to work with children from birth to six months of age while undergoing their training.

Although many aspiring teachers have a strong desire to work with young children (0–6 months), it might be challenging to get practical experience with newborns as part of their certification. For this reason, the UC team developed specialised virtual reality training settings.

The project was started by Professor Jayne White, who saw a problem for student teachers who did not have easy access to real-life babies as part of their certification. She contacted HIT Lab NZ to see whether they were interested in working together.

Heide Lukosch, Associate Professor, who also leads the HIT Lab NZ’s Applied Immersive Game Initiative (AIGI), was immediately drawn to the proposal. The initiative seeks to improve personal, social, educational, and health-related outcomes by accelerating research and public use of immersive gaming applications.

Next, under the direction of Professor White, the two colleges conducted the first testing of a VR prototype with support from the University’s Child Well-being Research Institute.

Virtual reality (VR) has shown to be a successful method for teaching practical skills in a variety of academic fields, but its educational applications have not yet reached their full potential, according to Professor White. The company was thrilled about the tool’s potential and the opportunities it presents for further development.

Professor White put out the concept of using virtual reality (VR) in early childhood education because, while babies can communicate verbally and nonverbally from birth, it may sometimes be challenging for non-familial adults to interpret their indications, added Associate Professor Lukosch.

By giving them the assistance they need to comprehend the cues or signals they provide, the team will be able to more successfully identify and practise a grammar that will fulfil their requirements.

With the variety of scenarios in daily life where more than verbal communication is required, the development of a VR tool of this kind would be immensely beneficial.

Under the direction of  Dr. Ngaroma Williams, a UC Senior Lecturer, Professor White and Associate Professor Lukosch are now co-leading the research, which is based on the Mātauranga Māori ideology of whanaungatanga. Finding the crucial elements needed to promote relationship skills in adults handling babies and incorporating them into creative training contexts is the aim of this project.

According to Associate Professor Lukosch, the potential and strength of virtual worlds to assist individuals in circumstances that would be otherwise difficult to reach or maybe harmful definitely piques their interest.

In this instance, the goal is to provide an immersive, virtual world where users may really feel present and responsible for their choices in certain circumstances, along with possibilities for introspection.

Haptic-equipped gloves will be a crucial component of these virtual worlds and interactions with young children. The gloves include a mechanical mechanism that replicates the resistance that a person would really experience while gently caring for a baby, whether it be for feeding, changing, soothing, or entertainment purposes.

One of the first VR training prototypes put users in charge of creating a pleasant relationship, deciphering nonverbal clues, and reacting to the wishes or preferences of the virtual child.

It is possible to program preferences or personas into an avatar of a virtual baby. The research will allow them to create an intelligent system that will determine how the user’s “baby” will react as they choose what actions to take in their interactions with the virtual baby.

For instance, the team might encode a predilection for red such that the infant would grin when the user picked up a red toy. Associate Professor Lukosch said, it would start sobbing if they choose a yellow toy or anything like.

There may be future uses for this developing technology, even if it isn’t artificial intelligence since the research component needs the team to have control over the system for user studies.

It’s clear from these training situations that artificial intelligence integration might be used in the future to create very adaptable learning environments.

The UC team is also collaborating closely with Dr. Niki Newman, the head of the University of Otago Simulation Centre in Christchurch, and Professor Tony Walls, who are providing guidance for the study’s healthcare components.

The team expects that the basic design principles for interaction established from the research may be applied to additional validated training environment areas as the work moves towards commercialisation over the following three years.

Join the Discussion


Visit StickyLock
Back to top