How TILE can help you design immersive simulation activities

Immersive simulation – what and why?

In recent years, Staffordshire University has used funding from Health Education England (HEE) to develop innovative, simulation-based teaching approaches.

Simulation enables students to develop contextual and practical skills in a safe learning environment. This is particularly valuable for our healthcare students, who are learning how to work with real patients, but the applications of immersive simulation go far beyond the healthcare setting. For example, students can develop interview skills by talking to an AI-generated virtual human, experience tourist attractions across the globe through a VR headset, or navigate a virtual “crime scene” through 360-degree footage. Immersive technology can also help give online learners a richer learning experience, keeping them engaged with the course.

The HEE funding has enabled the University to invest in specialised software such as Virti and ThingLink. TILE’s Instructional Designer, Simran Cheema, has been working with academic staff across the different schools to use these tools to embed immersive activities into their courses.

The following example explains the process of designing and creating an immersive simulation activity. If you can see the benefits of developing something like this for your own course, details on how to make a start can be found at the end of this post.

Designing an immersive activity – a case study

Simran reached out to academic staff in the Institute of Policing to offer to make an immersive activity to develop incident response skills. In this activity, the “incident” would be a simulation of a vehicle collision, and students would need to take on the role of a police officer attending the scene.

Identifying needs

Before starting design work, Simran had a conversation with course staff to identify their needs from the activity and find out what skills and knowledge should be taught and tested. This was then broken down into constituent steps that would form the overall activity – including interactions with different virtual humans, and 360-degree interactive images.

Simran then identified the technology that would be required to build the activity:

  • Virti for the virtual human interactions
  • ThingLink to host the whole activity, with 360-degree interactive scenes, embedded Virti characters, and decision-making points with branches to show the consequences of those decisions.

ThingLink also allows for scoring and feedback, to keep the students engaged and on track throughout the simulation, and it also provides the opportunity to track a student’s success through the analytics.

Students would be able to access the simulation freely through a web link, and staff would be able to see analytics data on students’ engagement with the simulation.

Planning phase

Simran used a different tech tool – Miro – to plan out the activity in detail. Miro’s flowcharting function helped connect the different parts of the scenario and identify where branching and feedback points would be needed.

The students are to follow a linear path through the activity until they encounter a particular character in the simulation – at which point they have a choice of different options for what to do with them. The students do not get immediate feedback on whether their choice was right or wrong – they have to follow through one of different scenarios to show the consequence of their decision first. 

Design phase

With the flowchart complete in Miro, Simran then started to build the activity using ThingLink’s Scenario Builder. “Simulation” is one of the standard ThingLink templates, but there are also templates for other types of activity such as escape rooms.

Each part of the simulation required some media to be designed and embedded – such as the Virti virtual humans, and the crime scene images.

Simran designed the virtual humans so that students could interact with them in whatever way they preferred – they could speak out loud and the virtual human on screen could respond audibly, or there are text options to read and select from on screen.

The crime scene images – both 2D and 3D – were mainly generated by AI, using the built-in Skybox feature included in the Staffordshire University ThingLink license. However, we do have the in-house technology and simulation settings to capture and record different media to use in the scenario to make it even more immersive for students. For example, for transitions within the activity, video clips of our own simulated court room, custody suite or hospital corridors could be used instead of stock or generated media.

As well as the “action” scenes, Simran also built in space for important conversation points and opportunities to keep students on track with occasional feedback.

Throughout the design process, there was a lot of testing and refining to make sure it was working as intended, and Simran worked closely with the course team throughout.

Could your students benefit from immersive simulation?

If you have some ideas you’d like to explore around immersive simulation, please get in touch with TILE by emailing TILEHub@staffs.ac.uk, or contact Instructional Designer Simran Cheema directly (Simran.Cheema@staffs.ac.uk). We will first identify your needs through an initial consultation and decide on the best approach, and which software to use. We will then support you through the design process as outlined in the case study above.

Immersive teaching for the Institute of Policing

TILE Hub’s George Hill (Online Distance Learning Designer) and Robin Ray (Digital Curriculum Designer) were recently invited by Institute of Policing (IoP) lecturer Karen Pirrie to talk about interactive teaching and learning software.

Karen and her IoP colleagues are in the process of updating their curriculum to adapt to the needs of their learners. These are police recruits across our partnered forces (Staffordshire, Warwickshire and West Mercia Police) who may begin their studies with Staffordshire University at different points throughout the year. They want to engage these recruits – who are keen to get started on the practical aspects of policing right away – with authentic and immersive digital content, to make sure they are actively learning the essential knowledge and skills for the job.

Here’s an overview of some of the tools that TILE recommended, with examples of how they could be used for delivery on the Policing Education Qualifications Framework (PEQF) programmes.

ThingLink

ThingLink allows you to combine images, video, audio and text to turn your teaching content into a much more immersive experience. It’s quick and intuitive to use – just upload a photo (including 360-degree images) or video and add hotspot tags for the user to click or tap to open up some text, audio or video relating to one part of the image. There’s also a “scenario builder” that lets you branch media into different learning pathways so the user can choose their own route.

Karen has already developed a virtual ThingLink tour to give paramedic students an immersive experience at Staffordshire University’s Crime Scene House. Using images taken from the inside and outside of the building, clickable hotspot tags were added at every point of interest. Learners can effectively walk around and look at them as if they were there – and read more detail about each hotspot.

This sort of tour could be used for formative assessment as well as content delivery – by asking learners to add their own tags to a ‘crime scene’ and share their thoughts about each hotspot in text or audio.

Decision making is a vital skill for policing students, and ThingLink’s scenario builder can help with actively developing that skill. For example, video or audio incident role plays could be recorded, with learners having to choose what to do next at key points, as if they were responding to a real incident. Depending on what they choose, another video or audio recording would open to show how the consequences of that decision might play out.

ThingLink encourages active rather than passive learning, and you can monitor how students are engaging with the content by looking at the engagement statistics on everything you share. You can drill down into the data and see how many people have clicked on each hotspot, or the amount of time they spent watching a video clip, so you can check how your content is being used and make changes if needed.

Wooclap

Where ThingLink helps engagement with asynchronous learning content, Wooclap is designed for live interaction with synchronous lectures, whether your students are sat in front of you in a lecture theatre or watching a live stream from home.

You can upload your lecture slides to Wooclap and embed questions and polls throughout the content. There’s a huge variety of question types you can include, from the basic multiple choice and “word cloud” questions to “fill in the blanks” and “label the image”. When you present your question-enhanced slides through Wooclap, whoever is watching can scan a QR code or access the event through a URL on their own devices and answer the questions as and when they are presented. As it links to an open website there should be no issues with institutional access either – anyone with internet access can join in, and it can comfortably be shared with up to 500 people at a time. You’ll be able to see how many people are actively engaging, and see and display their answers for immediate assessment for learning. If you want to make sure people are paying attention, you could add a “Spin the Wheel” slide to choose someone in your audience at random to respond to a question.

A potentially useful question type for policing is “Find on Image” – where a photo of, say, a potential crime scene would appear on the viewers’ devices, and they could be asked to click or tap on the important part of the image. This makes the learning immediately relevant and authentic, rather than simply testing their comprehension of the content on the slides.

As with ThinkLink, you can use Wooclap’s built-in analytics to monitor engagement, with the aim of improving it. It could be especially powerful as a tool in hybrid sessions to give all learners the same experience, whether they’re with you in person or online. Everyone responds to the same questions in the same way, which can foster a better sense of belonging for remote learners, and allows the lecturer to check everyone’s understanding and deal with any misconceptions as they arise.

Virti

Virti is a powerful simulation tool, designed specifically for vocational training. It takes a bit more setting up than either ThinkLink or Wooclap, but it can be used to immerse learners safely in real world scenarios before they have to encounter them in their work. Staffordshire’s School of Health, Science and Wellbeing already use Virti for healthcare simulations, and there is obvious potential for the IoP to use the software too.

TILE demonstrated two different features of Virti – programmable “virtual humans”, and recorded video simulations.

The animated virtual humans can be used for interactive role play, by programming them to react in certain ways (either by giving them a script of different responses, or letting generative AI do the work – see below). Learners could navigate conversations with a virtual crime suspect, witness or victim by choosing from a range of different responses.

Similarly to ThingLink, Virti can also be used with uploaded videos, to present learners with recordings of real humans acting out different scenarios, with decisions to be made at key points. For example, a day of filming (perhaps in conjunction with drama and production students in Staffordshire’s School of Digital, Technologies and Arts) could lead to an immersive virtual experience for policing students, delivered safely through their screen (or even a virtual reality headset).

Like ThingLink and Wooclap, Virti has engagement analytics so you can see who is accessing the simulations, and how they’re being used.

Using Generative AI for Assessment

Although there are many concerns around generative artificial intelligence (AI) being used inappropriately in assessment, there is also huge potential for using it deliberately, consciously and ethically to assess learner’s understanding and skills.

Text-based generative AI models work by predicting answers in text to a given prompt, drawing on a very large set of data. This means that a tool such as ChatGPT could be given a ‘persona’ (e.g. a witness to a crime) and asked to respond as if they were writing from that perspective. This script could form the basis of a role play, or a piece of text for a policing recruit to analyse and discuss.

Designing assessment tasks around critiquing the outputs of generative AI is another way of encouraging students to consider both its benefits in the workplace and its limitations. For example, ChatGPT could be asked to write a press release for a fictional incident – and learners could be asked to discuss the strengths and weaknesses of the generated text, to demonstrate their understanding.

Try it for yourself

If you’re feeling inspired to use any of these tools for your own teaching at Staffordshire University, just send a quick email to the TILE Hub (TILEhub@staffs.ac.uk) letting us know what you’re interested in, and we’ll get you set up with an account.