Computing Research Competition

This summer the School of Computing Science and Digital Media is running a research competition with the aim of involving students in academic research. The research competition is open to all students, entries for this are now open and close on Friday 23rd June. Entries should be submitted through the following link:

PLEASE ENTER THE RESEARCH COMPETITION THROUGH THIS LINK

Competition winners will be expected to work with academics over the summer period and to then give a small presentation of their work at the end of summer before teaching begins in Semester 1. Winners will also receive a £500 prize for taking part in this work. This is a great opportunity to get involved in research that is happening in the school and also to improve your CV. Details of the competition research entries that are available are detailed below.

Simulating overcrowding in historic environments

Current research within the school is studying the emergency evacuation of historic buildings. Specifically, the research is looking at how people move and react during an emergency in a confined or inaccessible space.

The research done to date looks at how people evacuate the structures but not about how they feel before and during this process.

This project will look at using at existing games based visualisation in conjunction with VR technology to simulate overcrowded historic buildings in a safe environment. The aim of the project will be to determine the points at which people feel uncomfortable and ultimately if this may affect their ability to evacuate the building.

Supervised by Dr John Isaacs

Augmented Accessible Reality

With the continued prevalence of augmented reality interaction, comes an increased need to ensure that the products that are being created are fully accessible. Methods to analyse this with standard computer applications are now commonplace and it is becoming relatively simple to ensure that applications are accessible(Fernandes et al. 2011; Hanson & Richards 2013). However, AR brings a new set of challenges because of the interaction that must take place between the physical and virtual worlds (Ip & Cooperstock 2011). Very little is known regarding the accessibility challenges that exist when working with these two, very different, modalities and to what extent virtual interactions can compensate for any physical issues that exist. Adding to this, the concept of physical interactions compensating for virtual interaction issues is another area that is relatively unexplored (Rashid et al. 2015).

The purpose of this research project it to begin an examination of this area and to develop a framework that can be used to categorise augmented reality interactions based on the reliance of physical and virtual world objects, and the granularity of interaction that is needed for each.

Supervised by Dr Michael Crabb

An Accessible Model for Digital Gaming

A growing trend in analogue gaming has been to investigate the opportunities presented by multi-modal game interactions. Digital apps are becoming more prominent as support tools, or even an integral part of the gaming experience. While there is work on the accessibility of analogue gaming, and work on the accessibility of digital gaming, there is no work to date on the intersections between these.

This project would focus on extending the work of the Meeple Centred Design (http://meeplelikeus.co.uk) project to encompass the digital space of modern designer board-games. The end goal would be to develop a framework that can meaningfully assess accessibility implications of multi- modal gaming experiences.

The students involved in this project will be engaged in preparatory work for a future grant funding submission on this topic, and will likely be a co-author on an academic paper generated from this work.

Supervised by Dr Michael Heron

Smart 3D Camera for Automatic 3D Scene Exploration

In this Summer Research Competition, we propose to develop a standalone 3D prototype to explore a variety of complex 3D scenes using an automatic 3D camera navigation control. This project includes the implementation of basic 3D tools for camera and scene loading as well as new artificial intelligence algorithms to automatically navigate in 3D environment, while taking into account the complexity and nature of the scene.

Supervised by Dr Yann Savoye