Log in
The reduction of spatial variation in the quality of reproduced sound within a defined space using varied loudspeaker placements is a significant challenge for sound engineers. Dr Bruce Wiggins has conducted research into encoding, decoding and processing algorithms using Ambisonics, a system based around full-sphere sound reproduction. The outcomes of the research have been made accessible to the wider community by the creation of a suite of software plug-ins (WigWare), a production workflow, and associated teaching materials which can enable commercial audio workstations to benefit from Ambisonics. There are numerous recorded instances of successful use.
The project impacts by connecting people with technology through an interactive art project. Portable equipment ensures wide participation: people respond to, and interact with, virtual living creatures in an entertaining but instructive context. Bringing together human participants (able to intervene in the environment) with the virtual bugs (responsive to stimuli/their environment), people are challenged to consider cause and effect in the physical environment as well as their own inter-social relations. The impact which is cultural, imaginative and pedagogic is achieved through touch rather than via the normal emphasis on the communicated world.
Music teachers, physiotherapists, museum curators and other practitioners have used the results of our research to improve their practice, with consequent benefits to individuals. For example, a violin teacher used our MusicJacket haptic guidance system to permanently improve pupil violin bowing technique. A neuroscience team made use of our Haptic Bracelet system in a novel form of gait rehabilitation with a patient recovering from a hemiparetic stroke, who reported improved posture and movement. Through public participation in events featuring our Haptic Lotus, such as theatre performances for blind and sighted people, as well as our engagement in schools and at festivals, we have stimulated public interest in technologically mediated approaches to issues of health, the arts and accessibility. This has led to informed public discourse through reports in national newspapers, magazines and the BBC.
Augmented reality (AR) and physiological computing (PC) represent computing paradigms for wearable technology. Both forms may be combined to deliver Adaptive AR (A2R) where changes in psychophysiology are used to adapt digital artifacts in real-time. A number of art exhibits were created that represented A2R and were presented to the public as part of the Turning FACT Inside Out show in Liverpool. The impact of this research is evidenced by: (a) engaging the public with emerging technology, (b) influencing the strategy of an arts organisation, and (c) informing the practice of artists.
Professor Joseph Hyde's research explores the role of music and sound in a broader performing/digital arts context, through installation and performance works using interactive technologies. Impact is generated through active participation by audience members as a way to embody the research. This work often engages a broader audience than purely music/sound work, reaching the wider arts, creative industries, education and science/engineering communities. Two recent projects illustrate this. me and my shadow was commissioned by MADE, a European Commission-funded initiative exploring mobility for digital arts. It ran simultaneously in London, Paris, Brussels and Istanbul, and formed the basis for a European Commission White Paper. danceroom Spectroscopy was a collaborative arts/science crossover project, which attracted attention in both arts and science communities. Both projects attracted substantial funding (c. €400,000 and £165,000 respectively), reached large audiences (5000 and 20,000 physical attendees) and had wide press coverage.
Glyndŵr researchers designed and developed ambient user interface devices and middleware (known as the `E-servant'), and evaluated the completed system, on an FP6 project developing near-to-market-ready prototypes of advanced kitchen appliances. Functionality included sensors in refrigerators that communicated if the door had been accidentally left open; in washing machines, RFID chips identified laundry and automatically selected correct programmes; other appliances, along with further sensors (e.g. smoke alarms) communicated their status via the E-servant to personalised user interfaces. Users could control the appliances, monitor them, and receive timely notifications. Impact relates to benefits to industrial partners and public engagement with research.
This study describes impact from James Ohene-Djan's research on personalisation, assistive technologies for the deaf, and web-based video. The research led to two spin-out companies:
(i) Viewtalk was started by Ohene-Djan in 2008 in partnership with Deafax, a charity dedicated to access for people with impaired hearing. Viewtalk developed video messaging specifically tailored to the needs of the UK's nine million deaf and hard-of-hearing people.
(ii) WinkBall was developed in partnership with a privately-owned UK news organisation (Correspondent Corporation). Winkball provided a system that enabled users to post video content for specific audiences and purposes. At its peak it employed 300 reporters to supply dedicated content, and generated a user community of 150,000 active content-generators and three million video watchers.