Virtual VJ: Enabling Engagement Through Audio-Visual Performance

Submitting Institution

Northumbria University Newcastle

Unit of Assessment

Art and Design: History, Practice and Theory

Summary Impact Type

Cultural

Research Subject Area(s)

Information and Computing Sciences: Artificial Intelligence and Image Processing, Information Systems
Studies In Creative Arts and Writing: Film, Television and Digital Media


Download original

PDF

Summary of the impact

Gibson's research-led practice has directly influenced the development of motion-tracking technology used to generate new audio-visual performances. Performances using this innovative art form have taken place in China, Canada and Singapore. They offer audiences a new and enhanced interactive participation experience and help to increase awareness of audio-visual interaction technology and its possibilities. The performances of Gibson's most recent motion- based project Virtual VJ have introduced artists, academics and technologists to innovative interactive art forms and have led to technological developments with associated commercial benefits.

Underpinning research

The project researchers are Dr Stephen Gibson (September 2010 to present), Reader in Interactive Media Design, at Northumbria University and Dr Stefan Müller-Arisona, Principal Investigator at the Future Cities Laboratory, Singapore. The Laboratory was established by ETH Zürich (the Swiss Federal Institute of Technology). Dr Gibson has been at Northumbria University throughout this research project, which began in January 2011.

The research objective has been to create a multi-user installation using the body motions of participants to control the behaviour of an audio and video environment. The key goal of the practice-led research is to encourage physical interaction with a technologically-based environment. The key research imperative that is explored is how to best enable both cooperation and a sense of personal space in virtual systems. In Virtual VJ, this is achieved by programming the environment so that dramatic events happen when the two trackers are close together or distant. For example, the trackers apply distortion to the audio when they are proximate to each other and reverb when they are distant from each other.

The project is technologically possible through using 3D motion-tracking hardware and software — The Gesture and Media System (GAMS), produced by Apr Inc. and Limbic Media Corporation. The latter company is co-owned by Gibson. Gibson has worked with APR Inc. for the past fifteen years as the lead beta-tester of GAMS, including a period (1998-2005) in which the product was named the Martin Lighting Director and was licenced to Martin Lighting
(http://www.martin.com/product/product.asp?product=lightingdirector). GAMS allows artists and other users to `map' an interactive space with sound, light and images, with the user-movement controlling these elements via hand-held trackers. There is increasing interest across the field of Human-Computer Interaction (HCI) in the responsiveness of technology to dynamic embodied human use and vice versa.

Gibson began with the creation of audio materials, which he programmed for interaction in 3D space and tested with user movements. Developed initially with one motion tracker, a second tracker was added as a manipulator of the data triggered by the first. Once the user movement was mapped to audio control, Gibson and Müller-Arisona built a related series of controls for video manipulation. These were then mapped to 3D movements for simultaneous video control. Gibson then established a series of nine virtual rooms in increasing levels of complexity and users progressed from room to room by reaching hotspots. At this stage, Gibson organised a number of users to beta-test the system in order to determine how their body interactions related to it and to other simultaneous users.

The research progressed so that audience members interacted with the system in whatever manner they chose, with more noticeable results being produced as the participants inhabited similar spaces, encouraging them to cooperate with each in order to produce dramatic results. Further results of the research are that Gibson and Müller-Arisona have built a fully functional installation for two participants at Culture Lab, Newcastle University, that can be easily moved; and they have devised a unencumbered detection system for controlling an audio-visual environment.

The research has been exhibited at a number of high profile international events, listed below.

References to the research

The project has produced two conference papers:

1. Gibson, S., `Subjective User-Interaction Models in 3D Spatial Environments: Virtual DJ and Virtual VJ.' (2011), `User in Flux' workshop, CHI Conference. Vancouver, Canada. May 2011. Available from Northumbria University on request.

2. Gibson, S., `Simulating Synaesthesia in Real-time Performance' (2011), International Symposium on Electronic Art (ISEA 17th Annual Conference 2011). Istanbul, Turkey. September 2011. Available at: http://isea2011.sabanciuniv.edu/paper/simulating-synaesthesia-real-time-performance-using-subjective-user-interaction-models-3d-spat

A full paper of the latter is to be published in a special volume of the Leonardo Electronic Almanac (LEA): Gibson, S., `Simulating Synesthesia in Spatially-Based Real-time Audio-Visual Performance'(2013, in publication), in Aceti, L., Gibson, S., and Müller Arisona, S., (co-editors) Live Visuals for Performance, Gaming, Installation and Electronic Environments, Leonardo Electronic Almanac, MIT Press.
http://www.leoalmanac.org/live-visuals-lea-call-for-papers/. (Acceptance letter is available)

Between May 2011 and June 2012 there have been three major exhibitions of Virtual VJ at academic venues:

1. Culture Lab CHI party, at Intersections Digital Studio, Emily Carr University of Art + Design, Vancouver, Canada. Refereed exhibition, May 2011. See:
http://www.youtube.com/watch?v=ahv_grq0bPc

2. The Interactive Experience, HCI 2011, at Culture Lab, Newcastle University, UK. Refereed exhibition, July 2011. See:
http://homepages.cs.ncl.ac.uk/anja.thieme/papers/interactive_experience.pdf

3. `Designing Musical Interactions for Mobile Systems' at Designing Interactive Systems (DIS) event, Culture Lab, Newcastle University, UK. Refereed exhibition, June 2012. Available at: http://sopi.media.taik.fi/mobilemusic/

In addition to the above exhibitions, there have been two major international public performances of Virtual VJ between August 2011 - May 2013:

1. 2nd Lantian Summer Arts Festival, Jade Valley Winery (a highly regarded site for Chinese and visiting artistic exhibitions), Xi'an, China. Invited exhibition, August 2011. See
http://www.jadevalley.com.cn/news/?type=detail&id=225

2. Digital Art Weeks 2013, Future Cities Laboratory, Singapore. Invited Exhibition, May 2013. See http://www.digitalartweeks.ethz.ch/web/DAW13/Front

Details of the impact

The technological developments of the GAMS system that underpin Virtual VJ have resulted in commercial benefits, as well as increasing the knowledge and access to innovative interactive art forms for artists, academics and technologists at international institutions. Virtual VJ creates opportunity for a high degree of audience participation, enhancing the cultural experience of public and specialists audiences.

Virtual VJ has been exhibited at internationally significant venues which showcase innovative research to Human Computing Interaction specialists and a wider audience. These have proved successful dissemination vehicles for Gibson's research, but more importantly have allowed international artists, academics and students to use the technology and to explore its opportunities in their own practice.

During 2011, Gibson exhibited Virtual VJ at CHI at Emily Carr University of Art + Design in Vancouver, Canada, and at the Jade Valley in X'ian, China. The Vancouver audience comprised 200 professional designers, artists and academics (from interactive art, computer human-interaction and interactive media design). The X'ian audience, approximately 500, included local political figures and high profile business people, as well as artists from X'ian Academy of Fine Arts. At these venues, audiences benefited from an opportunity to directly interact with the piece in real-time and influenced the creation of a unique live interactive environment. The organiser in Vancouver stated: "Virtual VJ ... directly impacted the perception of interactive art by the general public." Audience members were invited to perform using the motion-trackers and gain a direct sense of how motion-tracking can be used to control simultaneous media with their bodies. This exposed the audience members to innovative ideas about art and interactive experiences, especially participative art forms.

The next version of the project exhibited at Jade Valley was the culmination of a week-long workshop which Gibson held at Xi'an Academy of Fine Arts (XAFA) where there are no courses in interactive design or new media. This was the students' first exposure to the motion-tracking technology and their first opportunity to use it to change their artistic practice and to create their own interactive projects. The XAFA students gained awareness, understanding and practice from developing these interactive design projects.

Gibson's research has had an international impact on artists, influencing the design of interactive, immersive environments. An artist commenting on a piece of work he produced in 2011 stated how he was influenced by Gibson's research, "I was inspired by the concept of Virtual VJ that a participator's body gestures can be the interface to control audio and video materials, and to explore the spatial and corresponding relationship between humans, space and diverse media. GAMS was the key technology used in Light & Shadow, which is not only a simple interactive installation, but also an experiment of re-interpreting traditional culture via digital technology."

The Virtual VJ project was later performed in May 2013 at Digital Art Weeks in Singapore in an event sponsored by ETH Zurich. The primary collaborator for this project describes the impact on the Value Lab Asia: "Working with a motion-tracking system for physical and gestural live visuals control has helped to add a new scale of dynamics to live animated content, making content itself an immersive gesture, similar to a lighting system and disconnected from projection screen. In addition, the first exhibitions coincided with the early design phase of the Value Lab Asia ... the experience collected during these exhibitions considerably influenced several of the principal design features of the Value Lab Asia."

The statement above corroborates the beneficial changes made to this high-value facility as a result of Gibson's research-practice. Value Lab Asia (VLA) is a digitally augmented media-enriched collaborative environment, equipped with several touch surfaces and a very high-resolution video wall, located at the Singapore National Research Foundation. It has a wide range of applications such as participatory planning and design, information visualisation and discovery and remote conferencing: for example, it is used as a transmedia workshop space, where an audio-visual organiser supports participants using voice, tweet and audio-visualisation, to engage with each other and with live audio-enabled visuals on the wall of multiple touch-responsive screens.

The commercial impacts of Gibson's work with the GAMS systems has helped Acoustic Positioning Research (APR Inc), a company based in Edmonton Canada, to improve the quality of the GAMS system and sell more of them. These benefits are described by the President of APR and the inventor of the Gesture And Media System (GAMS): "Dr. Gibson's work has helped APR significantly ... it has improved the quality of the GAMS ... due to his early-adopter bug finding and suggestions for both incremental improvements and significantly different and innovative functionalities; it has helped sell systems by providing documented examples of engaging, performative and public-interactive use; it has inspired our engineers, confirming and exemplifying the original integrated-media vision of GAMS technology; ... and applicable in matters reaching far beyond the world of media entertainment."

Sources to corroborate the impact

The Principal Investigator at Future Cities Laboratory, Singapore-ETH Centre, confirms impact on creative practice in the live visual and interactive environment field and also Gibson's impact on the technological development of Value Lab Asia. An email statement is available. Link to the workshop: http://www.digitalartweeks.ethz.ch/web/DAW13/VLAWorkshop

The President of Acoustic Positioning Research (APR Inc.) is an artist and engineer working with "integrated" (as opposed to "multi") media. He is the inventor of the Gesture And Media System (GAMS). A statement from the President is available and corroborates the impact of Gibson's research on developments of the GAMS technology and quantifies this impact with jobs and company revenue data.

Audience figures and beneficiaries for the CHI exhibit can be confirmed by European Research Council (ERC) Professor at Goldsmith's Digital Studios, University of London. An email from the ERC Professor detailing this impact is available.

The main organiser of the Jade Valley events can confirm audience figures and beneficiaries for the Jade Valley exhibit. An e-mail statement detailing this impact, as well as the impact on his own work as an artist, is available.