Forthcoming Meetings

Making sense of the beat: How humans use information across the senses to coordinate movements to a beat.

Date: 20 Aug 2014
Time: 18:00

Location: Room 203, Birmingham City University
Millennium Point
Birmingham

Lecture by Dr. Mark Elliott, School of Psychology, University of Birmingham.
Room 203, Millennium Point, Birmingham City University, Curzon St, Birmingham, B4 7XG.
 

People will nod or tap along to the beat of a song, often without even thinking about it. This demonstrates the strong links between auditory rhythms and movement in humans. However, the brain often uses multiple senses, including vision and touch rather than just sound alone to define events in time. While synchronising movements to the beat appears a very simple thing to do, the brain continuously has to deal with conflicting information from these senses and correct for errors such that we continue to move in time with the beat. In this talk I will present the research we have carried out in the Sensory Motor Neuroscience (SyMoN) Lab at the University of Birmingham, to understand how we keep in time with the beat. In particular, I will discuss the models that describe how we combine information across the senses and correct the mistakes we make. I will further talk about how we have used the models developed to understand how a string quartet keep in time together, how a DJ convinces us they performed a seamless mix and how excited crowds end up bouncing around in synchrony. 

 


Augmenting the piano keyboard: From the lab to the stage

Date: 9 Sep 2014
Time: 18:30

Location: Arts One, Queen Mary University of London (Mile End campus)
ArtsOne
London

Lecture by Andrew McPherson, Queen Mary University of London

This talk presents two augmented musical instruments which extend the capabilities of the familiar keyboard. The magnetic resonator piano (MRP) is an electronically-transformed acoustic grand piano. Electromagnets induce vibrations in the strings, creating infinite sustain, crescendos from silence, harmonics, pitch bends and new timbres, all controlled intuitively from the piano keyboard.

The TouchKeys add multi-touch sensing to the surface of any electronic keyboard. Capacitive touch sensors measure the position and contact area of the fingers on each key, allowing the performer to add vibrato, pitch bends and timbre changes to each note independently just by moving the fingers on the keys during performance. The mappings between touch data and sound have been designed to avoid interference with traditional keyboard technique, so the TouchKeys can build on the expertise of trained pianists with minimal relearning.

Both instruments have established a continuing musical presence outside the research lab. In addition to presenting the instrument designs, this talk will discuss recent collaborations with the London Chamber Orchestra and the band These New Puritans using the magnetic resonator piano, and a Kickstarter crowd-funding campaign for the TouchKeys which raised support for producing and distributing TouchKeys instruments to musicians in 20 countries.


A Comprehensive Overview of Game Audio

Date: 3 Nov 2014
Time: 15:00

Location: Room 203, Birmingham City University
Millennium Point
Birmingham

Room 203, Millennium Point, Birmingham City University, Curzon St, Birmingham, B4 7XG.

Overview:
AES-Midlands are hosting an afternoon of presentations introducing the field of video game audio. From simple casual games all the way to AAA blockbusters, audio plays a major role in the gamer’s experience. The interactive nature of games and the technical limitations of the platforms they run on, significantly contrasts from linear mediums such as TV and film. Presented by industry professionals of both creative and technical backgrounds, these presentations are ideal for those interested in the industry.

The event is free of charge and open to everyone (members and non-members). Free tickets are available here.

Schedule:
[3.00 - 3.45] The Sonic Journey, Andy Grier (FreeStyleGames/Activision)
[3.45 - 4.15] Pew-Pew! Boom-Boom! Kapow!, Andy Grier (FreeStyleGames/Activision)
[4.15 - 5.00] If a tree falls in the forest and no one is around to hear it, how many channels does it use?, Jethro Dunn (Codemasters)
[5.00 - 5.30] Break
[5.30 - 6.15] Mixing for the Unknown, Edward Walker (Sounding Sweet)
[6.15 - 7.00] Sound Bytes, Aristotel Digenis (FreeStyleGames/Activision)
[7.00 - 7.45] Creating a Virtual World in Real-Time, Jon Holmes (Rare/Microsoft)
[7.45 - 8.00] Coffee
[8.00 - 8.30] To ∞ and beyond…, Aristotel Digenis (FreeStyleGames/Activision)

Abstracts:

3.00 – 3.45: The Sonic Journey
Andy Grier – Lead Audio Designer – FreeStyleGames/Activision
A brief historical audio tour from the first bleeps and bloops of Pong, through all the various synthesis methods of console generations that you or your parents played on, right up to groundbreaking technology found in current generation video games. Andy will highlight key moments in game audio history which helped shape the industry whilst engaging the audience in a comparison of where we’ve came from vs. where we currently stand.

3.45 -4.15: PEW-PEW! BOOM-BOOM! KAPOW!
Andy Grier – Lead Audio Designer – FreeStyleGames/Activision
So what does sound design for video games entail? Andy will discuss the task of producing raw source material and content that will go into the video game. Audio topics such as traditional sound design, Foley art, field recording, synthesis, voice-over production and music composition to name a few. It’s the harmonious sum of all these components which are brought together to create the audio vision for the game.

4.15 -5.00 If a tree falls in the forest and no one is around to hear it, how many channels does it use?
Jethro Dunn – Senior Audio Designer – Codemasters
TBA

5.00 – 5.30: Break
Sandwiches will be served in Room 405

5.30 – 6.15: Mixing for the Unknown
Edward Walker – Game Audio & Post Production Sound Engineer/ Director – Sounding Sweet
TBA

6.15 – 7.00: Sound Bytes
Aristotel Digenis – Lead Audio Programmer – FreeStyleGames/Activision
While new game console generations offer great gains in computational power, audio programmers still need to be creative in how to best implement audio algorithms into games. Computational complexity aside, algorithms that may be suitable for offline processing often need to be reviewed and adjusted for the interactive and real-time nature inherent in games. Aristotel will cover engaging topics including spatial audio, environmental audio, codecs selection, and toolsets for authoring interactive audio.

7.00 – 7.45: Creating a Virtual World in the Real-Time-World
Jon Holmes – Audio Engineer – Rare/Microsoft
It’s a really exciting time for audio in games. Technology has rapidly advanced to the point where we have the fewest restrictions ever on what we are capable of doing. The last generation of games consoles in particular have allowed game audio programmers to really flex their technical creativity. Jon will talk about how much audio really goes into a modern AAA game and how the available technology is used to make it sound dynamic and fully immersive.

7.45 – 8.00: Coffee
Coffee will be served in Room 405

8.00 – 8.30: To ∞ and beyond…
Aristotel Digenis – Lead Audio Programmer – FreeStyleGames/Activision
What next…? That is the question. This closing talk will go over just some of the areas being researched both by game audio companies as well as academic institutions, and suggest how game audio may benefit from them.