Having done my bachelor on games, it is time to move on to something more specific, namely how head-related transfer functions can be used in real time – or actually we have found research on efficient algorithms that simulates HRTF but does not directly use HRTF-databases.
Anyway, as 7th semester medialogy master student at Aalborg University in Copenhagen, I take part in creating an audiovisual augmented reality (AR) installation that uses visual tracking and simulates HRTFs. My interest lies in how 3D-audio can become more vivid without using too much system resources and how important it is (i.e. to what extend people notice differences of off-location spatial audio).
From my part, and my associate mr. Anders Fredslund, we will create an external for Max/MSP that handles HRTF and hopefully it can be used to other than testing our theses.