In the early days of computer games the sound was created using simple synthesis techniques, but as the development of better processors and larger storage media, the development lead towards the use of wave-table synthesis, which has become the most used technique in current computer games. Since the introduction of the wave-table synthesis the development in audio creation and playback in games has stagnated.
One of the latest fields within sound synthesis is physically modelled, which holds great potential, within games and interactive environments, because of its more dynamic nature. An area in which very sparse research has been done is measuring the impact of physically modelled sound in computer game environments. This has lead to the following problem statement to be formulated: To which degree does physically modelled sound enhance physical immersion in first person computer games?
This project has analysed theories proposed by several authors within the fields: immersion, narrative and gameplay in computer games, audio in computer games. These fields and their different theories have formed an ontology for the project, upon which an application has been build. The application consists of a Half Life 2 modification, which makes use of the Nintendo Wii controllers, together with a modal sound synthesis.
There is no such thing as human computer interaction (HCI) as there is a human, in the role of a designer, behind all computer systems. The concept of HCI is therefore a way to describe interesting ways* for humans to interact through computer systems. Hereby I propose that there is no such thing as HCI because computers do not provide information in any form. Computers are tools to mediate information between humans. What is being discussed here is the interaction between a human and a machine concerning the information that is being exchanged. One might argue that the interaction between a human and a machine is possible, however, this is without meaning because an interaction is an exchange of information and only living things can provide information. Continue reading Questioning established definitions of semiotics on relation to human computer interaction
As human beings, we are dependent on our ability to navigate by 3d audio since it provides us with many clues about how we are to navigate and behave in our surroundings. The fact that we from birth have been equipped with two ears placed on each side of our head makes us able to perceive the azimuth of a given sound, in fact we are able to localize a sound source within 2 degrees of azimuth; the design of the pinna or outer ear and our torso provides us with the ability to perceive the elevation of a given sound.
During the past decade there has been an increase in interest within 3D sound or spatial audio, both within entertainment, industry, and research; within this period several methods and systems has been developed to reproduce spatial audio. One of the methods is called head-related transfer functions (HRTF), which uses several audio cues in order to provide the listener with a broad spatial soundscape.
Having done my bachelor on games, it is time to move on to something more specific, namely how head-related transfer functions can be used in real time – or actually we have found research on efficient algorithms that simulates HRTF but does not directly use HRTF-databases.
Anyway, as 7th semester medialogy master student at Aalborg University in Copenhagen, I take part in creating an audiovisual augmented reality (AR) installation that uses visual tracking and simulates HRTFs. My interest lies in how 3D-audio can become more vivid without using too much system resources and how important it is (i.e. to what extend people notice differences of off-location spatial audio).
From my part, and my associate mr. Anders Fredslund, we will create an external for Max/MSP that handles HRTF and hopefully it can be used to other than testing our theses.
Bachlor thesis at Medialogy: Bringing Direct Social Interaction to Computer Games.
In this project we did a lot of research in the social factors in play and gaming. We established a framework for describing the immersive factors in a game and tested the framework with a computer augmented card game that displayed the players’ stats in a pseudo–holographic display, which enabled face–to–face communication while following the displayed stats.
How can we improve the advantages of a board game with the technologies provided by a computer?
When the tendency of playing games becomes an asocial thing, it is a scream to the developers to change the course.
The ever growing and impressive features of computer games have long suppressed the power of conventional games. The still fast development in technologies allow for still more extraordinary graphic engines. But what is happening to good old tabletop games.
We have delved into this aspect and investigated the relation between immersion and socialization as a method to create a relation between the idea of board games and the power of computers.
A bachelor thesis about direct social interaction in multiplayer games. The Petri³ is pronounced “petri cube”.
The Petri³ was a tool for my bachelor thesis at medialogy, with which we tested the degree of direct social interaction in an immersive augmented reality multiplayer card game. Besides playing around with cool next gen tech, we wanted to see how people responded to digitally augmentation of a card game. We ended up with a contraption that tracked a card game with image recognition and displayed the game’s status on a holographic display.
There were several elements in the project. The card game, which was an extremely simplified flavor of the mechanics from the Munchkin games. A game tracking engine that read the cards when played and kept track of scores for the players. Lastly the pseudo-holographic display, which we made out of a truncated pyramid in plexiglass with a back light projection on the top.
My first DADIU production was called Hængerøv and was made in Source SDK using Lua, C++ and Visual Studio 2005. The production took place May 2007.
The game was a 3D platformer where you would play a young boy, that accidentally broke his sister’s new cell phone while secretly tampering with it in his tree house one night. The objective is to collect the missing buttons while evading the furious sister who threatens to embarrass you by showing a picture of your bare bottom to the entire school, but that is not all.
The dark garden is a treacherous place to sneak around. Garden gnomes have come to life and they yodel awfully loud when tripped over, and that is what your sister is waiting for, so she can find you and take your picture.
Your parents aren’t of much help as they are having a garden party. Empty wine bottles can also give up your position, and so can the (not-so-scary-almost-pathetic) ghosts that are haunting the garden.
Fortunately you are armed with your Bug Vacuum Gun, that can suck up the ghost. The downside of it is that it attracts your sister’s attention as well.
The only chance of avoiding total embarrassment is to return the fixed mobile to your sister before her boyfriend calls her.