MONSTERS
This is a audio game/simulation where you must defend your self against the monsters in a dark cave. With a torch as your only tool you must shine your light in the direction you hear the monsters coming from so they run away. We used four speakers to create the surrounding cave environment where mysterious monsters approach and surround the player from different directions.
For this project we used maxmsp a visual programming language to connect with the compass embedded within an iphone. The iphone acted as the players torch so when it was angled in the right direction the audio would skip to the appropriate sound to signal the creature either attacking or fighting.
The maxmsp patch also included reverb (to simulate distance within the virtual cave), and 3d sound panning. Below are the sound effects used but not including the ambient cave sounds which played seamlessly in the background of each speaker in the installation.
For this project we used maxmsp a visual programming language to connect with the compass embedded within an iphone. The iphone acted as the players torch so when it was angled in the right direction the audio would skip to the appropriate sound to signal the creature either attacking or fighting.
The maxmsp patch also included reverb (to simulate distance within the virtual cave), and 3d sound panning. Below are the sound effects used but not including the ambient cave sounds which played seamlessly in the background of each speaker in the installation.
REFLECTIVE STATEMENT
Creating an interactive environment in which the user is the main object rather then the use of a physical gaming device i.e controller . In doing so the simulative environment had to considered evolving senses using sound we considered many methods in which to explore this. Our conceptual ideas had also explored the idea in which the body would react upon the user sensing in a dark space.
Game as their nature have been judged according to their visuals, their sprites and their user interface over the past and even today. What we as a group wanted to explore was different and which was unique to the other groups. We wanted to get the user out of the usual mind set of expecting something to look at as soon as they are informed that they are being introduced to a game simulation. Thus we took sound as the key ingredient in our simulation.
Sounds are important, more so than most people realize. Try playing your favorite game with all the sounds turned off. Doesn't play right does it? In a study done by LucasFilm when they were testing out the THX standards, it became apparent that decent sound can actually fool the brain into thinking the picture is better. In the study, one group of people that where shown a movie with average sound, then the same movie with better sound actually commented that the picture seemed sharper too. However, now we have a new tool to play with. 3D Spatialized sound. Not that spatialized sound hasn't been used before - lots of games split the mono sounds over two channels and use 3D distance from camera to determine each channel's volume. But this has always been more a gimmick than a real helpful tool for the game's player. That’s not to say it isn't helpful at all, just that for you - as a games player - to really be able to use it, you have to have just the right set of circumstances. Things are coming along with the new 5.1 speaker setups that we are starting to see developed for the home PC.
Just to give you an overview on the ideas that we discussed.
We started off with direction of just using a lot of sound to create this space.
Lots of dark creatures, panthers, bats flying and monster sounds. Our original idea was that when you move your body to face the direction of the sounds the the monster would die or it may take stronger enemies a longer time/ take more damage for them to die. We also considered using knock sensors so you have to stomp your feet on the floor to scare them away, as you would do to scare away mice or roaches. Use ground vibrations to scare away the creature as all animals have a instinct when it comes to natural vibrations or the weather. To add to this we decided to give it a more realistic feel. Add some gravel on the floor but then again you cant really dirty up uni space. Also for the creatures moving towards you we wanted to use LED lights, have them come towards you like eyes and when theyre really close and about to attack. If you have too many lights coming towards you the player would die instantly. Then again this didn’t serve the purpose of having it a soul sound game as it included visuals, but we went ahead with creating the idea never the less to see where we could deduct the sense of sight later on in the process. Having the original idea in mind we decided to make a black dome shape to stand under with LEDS scattered around like stars to mark the approaching enemies. These lights are to give feedback to the player that they are being attacked. We wondered if we would be able to have some kind of sound alert to notify the player that they have killed an enemy as every game has this feature. Another original idea we had was to have a head torch of some sort and an accelerometer to sense which way the player is facing in the dark.
Original Interface –
Player
Embedded in the helmet is an accelerometer (either wiimote, androidphone or iphone)
The accelerometer reads the direction the player is facing and this is how the player aims their attacks - by looking (?) This data is transmitted via OSC to max/msp
At this stage we trying to use the data to connect to the sound feature of the game.
Sound
Enemies come to the player via sound. They come from different directions( dealing with 3D spatial sounds). Thus we sort out to the speakers in the interactive Co-Lab. The key program we were to use was max-msp.
Coming to what we actually managed to do due to time constraint and technical difficulties. We changed the whole concept of having it as a game and switched to having it as a simulation. We chucked the idea of having LEDs as having it jus playing with sound gave a deeper meaning to our concept, which was to totally negate the feeling and interaction of any visual aid. The simulation of being in a cave is relevant only if the player is in total darkness. Ones sense of sound is heightened when there is no feedback whatsoever to the eyes and its something we have experienced in the past in day to day life. Thus we changed our direction a bit to give the player a feel of isolation and in turn it would be more effective. We mastered our sound track through simple garage band on the mac. Took loads of sound clips off the internet and merged them together to give us one scary soundtrack. Our main problem was the programming on max msp. As melody and me had no knowledge of C++, we stuck to a program that we both had a little comfort in.
We took the aspect of 3D sound panning to give a feel of the monster travelling around you through the speakers. What we found really interesting about this was that it would give a feel of catching the monster through the speakers. The simulation seems impossible to catch a creature through speakers but we decided to incorporate way to have max msp detect the same direction of the speaker the player was facing and thus diminish or change the sound. Even having some sort of movement like a swipe of the hand (sword slash) should be sufficient enough to make it some sort of interaction for the player rather than just walking around the room following the sound. This is when we came up with the idea of using the accelerometer in the iphone. We chose the iphone over the wii remote because the iphone has become very popular status in this generation and we wanted to take something that one uses on day to day basis (phone) and underplay the whole application concept of it. At this stage we also felt that turning our simulation into an application and putting it on the Apps store would be a fun thing to do. Anyway we created the app on the iphone using C74 which helps you directly link the iphone to the max msp program. What it also does is helps us to use the accelerometer data / compass/ shake / gps and microphone to process in max msp. Now this was very good progress for us as we spent days and days trying to figure out to do this. Most of our time was wasted trying new stuff with the wii remote and then a week before due date we came across a method to create the iphone app. We were working on our code at the same time as well. Getting everything to work together was the biggest pain of all. We trying to work reverb on the sound track, getting the compass to to detect on the sound panning patch so that it changes automatically according to the direction the phone is facing, getting shake to work as the sword slash. Another problem we had was getting the shake and compass to work together as our application would pick up one entity at a time. This really wasn’t working for us but with a little more time we think we could have had both functioning at the same time.
All in all we really enjoyed this paper and this project per say. We worked perfectly together to reach our targets. We both used our set skills and made full use of our individual strong suits. We are actually quite satisfied with the outcome as until five minutes before deadline nothing seemed to work and then suddenly everything just seemed to fix itself with one missing line of code. We put some considerable hours and all nighters to have things working, cause you know how programming is, loads of references and research, libraries and examples. To conclude the lectures given by you personally to us was useful as it gave us perspective and direction in thinking in a different way.
Game as their nature have been judged according to their visuals, their sprites and their user interface over the past and even today. What we as a group wanted to explore was different and which was unique to the other groups. We wanted to get the user out of the usual mind set of expecting something to look at as soon as they are informed that they are being introduced to a game simulation. Thus we took sound as the key ingredient in our simulation.
Sounds are important, more so than most people realize. Try playing your favorite game with all the sounds turned off. Doesn't play right does it? In a study done by LucasFilm when they were testing out the THX standards, it became apparent that decent sound can actually fool the brain into thinking the picture is better. In the study, one group of people that where shown a movie with average sound, then the same movie with better sound actually commented that the picture seemed sharper too. However, now we have a new tool to play with. 3D Spatialized sound. Not that spatialized sound hasn't been used before - lots of games split the mono sounds over two channels and use 3D distance from camera to determine each channel's volume. But this has always been more a gimmick than a real helpful tool for the game's player. That’s not to say it isn't helpful at all, just that for you - as a games player - to really be able to use it, you have to have just the right set of circumstances. Things are coming along with the new 5.1 speaker setups that we are starting to see developed for the home PC.
Just to give you an overview on the ideas that we discussed.
We started off with direction of just using a lot of sound to create this space.
Lots of dark creatures, panthers, bats flying and monster sounds. Our original idea was that when you move your body to face the direction of the sounds the the monster would die or it may take stronger enemies a longer time/ take more damage for them to die. We also considered using knock sensors so you have to stomp your feet on the floor to scare them away, as you would do to scare away mice or roaches. Use ground vibrations to scare away the creature as all animals have a instinct when it comes to natural vibrations or the weather. To add to this we decided to give it a more realistic feel. Add some gravel on the floor but then again you cant really dirty up uni space. Also for the creatures moving towards you we wanted to use LED lights, have them come towards you like eyes and when theyre really close and about to attack. If you have too many lights coming towards you the player would die instantly. Then again this didn’t serve the purpose of having it a soul sound game as it included visuals, but we went ahead with creating the idea never the less to see where we could deduct the sense of sight later on in the process. Having the original idea in mind we decided to make a black dome shape to stand under with LEDS scattered around like stars to mark the approaching enemies. These lights are to give feedback to the player that they are being attacked. We wondered if we would be able to have some kind of sound alert to notify the player that they have killed an enemy as every game has this feature. Another original idea we had was to have a head torch of some sort and an accelerometer to sense which way the player is facing in the dark.
Original Interface –
Player
Embedded in the helmet is an accelerometer (either wiimote, androidphone or iphone)
The accelerometer reads the direction the player is facing and this is how the player aims their attacks - by looking (?) This data is transmitted via OSC to max/msp
At this stage we trying to use the data to connect to the sound feature of the game.
Sound
Enemies come to the player via sound. They come from different directions( dealing with 3D spatial sounds). Thus we sort out to the speakers in the interactive Co-Lab. The key program we were to use was max-msp.
Coming to what we actually managed to do due to time constraint and technical difficulties. We changed the whole concept of having it as a game and switched to having it as a simulation. We chucked the idea of having LEDs as having it jus playing with sound gave a deeper meaning to our concept, which was to totally negate the feeling and interaction of any visual aid. The simulation of being in a cave is relevant only if the player is in total darkness. Ones sense of sound is heightened when there is no feedback whatsoever to the eyes and its something we have experienced in the past in day to day life. Thus we changed our direction a bit to give the player a feel of isolation and in turn it would be more effective. We mastered our sound track through simple garage band on the mac. Took loads of sound clips off the internet and merged them together to give us one scary soundtrack. Our main problem was the programming on max msp. As melody and me had no knowledge of C++, we stuck to a program that we both had a little comfort in.
We took the aspect of 3D sound panning to give a feel of the monster travelling around you through the speakers. What we found really interesting about this was that it would give a feel of catching the monster through the speakers. The simulation seems impossible to catch a creature through speakers but we decided to incorporate way to have max msp detect the same direction of the speaker the player was facing and thus diminish or change the sound. Even having some sort of movement like a swipe of the hand (sword slash) should be sufficient enough to make it some sort of interaction for the player rather than just walking around the room following the sound. This is when we came up with the idea of using the accelerometer in the iphone. We chose the iphone over the wii remote because the iphone has become very popular status in this generation and we wanted to take something that one uses on day to day basis (phone) and underplay the whole application concept of it. At this stage we also felt that turning our simulation into an application and putting it on the Apps store would be a fun thing to do. Anyway we created the app on the iphone using C74 which helps you directly link the iphone to the max msp program. What it also does is helps us to use the accelerometer data / compass/ shake / gps and microphone to process in max msp. Now this was very good progress for us as we spent days and days trying to figure out to do this. Most of our time was wasted trying new stuff with the wii remote and then a week before due date we came across a method to create the iphone app. We were working on our code at the same time as well. Getting everything to work together was the biggest pain of all. We trying to work reverb on the sound track, getting the compass to to detect on the sound panning patch so that it changes automatically according to the direction the phone is facing, getting shake to work as the sword slash. Another problem we had was getting the shake and compass to work together as our application would pick up one entity at a time. This really wasn’t working for us but with a little more time we think we could have had both functioning at the same time.
All in all we really enjoyed this paper and this project per say. We worked perfectly together to reach our targets. We both used our set skills and made full use of our individual strong suits. We are actually quite satisfied with the outcome as until five minutes before deadline nothing seemed to work and then suddenly everything just seemed to fix itself with one missing line of code. We put some considerable hours and all nighters to have things working, cause you know how programming is, loads of references and research, libraries and examples. To conclude the lectures given by you personally to us was useful as it gave us perspective and direction in thinking in a different way.