Final Project Proposal.: Ayo
Viu-It's Alive!
Project Vision
The goal of this project is to facilitate a playable experience using audio, video animation and the human body as mediums. To this end, participants will engage with the project by walking, jumping and running around an interactive room to explore what secrets it offers and the subtle ways it comes alive.
Similar Previous Work
The inspiration for this project is the idea of giving a form or voice to the mundane and largely invisible. As such, here is a sampling of projects I believe accomplished this idea very well:
Electricity Comes from Other Planets
Creators: Fred Deakin, Nat Hunter, Marek Bereza, James Bulley for Joue Le Jeu(Play Along) exhibition.
Propinquity
Creators: Lynn Hughes, Bart Simon, Jane Tingley, Anouk Wipprecht, Marius Kintel, Severin Smith
Mogees - Play the world
Creator: Bruno Zamborlin at Embodied AudioVisual Interaction team (EAVI), Goldsmiths.
Implementation Details
Anticipated Hardware
- Microsoft Kinect or available Depth camera
- Short-throw Projector
- Low-cost micro-controller
- Wireless RF modules: Xbee or Bluetooth
- 3-axis accelerometer
- Laptop computer
- RGB LEDs
My approach to the mechanics of the gameplay will be to use simple actions and behaviours that can scale in order to produce emergent complex behavior. This would produce the illusion that the environment is alive and responding to participant input.
Ninja: The goal is to avoid scan lines (radar pattern) that will be projected toward participants. It would encourage walking, running and jumping to evade the scan lines as they approach.
Single-Player Gameplay: The computer would project the scan lines that the lone user proceeds to evade from.Human Synth: The goal is to create music by moving around the interactive room. This is achieved by mapping the 2D space of the room to frequency on one axis, and pitch on the other.
Multi-Player Gameplay: Participants generate these scan lines themselves and proceed to tag other participants who are free to evade or generate their own scan lines to cancel the approaching scan lines.
Single-Player Gameplay: The lone participant creates music by walking, running and jumping to different locations within the room.
Multi-Player Gameplay: Each new participant adds an ddditional instrument and effect to an unknown location in the room. Thus music-generation is now not only crowd-sourced, but involves moving around to various locations to discover what sound is produced.
Project Scope & Rough Timeline
I will try to strictly adhere to the suggested timeline for final projects. Thus, the first two weeks will be devoted to brainstorming, and research. Development will begin in the third week leading into debugging and testing from week 5. The last week (week 8) will be devoted solely to appropriate documentation (video and otherwise).
Deliverables
- LEDs enclosed within laser-cut cubes that can be activated remotely.
- Battle-tested gameplay mechanics
- Fully functioning software to enable Kinect to sense participants and project appropriate game information.
- Schematics, Documentation
- System Demo
No comments:
Post a Comment