Monday, January 6, 2014

Ding: Final Video "EnSing"

EnSing
EnSing comes from the idea of building a context sensitive and PC independently music player in people’s home environment. EnSing is made up of two parts: the central hub BODY and detachable blocks with magnets CAPs. Individual CAP could sense or be designated to a parameter of context or users, light intensity and users mood for example. A CAP has to be attached to the BODY to work. Tapping the head of BODY or after playing a song, BODY updates values of CAPs directly or implicitly.





Key Technology:

1. Web music request search and play via Openframeworks
2. Music playing and control under Raspberry Pi
3. Arduino control for sensors
4. Communication between Raspberry Pi and Arduino
5. Detachable physical structure design
6. laser cut and 3D printing


Acknowledgements

Thanks for the advice and help of Dale Clifford, Zack Jacobson-Weaver, Madeline Gannon, Ali Momeni and all CoDelabbers.

Monday, December 16, 2013

Whisker V4 to V1: Amy

Whisker 1:

 materials:
from a lamp
-copper wire
-spring
-screws
-weight
-plastic
The main properties of the first part of my whisker is its magnetic attraction, as a response to its own surrounding. In essence the whisker acts as an antenna to send out electromagnetic signals to allow for an animal to respond to a predator that is nearby.
 

Whisker 2:


 materials:
-magnets
-acrylic
-metal I-hook
-wood rod

V2 takes the electromagnetic response further to allow for the head to move out and in based on magnetic attraction. This begins to incorporate a defense mode which would be move manually to attack the predator and try to catch their hand as they approached the stationary whisker. The value changed from V1 to V2 is that the object is scaled up and the magnetic attraction is continued, but the whisker still lacks a greater overall interaction for users. There is also a lack of determination as to the definition of the whiskers goals overall and how it wants to react.

Whisker 3:



 materials:
-cardboard
-acrylic
-servo motor
-magnets
-wood rod
-ardunio
-elastic band

Whisker V3 has modifications that integrate a servo motor into the claw part of the whisker. When a magnetic force is need the whisker it tries to push itself away through opposing magnetic forces. If someone gets to close the claw spins based on a simple servo coding (no sensors used) and the claw can be shut and opened through manual pulling and pushing of a rubber band. This modification continues to focus on self defense as you approach the whisker. It begins to resemble a toy and allows the user to interact with it further.

Whisker 4: 


Whisker V.4 from Amy Friedman on Vimeo.








 materials:
-Fishing Wire
-Metal Wire-Wood Rod
-Acrylic
-Servo Motor
-LED Light
-DC Motor
-Buzzer 
-PIR Sensor
-Ardunio

This final whisker prototype adds a greater interactive element as V4 reacts to motion. With all of its electronic features, it gives the whisker a personality that was lacking before. I learned that inorder for something to truly be interactive there needs to be a strong curiosity. One may know what it is made of, but the user still may want to explore what it can further active over time. When V4 is activated, it first lights up its LED, signifying that it is awake. Then he turns his head around overseeing his surroundings, and understanding what woke him up. After finishing his surveillance, he lets out a high pitched alarm signifying his scream of terror, and tries to get away but inevitably cant move. There is something different in the way we interact with a bow & arrow or clamp vs a "cute" toy as some may declare this to be. And it was a great lesson to learn about both through my prototyping. 



Sunday, December 15, 2013

Final project description: "Heart Beat"--- Meng and Yang

Statement:
Heart beat in many cultures has a meaning of emotions and feelings. In the "Heart Beat" project, we try to build a device which can visualize people's heart beat and create magic feelings. In the developing of this project, we made three prototypes. The first prototype is a heart-beat box which can sense and visualize our heart beat if we put our fingers on the sensor. Red LEDs with in the heart-shaped box will flash according to the beat it senses. In the second version of prototype we tried to projected ripples on the wall. The ripples spread out according to our heart beats. In the final version of prototype we built a mobile projector system which receives signal from a wireless heart-beat-sensor box. Holder holds the projector moving around and she/he can pass the sensor to other people. The projector projects animation of the heart beat it senses on the floor. We hope the device can augment the communication between people and engage more people into talking by trying visualizing their own heart beats.

Video:

HeartBeat from Benny Zhang on Vimeo.


Tech description:
One pair of IR LED and receiver is used for the heart beat sensor. The LED is connected with 5v power and emits stable power of light. The IR light then be reflected by our finger so that the light the receiver received changes according to the density of fingers. Every time Our heart beats, blood flushes into the finger changing the density repeatedly thus with algorithm the heart beat can be detected. The signal is then transmitted to the arduino connected with laptop using wireless module. Processing program receives signal from serial and draw the animations.

Acknowledgements
Dale Clifford
Zack Jacobson-Weaver
Ali Momeni
Golan Levin
Aisling Kelliher
P. Zach Ali
Thank you and all our classmates. We learn a lot from this course because of your enlightening ideas.

Saturday, December 14, 2013

Final Project Description


Statement:
NuPhase is an experimental interface that aims to provoke the understanding of relationship between the slight temperature variation and phase change phenomenon. Through playing and visualization, it hopes to bring to light more creative uses of this material.

Short tech description:
Processing + Kinect (video function and edge detection)

Acknowledgements:
Dale Clifford
Zack Jacobson-Weaver
Aisling Kelliher
Madline Gannon
CMU codelab

Thursday, December 12, 2013

Row House Cinema Final Prototype




The final prototype delivered for this class used two of the custom LED driver boards with 30 small, white LEDs. The boards were connected to the RaspberryPi running a simple openFrameworks application. The application uses an early version of a library I'm working on for the PCA9685 PWM chip, which is the  center of the LED driver boards.
The video above shows the prototype running through a generative program based on 2d Perlin Noise. I'll be using this prototype to dial in the correct noise/video ratio for the final installation in the marquee.
The openFrameworks code, as well as the in-progress PCA9685 C++/raspberryPi library can be found HERE.

Major thanks to Dale Clifford, Zack Jacobson-Weaver, Eric Brockmeyer, Ali Momeni and Tom (the person who posted the RaspberryPi i2c code I needed on stackoverflow).



Monday, December 9, 2013

Memory Shells - Final Description


Short Description

Memory shells is an exploration of how we distort our memory of previous events and the value of this distortion in creating a narrative. The audience listens to audio of memorable and nostalgic events taken from pop culture, news and speeches. The audio is slowly modified by their facial expressions to create a narrative that attempts to approximate collective memory. These expressions model the personal biases that are introduced when sharing a memory.

Tech Description 

The system consists of of a central box and a ‘seashell’. A Raspberry Pi connected to a webcam tracks the user’s face and sends frames to a local server (MJPGStreamer). The server determines the facial expression of the user with openFrameworks and determines a set of probability values to select the next track. To include the ‘emotional bias’ of the user in the track, the volume of the track changes and noise is added using SoX. The server uses Websockets to send a command back to the rPi to play the next segment with the added modifications.

Acknowledgements

I would like to thank Dale Clifford and Zack Jacobson-Weaver for their feedback and guidance, and Marc Farra, Liang He and Meng Shi for their support and technical help.

"Ba.S.I.S." Final Project: Amy



Ba.S.I.S. - Volleyball Skill Identification System




 Statement:
Ba.S.I.S., is a volleyball skill identification system. The project itself is a glove that is worn on the right hand and allows anyone to play volleyball and learn how to accurately execute each skill. Through an analog output of 3 sensors, the information will indicate to the user whether they BUMP, SET or SPIKE.



Short tech description:
FSR Sensor
Flex Sensor
Softpot Position Sensor
Arduino (analog output)
Processing (visualization of output)

 Acknowledgements:
Zack Jacobson-Weaver
Dale Clifford
Mauricio Contreras

Friday, December 6, 2013

Final Project: Note Cubes-----Wanfang Diao


Note Cubes from Wanfang Diao on Vimeo.

Name of the piece: Note Cubes
50 word statement:
The Note Cubes is a set of tangible cubes designed for non-musician to explore sound, notes and rhythm. By putting them a line or also stacking them (just like playing toy bricks), you can let cubes trigger their "neighbor cubes "( left or right & up & down) to play notes.
Short tech description:
There is micro-controller (Trinket), photosensors, speaker and LEDs in each cubes. Each speaker can play notes when triggered by LED lights from other cubes under the control of micro-controller.
The cube's shell is made by hardboard by laser cutting.
Acknowledgements
Thanks for the help of Dale Clifford, Zack Jacobson-Weaver, Ali Momeni, Madeline Gannon and my other friends in CoDelab!

Monday, December 2, 2013

Final project description: "Building towards Robotic Sculpting" - Mauricio Contreras

Building towards Robotic Sculpting

I'm exploring gestural control with haptic feedback in driving an industrial robotic arm to enable human driven sculpting. In the final showing I'll demo different techniques I've been using and applying to the robotic arm, in a smaller format, controlling a desktop drilling tool.

Tech description:

I'm using an Android smartphone as controller; it's orientation drives the z axis motion of a small drill press (Dremel + stand rather). It is also a haptic feedback device in the sense that, upon the head of the tool drawing near to the material being drilled, the smartphone vibrates. The programing of the smartphone is in Java, writing in a TCP/IP socket (wirelessly) to a PC, which in turn does serial to an Arduino driving the servo motor that moves the tool in the vertical axis. The distance sensor communicates to another microcontroller and then to the same software running in the PC, which writes to the socket sending the vibration signal, closing the feedback loop.

Acknowledgements

Dale Clifford
Zack Jacobson-Weaver
Madeline Gannon
Ali Momeni
Mike Jeffers
Golan Levin
Garth Zegling
and CMU’s Manipulation Lab

without your valuable support and honest critique, this project would be nothing.

Video (v1)



"Project Ganache" Final Installation: Gaby

Project Ganache 

Ganache is a fun way to decorate cakes using spiraling motions. It combines mechanics, electricity, and math, allowing for fun experimentation with icing as well as a taste of the world of a wedding cake maker.

Technology:

1. Arduino Uno
2. Potentiometer
3. DC motors
4. Laser cutting

Acknowledgements

Thank you to everybody who helped in the brainstorming and definition of this project: Dale Clifford, Zack Jacobson-Weaver, Chin Wei, Nissa Nishiyama, Ayobami Olubeko, Amy Friedman, Austin McCasland, and Andre Le. A special thanks especially to Jake Marsico, Mauricio Contreras and again Dale and Zach for patient guidance and help with the electronics and mechanics. Thanks in addition to the Mischer Traxler "'till you stop automatic cake decorator" project for the inspiration.

Flitter Bloom



Inspired by Disney’s Botanicus Interacticus project, our project aims to explore how plants might react to human presence and touch, and also, how people would react to plants that responds to them.


Technology



Input:
Arduino Uno with capacitance sensing circuit 




Output:
2 x 5 volt micro servos attached to strings controlling the plant


Acknowledgements

  • Dale Clifford
  • Zach Jacobson-Weaver
  • Andre Le
  • Austin McCaslan
  • Ayo Olubeko
  • Gaby Moreno
  • Ali Momeni


Final Installation "Inkative": Liang

Inkative 

It aims to explore fish’s reaction to human behaviours via the form of ink. Inkative is a tangible platform that has the capacity to record and convert the interactive moments between fish and people into ink strokes on paper. The physical inks interpret how people’s behaviours affect fish’s activity.

Key Technology:

1. Computer Vision (OpenCV) in openFrameworks
2. Proximity sensor
3. Stepper motor + servo
4. Arduino
5. Laser cutting and 3D printing
6. CNC-like mechanism

Acknowledgements

Thanks for the help and suggestion of Dale Clifford and Madeline Gannon. Special thanks for Zack Jacobson-Weaver's brilliant advices on CNC construction and design. Thanks for the work "piccolo" of Huaishu. In addition, thanks everyone in CoDe lab, working together and playing together. 

Wednesday, November 20, 2013

Looking Out

Resinance is inspirational as it created a system of self responding entities. The objects interact with the environment as well as among themselves.


Resinance from materiability on Vimeo.

Sunday, November 17, 2013

Looking Out

By creating visual illusions, "Box Demo" created an intense magic experience.
Project by GMUNK.


Box from Bot & Dolly on Vimeo.