Monday, December 16, 2013

Whisker V4 to V1: Amy

Whisker 1:

 materials:
from a lamp
-copper wire
-spring
-screws
-weight
-plastic
The main properties of the first part of my whisker is its magnetic attraction, as a response to its own surrounding. In essence the whisker acts as an antenna to send out electromagnetic signals to allow for an animal to respond to a predator that is nearby.
 

Whisker 2:


 materials:
-magnets
-acrylic
-metal I-hook
-wood rod

V2 takes the electromagnetic response further to allow for the head to move out and in based on magnetic attraction. This begins to incorporate a defense mode which would be move manually to attack the predator and try to catch their hand as they approached the stationary whisker. The value changed from V1 to V2 is that the object is scaled up and the magnetic attraction is continued, but the whisker still lacks a greater overall interaction for users. There is also a lack of determination as to the definition of the whiskers goals overall and how it wants to react.

Whisker 3:



 materials:
-cardboard
-acrylic
-servo motor
-magnets
-wood rod
-ardunio
-elastic band

Whisker V3 has modifications that integrate a servo motor into the claw part of the whisker. When a magnetic force is need the whisker it tries to push itself away through opposing magnetic forces. If someone gets to close the claw spins based on a simple servo coding (no sensors used) and the claw can be shut and opened through manual pulling and pushing of a rubber band. This modification continues to focus on self defense as you approach the whisker. It begins to resemble a toy and allows the user to interact with it further.

Whisker 4: 


Whisker V.4 from Amy Friedman on Vimeo.








 materials:
-Fishing Wire
-Metal Wire-Wood Rod
-Acrylic
-Servo Motor
-LED Light
-DC Motor
-Buzzer 
-PIR Sensor
-Ardunio

This final whisker prototype adds a greater interactive element as V4 reacts to motion. With all of its electronic features, it gives the whisker a personality that was lacking before. I learned that inorder for something to truly be interactive there needs to be a strong curiosity. One may know what it is made of, but the user still may want to explore what it can further active over time. When V4 is activated, it first lights up its LED, signifying that it is awake. Then he turns his head around overseeing his surroundings, and understanding what woke him up. After finishing his surveillance, he lets out a high pitched alarm signifying his scream of terror, and tries to get away but inevitably cant move. There is something different in the way we interact with a bow & arrow or clamp vs a "cute" toy as some may declare this to be. And it was a great lesson to learn about both through my prototyping. 



Sunday, December 15, 2013

Final project description: "Heart Beat"--- Meng and Yang

Statement:
Heart beat in many cultures has a meaning of emotions and feelings. In the "Heart Beat" project, we try to build a device which can visualize people's heart beat and create magic feelings. In the developing of this project, we made three prototypes. The first prototype is a heart-beat box which can sense and visualize our heart beat if we put our fingers on the sensor. Red LEDs with in the heart-shaped box will flash according to the beat it senses. In the second version of prototype we tried to projected ripples on the wall. The ripples spread out according to our heart beats. In the final version of prototype we built a mobile projector system which receives signal from a wireless heart-beat-sensor box. Holder holds the projector moving around and she/he can pass the sensor to other people. The projector projects animation of the heart beat it senses on the floor. We hope the device can augment the communication between people and engage more people into talking by trying visualizing their own heart beats.

Video:

HeartBeat from Benny Zhang on Vimeo.


Tech description:
One pair of IR LED and receiver is used for the heart beat sensor. The LED is connected with 5v power and emits stable power of light. The IR light then be reflected by our finger so that the light the receiver received changes according to the density of fingers. Every time Our heart beats, blood flushes into the finger changing the density repeatedly thus with algorithm the heart beat can be detected. The signal is then transmitted to the arduino connected with laptop using wireless module. Processing program receives signal from serial and draw the animations.

Acknowledgements
Dale Clifford
Zack Jacobson-Weaver
Ali Momeni
Golan Levin
Aisling Kelliher
P. Zach Ali
Thank you and all our classmates. We learn a lot from this course because of your enlightening ideas.

Saturday, December 14, 2013

Final Project Description


Statement:
NuPhase is an experimental interface that aims to provoke the understanding of relationship between the slight temperature variation and phase change phenomenon. Through playing and visualization, it hopes to bring to light more creative uses of this material.

Short tech description:
Processing + Kinect (video function and edge detection)

Acknowledgements:
Dale Clifford
Zack Jacobson-Weaver
Aisling Kelliher
Madline Gannon
CMU codelab

Thursday, December 12, 2013

Row House Cinema Final Prototype




The final prototype delivered for this class used two of the custom LED driver boards with 30 small, white LEDs. The boards were connected to the RaspberryPi running a simple openFrameworks application. The application uses an early version of a library I'm working on for the PCA9685 PWM chip, which is the  center of the LED driver boards.
The video above shows the prototype running through a generative program based on 2d Perlin Noise. I'll be using this prototype to dial in the correct noise/video ratio for the final installation in the marquee.
The openFrameworks code, as well as the in-progress PCA9685 C++/raspberryPi library can be found HERE.

Major thanks to Dale Clifford, Zack Jacobson-Weaver, Eric Brockmeyer, Ali Momeni and Tom (the person who posted the RaspberryPi i2c code I needed on stackoverflow).



Monday, December 9, 2013

Memory Shells - Final Description


Short Description

Memory shells is an exploration of how we distort our memory of previous events and the value of this distortion in creating a narrative. The audience listens to audio of memorable and nostalgic events taken from pop culture, news and speeches. The audio is slowly modified by their facial expressions to create a narrative that attempts to approximate collective memory. These expressions model the personal biases that are introduced when sharing a memory.

Tech Description 

The system consists of of a central box and a ‘seashell’. A Raspberry Pi connected to a webcam tracks the user’s face and sends frames to a local server (MJPGStreamer). The server determines the facial expression of the user with openFrameworks and determines a set of probability values to select the next track. To include the ‘emotional bias’ of the user in the track, the volume of the track changes and noise is added using SoX. The server uses Websockets to send a command back to the rPi to play the next segment with the added modifications.

Acknowledgements

I would like to thank Dale Clifford and Zack Jacobson-Weaver for their feedback and guidance, and Marc Farra, Liang He and Meng Shi for their support and technical help.

"Ba.S.I.S." Final Project: Amy



Ba.S.I.S. - Volleyball Skill Identification System




 Statement:
Ba.S.I.S., is a volleyball skill identification system. The project itself is a glove that is worn on the right hand and allows anyone to play volleyball and learn how to accurately execute each skill. Through an analog output of 3 sensors, the information will indicate to the user whether they BUMP, SET or SPIKE.



Short tech description:
FSR Sensor
Flex Sensor
Softpot Position Sensor
Arduino (analog output)
Processing (visualization of output)

 Acknowledgements:
Zack Jacobson-Weaver
Dale Clifford
Mauricio Contreras

Friday, December 6, 2013

Final Project: Note Cubes-----Wanfang Diao


Note Cubes from Wanfang Diao on Vimeo.

Name of the piece: Note Cubes
50 word statement:
The Note Cubes is a set of tangible cubes designed for non-musician to explore sound, notes and rhythm. By putting them a line or also stacking them (just like playing toy bricks), you can let cubes trigger their "neighbor cubes "( left or right & up & down) to play notes.
Short tech description:
There is micro-controller (Trinket), photosensors, speaker and LEDs in each cubes. Each speaker can play notes when triggered by LED lights from other cubes under the control of micro-controller.
The cube's shell is made by hardboard by laser cutting.
Acknowledgements
Thanks for the help of Dale Clifford, Zack Jacobson-Weaver, Ali Momeni, Madeline Gannon and my other friends in CoDelab!

Monday, December 2, 2013

Final project description: "Building towards Robotic Sculpting" - Mauricio Contreras

Building towards Robotic Sculpting

I'm exploring gestural control with haptic feedback in driving an industrial robotic arm to enable human driven sculpting. In the final showing I'll demo different techniques I've been using and applying to the robotic arm, in a smaller format, controlling a desktop drilling tool.

Tech description:

I'm using an Android smartphone as controller; it's orientation drives the z axis motion of a small drill press (Dremel + stand rather). It is also a haptic feedback device in the sense that, upon the head of the tool drawing near to the material being drilled, the smartphone vibrates. The programing of the smartphone is in Java, writing in a TCP/IP socket (wirelessly) to a PC, which in turn does serial to an Arduino driving the servo motor that moves the tool in the vertical axis. The distance sensor communicates to another microcontroller and then to the same software running in the PC, which writes to the socket sending the vibration signal, closing the feedback loop.

Acknowledgements

Dale Clifford
Zack Jacobson-Weaver
Madeline Gannon
Ali Momeni
Mike Jeffers
Golan Levin
Garth Zegling
and CMU’s Manipulation Lab

without your valuable support and honest critique, this project would be nothing.

Video (v1)



"Project Ganache" Final Installation: Gaby

Project Ganache 

Ganache is a fun way to decorate cakes using spiraling motions. It combines mechanics, electricity, and math, allowing for fun experimentation with icing as well as a taste of the world of a wedding cake maker.

Technology:

1. Arduino Uno
2. Potentiometer
3. DC motors
4. Laser cutting

Acknowledgements

Thank you to everybody who helped in the brainstorming and definition of this project: Dale Clifford, Zack Jacobson-Weaver, Chin Wei, Nissa Nishiyama, Ayobami Olubeko, Amy Friedman, Austin McCasland, and Andre Le. A special thanks especially to Jake Marsico, Mauricio Contreras and again Dale and Zach for patient guidance and help with the electronics and mechanics. Thanks in addition to the Mischer Traxler "'till you stop automatic cake decorator" project for the inspiration.

Flitter Bloom



Inspired by Disney’s Botanicus Interacticus project, our project aims to explore how plants might react to human presence and touch, and also, how people would react to plants that responds to them.


Technology



Input:
Arduino Uno with capacitance sensing circuit 




Output:
2 x 5 volt micro servos attached to strings controlling the plant


Acknowledgements

  • Dale Clifford
  • Zach Jacobson-Weaver
  • Andre Le
  • Austin McCaslan
  • Ayo Olubeko
  • Gaby Moreno
  • Ali Momeni


Final Installation "Inkative": Liang

Inkative 

It aims to explore fish’s reaction to human behaviours via the form of ink. Inkative is a tangible platform that has the capacity to record and convert the interactive moments between fish and people into ink strokes on paper. The physical inks interpret how people’s behaviours affect fish’s activity.

Key Technology:

1. Computer Vision (OpenCV) in openFrameworks
2. Proximity sensor
3. Stepper motor + servo
4. Arduino
5. Laser cutting and 3D printing
6. CNC-like mechanism

Acknowledgements

Thanks for the help and suggestion of Dale Clifford and Madeline Gannon. Special thanks for Zack Jacobson-Weaver's brilliant advices on CNC construction and design. Thanks for the work "piccolo" of Huaishu. In addition, thanks everyone in CoDe lab, working together and playing together. 

Wednesday, November 20, 2013

Looking Out

Resinance is inspirational as it created a system of self responding entities. The objects interact with the environment as well as among themselves.


Resinance from materiability on Vimeo.

Sunday, November 17, 2013

Looking Out

By creating visual illusions, "Box Demo" created an intense magic experience.
Project by GMUNK.


Box from Bot & Dolly on Vimeo.

Saturday, November 2, 2013

LookingOut 9: Liang

MusicInk (2013)
by Riccardo Vendramin & Gilda Negrini















MusicInk is a project aiming at music teaching. It creates a new way to understand music and interact with musical instruments. It is a toy that provides an innovative approach to music, allowing kids not only to draw musical properties, but also to compose a real symphony. The major technology covers a conductive ink by Bare Conductive, MPR121 controller, Arduino, LiPo shield, Bluetooth shield by Seedstudio, Android, Pure Data for Android, Pure Data patch. Children are able to complete a unique electrically conductive paint and then play their drawings with music.


Monday, October 28, 2013

Looking Out 6: Amy

Nike Fuel Band
http://www.nike.com/us/en_us/c/nikeplus-fuelband

http://insider.nike.com/us/launch/nike-fuelband/

The Nike Fuel Band is an interesting product as it was developed to track the everyday activity of anyone. This project was geared towards the average man to allow him to have a device that helps prioritize physical activity in everyday life. The fuelband further allows users to reach out to a network to gain motivation from others. The band can be connected to any smartphone by bluetooth, after downloading the free Nike Fuel App. The fuel settings are customized to your age, weight, height, and you set your own fuel goal based on what you want to achieve. There are four settings shown on the band itself; Fuel, Calories, Steps and Time.

This Technology allows people to become aware of the lack or good amount of activity that they experience on a daily basis, and this device helps build awareness to what they should and can achieve.

Technology:
2 Lithium Polymer Batteries(3.7V)
20 color red/green LED
100 white LED
Stainless Steel
Magnesium
Thermoplastic Elastomers
Polypropylene
Bluetooth Connectivity
Pedometer

http://en.wikipedia.org/wiki/Nike%2B_FuelBand#Specifications

Final Project Proposal: Amy

Sensored Sports Ball for Coordination Development

Ideal Project:


http://www.ducksters.com/sports/football/throwing_a_football.php
http://www.thecompletepitcher.com/pitching_grips.htm

 problem
A major problem when it comes to playing sports or learning a new sport, is the constant jamming of the fingers, simple injuries that could be solved with the correct hand-eye coordination development.

solution
This idea could be applicable to any sports ball whether that be football, volleyball, baseball, etc. The ball would be lined with an infrared sensor layer that could create a virtual imprint of an athletes hand placement on the ball. The information would be wirelessly sent to a computer, to allow athletes to easily preview their "imprint". Another aspect of the sensor layer is that it would have a force sensor, and calibrator to calculate the force that the ball hits each finger with or the force that the finger hits the ball with. This information would be sent to the computer, computed through a program, and output directions to athletes to help improve their ball control. Programs will need to be developed with standards for each sport. In order for the program to be successful I would need to gather enough information to create a control which all variable data can be calculated against. I would also want the ball to be able to understand the wind patterns, and the temperature, and use these variables to help improve an athletes adjustment to surroundings.

http://www.imaging1.com/thermal/thermoscope-FLIR.html

application
The output would allow athletes to improve their source of error when it comes to their given sport. It will also allow for less injury to athletes, and gives personalized feedback, when a coach may be focusing on another play. This is applicable for someone who wants to improve their skillsets and learn the right way to play a sport, rather than someone who is playing for fun.

Reality:
I will create a volleyball that will record data of what is happening to it. There are 3 variables that I will track to understand the data I receive. The 3 tracks are: set, spike and serve. I will need to learn Zigbee and I will need to teach myself to pipeline the data I receive from the arduino and send it to Processing to visualize the data.

Things I will need:
-Volleyball
- Zigbee
- Pressure Sensors
-Accelerometer
-MPR121 Capacitive Touch Sensor

Areas to Research:
-sports medicine
-interaction design
-athlete sensor data
-tracking athletic movement

Saturday, October 26, 2013

Looking Out: Andre

The V Motion Project from Assembly on Vimeo.


This interactive performance piece called The V Motion Project was a collaboration between a variety from various fields at the request of Frucor (makers of V Energy drink) and their ad agency BBDO. The end result was a stunning projection mapped live musical sequencer performed outdoors on the side of a building. More information can be found about the tech here: http://www.custom-logic.com/blog/v-motion-project-the-instrument/

Thursday, October 24, 2013

LookingOut 8: Liang

Project #1 Bubble Cellar
BY Joelle Aeschlimann, Pauline Saglio, and Mathieu Rivier (2013)

This is an interactive installation composed of a bubble blower, a projector, wind sensors. It invites users to come near and blow into the blower to create bubbles just as we always do in our daily experience. Instead of creating real bubbles, the installation projects virtual soap bubbles on the wall, which rises with lightness and fragility in the air, leaving us in the expectation of its future explosion. Since it is projected shadows, it allows much more imaginations and wonders, which are turned out to be some animations when the bubbles are exploded.







Project #2 MoleBot
BY Design Media Laboratory (2012)

It is a mole robot built under a transformable surface that allows people to enjoy playful interactions with everyday objects and props, and experience what it feels like to have a mole live under your table. It can undergo many tasks, like pick fruits, lead objects... users are able to interact with it via gestures.

Monday, October 21, 2013

Looking Out: Gaby

Chopsticking


Chopsticking is a two player board game designed to test and improve one's chopsticking skills. While the sushi bowl spins, each player has 30 seconds to pick up as much sushi as possible while accurately holding the chopsticks.

You can find out more about the project at these links:

Campfire - Final Project Proposal - Austin, Andre

Themes

  • Campfires are places where people gather
  • The outside world melts away
  • Encourages engagement
    • Interaction
      • Engage with materials/food
      • Engage with one another
    • Observing
      • Random pops and light patterns


Concept
  • We want to create a modern day campfire
    • Light
    • Sound
    • Interaction from 360-degrees
    • Engage with other people over the top
    • Suspense


Cube or pyramid with light up buttons.  People gather around the cube and press buttons to experiment with what happens.  Different light and sound patterns will occur depending on the order and combination of button presses.  People will naturally be drawn in and want to experiment .  There will be a stasis mode of a light pulse of light and sound from the object; a state which it will revert to without stimuli.




Background
-Monome
-Led Light Cubes




Tech

  • Arduino
  • Max/Msp
  • Chronome


Materials

Prototype Specific
  • 4x4 PCB LED Boards (x2) - $20
  • 4x4 LED Button Boards (x2) - $20

Final Project
  • 7x 74HC595- $5-$10
  • 7x 74HC165 - $5-$10
  • Possibly using adafruit 24-channel LED Drivers -$15 each
  • RGB LEDs x80 or x45 - $10
  • Arduino
  • Acryllic Sheet - $50
  • Buttons (either 45 or 80 depending on prototype grid size) - $5-$10



Friday, October 18, 2013

Final Project Proposal: Liang

Project: Inkative


INTRODUCTION

Project Intent
What does the interaction between humans and living creatures looks like? I hope Inkative be such a platform that records and depicts the dynamic interactive process in a tangible way. It is able to convert little creatures' (like gold fishes, bees, flies) reaction to surrounding stimulus into something that can be recognised and described. To some extent, it will provide a novel way for people to observe and learn how to communicate with these little lives.


PROJECT BACKGROUND

The original idea of this project came from a episode of film "Sherlock Holmes" (here). Out of curiosity about how people's behaviours or environmental changes affect little live creatures' behaviours, I started to read and review literatures and works of other researchers and artists. Here lists some of them:

David Bowen's fly works
























fly revolver (2013) [link]: A revolver is controlled by the activities of a collection of houseflies, which are monitored via video  in an acrylic sphere. The revolver aims in real-time according to files' relative positions on the target. If a fly is detected in the center of the target the trigger of the revolver is pulled.








































fly drawing device (2011) [link]: Based on the subtle movements of several houseflies, a artificial device is made to produce drawing on a wall. The drawing is composed of random and discrete strokes. There is a big sphere with a chamber on the top. When flies enter the chamber the device is triggered to begin the drawing.























Tangible Design's Picidae Chorus [link]
It is a kinetic, light and sound installation featuring 7 geometric birds, installed in the woods. It is lit up as the birds pecked a percussion box and slowly fade as they come to a stop. It provides a playful and engaging environment for visitors in the park.


























Social Firefly by Jason mcDermott, Liam Ryon, and Frank Maguire (2011) [link]
It is innovative light artwork and also a community of friendly intelligent lights that influence one another. The fireflies are programmed to respond to light from their neighbours. In this community, popular fireflies become highly influential, whilst isolated ones must work harder to reach their friends. People can speak the same language and influence the interaction between community members by shining lights on these fireflies.

In conclusion, these works can be placed into two categories: bio controller and interactive installation. The former only focuses on what kind of outputs can be produced by animals' behaviours, lacking the ability to observe how environment can affect their behaviours. Therefore, it is more like a one-way system which addresses the variations of outputs but ignores the interactive input from ambient stimulus. Contrarily, the latter category focuses on how to simulate creatures with modern technology and fabrication method. It presents an interesting way to explore how animals' behaviours and habits affect people's social activities. So in most cases, installation art is the best form for this kind of system which invites audience to participate in. I hope to create a device based on the ideas of above works, enabling users to interact with animals, recording the dynamic interactive process, producing something can be perceived and studied by humans, and finally helping people better understand the communications with the other living friends on the planet.  



INITIAL DESIGN & IMPLEMENTATION

Prototype
Initial prototype is pretty simple. It consists of a large transparent sphere, 3 cameras and a swarm of houseflies. The cameras capture the movements of flies and map them to some digital outputs. The sketch is shown here. However, it fails to find the interaction between flies and people in some prior experiments. The possible reason is that the sphere is too large for flies to recognise the surrounding changes. So this design is claimed to death.






























The project currently aims to observe the interactions between people and little animals. So here the application is about gold fish. The system looks like a stage, the fish tank can be placed on the stage. Two cameras are used to track the positions of gold fish in the fish tank and deliver data to the computer. Beneath the stage there is a chamber which looks like a CNC. In this chamber it will control a brush moving in 3D space to draw on a ink paper. Unlike CNC, it is has circular structure. A stepper motor is set at the center position enable the upper circular panel to spin. A steel rod is used as a bracket to support another stepper motor to drive the brush. With these two motors and such structure, the brush can go anywhere on a circular plane theoretically. The brush is mounted on a device which is driven by a servo, so that it can be lifted up and down to indicate the animal's vertical position. The form of drawing is Chinese traditional painting because it can reflect the animal's 3D position through the pressure and the amount of water of the brush. In addition, to address the interaction, proximity sensors are used to detect if there are people standing around the platform.

This is a initial sketch for this prototype:




































The diagram of the system is:


Materials

Required: masonite wood, steel rod, ink paper, Chinese calligraph brush, acrylic board, felt
Maybe: PLA

Hardware
Required: Webcam, proximity sensor, stepper motor, servo, Arduino, computer, resistor, LED, jumper wire
Maybe: Raspberry Pi

Software
Required: Arduino, openFrameworks, openCV

Fabrication Processes
Laser cutting all plates and parts that the stage needs, including the brackets, the base, the circular plane, gears and any enclosures. For those laser cutting cannot satisfy 3D printing may help.

Budget
Name                                 Price                                  Number
Webcam                            $5.09                                     2
Masonite wood                   $39.95(12''*12'')                     1
Proximity sensor                 $13.95                                   4+
Arduino Uno                        $29.95                                   1
Brush                                 $4.38                                     1
Ink paper                            $2.00                                     n
Stepper motor                     $16.95                                   2
Servo                                  $8.95                                    1
Acrylic                                $15.00                                   1
Felt                                     $14.30                                  1

Timeline
10/14 - 10/20: schematic design, prototype scope, sketches, technology and material exploration, diagrams.
10/21 - 10/27: components purchase, openCV test in openFrameworks, finish object track and location in 3D environment.
10/28 - 11/03: physical structure design and fabricate all parts, mount circuit, test on stepper motor.
11/04 -  11/10: mount all part and test the mechanism, test on servo.
11/11 - 11/17: complete circuit logic, including the combination of stepper motor and servo, mesh up CV data and circuit program attempt 1.
11/18 - 11/24: mesh up CV data and circuit program attempt 2, andtest the entire system.
11/25 - 12/01: refine design, development and mechanism, test and prepare for final presentation (shoot video demo and final documentation).

Documentations go through the entire design and development process.

Challenges
1. Track fish's position with openCV and openFrameworks.
2. Data processing. Convert location(x,y,z) into the angles applied for stepper motors and the servo.
3. CNC-like mechanism design and fabrication.  

CONCLUSION
Inkative is a platform that records the dynamic interaction between people and fish and converts it into Chinese traditional painting. People are able to study the fish's reaction to ambient stimulus from these paintings. In the future, it can also be applied for other animal, like bees, flies and etc. It has large potential value in assisting people to understand how animals behaviour changes as the environment alters.