Digital Content/ Development pt.2

This entry is more likely to the part where i try the script that is included in the library.

This is the processing part, where i try to understand the language that has been used to project a synthdraw, i've found that the visualization is impressing if i can synchronize it with my synthesizer and replace the the cursor with my range sensor value. But first, I need to understand the script and try to make it communicate with the Arduino IDE using processing.



this is the video of processing the visualization using webcam as the sensory device.
You can see the different when the visualization bar color's tone is changing, it act that way because of the motion has been detected by the webcam.



and this is the plug in for processign that can be used to control the visualization background color, by doing so if this can be sync and communicate with my arduino board and the knob as the controller, i can control the visualization and the output sound together and make it sync smoothly. hopefully this is not impossible.



OK. Now is the SuperCollider part, i altered the script a bit, and make the mouse as the variable to take control the value of the output as it will produce a midi sound based on the mouse x movement. It quite nice for the first timer, i will try to improve my self for the prototyping stage.



The last part, this the flash file that i've found very interesting and cool. This was made purposely for the touch screen project, but i was thinking if can alter the script and make is communicate with arduino IDE it would be great for my project, i just need to have the script for synthesizer that can produce stand alone MIDI sound and find a plug in to make it communicate with swf. Going to do more research on this.

No comments:

Post a Comment