Today we have to present our conceptual prototype. We divided the task into four parts.
Part 1 – Building the mechanics of the sliders/surface. (Russ)
Part 2 – Displaying a waveform given our or more points on the wave. (Dan)
Part 3 – Capturing the Top Codes and sending the positions as input to Part 2. (Will)
part 4 – Using the produced wave in Part 2 to create a sound. (Russ)
I used the QTJava API to access the camera and capture stills. I then pass each still to the Scanner Class and decode the Top Codes.
It was difficult getting the QTJava to work but reading posts on developer.apple.com gave me a better understanding of how to set it up. I ended up with the following code, GrabWavePoints.java. This code uses the Top Code Classes by Mike Horn from Dan’s blog.
To test the code I run the following command
java -jar wavedecode.jar 2000 | ruby tui_visualization.rb – which captures every 2000 miliseconds.
The java part reads the Top Codes every 2 seconds and outputs the coordinates to standard output. This is them ‘piped’ to the ruby code.
java -jar wavedecode.jar 2000 -debug | ruby tui_visualization.rb
Here the standard output that is sent to the ruby part from the java program is also displayed in the terminal as standard error.
Here is a video of what we have working so far.
blog comments powered by Disqus