I am working out on a smaller scale a version instillation that I am working on for an night time outdoor venue. I am taking one image that and slicing it into multiple thin vertical pieces. The pieces are then recombined into the original picture within processing. In front of each "slice" will be a sonar range finder to calculate the distance of the audience. Once someone is within range of the sonar the section corresponding to that specific sonar range finder will scroll vertically on the screen. As they get closer the piece will slow down since the distance variable will decrease.
I have a Arduino Mega running a simple analogue read program at this time tethered to a laptop which in turn is connected to a projector. Though I might look into using PMW or digital read so I can greatly increase the number of slices. With the code below I am using only 6 sensors. 1. How should I run the sonars? there should be plenty of space for each not to get in each others path, but should I still worry about one sonar detecting another signal? Recommendations? 2. Passing the data to processing should I pack it into a string and then split it up in Processing or read each sonar in sequence? I have code I can show on the processing end. How are we looking? This will be expanded to a great extent once this is running. I know I need to map the input of the sonar data so I am not scrolling these images too fast.
import processing.serial.*; Serial myPort; int numSensors = 6; int linefeed = 10; int sensors;