We closed this forum 18 June 2010. It has served us well since 2005 as the ALPHA forum did before it from 2002 to 2005. New discussions are ongoing at the new URL http://forum.processing.org. You'll need to sign up and get a new user account. We're sorry about that inconvenience, but we think it's better in the long run. The content on this forum will remain online.
IndexProgramming Questions & HelpVideo Capture,  Movie Playback,  Vision Libraries › Sync data to video - eyetracking project
Page Index Toggle Pages: 1
Sync data to video - eyetracking project (Read 1523 times)
Sync data to video - eyetracking project
May 21st, 2010, 7:38pm
 
TL:DR - Syncing video and data stream - Whats the best way to display a long (900+) sequence of images at a high frame rate (30fps);

I am working with an eyetracker that outputs an x and y coordinate based on the users gaze point.

I want to be able to show the user a video clip, record their eye movement data and then render a video of a point using that data on top of the original video.
E.g. http://uoregon.edu/~josh/eyetracker/test.mov (I am using mouse data during testing).

Initially I tried play the video and outputting the data to text file.  Then playing the video, reading the text file, drawing the gaze point and then drawing the entire frame to a new video.  This works for short clips (how the example video was made) but over time the data and video go out of sync.  Processing has no way of telling what 'frame' a movie is at, only a 'time' which isn't accurate enough.

So then I broke the video up into individual images (30 per second of video) and then displayed these frames in sequence, recording one piece of eye movement data per frame.  This worked great and allows for perfect syncing when rendering.

However this limits me to a frame rate within processing of around 20fps when recording the data.  Given the frame sequence is generated at 30fps, the resulting 'animation' plays back at 2/3 of its actual speed.

Loading the images into an array during setup gives me a playback fps of 30, but memory issues keep occurring around the 400 frame mark.

Does anyone have any suggestions  Is there a better way entirely to do this  I looked at jitter as it can cycle through a movie 1 frame at a time, but would prefer to get this working in processing.

Sorry for the lengthy post.  Thought it best to go through my process.  Let me know if you need any clarification.

Thanks,
TB
Re: Sync data to video - eyetracking project
Reply #1 - May 22nd, 2010, 1:53am
 
Hmm maybe you can record, and play/overlay it by using available()
http://processing.org/reference/libraries/video/Movie_available_.html instead of doing it in draw. it should be synced as everything happens on a frame by frame basis of the original movie.
Re: Sync data to video - eyetracking project
Reply #2 - May 22nd, 2010, 10:25am
 
Thanks.  I've used at available() and read() but they don't correlate to whether the movie is advancing through frames or not.  

In the following example "264.mov" is a 20 second long video clip at 30fps = 600 frames. The movie plays exactly as it should but the frame counter prints out at around 250-280 after each run.

Code:
import processing.video.*;
Movie myMovie;
int frame = 0;

void setup() {
 size(640, 480, P2D);
 //frameRate(30);
 myMovie = new Movie(this, "264.mov");
 myMovie.play();
}

void draw() {
 if(myMovie.available()) {
   myMovie.read();
   frame++;
 }
if(myMovie.time() == myMovie.duration()){
     println(frame);
     exit();
   }
image(myMovie, 0, 0);

}
Page Index Toggle Pages: 1