Conceptual mental block implementing variables with blob detection algorithm

edited January 2014 in How To...

Hey all,

So I am creating an Koi fish pond that uses a set of images to trigger illustrated animation frames. the installation space will be a room with 4 IR lasers on each corner, projector overhead, and an IR camera to make a more reliable algorithm (and the fact this will be in complete darkness!)

I am having a block though in thinking about how to conceptualize the logic for certain instances. These instances are the follows:

the user will be essentially a circle (due to the overhead camera/distance from floor) walking within space. I currently have the fish swimming in place (the animation is triggering a fish movement in a few frames). when the user enters, the fish begins to follow the user.

so what i need to do is measure the angle from the fish and the user (blob). so if the user is 45 degrees to the fish (side of fish out of its "sight" front of face), it triggers the series of "turn" animations. this isnt a question about triggering animations, but the logic to measure angles of the fish to the user.

i've illustrated what im talking about as best i can and a screenshot of program.

Screen Shot 2014-01-20 at 8.53.33 PM Screen Shot 2014-01-20 at 9.15.32 PM

Answers

  • edited January 2014 Answer ✓

    I have given you a solution just put an angle condition along with all those condition mentioned in if statements thats all !

Sign In or Register to comment.