I've been playing with Daniel Shiffman's code example "flocking" and wish to make the code more efficient so I can have 5,000 boids on a canvas of 1024*768 pixels running at 25fps.
I believe it's just the flocking that hits the framerate, having every boid checking every other boid to see if it's in close proximity to it. Currently, anything over 800 boids reduces the framerate to 20fps or less.
Would I need something like a quadtree to reduce the number of proximity checks a single boid has to do?
I've been reading lots about collision detection on these forums, reading about box2d, fiscia, geomerative and others, but all the threads I find relate to collision detection of an unchanging vector shape.
My ultimate goal is to be able to test if a point in 2D space is within the bounds of an irregular polygon, whose size, shape and location is potentially changing 25 times a second.
Unrealistic? An approximation would do, maybe less prescision in the "bounding box" or updates at less than 25 frames a second.
The situation: I'm using the computer vision library Jmyron to do blob tracking, which is working very well. I can extract the blobs as the co-ordinates that form quads, co-ordinates that form vector shapes or as a list of "edge pixels".
The other half of my project is a really basic random particle system. Particles whizzing about with random directions and velocities. The idea is that these particles can't intersect with the blobs that Jmyron detects, so if a hand-shaped blob is detected, the particles will go anywhere around the "silhouette" of the hand, including between the fingers, but won't go inside the outline of the hand.
Crazy?
If anyone could point me in the right direction, I would be very grateful.