Experiencing a weird frame rate issue
in
Programming Questions
•
3 months ago
I have written a small Processing program that does a simple plasma animation. I have written two version of the program, as I was searching for a way to optimize it for better Javascript and Android performance. For the purposes of this question, however, I have been running both versions using the standard Java environment for Processing.
Although the code isn't terribly long, I will just post links to the code on here so that it doesn't clutter up this actual forum post. They links to the code files located on my Dropbox.
Now, I
expected the 2nd version of the code (or at least hoped) to have a significantly higher frame rate than the first version of the code, because in the 2nd version of the code I am using look-up tables to a higher degree.
In actuality, the 2nd version of the code
does get a marginally better frame rate initially, but then after about 1350 frames or so (and it is consistently around that time) the frame rate drops in epic proportions.
Here are a couple of figures showing what I am seeing. In the following figures, the
red line is version 1 of the code, and the
blue line is version 2 of the code (the version that uses more look-up tables and which I expected to have a higher frame rate).
Here is a 2nd figure where I zoom in on the problem area:
Does anybody have any idea what could be causing this, based off of looking at the code and the figures?
My machine is using an AMD Athlon X2 Dual Core processor, 1.5 GHz, 64-bit proc, with 3 GB of RAM, and running Windows 8.
Two possibilities I have come up with:
(1) For some reason the look-up tables in version 2 of the code don't entirely fit in cache, and so the processor has to continually swap out look-up tables in and out of cache, and this causes the frame rate to drop. If this were true, however, I would expect it to happen immediately, and not 1300 frames into execution of the program.
(2) The operating system is possibly dedicating more processing time to my app near the beginning of its execution, and then lowering its priority as execution time goes on, thus causing a drop in frame rate. But if this were true, I would expect it to be mirrored in version 1 of the code as well.
What opinions or suggestions do you guys have?
1