Utilizing maximum amount of memory without going out-of-memory
in
Programming Questions
•
1 year ago
I need to use all of the available memory for the manipulation of large (image) data sets, but the way memory management is handled in java makes this a challenge. I am on a 64bit system with 14GB of memory set in the Processing preferences. Solutions I have tried are:
Test Code
- Try-catch. Prevents OOM, but ends up so close to the max that the program runs into OOM later.
- A manual safe barrier. Only works by trial-and-error, so less than ideal.
Test Code
- ArrayList <PImage> outputBuffer = new ArrayList <PImage> ();
- PImage tempImage = createImage(1920, 1080, RGB);
- int maxMemory, totalMemory, freeMemory, percentage;
- maxMemory = int(Runtime.getRuntime().maxMemory()/1000000);
- for (int i=0; i<10000000; i++) {
- outputBuffer.add(tempImage.get());
- totalMemory = int(Runtime.getRuntime().totalMemory()/1000000);
- freeMemory = int(Runtime.getRuntime().freeMemory()/1000000);
- percentage = int((float)totalMemory/maxMemory*100);
- println("bufferSize: " + outputBuffer.size() + " | max: " + maxMemory + " | total: " + totalMemory + " (" + percentage + "%)" + " | free: " + freeMemory);
- // if something-something break;
- }
When I run the above code I crash at:
bufferSize: 1356 | max: 13049 | total: 11264 (86%) | free: 7
The bigger the image, the lower the percentage. So it seems you lose some GB's anyway.
What are possible ways to handle this correctly?
bufferSize: 1356 | max: 13049 | total: 11264 (86%) | free: 7
The bigger the image, the lower the percentage. So it seems you lose some GB's anyway.
What are possible ways to handle this correctly?
1