I've spent quite a bit of time digging around, trying to find a way to export very large images (in this case, about 1200 x 500,000,000 px). Ultimately, this will be a printed image.
PGraphics seems the best option for offscreen drawing, and I've gotten my sketch to save when running smaller images. However, it runs out of memory at about 1200 x 20,000 px - far too small for this particular project.
I looked at the TileSaver but it seems to only be for high-res images of small OpenGL images.
Any suggestions for offscreen rendering, or a way to spit out lots of smaller .tif or .png files and stitch them together (this option need not be stitched in Processing, but would likely need to be automated rather than in Photoshop by hand).
PS: As a side note for those wondering why anyone would want to create a 500-million pixel long image, the code visualizes every unique combination of the 12 notes from the Nokia Tune ringtone. This is the most ubiquitous piece of music in existence, heard 20k times per second around the world.
I am curious how your image will be printed. A very long banner? Your question have been asked a few time already, and so far I haven't seen a really satisfying answer. You can increase the memory settings of Processing, but you will hit limits. I haven't looked at your sketch (yet), but can it be designed to make several smaller PNG files, as you mention? Or is that a whole, hard to separate?
Thanks for the (amazingly) quick response, phil.lho. While digging through the forums for other projects you've been a huge help in the past - thanks! My feeling was the same - this question has been asked but no good ways to do this.
Likely it will be printed in several sections (I have access to a inkjet printer that takes roll paper up to 100' long, though I've never printed nearly that large). Not sure yet quite how big I'll end up with. This project is also going to result in an mp3 of every possible ringtone; my calculations put those files at about 10TB.
I should have mentioned, I upped my Processing memory settings to their max.
It should be really easy to export .png files programmatically - the program is really just a bunch of for loops iterating through and writing arrays. The question at that point is putting them together, either in Processing or otherwise.
Ah, so you don't need to make an image of the final size, but of each part to be printed. And if you can easily generate each part, no need for a huge file (that even might hit some limit of the file format!). I see two ways then: print each part separately, finding a way to tell the printer just to resume the printing at the end of the previous one, without form feed or something. Not obvious... Or indeed load the images in Photoshop (the real stuff) to stitch them together. To my knowledge, it has special algorithms to handle images of size larger than the memory. You can automate this with the scripting capability of Photoshop.
I always recall 1 inch = 2.54 centimers, I used 2.54 in my computation instead of 25.4 (millimeters), so the error is mine, sorry. Anyway, that's the order of scale which matters here, so you are right on all points.
just another thought. instead of making it a printed artwork. think about an animation. if you make the video fill HD (1920 width) you can create 260416 single frames. if you scroll through the large image, even much much more. this could probably run for days
I definitely know this will be a (stupidly) large image, an issue for me to deal with for sure. Since the image is very long but not very wide - it could be as few as 12px wide - rows and such are possible. If the image ends up 42.3km at one row, 100 rows brings it down to about 1k feet... still stupidly large but starting to be do-able. Animation is a thought, and one I've had, but I like the scale of the print. At this point I'm in favor of trying to get the print images exported, then decide if this is do-able or not.
My real question is what my limits are for a Processing export - am I just bumping into the computational limit?. If I've set my Processing memory limit to ~2GB and I can only export a 20,000px file, I end up with a 66" file at 300dpi. That's large, but not too large by today's printing possibilities and that's with a very narrow image (1,200px).
+ Am I correct in thinking that my method of export would have the same memory issues as creating a second program to combine image files using PImage? I was wondering if a program that would tile smaller images would do any good, since it's not also doing the ArrayList functions to build the permutations.
+ Photoshop actions and batch processing is something I've used a bit, so that's a possibility; Most Pixels Ever is great, but on a quick look seems quite complex to get running. I guess I like the idea of doing everything in Processing - push a button and let it run for a few days - was the most appealing.
again, i dont believe you can create, and print one image that large. we are talking about an 60.000 Megapixel Image. that would take up 879 GB in Memory if i am not wrong. so you can set your memory as high as you want. it wont work.
Well, there are techniques to use hard disk as memory (virtual memory, memory mapping...) but I am not too sure how to use them in Java. And that's still a sizable chunk! The printer driver will have trouble too, etc. Classical image saving (PNG for example) probably need to have all the image in memory, to compress it, etc.
I do realize that this will need to be broken apart, and whatever solution will result in a ton of data... which is really the point of the project anyway. The mp3 part of this project would take several years to generate with my laptop, which is why I'm going to try the project using the supercomputer on campus. Hooray for artists at universities!
The question really is are there any strategies for generating large images with Processing? My particular case is beyond the norm, but since it's been asked before I thought it might yield some help to others tackling more manageable projects.
Are any of these strategies of possible help reducing the memory use of a program while running (resulting in larger possible images)?
1. The program I have written is fairly basic (mostly writing to and reading from ArrayLists). However, does this variable storing take a lot of RAM while the sketch is running? Could saving to lots of png files, then loading them with another program that doesn't do all the array work be of any help?
2. I know when working with video in Jitter, moving image processing to OpenGL and the GPU makes things run much more quickly. Would a similar move - loading smaller png files into a GL space and then exporting - be useful? I could then use TileSaver to export.
3. Would writing directly to the pixel array be computationally less expensive than using the rect function? Would defining colors as variables as opposed to creating them with equations within a for loop be better?
4. Any other thoughts on brining memory use down? Perhaps a topic for another forum post.
about the sound part that takes so long to generate the mp3. you know. you are using processing, why dont you make use of it. you can easily generate a tool, that plays that sound whenever you open it. also save the position where it was and play it again. might be only some kb in size and does exactly the same.
Leave a comment on cedrickiefer's reply
Change topic type
Link this topic
Provide the permalink of a topic that is related to this topic