We are about to switch to a new forum software. Until then we have removed the registration on this forum.
I am working upon a a huge data set and I want to visualize them all or at least be able to produce a pdf or jpg output?
How that can been done in p5.js ?
Answers
That would be 10,000,000,000 pixels and if each pixel requires 4 bytes (ARGB) then that is ~40 GB. This is too big for a single entity such as a pdf or jpg and it is too large to see on a visual display at 1:1 scale.
A solution would be an interface that allows you to pan over and zoom into the data set programmatically. Similar idea as a Mandelbrot viewer.
Alternatively create hundreds of smaller images that fit together and make up the whole.
you can create a 100,000 square pdf without creating an associated window (which itself might use 40GB of memory and which you couldn't see more than a tiny fraction of anyway). pdfs are vector based. so a file with 100 dots in it will be 1 percent the size of a file with 10,000 dots in it (roughly)
the problem now is if your pdf viewer can handle it...
Mine sure can't! #BSODFTW
mine (mac) does. but if you replace the above code with something that draws 100,000 points then you can't find them no matter how you zoom and scroll. there's something there because the background isn't white, but...
thank you for your answer @koogs. have you a javascript solution ? or is it impossible because of browser limitation ? what about a jpg (at this end I will print that image).
@quark any idea to do what you propose there
The key question here is what interface you are producing your image for. For example, map interfaces like Google Maps have progressive tile loading and zooming mechanism -- this is a common way of interacting with an extremely large image (like the surface of the planet). In general, the way you populate an interface like that is through writing tile images, so you would set up your visualizer accordingly -- walk through the space in a grid and saveFrame each tile.
http://unfoldingmaps.org/
https://forum.processing.org/two/discussion/20910/slippy-maps-on-p5-js
https://github.com/cvalenzuela/Mappa
For very large image file formats, a great classic is pyramid TIFF format / bigTIFF. Like a zoomable tile map, it saves different resolutions in a pyramid of sub-files -- one image on top, many zoomed in images below. If you are planning on printing a poster the size of a building this might make sense.
https://www.loc.gov/preservation/digital/formats/fdd/fdd000237.shtml
There are also other scientific image formats designed for astronomy / medicine... the question is, what do you want to do with your big picture?
You can try to save this data in p5.js and see if it collapses or if your browser can handle it. Or you can google it and save yourself the trouble to doing this. I am sure ppl have tried this before.
You need to clarify what you want to do with your data. A single file storing that many pixels? Will you try to print it? If so, then you do not need to store that many pixels. If you go with multiple images (Tiles), then you save multiple images and you will load only few tiles at the time associated to your current view. Imagine googlemaps when you zoom in into a city. You can pan left or right, up and down and the browser will load the proper tiles associated to your current location. It will do proper zooming, rotation and translation. However, it is not clear from your post exactly what you want to do.
Kf
@jeremydouglass and @kfrajer Thank you for the inputs
I want to print this image upon a 2 meters x 1,50 meter print @300dpi (or more if possible)
Just to be clear I am not suggesting that the hundreds of pieces could be fitted together to create a single entity e.g. png or pdf. (Note jpg uses a lossy compression algorithm which means you are likely to loose pixel level resolution.)
Rather small rectangular areas of the domain are rendered and saved as individual images. Then create a webpage that loads and displays them as and when needed e.g. as you pan or zoom.
if it's destined for print, why are you using javascript (which is traditionally more web-centric)?
if it has to be javascript does it have to be p5.js? there are probably node libraries for this. (pdfmake, pdfkit, google tells me)
(btw, my example was 10 times bigger than it should've been due to a typo... fixed now)
linux is struggling with the file. slow...
@nippotam
With requirements like "or more if possible"...
...I strongly recommend that you work backwards, not forwards. Have you printed at that size before? Who is your target print provider / what is your target printing device, and what file formats do they / does it accept? You need to know that before deciding what your sketch needs to produce and how.
Note that 1 meter in inches (39.3701) * 2 meters * 300dpi = 23622 pixels wide. So a 2x1 meter image is 23622 x 11811, or ~279 million pixels (278999442) -- so ~532MB raw PNG IF you use a 16bit color depth - or a 1GB image if you use 32bit color. That can be hard on desktop software, but is not unthinkably large -- you just need to be prepared for things to fail, and you ideally want to work with printers who have done this before -- or else be prepared to embark on a learning experience (which can sometimes be expensive if you have 1-meter test runs failing). If you are doing your own testing on a banner printer, I suggest printing ~2in test strips first.
P.S. for napkin math on what to expect when you are assembling very large image files, you can use a file calculator e.g. http://jan.ucc.nau.edu/lrm22/pixels2bytes/calculator.htm -- although your milage may vary significantly on compressed formats depending on the nature of your image data.
there is a tutorial on producing output for print here:
https://processing.org/tutorials/print/
jeremy raises a good point about the actual number of pixels you'll need, and those numbers are a lot more reasonable than the ones in the subject line
(and i think you can get away with less than 300 dpi for these things)
This is true.
@ 150dpi:
2m x 1m = 11811 * 5906 = 69,750,215 pixels
So this is really getting back into the realm of "just a large file."
For reference, if you have an industrial strength image editing machine with lots of RAM, I believe Adobe Photoshop max size is 300,000 x 300,000, and Photoshop Elements max size is 30,000 x 30,000. By comparison, I believe GIMP has a default max size of 262,144 x 262,144.
So at either 300dpi or 150dpi you are under any of these sizes by quite a lot, even if using Photoshop Elements. RAM is probably your main concern.
Of course, if you are thinking about the expected RAM limits for Processing or a browser (e.g. Chrome), this is a whole different issue. Again, tiles might be an extremely important approach -- a default configured Java or JavaScript could easily run out of RAM while trying to manipulate uncompressed pixel buffers of that size in a single operation.
(he said 1.5m)
Ah! Multiply all pixel counts and megabyte counts above by 1.5 for updated estimates. Longest edge still under 30,000.
Thank you All ! I asked to the printer I have in mind. He told me the same stuff : 150dpi is OK at least 200mb file. I will do two different program one for print and one for screen interaction. Thank you for your quick answering.