We closed this forum 18 June 2010. It has served us well since 2005 as the ALPHA forum did before it from 2002 to 2005. New discussions are ongoing at the new URL http://forum.processing.org. You'll need to sign up and get a new user account. We're sorry about that inconvenience, but we think it's better in the long run. The content on this forum will remain online.
Page Index Toggle Pages: 1
big files (Read 398 times)
big files
Mar 23rd, 2006, 8:33am
 
I need to load in very large images, and it seems the only way to do this will be as a byte[] array.

Problem is: the data will be in its compressed format.

Any ideas? Obviously a PImage container can't hold it!
Re: big files
Reply #1 - Mar 23rd, 2006, 12:58pm
 
I'm rephrasing this to fit my current thoughts:

based on an images dimensions say 5000 * 5000 is it possible to load only a section and display it to the screen.

I'm fairly certain it's impossible with a compressed image format, so what would be required is a converter to pixel[] stream, which could then be accessed through an io.file stream of something.

This would obviously involve parsers depending on the original compression type .gif or .jpeg etc. before writing the pixel[] data as a raw file.

I'm guessing that the whole file will need to be held as a byte[] at some stage.

Any pointers would be appreciated.

Re: big files
Reply #2 - Mar 23rd, 2006, 2:43pm
 
sorta depends on what you're trying to do. if you just want to grab a huge jpeg and render it to the screen and are using the default JAVA2D library, then you might try using Java's Applet.getImage() and then grabbing the Graphics2D object from the renderer and drawing it yourself using a java2d call instead of using p5 (getting the g2d is covered in the faq).

if you're using a different renderer, it gets a little more complicated. it also depends on whether you're trying to quickly scroll around to different sections, and whether you have control over the source files. for instance, if you have control of how the files are generated, then you could do something like storing a series of files that contain the image broken up (or at multiple zoom levels).. think google maps as far as that sort of implementation works.

generally if you're trying to move through a lot of data, there's a tradeoff that happens between the amount of time it takes to read uncompressed data from the file versus the amount of time to grab a small chunk and decompress it. in general, compression means you can't randomly jump through the file.. you could use a RandomAccessFile to allow yourself to read the raw bytes from different locations in the file as needed, but the image would need to be uncompressed (something simple like tga, for instance).

soo.. there are lots of options, just depends on what you're trying to do, and what your constraints are.
Re: big files
Reply #3 - Mar 23rd, 2006, 5:08pm
 
Thanks Ben!

Found a Gimp script would suffice for splitting my image up, as I didn't require any additional processing beyond that.

Once in this format I can start loading them into processing without any problems.
Page Index Toggle Pages: 1