
Christian Henning wrote:
Also, how would a C++ wrapper help you with large images?
My point is that your gil interface doesn't help me with large images in limited RAM, and your current design is "all or nothing". ?Decomposing it into wrappers around the libraries that are independently useful would mean that I could use them directly and get the benefit of e.g. the error handling stuff, with my own row-at-a-time code on top.
Imagine for now libjpeg doesn't allow for partial image decoding. How would you solve that problem?
I'm not sure what you mean by "that problem" in that question. Do you just mean "processing large images in limited RAM"? If you do, the answer is that I would just process the image sequentially, e.g. row-at-a-time. For operations like scaling and tiling this is straightforward.
My solution might be to create a virtual image container which only holds a certain amount of an image in memory and the rest on the hard drive. Though, while reading a large image this container will handle limit memory situation on the fly.
Well my digital picture frame doesn't have a hard drive. You seem to be over-thinking this a bit!
I have been glancing over libjepg documentation. One of the advanced feature is called "Buffered-image mode". It goes like this:
"In buffered-image mode, the library stores the partially decoded image in a coefficient buffer, from which it can be read out as many times as desired. This mode is typically used for incremental display of progressive JPEG files, but it can be used with any JPEG file. Each scan of a progressive JPEG file adds more data (more detail) to the buffered image. The application can display in lockstep with the source file (one display pass per input scan), or it can allow input processing to outrun display processing. By making input and display processing run independently, it is possible for the application to adapt progressive display to a wide range of data transmission rates."
Do you think this feature can be used to read out sub-images?
Perhaps. Phil.