
Phil Endecott wrote:
Christian Henning wrote:
Also, how would a C++ wrapper help you with large images?
My point is that your gil interface doesn't help me with large images in limited RAM, and your current design is "all or nothing". ?Decomposing it into wrappers around the libraries that are independently useful would mean that I could use them directly and get the benefit of e.g. the error handling stuff, with my own row-at-a-time code on top.
Imagine for now libjpeg doesn't allow for partial image decoding. How would you solve that problem?
I'm not sure what you mean by "that problem" in that question. Do you just mean "processing large images in limited RAM"? If you do, the answer is that I would just process the image sequentially, e.g. row-at-a-time. For operations like scaling and tiling this is straightforward.
It seems aligned to what I was trying to suggest too.
My solution might be to create a virtual image container which only holds a certain amount of an image in memory and the rest on the hard drive. Though, while reading a large image this container will handle limit memory situation on the fly.
Well my digital picture frame doesn't have a hard drive. You seem to be over-thinking this a bit!
Perhaps, generally I mean, it's not a bad idea to manage a cache that could be used by underlying I/O machinery. This cache could be used to store most recent blocks, so they are not re-read from disk if re-accessed. Though, it feels to me as an application specific feature, but on the other hand it could be implemented better if made as close as to low-level I/O. Best regards, -- Mateusz Loskot http://mateusz.loskot.net