
So you are going to do JPG. Wow. Thats a tough spec. Whats the goal, complete coverage, or just the important bits. For jpgs, I use IPP, which is not opensource unfortunately. Lots of issues to consider. Are you going to do jpg 2000? On Thu, Mar 25, 2010 at 4:11 PM, Phil Endecott <spam_from_boost_dev@chezphil.org> wrote:
Tom Brinkman wrote:
On Linux, where I develop, we use PNG. As far as i know, the following library is pretty much the standard way to interface with PNG's.
http://www.karlings.com/~danne/pnglite/
It was written by the original png developers, but has a simpler interface. Should be adequate for your purposes. Just grab what you can.
"Should be adequate for your purposes" - I'm not sure what you mean Tom. We're discussing how to process large JPEGs in limited RAM, and how to deal with the nasty error-reporting mechanism that libjpeg has. I don't see how pnglite is going to help with that.
Anyway, having looked at it, it seems that it decodes the whole image in one call into a contiguous memory region. So if I were trying to decode PNGs then it would require that I had enough RAM to store the whole decoded image, which is exactly what I don't have. It also doesn't handle indexed images. It does have sane error reporting though.
Regards, Phil.
_______________________________________________ Unsubscribe & other changes: http://lists.boost.org/mailman/listinfo.cgi/boost