
On 09/12/10 12:09, Domagoj Saric wrote:
"Phil Endecott" <spam_from_boost_dev@chezphil.org> wrote in message
If you're interested in experimenting, I suggest making random input tiles or just replicating them.
Luckily I found this http://www.unearthedoutdoors.net/global_data/true_marble/download :)
If you want, I can take the largest TIFF there and chop it out into 256x256 PNGs and measure the RAM and CPU time usage (or do some other test if you can define it clearly)...
I'd be interested in making similar tests myself. I have access to georeferenced raster datasets as large as 90K x 45K pixels. I also have extensive experience with GDAL (http://www.gdal.org) library which I have used to process such datasets with success However, I'm still missing specification of operations to be performed. Can we describe Phil's use case in form of reproducible steps, so we can pprogram this use case using with various toolkits/libraries? As the datasets I have are not public data, I could use the True Marble Imagery so the results are comparable. What you think? Best regards, -- Mateusz Loskot, http://mateusz.loskot.net Charter Member of OSGeo, http://osgeo.org Member of ACCU, http://accu.org