
I've been talking about the ct/rapidmind stuff, though my discussion has been about the general idea and not the specifics of what they may have implemented. I don't know exactly what is in there and haven't tried it yet myself.
I am sure both JIT approach and static metaprogramming approach have their places. My issue with ct/rapidmind/ArBB is the "unnatural-ness" it introduces to the programmer. Seems like the programmer has to be aware of "retained" mode vs. immediate mode when coding his algorithms. For some of the problems, I believe static metaprogramming (a la NT2) can be more natural to a C++ programmer.
If the dynamically generated code makes use of new hardware features not available at the time the original code was compiled....
Can't beat JIT in the above scenario. But the above scenario is arguably fictitious. HPC people will definitely recompile their code on newer platforms.
If the compile time is negligable compared the the runtime of the resulting code...
Vectorizing is no easy task. State of the art algorithms are O(N^2) on the size of a basic block. So it has non-negligible overhead. Unless the programmers using JIT are aware of this (use "retained" mode to pre-compile functions once and call it multiple times), they can easily blow away the benefits. Manjunath