
On 02/21/2012 11:09 AM, Artyom Beilis wrote:
When you include a header like<boost/predef.h>
you get 103 files included... 103 stats, fopen, fclose, read parse and so on.
More then that, consider build systems that should stat 103 files for each cpp file compiled.
One option I can think of is to provide a single concatenated header that is generated from the current modular headers. This, of course is a bit more work on the library side, and is considerably harder for users to humanly understand one big concatenated header file.
But easier to compiler... :-)
I am strongly opposed to this! First of all, it is the job of the compiler to compile. In theory, it doesn't matter how many files he needs to open to get the job done. However, it matters for us humans. Is this just over pessimistic, premature optimization?
Hence this is something that I would seriously consider adjusting if the number of headers becomes a real measurable problem. Which also suggests that I should add some header parsing performance tests.
The problem is not header parsing performance only. Every build system checks dependencies, that means it stats() all the files it makes the process slower.
So even if the process of parsing and stating takes only 0.1s it becomes critical for many files.
What you are suggesting effectively sounds like throwing all kinds of structure and file organization away that makes sense for us mere humans just to compile a little faster. Please, back it up with some proof. Additionally, i would like to add that a fine grained header file organization also leads to less includes. One might always need everything defined in the library in one TU. I claim that having those big headers is slowing the process down. So, how big is the impact? Did you do some real measurements?