
On 2/20/2012 2:06 AM, Artyom Beilis wrote:
This is my review:
Thank you for taking the time to do a review. Much appreciated!
- What is your evaluation of the design?
I personally do not like it for two reasons:
When you include a header like<boost/predef.h> you get 103 files included... 103 stats, fopen, fclose, read parse and so on.
More then that, consider build systems that should stat 103 files for each cpp file compiled.
That thought did cross my mind when deciding on the header structure. But, given how many compilers support some form of precompiled inclusion I decided that it was OK to go with the modular arrangement. Especially for the much easier maintainability of the library. One option I can think of is to provide a single concatenated header that is generated from the current modular headers. This, of course is a bit more work on the library side, and is considerably harder for users to humanly understand one big concatenated header file. Hence this is something that I would seriously consider adjusting if the number of headers becomes a real measurable problem. Which also suggests that I should add some header parsing performance tests.
I know that this library created as "header only" but Boost has enough header-only configurations and not a single normal compile time configuration.
I do mention in the future tasks for the library that having some form of pre-compile external configuration as a way to add configuration for external libraries would be a good idea. But the scope, and hence goals, of the current set of definitions is limited to preprocessor defined symbols.
I assume that better configuration can be created using compilation rather then explicit defines test, also it would be much more expendable for future directions.
For just all the definitions that this library currently does I can't see how a compiled test could do any better. Since the compile test would do exactly what the library currently does as the definitions the library is based on are equivalent to what you could discover outside of the preprocessor.
I think it would be great if a single file like<boost/conf_predef.h> would include all definitions. Something that can be generated at build time and it would include all needed defines making
the "include" much faster and easier to read.
I disagree that this would make it easier to read. If one has to generate multiple "conf_predef.h" files, because one is dealing with multiple configurations, then it becomes less readable as one is trying to keep in your head, and manage with the build system, multiple versions of the same file. I've been there with auto-conf libraries, and written build system management for this, and it's a management pain to juggle the multi-conf design (from a user point of view).
Also I'd merge multiple files like boost/prefef/os/* to a single one to improve compilation times.
I mentioned above how I would consider doing that. Of course the drawback is that then one ends up with a large single file that one has to search around in to find what you may be looking for. Hence making documentation paramount :-)
Also generating compiled time header would allow to to extend the macros to new definitions that can't be done today for example:
- BOOST_POINTER_SIZE 4 or 8 (or even 2) - BOOST_WCHAR_T_SIZE 2 or 4
There is a fine line between the current goal of this library vs. what the goals of the Boost Config library. Currently this library is not about defining "features" as that is what Boost Config does. I'm not objecting to having the above definitions.. Just need to point out that..
And so on.
The "so on" would be subject to the goals ;-)
Basically two things I'd like to see improved:
1. Do not create 103 files but several maybe bigger files.
If I'm going to do that, I'd create just one file. After all if file system performance is the reason for doing it then it makes more sense to eliminate as much of it as possible.
2. Add compile time configuration generation for stuff that can't be generated using headers only.
Yes, as mentioned in the "todo" section.
- What is your evaluation of the implementation?
I think some stuff can be more fine grained. --------------------------------------------
For example OS Version for Windows... There are macros like _WIN32_WINNT (AFAIR) that gives you quite nice OS version.
It certainly would be nice to deal with all the complexity of the Windows version definitions. I'll add that to some future release.
Another thing is for example libstdc++ version. I'd rather expect to have for example SO version like 6.x - similar to libstdc++.so.6.x It can be configured by mapping year -> version.
I shied away from doing that as it then becomes a continual maintenance task. It especially means that users will be put in a situation where the predef library will lag behind in having mapped version definitions. Of course, ideally libstd++ would supply such a mapped version number directly and I could use that. But I failed to find such a definition.
Or use compiler version - library binded to specific compiler.
That is not sufficient. You can have a compiler that can target multiple configurations and hence this version may not have any relevance to the library one is actually compiling to. Unless you meant something else?
It is not really intuitive because I do not remember in what year
gcc-4.5 was released but I do know when for example unique_ptr was introduced.
Actually you'd have to look up more than just the year. You would need the exact date to be sure you have to correct condition as any particular feature may have been added late in a year where there are multiple releases as has been the case for all gcc release <http://gcc.gnu.org/releases.html>.
In this case I think it can be derived from gcc version and the fact that libstc++ library is used.
Other things like for example LANGUAGE_STDCPP... We already have __cplusplus that does almost exactly the same as this macro. Why do we need it and what added value it gives us?
I think the same can be argued for all the definitions that this library provides. It gives us the ability to check for the version, or just presence, with a *single* *consistent* *readable* interface.
Architecture ------------
I notices several problems:
1. Arm has several architectures ARM, ARMEL should be distinguished
Sorry if I don't know.. But what is ARMEL is exactly in relation to ARM? In particular how does it relate to the various ARM chips produced <http://en.wikipedia.org/wiki/ARM_architecture>? And how would I determine an ARMEL?
2. There are MIPS and MIPSEL - Little Endian MIPS (debian has both for example)
Same question as above <http://en.wikipedia.org/wiki/MIPS_architecture>. Hm.. Or is "EL" some tag for specifying the target endian-ness? Because if it is, adding an orthogonal endian definition is something I'd like to do in the near future. Although, like the "word" size, is not something as easily done as just enumerating architectures.
I'd suggest to check carefully all acrchitecures.
That's always the case for all the predefs ;-)
Compiler: ---------
There is no "Cygwin" compiler or "MINGW" compiler, it is GCC. Should be fixed
Right. My mistake there. Makes more sense to test for OS_CYGWIN and OS_WINDOWS.
OS: ---
- I'd rather like to See Darwin then MACOS - MACOS not good.
It should be MACOSCLASSIC and MACOS or even better:
MACOS and DARWIN
Mac OS 9 and Mac OS 10 share amlost nothing, on the other hand Darwin and Mac OX 10 is almost the same.
Well given recent development it might make more sense to just use MACOS and OSX. But yes, I'll consider better naming for those. Although given that the current definitions only cover 9.0 vs 10.0, it might not yield much of a gain in clarity.
Testing... ---------
1. Unfortunately I see none on this.
There is testing for at least the functionality of the version macro and the decomposition utility macros. So I object to the "none" characterization. Testing for this library was something I asked about previously on this list. And there were not any really attainable solutions. The solutions boiled down to either simulate the target configuration or human verification. Unfortunately the former is essentially a tautological test as the test is specifying both the definition and test. Human verification has the benefits that it is distributed and in this use case doesn't need to be repeated.
I understand that is is not simple but
Indeed :-\
you may probably use some tools like `uname` and compare their output any maybe use for example bjam toolset and bjam os and compare it with detected compiler.
No I can't. Any external mechanism would almost certainly test the host configuration not the target configuration. And the commonplace cross-compilation world we now live in this is not something that can be dismissed. Hence the only likely external test would involve running the compilers themselves and those would require prior knowledge of what you are targeting. And, consequently, likely running into the tautological test problem.
From what I can see all tests are manual. It is not good.
Not all tests.
2. I'd like to read how did you tested this project... I don't see any of it in the documentation.
I could certainly add such documentation. Although it would be continually expanding.
Have you setup at least several VMs using for example qemu with different architectures? Have to tested this on several OSs?
The current testing only involved the automated testing I mention above for the version and decomposition macros. Plus testing in the platforms that I have immediate access to: Windows, OSX, Linux (OpenSUSE), and iOS.
- What is your evaluation of the documentation?
It needs to be improved, for example what BOOST_LIBSTD_CXX version means?
I do point to the libraries web site <http://libcxx.llvm.org/>. Is that not enough to answer what it is? And the macros does say "If available version number as major, minor, and patch". Although now that I reread the definition it's actually only version and patch that are available.
Does BOOST_OS_WINDOWS defined when CYGWIN is used
I'll add a note about that when I find out if gcc on cygwin defines the windows macro.
Only from looking at the source I could figure this out. I think each macro should display exact meaning for user who is interested.
Hm, exact meaning would boil down to including the source in the documentation. But it would be worth it to list which predefined macros each macro uses in it's implementation. That should cover most cases without making the documentation just parrot the source.
1. Almost all tests do not even compile on Cygwin...
Could you elaborate? What cygwin version are you trying?
2. Under mingw gcc-4.5.3 I see following output of info_as_cpp
For example:
** Detected ** .... BOOST_LIBSTD_GNU = 410400028 (41,4,28) | GNU ...
How should I treat it?
The docs say "Version number available as year (from 1970), month, and day." I.e. the above says libstdc++ released on 2011/04/28.
- Do you think the library should be accepted as a Boost library?
Unfortunately, No.
Sorry to hear that..
But I strongly recommend to improve the library and submit it for further review.
Problems:
1. Too many macros are not fine grained as they should be or they generate unexpected or surprising definitions.
Do you have more examples than the ones you mentioned above?
2. Documentation must be improved because for now it is very unclear.
All documentation can always be improved ;-) Did my answers above address this for you?
3. Many problems with different definitions like there is no such thing like Cygwin compiler should be fixed. 4. 103 files in headers that would slow down the build process. 5. No real test suite exists
Is there more I can expand on to address those? -- -- Grafik - Don't Assume Anything -- Redshift Software, Inc. - http://redshift-software.com -- rrivera/acm.org (msn) - grafik/redshift-software.com -- 102708583/icq - grafikrobot/aim,yahoo,skype,efnet,gmail