
That's good news. I feared that you had been frightened-off by the previous discussions.
Although it's good to have the code, and no doubt some people who can scan C++ faster than I can will really appreciate it, what I'd love to see is more in the way of rationale and concept-documentation. For example:
- My recollection of the last part of the discussions the first time around was that they focused on the "nasty" way in which you made it possible to adapt a legacy struct to work with your library, and in particular how you added methods to the class by casting from a base class to a subclass. It would be great to see a write up of the rationale for that compared with the alternatives. Perhaps this could just be distilled out of the previous discussions. My feeling is that it may come down to this: what you've done is the most pragmatic solution for your environment, but it isn't something that could ever make it into the C++ standard library (since it used casts in a non-standards-compliant way). So, should Boost only accept libraries that could be acceptable for C++, or could Boost have a more liberal policy? Also, how much weight should be put on the "legacy" benefits of your approach? My feeling is that the standard library, and Boost, typically prefer to "do it right as if you could start all over again", rather than fitting in with legacy problems.
Even more as it is possible to write the algorithms in a way that these are agnostic of the concrete point data type used (as long as there are adaptors available, allowing to make this point type compatible with the expected point concept). Joel explicitly alluded to that in the first discussion and I think there is no other way forward. Remember, Boost is a collection of libraries with a major emphasis on generic interfaces, which clearly is one reason of its acceptance by the community. And - anything not conformant to the Standard shouldn't even be considered as a Boost library, IMHO Regards Hartmut