On 7/26/2011 1:42 PM, Michael Powell wrote:
What you might be looking for here is a runtime unit system. I devised such a system around the boost quantity system that somewhat works. Unfortunately I can't share it.
This is basically where we're at. The claims of "zero overhead" are only partly true, as you and I are finding, because in order to use the library in any useful context, plumbing has to be built up around it to get at the calculations and conversions.
The claim of zero overhead is only mostly true, but not because it doesn't do runtime unit conversions. It's only mostly true because: 1) There's actually a significant overhead at compile time. I believe this is a worthy sacrifice but it's something that needs to be considered. 2) It depends on compiler optimizations that don't always happen. It *should* be zero overhead because the function calls and such should be optimized away such that the machine code is the same as if the type were a double, but I've noticed this doesn't always happen. Of course it obviously does not when debug mode is on. Boost.Units never claims to be an answer for "userland units".
Basically I have a unit that does conversions, a quantity that stores a value, and these types convert to/from the boost versions of the same dimension before going into math functions. All forms of I/O and user interaction are done in this wrapped thing and all formulas are done in a particular boost::units system. It wasn't too difficult (except for some unfortunate things I was forced into--such as treating 3 different kinds of flow as the same variable).
Also basically where we're at. I convinced my senior guy that we should focus on conversions at a later point and get the calculations working first, making the assumption we're communicating in the same base units.
This should be natural. Build the models to work in some fundamental set of unit types that is standardized across the entire architecture. Then use unit conversions between the models and their views.