
-----Original Message----- From: boost-bounces@lists.boost.org [mailto:boost-bounces@lists.boost.org] On Behalf Of Chris Uzdavinis
Which is fairly obvious in nearly all cases. For example:
MACRO()
int f(...) { ... }
To humans, yes. To editors, well, that's a hard problem.
Not if editors can see through macro expansions, but I was referring to re-indenting by hand.
It is a pain to keep having to go back through the code and re-hand-indent certain lines after auto-indenting a whole file or a region of it.
Why would you need to auto-indent a whole file?
A whole file "or region of it". I often find myself performing work on a range of lines. Sometimes I rename variables, they get too long, and I have to split the lines. This can be automated via editor macros. After a bunch of such changes, I re-indent the buffer to clean it up, strip out trailing whitespace, and so on.
I don't think that is an onerous task to split a line that is too long manually. Also, in addition to reading code, one of the main purposes of indentation is to keep track of where you are when writing code. If you format code as you go along, there is no need to re-indent the buffer. Likewise, removing trailing whitespace can be done without any formatting changes.
Sometimes you decide that a chunk of code needs to be moved/refactored into its own function, and the indentation level is no longer correct. So you re-indent the whole region with that new function.
This is what plain old block identing is for. You don't need a code formatter to do this.
The only good reason that I see is if some sort of code generator that doesn't cater to user readable (e.g. the preprocessor) produces a mass of code that needs reformatting. In that case, it only needs to be temporarily reformatted, and it doesn't matter if it isn't quite perfect.
I don't worry too much if the whole file is generated, since I probably wouldn't be editing it anyway, but instead would be editing the spec file that was used to generate it. Or perhaps I'd edit the generator itself.
That's true, but you might be reading it to verify what the generator is doing.
I'm asserting that this is never the case. The "logically" part above is a testament to the prevailing viewpoint. It isn't logically (and definitely not physically) a statement.
By using the word "logically", I meant that "X becomes Y" can quite reasonably be described as being Y.
I understand that, and that's what I'm disagreeing with. There is a different degree of indirection that is important.
When you throw extra parenthesis after it and do other things to change how the preprocessor expands macros, then it blurs things up. But for the typical, standard usages of macros, this indistinction is not a problem.
I totally disagree. In fact, (if you're referring to something like full scale preprocessor metaprogramming) I'd go so far as to say the opposite. In that case, any indistinction is far less of a problem than it is with "standard usages of macros". Preprocessor metaprogramming is so obviously different than normal code that it hardly matters. It is the normal, relatively simple, uses that can cause the most damage.
Then again, there are extremely few people who write macros anywhere near as complicated or ambitious as you do.
And I'm not referring to those kinds of macros either.
The simple macros are easy to write correctly. I'm not talking about newbies who don't understand double evaluation, etc., either. I'm talking about competant programmers who don't use macros very often and when they do, apply the KISS principle.
Neither adding or nor elliding trailing semicolons makes things simpler to comprehend for the user. I don't think that KISS applies. For the designer of a macro, in many cases making the macro "semicolon-able" makes the macro definition more complex.
No it doesn't. The semicolon means nothing to the preprocessor, and the macro can still effect the code that follows.
Ok, here I think we have a miscommunication by the ambiguities of English. I apologise. I meant for antecedent to which "this requires ..." referred was the author of the macro. The macro author must ensure that after the user-provided semicolon, the macro has no more effect on subsequent code. ("This requires that the macro ACTUALLY be done...") I certainly didn't mean that having the user provide a trailing semicolon somehow forces the preprocessor to be done expanding the macro.
Okay. In any case, for nearly all macros, the closing parenthesis of the invocation signals that it doesn't effect trailing code. You have to either be using complex metaprogramming macros, or really bend over backwards to contrive a situation where it doesn't. For a macro to effect trailing code, the trailing code has to be either a parenthetic expression--as in macro(...) (trailing-code) --or it has to have a mismatched right parentheses, such as: macro(...) trailing-code) Off the top of my head, there is no other way that the macro invocation can mess with trailing code.
A self-contained statement, BTW, would include the semicolon. Otherwise, I don't disagree with this part--mainly because putting the semicolon would be erroneous in nearly all cases.
Well, I'm glad to see we already are in half agreement. :)
:)
It seems clear to me how the semicolon's presence (or lack thereof) can confuse maintence programmers,
Confuse maintenance programmers only if they have the flawed viewpoint that I've been ranting about for years now. Worst case scenario is that the maintenance programmer has to look up the documentation for the macro. Having to do that enough times can change the perspective enough that it isn't a problem anymore.
The typical problems: A) What documentation?
We can't protect people from laziness or wanton stupidity, nor should we really try.
B) Programmer apathy.
IMO, it isn't worth my time to try to keep people that don't care from screwing things up. In the face of apathy, it doesn't matter what we do.
C) "Aren't we supposed to avoid macros anyway? Why effort to study them in such detail if I'm supposed to feel guilty for using them anyway?"
My answers are "no" and "it isn't necessary to study macros in detail for most uses of macros." It merely requires a perspective shift when using macros.
Clearly, writing the boost preprocessor library requires a much more strict view of what's going on, and macro purity is a must.
Well, I'm not really referring to this kind of macro use.
But really, how many people in the world could even write the boost preprocessor library?
Probably quite a few now, but certainly less than 1%. I think Aleksey and Dave know enough about it that they could do it if they had sufficient motivation. Plus, there are a variety of people that have given ideas that show that they have the capability or could gain the capability quickly. OTOH, I'd say about one other person could write Chaos, and that's Vesa, though some others would eventually get there.
Heck, how many actually understand it?
In this case, quite a few. If you take away all the workarounds, the library is not that complex. Chaos (which doesn't contain any workarounds) *is* complex.
Compared to the number of C++ programmers, the number is probably less than 1%. In most cases, for the majority of programmers, such strictness is unnecessary and adds very little, but does make code look weird, especially after the editor has indented it incorrectly.
Despite all this, I haven't been referring to preprocessor metaprogramming at all. Complicated preprocessor metaprogramming requires knowledge and consistancy across a whole slew of issues, and strictness to those principles is a must. But those principles are different than the one I'm referring to here. To preprocessor metaprogramming (as in authoring preprocessor metaprogramming constructs) this one doesn't even come up because you're never really dealing with anything as concrete as specific underlying language syntax. Regards, Paul Mensonides