C++ announcements coming tomorrow

Herb Sutter, the Convenor of the ISO C++ Standards Committee, will be making some announcements tomorrow likely to be of interest to C++ developers in general and Boosters in particular. The venue is a talk titled "The Future of C++" at the Microsoft Build Conference. While some of Herb's announcements will be specific to Microsoft, the key C++ initiatives involve Boost and the whole C++ community. Some of the ideas evolved from feedback Herb got at C++Now/BoostCon 2012 last May. I'll post a summary of the announcements tomorrow, but you can also watch the live stream at http://channel9.msdn.com at 12:45 PM Pacific time. For other time zones, see http://www.timeanddate.com/worldclock/fixedtime.html?iso=20121102T1945 --Beman PS: We don't usually post teasers like this on the Boost lists, but Herb asked that a notice be posted, and given the endless work he has put in on this over the summer, I couldn't say no.

On Thu, Nov 1, 2012 at 3:30 PM, Beman Dawes <bdawes@acm.org> wrote:
Herb Sutter, the Convenor of the ISO C++ Standards Committee, will be making some announcements tomorrow likely to be of interest to C++ developers in general and Boosters in particular. The venue is a talk titled "The Future of C++" at the Microsoft Build Conference.
Wasn't that the title that was decided for the 2012 BoostCon variant during the shadowy seance? I wish I could have gone... 2011 was a blast! Used up all my vaca days hiking, etc... Doh!
While some of Herb's announcements will be specific to Microsoft, the key C++ initiatives involve Boost and the whole C++ community. Some of the ideas evolved from feedback Herb got at C++Now/BoostCon 2012 last May.
I'll post a summary of the announcements tomorrow, but you can also watch the live stream at http://channel9.msdn.com at 12:45 PM Pacific time. For other time zones, see http://www.timeanddate.com/worldclock/fixedtime.html?iso=20121102T1945
Thanks Beman! Always good to know! These lists are absolutely my C++ news feed! Greg

On Thu, Nov 1, 2012 at 3:30 PM, Beman Dawes <bdawes@acm.org> wrote:
Herb Sutter, the Convenor of the ISO C++ Standards Committee, will be making some announcements tomorrow likely to be of interest to C++ developers in general and Boosters in particular. The venue is a talk titled "The Future of C++" at the Microsoft Build Conference.
While some of Herb's announcements will be specific to Microsoft, the key C++ initiatives involve Boost and the whole C++ community. Some of the ideas evolved from feedback Herb got at C++Now/BoostCon 2012 last May.
I'll post a summary of the announcements tomorrow, but you can also watch the live stream at http://channel9.msdn.com at 12:45 PM Pacific time.
Here is the summary, in reverse order from Herb's presentation: * The isocpp.org web site has begun operations. This site is intended to act as "The home of Standard C++ on the web — news, status and discussion about the C++ standard on all compilers and platforms." * The Standard C++ Foundation has been formed. See http://isocpp.org/about for more information. Note the wide industry support (backed by donations of money) and the Boost representation on the Board of Directors. The first project funded by the Foundation is the isocpp.org web site. * The timeline for the C++ standards committee over the next 24 months is aiming for three Technical Specifications (including a Filesystem TS, based on Boost.Filesystem V3), and a minor revision of the C++ standard (tentatively C++14), with technical corrections and some minor new language and library features. Following that, there will be a constant stream of Technical Specifications and then a major revision of the standard (tentatively C++17)). In other words, much more frequent releases than in the past. * Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS... for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers. Herb's presentation has a lot of interesting perspectives. You can watch it at http://channel9.msdn.com/Events/Build/2012/2-005 --Beman

* Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS... for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers.
Shortly after this talk, Herb held a Q&A session where people had the opportunity to ask him questions about these announcements (can't find a link at the moment). Someone asked whether Microsoft intends to implement full 100% support for standard C++(11), and Herb answered with a resounding 'yes'. Did anyone tell him about the problems with VC's preprocessor that come up on this list again and again and that prevent a powerful preprocessor metaprogramming library like Chaos from being usable on VC? Regards, Nate

On 11/3/2012 10:26 AM, Nathan Ridge wrote:
* Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS... for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers.
Shortly after this talk, Herb held a Q&A session where people had the opportunity to ask him questions about these announcements (can't find a link at the moment).
Someone asked whether Microsoft intends to implement full 100% support for standard C++(11), and Herb answered with a resounding 'yes'.
<rant> IMO, yet more marketing b***s***. This has been said before, and Herb has long since lost my trust (and the trust of many others). He is no longer a free voice. The only person on C9 that doesn't come off as an MS shill is STL.
Did anyone tell him about the problems with VC's preprocessor that come up on this list again and again and that prevent a powerful preprocessor metaprogramming library like Chaos from being usable on VC?
He's been told repeatedly. -- Not that I'm against a "foundation" or against adding more libraries to the standard library, but the only things that C++ programmers need to produce portable code are C++ compilers that implement the standard (and only the standard--not a bunch of vendor-specific extensions). As an example, paraphrasing, "We're proposing 'await' but if the committee doesn't want it we can always add it as an extension." It is particular compiler vendors and their compilers that are getting in the way of progress. Further, we don't need C++/CX (or whatever it is called this iteration). The .Net Framework is a huge pile of typical MS bloatware, and, contrary to popular opinion, C# is actually *not* a good language. It actively interferes with abstraction and encourages bloatware production. I am so tired of hearing the "right tool for the job" fallacy WRT programming languages especially WRT to C# and Java. Aside, I'm also sick of hearing the word "app" and constant attempts to justify turning productivity into novelty with things like big touchscreen monitors and statements having to do with the supposed lack of UI innovation--which has now led to anti-productivity UIs such as (vanilla) Gnome 3, Unity, and, worst of all, Metro. </rant> Regards, Paul Mensonides

On 11/3/2012 2:24 PM, Paul Mensonides wrote:
On 11/3/2012 10:26 AM, Nathan Ridge wrote:
* Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS...
for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers.
Shortly after this talk, Herb held a Q&A session where people had the opportunity to ask him questions about these announcements (can't find a link at the moment).
Someone asked whether Microsoft intends to implement full 100% support for standard C++(11), and Herb answered with a resounding 'yes'.
<rant>
IMO, yet more marketing b***s***. This has been said before, and Herb has long since lost my trust (and the trust of many others). He is no longer a free voice. The only person on C9 that doesn't come off as an MS shill is STL.
I'm afraid I thought the same thing when he said this. He has also stated publicly that Microsoft won't ever implement 2-phase lookup, which, last I checked, is required for 100% std-compliance. I'll give Herb the benefit of the doubt and assume he was answering the "will MS implement all the new c++11 features?" which I believe is yes.
Did anyone tell him about the problems with VC's preprocessor that come up on this list again and again and that prevent a powerful preprocessor metaprogramming library like Chaos from being usable on VC?
He's been told repeatedly.
To be fair to Herb, it's not his call to make.
Not that I'm against a "foundation" or against adding more libraries to the standard library, but the only things that C++ programmers need to produce portable code are C++ compilers that implement the standard (and only the standard--not a bunch of vendor-specific extensions). As an example, paraphrasing, "We're proposing 'await' but if the committee doesn't want it we can always add it as an extension." It is particular compiler vendors and their compilers that are getting in the way of progress.
I disagree with the view that it's categorically wrong for compiler vendors to implement extensions, as long as there is a way to turn them off, and as long as the standard library works in a std-compliant way when they're off.
Further, we don't need C++/CX (or whatever it is called this iteration). The .Net Framework is a huge pile of typical MS bloatware, and, contrary to popular opinion, C# is actually *not* a good language.
C++/CX has nothing whatsoever to do with .NET. You're thinking of C++/CLI.
It actively interferes with abstraction and encourages bloatware production. I am so tired of hearing the "right tool for the job" fallacy WRT programming languages especially WRT to C# and Java.
One language to rule them all, then?
Aside, I'm also sick of hearing the word "app" and constant attempts to justify turning productivity into novelty with things like big touchscreen monitors and statements having to do with the supposed lack of UI innovation--which has now led to anti-productivity UIs such as (vanilla) Gnome 3, Unity, and, worst of all, Metro.
</rant>
Off-topic. -- Eric Niebler BoostPro Computing http://www.boostpro.com

On 03/11/12 23:15, Eric Niebler wrote:
I'm afraid I thought the same thing when he said this. He has also stated publicly that Microsoft won't ever implement 2-phase lookup, which, last I checked, is required for 100% std-compliance.
It's not strictly needed, but is pretty much the only realistic way I can imagine to implement name lookup correctly. While name lookup is quite broken in MSVC, the only situations where you'd need the correct behaviour are however arguably bad style.

I'm afraid I thought the same thing when he said this. He has also stated publicly that Microsoft won't ever implement 2-phase lookup, which, last I checked, is required for 100% std-compliance. It's not strictly needed, but is pretty much the only realistic way I can imagine to implement name lookup correctly.
While name lookup is quite broken in MSVC, the only situations where you'd need the correct behaviour are however arguably bad style.
... well, some workaronds for msvc lookup bugs tend to result in code, which i'd consider as `bad style' ... this is basically the reason, why [1] is not fixed. tim [1] https://svn.boost.org/trac/boost/ticket/7358

On 11/3/2012 3:15 PM, Eric Niebler wrote:
On 11/3/2012 2:24 PM, Paul Mensonides wrote:
He's been told repeatedly.
To be fair to Herb, it's not his call to make.
I don't dislike Herb, and my rant isn't really about Herb. I just believe that everything that he says WRT VC++ is carefully filtered through MS's marketing engine--which is what I have a problem with. Because of that, I do not trust anything that MS says, via Herb or others, with respect to VC++ or MS's so-called "commitment" to C++. I believe that MS has made valuable contributions to C++, but also that they have hobbled it and tried to control it. In the cases where they have made contributions, such as putting money into this new foundation, I do not believe it has been for altruistic reasons but is instead yet another marketing attempt. Don't get me wrong, I have nothing against companies trying to make money. I do have a problem with companies attempting to disguise their agendas. All of the above is opinion, but trust is earned not assumed, and I have not seen anything out of MS WRT to C++ that gives me cause to issue trust. In fact, I've seen the opposite time and again over a very long period of time.
Not that I'm against a "foundation" or against adding more libraries to the standard library, but the only things that C++ programmers need to produce portable code are C++ compilers that implement the standard (and only the standard--not a bunch of vendor-specific extensions). As an example, paraphrasing, "We're proposing 'await' but if the committee doesn't want it we can always add it as an extension." It is particular compiler vendors and their compilers that are getting in the way of progress.
I disagree with the view that it's categorically wrong for compiler vendors to implement extensions, as long as there is a way to turn them off, and as long as the standard library works in a std-compliant way when they're off.
To be clear, I understand the need for compiler-specific things like pragmas (and maybe even things like __stdcall and __cdecl). I do have a problem with feature extensions of any kind, however. The reason is not because I believe the language to be sacrosanct so much as that it breeds lack of portability. If a vendor wants a particular feature extension, the route is through the language standard, which, to some degree, allows such features to be discussed and designed in a much larger context and also inhibits the abuse of popular-compiler-vendor power. If that route fails, the feature is either not ready or not general purpose enough to belong in the language. I hardly believe the standard to be perfect. For example, I believe initializer lists to be an abomination. Such a facility should have been implemented atop a less-castrated variadic template mechanism. However, initializer lists are part of C++ because the majority thought they were a good idea. Therefore, if I was to implement a compiler and call it C++, I had better implement that feature even if I don't like it. The same is true for two-phase lookup, etc., etc.. The point being, *I*, as the compiler vendor, don't get to decide what is and is not C++ and what should or should not be implemented. What C++ needs is portable libraries. In theory, if a library is not making platform-specific API calls and not using vendor extensions, portability should be achieved *accidentally*--not by drastic effort by a library implementer for each compiler/platform. The bottom line here is not more libraries added to the standard library per se nor a foundation. Those are solutions to different problems. The bottom line is lack of real prioritization to implement the language by certain compiler vendors. *They* are the problem. VC++ is at the top of that list, though it is by no means the only one on the list. Each compiler/platform implements a union of a subset of C++ and a set of extensions. Writing portable C++ code requires using only the intersection of those sets from all compilers/platforms together with workarounds for all bugs in all scenarios. Almost nobody does that, and therefore we end up with lots of non-portable or conditionally-portable libraries. If compilers implemented 100% of the language (allowance being made of course for features recently added to the language) and did not provide extensions (which by their very existence encourage their use), then libraries can be developed to generalize platform-specific details (not compiler-specific details) (e.g. Boost.Filesystem, Asio, etc.) and a hierarchy of portable libraries *can* be developed.
Further, we don't need C++/CX (or whatever it is called this iteration). The .Net Framework is a huge pile of typical MS bloatware, and, contrary to popular opinion, C# is actually *not* a good language.
C++/CX has nothing whatsoever to do with .NET. You're thinking of C++/CLI.
I'm thinking both actually. I.e. extensions to the language designed to interoperate with other things. Those should be 100% library. Same thing with AMP.
It actively interferes with abstraction and encourages bloatware production. I am so tired of hearing the "right tool for the job" fallacy WRT programming languages especially WRT to C# and Java.
One language to rule them all, then?
In theory, yes. Or, at least, almost yes. There is a legitimate difference between scripting vs compiled software (e.g. Python vs. C++) but that difference has to do with codebase longevity, not application domain. Otherwise, programming is about abstraction and composition of those abstractions. A theoretical language which allows full abstraction and full composition of abstractions is the "one language to rule them all." I'm not saying that is C++, and C++ is unlikely to reach that even in the limit, though it is the closest of current major languages. What abstraction and composition give you is the ability to decompose software at the library component level--whose domain reach extends far beyond the language itself and the standard libraries of the language. Given that such a language doesn't yet exist, there is room for a competition of ideas in how to reach or most closely approximate that goal. In that competition there are means of comparing languages objectively. The only significant productivity boost that C# yields is the presence of the large monolithic .Net Framework. It isn't the language itself that is more productive. The language is, in fact, crippled WRT abstraction and composition. Just like Java, it is full of decisions intentionally designed to enforce a particular way of coding--which is the opposite of abstraction, the opposite of DSEL design, and so on. Furthermore, its generics mechanism is a joke, it lacks a typedef concept and therefore an associated types concept, the universal gc model is fraught with issues related to release of non-memory resources (leading to the Dispose so-called "pattern" and tons of the equivalent of try/catch/rethrow cleverly hidden as using() which is the absolute opposite of abstraction). It is simply not a good language from first principles because its first principles are objectively wrong. Related to that, the supposed easy-to-use languages such as C# actually amount to unqualified people implementing (and re-implementing) duct tape solutions to problems. That is not a good thing, it's detrimental in the long run. Duct tape solutions, in turn, drastically deprioritize the development of good, reusable, long-term solutions. WRT to C#, I'm not speaking without experience, unfortunately. I implemented and maintain a C# code-base of approximately 60,000 lines (not particularly small for one person, though not large) for my employer (which is not a software foundary). I originally made the decision to use C# because I believed at the time that C++ would be beyond the capability of our IT department to understand and maintain. In retrospect, that was a mistake on my part where I fell into the right-tool-for-the-job trap. However, as time went on, the domain logic far exceeded the language complexity, and the codebase is now well beyond the capability of our IT department regardless (lots of math). In that development, I routinely ran into C#'s lack of abstraction capability. Now, I want to port it to C++, but to do that I have to replace usages of the .Net Framework with libraries. Boost provides some of those that I need, Qt (which is its own style of abomination) provides others, but others are more difficult. Not because such libraries don't exist, but because they are often full of unjustified platform/compiler-specific extensions, rely on other libraries with such reliance, or run afoul of broken or missing language features on said compilers. Regards, Paul Mensonides

On Sun, Nov 4, 2012 at 2:43 AM, Paul Mensonides <pmenso57@comcast.net> wrote:
I disagree with the view that it's categorically wrong for compiler vendors to implement extensions, as long as there is a way to turn them off, and as long as the standard library works in a std-compliant way when they're off.
To be clear, I understand the need for compiler-specific things like pragmas (and maybe even things like __stdcall and __cdecl). I do have a problem with feature extensions of any kind, however. The reason is not because I believe the language to be sacrosanct so much as that it breeds lack of portability.
Does it? If the goal is portable code, you compile the code on multiple platforms and compilers, and unintended use of extensions is easily detected. Lack of full support for standard C++ or C++11 and bugs are a much bigger problem IMO as they sometimes require non-trivial workarounds that complicate the code.
If compilers implemented 100% of the language (allowance being made of course for features recently added to the language) and did not provide extensions (which by their very existence encourage their use), then libraries can be developed to generalize platform-specific details (not compiler-specific details) (e.g. Boost.Filesystem, Asio, etc.) and a hierarchy of portable libraries *can* be developed.
Some libs should be part of the standard to lower the barrier to start using them. Building/using a (C++) library on Windows is also far less than ideal, which IMO is a big problem for C++ on that platform. Boost appears to have the tools to partly solve this, maybe we should look into that too. Olaf

On 11/4/2012 5:57 AM, Olaf van der Spek wrote:
On Sun, Nov 4, 2012 at 2:43 AM, Paul Mensonides <pmenso57@comcast.net> wrote:
I disagree with the view that it's categorically wrong for compiler vendors to implement extensions, as long as there is a way to turn them off, and as long as the standard library works in a std-compliant way when they're off.
To be clear, I understand the need for compiler-specific things like pragmas (and maybe even things like __stdcall and __cdecl). I do have a problem with feature extensions of any kind, however. The reason is not because I believe the language to be sacrosanct so much as that it breeds lack of portability.
Does it? If the goal is portable code, you compile the code on multiple platforms and compilers, and unintended use of extensions is easily detected. Lack of full support for standard C++ or C++11 and bugs are a much bigger problem IMO as they sometimes require non-trivial workarounds that complicate the code.
Yes, it does. Taking aim at GCC instead, a huge amount of GNU code will not compile without --std=gnu++11 instead of --std=c++11. Even more of it won't compile without a POSIX environment--even ignoring the build system and autotools. The point is that, in many cases, portability should be accidental. I.e. with relatively few platform-specific exceptions (which should be isolated and generalized) programming is done against the abstract machine specified by the C++ standard. That's the compiler user's part of the contract. The compiler developer's part of the contract is to implement a compiler that compiles the language specified by the C++ standard. Sure, lack of full support for the language and the presence of bugs are a huge problems. However, towards the end of 2012, I'm not particularly irrated by the current state of C++11 conformance. I am irrated by set in stone "won't fix" responses to bug reports and broken, mis-implemented features.
If compilers implemented 100% of the language (allowance being made of course for features recently added to the language) and did not provide extensions (which by their very existence encourage their use), then libraries can be developed to generalize platform-specific details (not compiler-specific details) (e.g. Boost.Filesystem, Asio, etc.) and a hierarchy of portable libraries *can* be developed.
Some libs should be part of the standard to lower the barrier to start using them.
Of course. However, I consider that to be a means to guarantee that the presence of those libraries included in the "price" of the compiler and support of that compiler. There is no way that every useful library in every useful domain can be added to the standard library. For C++ software development to really flourish, libraries from outside the standard need to be portable and reusable. Take file system and networking libraries, for example. I can use those libraries regardless of whether the standard contains equivalent libraries because Boost contains them. I think adding networking support and file system support to the standard library is a good thing, but I can do those things now and fairly easily because Boost developers tend to bend over backward to make things as portable as possible. But it should not be so difficult to write portable code. We should not need to have a Boost codebase that is saturated with workarounds. Whose fault is that? The presence of those workarounds is a symptom, but what is the root cause? Compilers. And Boost is just one collection of libraries. Yes, some libraries like Filesystem and Asio have to contain platform-specific code (i.e. alternate implementations for different platforms), but they encapsulate and generalize the platform differences. In some ways, that is their most important function.
Building/using a (C++) library on Windows is also far less than ideal, which IMO is a big problem for C++ on that platform.
Well, we all know Windows is fundamentally broken and getting worse. If you use VC++ on Windows, yes. I use one of the MinGW ports of GCC on Windows or a cross-compiler, and other than over-reliance on --std=gnu++XX and piss poor support for some of libstdc++ where its implementation is heavily dependent on POSIX (e.g. threading), it appears to work well. I use Boost a lot and don't have trouble building it except altering the jam file every time to replace mingw with gcc. I also use Qt for UI-related stuff, and building that is not terribly difficult either. It gets more difficult with other libraries because when libraries "support" Windows, that tends to imply VC++--which is a problem. We need to get to a point where one doesn't target *compilers*. Instead one targets C++ and some platform. From there, we can generalize the platform-specific parts with libraries such that everything else can target C++. OTOH, Linux has its issues also such as its massive assumption of a system-wide particular version of GCC with no C++ ABI encoding in the shared object versioning. The Linux package management models (i.e. apt-get, rpm, emerge) are also fundamentally broken because they don't scale (many-to-one-to-many vs many-to-many). Regards, Paul Mensonides

Yes, it does. Taking aim at GCC instead, a huge amount of GNU code will not compile without --std=gnu++11 instead of --std=c++11. Even more of it won't compile without a POSIX environment--even ignoring the build system and autotools. The point is that, in many cases, portability should be accidental. I.e. with relatively few platform-specific exceptions (which should be isolated and generalized) programming is done against the abstract machine specified by the C++ standard. That's the compiler user's part of the contract. The compiler developer's part of the contract is to implement a compiler that compiles the language specified by the C++ standard.
I'd say you idealize too much. Surely, a lot of code can be written in a platform-agnostic way, relying on the standard C++ only. But there is a significant range of tasks that benefit from compiler extensions, and that's not only OS-dependent things. Take manual vectorization for instance, this is a CPU-specific thing which by definition cannot be fixed in the standard. Generalized implementations are possible but IMHO they will never be as efficient. Or multi-module interfaces - the differences between OS implementations are too deep. Or for how long the "long long int" type was an extension? Look at extensions from another perspective. They are often a playground to test new features that may eventually, after usage experience is gathered, go into standard as a new feature. Only extensions are available here and now and not in a few years when the new std paper rolls out. This is pretty much like how it goes with libraries. Don't get me wrong, I'm all for fully conforming compilers. But objectively extensions are useful in some domains and eventually are a good thing. You don't need them, you don't use them. But should you need them, it's good they are there.
Sure, lack of full support for the language and the presence of bugs are a huge problems. However, towards the end of 2012, I'm not particularly irrated by the current state of C++11 conformance. I am irrated by set in stone "won't fix" responses to bug reports and broken, mis-implemented features.
I agree, this disappoints a lot.

On 11/4/2012 1:14 PM, Andrey Semashev wrote:
I'd say you idealize too much. Surely, a lot of code can be written in a platform-agnostic way, relying on the standard C++ only. But there is a significant range of tasks that benefit from compiler extensions, and that's not only OS-dependent things. Take manual vectorization for instance, this is a CPU-specific thing which by definition cannot be fixed in the standard. Generalized implementations are possible but IMHO they will never be as efficient. Or multi-module interfaces - the differences between OS implementations are too deep. Or for how long the "long long int" type was an extension?
I am not against all extensions. I am against *feature* extensions. There is a place for *necessary* extensions, and, in those cases, lack of the presence of those extensions should easily fallback to the actual language. Hardware vectorization is one example of that. If it exists, it should exist as a compiler-specific pragma which can be ignored. Further, no feature extension to the language is good if the feature could be implemented via library--i.e. it should not have language extensions solely for the purpose of syntactic convenience. E.g. MS's AMP extensions: parallel_for_each( sum.extent, [=] (index<1> idx) restrict(amp) { sum[idx] = a[idx] + b[idx]; } ); This should be: parallel_for_each( sum.extent, [=] (index<1> idx) _Pragma("restrict(amp)") { sum[idx] = a[idx] + b[idx]; } ); I.e. for necessary extensions, pragmas should be used so that 1) the language itself isn't modified, and 2) the code would work correctly (if not terribly efficiently) if the pragma was ignored. The extensions for C++/CX and C++/CLI are even worse because they actually subvert the entire language by forcing the limited .Net runtime model on the language which is a massive regression. Sorry, but this is crap: Foo^ foo = ref new Foo();
Look at extensions from another perspective. They are often a playground to test new features that may eventually, after usage experience is gathered, go into standard as a new feature. Only extensions are available here and now and not in a few years when the new std paper rolls out. This is pretty much like how it goes with libraries.
And they are used in production code, not playground code, and then go on to interfere with standardization because of attempts to be backward compatible with things that never officially existed. That is not to say that particular feature extensions aren't useful. That is not to say that they can't be implemented and deployed faster than the C++ standard release cycle. However, the minor gains of feature extensions are drastically outweighed by the lack of portability that results. Regards, Paul Mensonides

On Sun, Nov 4, 2012 at 3:55 PM, Paul Mensonides <pmenso57@comcast.net>wrote:
On 11/4/2012 1:14 PM, Andrey Semashev wrote:
I'd say you idealize too much. Surely, a lot of code can be written in a
platform-agnostic way, relying on the standard C++ only. But there is a significant range of tasks that benefit from compiler extensions, and that's not only OS-dependent things. Take manual vectorization for instance, this is a CPU-specific thing which by definition cannot be fixed in the standard. Generalized implementations are possible but IMHO they will never be as efficient. Or multi-module interfaces - the differences between OS implementations are too deep. Or for how long the "long long int" type was an extension?
I am not against all extensions. I am against *feature* extensions. There is a place for *necessary* extensions, and, in those cases, lack of the presence of those extensions should easily fallback to the actual language. Hardware vectorization is one example of that. If it exists, it should exist as a compiler-specific pragma which can be ignored. Further, no feature extension to the language is good if the feature could be implemented via library--i.e. it should not have language extensions solely for the purpose of syntactic convenience. E.g. MS's AMP extensions:
parallel_for_each( sum.extent, [=] (index<1> idx) restrict(amp) { sum[idx] = a[idx] + b[idx]; } );
This should be:
parallel_for_each( sum.extent, [=] (index<1> idx) _Pragma("restrict(amp)") { sum[idx] = a[idx] + b[idx]; } );
I.e. for necessary extensions, pragmas should be used so that 1) the language itself isn't modified, and 2) the code would work correctly (if not terribly efficiently) if the pragma was ignored.
The extensions for C++/CX and C++/CLI are even worse because they actually subvert the entire language by forcing the limited .Net runtime model on the language which is a massive regression.
C++/CX is sugar for building and using Windows RT COM objects. It does not involve .NET _at all_. Not in your code and not hidden in the runtime background. Furthermore, both of them are intended for use as a bridge into the Windows platforms at the outer edges of otherwise portable, standards-compliant code. You're not supposed to use them as your primary language. Resume (slightly more informed) venting. -- Cory Nelson http://int64.org

On 11/4/2012 2:14 PM, Cory Nelson wrote:
On Sun, Nov 4, 2012 at 3:55 PM, Paul Mensonides <pmenso57@comcast.net>wrote:
The extensions for C++/CX and C++/CLI are even worse because they actually subvert the entire language by forcing the limited .Net runtime model on the language which is a massive regression.
C++/CX is sugar for building and using Windows RT COM objects. It does not involve .NET _at all_. Not in your code and not hidden in the runtime background. Furthermore, both of them are intended for use as a bridge into the Windows platforms at the outer edges of otherwise portable, standards-compliant code. You're not supposed to use them as your primary language.
Resume (slightly more informed) venting.
I'm not uninformed. I'm generalizing because C++/CX and C++/CLI share some extensions at the syntactic level, and I'm not referring to the details of either except in that they *both* create a new type system, object model, runtime model, etc.. Furthermore, my rant is not about .Net. My rant is about altering the language in absolutely unnecessary ways in order to market other, non-C++ technologies by attempting to leverage the C++ base and trying to disguise it. E.g. introducting "Windows Runtime types", ref classes, partial classes, properties, a crippled "runtime" generics mechanism. Sorry, no. This is (1) unnecessary and definitely not more important than implementing the complete actual language, and (2) is not driven by trying to be "helpful to the community". This is 100% driven by marketing and MS's current attempt at creating a closed platform. All of that said, I wouldn't be nearly so pissed at MS if they actually implemented the language aside from whatever else the do--extensions or libraries or whatever. But they don't. They continue to focus on proprietary stuff while simultaneously preaching about their "commitment" to C++. So, I call b.s., and I'm *far* from the only person that can put 2 and 2 together and derive such a perception. Regards, Paul Mensonides

On 11/4/2012 5:41 PM, Paul Mensonides wrote:
On 11/4/2012 2:14 PM, Cory Nelson wrote:
On Sun, Nov 4, 2012 at 3:55 PM, Paul Mensonides <pmenso57@comcast.net>wrote:
The extensions for C++/CX and C++/CLI are even worse because they actually subvert the entire language by forcing the limited .Net runtime model on the language which is a massive regression.
C++/CX is sugar for building and using Windows RT COM objects. It does not involve .NET _at all_. Not in your code and not hidden in the runtime background. Furthermore, both of them are intended for use as a bridge into the Windows platforms at the outer edges of otherwise portable, standards-compliant code. You're not supposed to use them as your primary language.
Resume (slightly more informed) venting.
I'm not uninformed. I'm generalizing because C++/CX and C++/CLI share some extensions at the syntactic level, and I'm not referring to the details of either except in that they *both* create a new type system, object model, runtime model, etc..
C++/CLI does not pretend to be C++. It is a dialect of C++ for .Net programming but nobody of any experience views this as the official C++ language. Your rant against things like C++/CLI and C++/CX is ill-founded IMO. Language vendors certainly have the right to create a new language from an existing language for their own use and their customer's use. I think you are being intolerant to think otherwise. If Microsoft has said anywhere that C++/CLI or C++/CX is standard C++ I would like to see it. There are probably few serious C++ expert programmers who do not agree with you that language vendors should support the C++ standard. But that is very different from mandating that language vendors should only be allowed to support a language standard and not be allowed to create another similar language for their own purposes. Actually C++/CLI is a very good language for .Net programming but Microsoft's support for it has been abysmal, so that it is hopeless to use it ( as opposed to C# ) for any serious largescale .Net programs or modules.

On 11/4/2012 6:19 PM, Edward Diener wrote:
C++/CLI does not pretend to be C++. It is a dialect of C++ for .Net programming but nobody of any experience views this as the official C++ language.
Your rant against things like C++/CLI and C++/CX is ill-founded IMO. Language vendors certainly have the right to create a new language from an existing language for their own use and their customer's use. I think you are being intolerant to think otherwise. If Microsoft has said anywhere that C++/CLI or C++/CX is standard C++ I would like to see it.
Sure they have the right to do it. That doesn't make it good design, that doesn't make the way they go about it ethical, and that doesn't make it good for C++ as a whole. Regardless, as I've said, I wouldn't have nearly the vitriol toward MS if they actually implemented the language. No marketing, no "subtle" maneuvering, just implement the language. They go on and on, especially in the last year or two, about the "C++ renaissance" and their commitment to C++, but then they develop yet another set of their own extensions instead of C++.
There are probably few serious C++ expert programmers who do not agree with you that language vendors should support the C++ standard. But that is very different from mandating that language vendors should only be allowed to support a language standard and not be allowed to create another similar language for their own purposes.
I'm not advocating mandating anything. Microsoft should either say they support C++ and then actually do it by implementing the language (completely--no "won't fix"), or they should say that they don't support C++ and step aside--developing whatever C-flat that they want.
Actually C++/CLI is a very good language for .Net programming but Microsoft's support for it has been abysmal, so that it is hopeless to use it ( as opposed to C# ) for any serious largescale .Net programs or modules.
This has nothing to do with C++, but there is no such thing as a good language for .Net programming. It isn't a good runtime model. Take away the .Net Framework (which is bloatware anyway--std::string x 1000) and what do you have? You have a fairly generic ho-hum language that bubbles implementation detail (in the form of any type of non-memory resource management). Now implement the .Net Framework, bloatware and all if you like, as a C++ API, and you have everything that C# actually offers sans reflection--which is frequently overused to paper over poor design. The point is, of course, that C# strength is *not* its language, but rather its extensive library--however poorly implemented. I look at that and say, "Why isn't that easily available in C++?" The answer, IMO, is largely because libraries target *compilers* rather than C++. Regardless of what gets added to the standard library, it will *never* be enough. What's needed is for compiler vendors to approximate the language to the greatest degree possible such that the standard acts as a contract between compiler authors and compiler vendors. That is priority one for C++ to move forward. More libraries in the standard library is good, but unnecessary. There is only one thing that is really necessary. Regards, Paul Mensonides

Edward Diener skrev 2012-11-05 03:19:
On 11/4/2012 5:41 PM, Paul Mensonides wrote:
On 11/4/2012 2:14 PM, Cory Nelson wrote:
On Sun, Nov 4, 2012 at 3:55 PM, Paul Mensonides <pmenso57@comcast.net>wrote:
The extensions for C++/CX and C++/CLI are even worse because they actually subvert the entire language by forcing the limited .Net runtime model on the language which is a massive regression.
C++/CX is sugar for building and using Windows RT COM objects. It does not involve .NET _at all_. Not in your code and not hidden in the runtime background. Furthermore, both of them are intended for use as a bridge into the Windows platforms at the outer edges of otherwise portable, standards-compliant code. You're not supposed to use them as your primary language.
Resume (slightly more informed) venting.
I'm not uninformed. I'm generalizing because C++/CX and C++/CLI share some extensions at the syntactic level, and I'm not referring to the details of either except in that they *both* create a new type system, object model, runtime model, etc..
C++/CLI does not pretend to be C++. It is a dialect of C++ for .Net programming but nobody of any experience views this as the official C++ language.
Your rant against things like C++/CLI and C++/CX is ill-founded IMO. Language vendors certainly have the right to create a new language from an existing language for their own use and their customer's use. I think you are being intolerant to think otherwise. If Microsoft has said anywhere that C++/CLI or C++/CX is standard C++ I would like to see it.
The rant is really about them saying that they are "Fully Committed to C++", and pushing "A C++ Renaissance". Then it turn out they deliver C++/CLI and C++/CX, but cannot deliver new C++11 feature because of "lack of resources" and "missing the deadline". Doesn't sound very committed to me!
There are probably few serious C++ expert programmers who do not agree with you that language vendors should support the C++ standard. But that is very different from mandating that language vendors should only be allowed to support a language standard and not be allowed to create another similar language for their own purposes.
They can do whatever they want, but it would be A LOT prettier if they were honest about it. Saying one thing and doing another isn't the best way to impress your customers. Neither is "Buy our new product now, we will deliver the contents later." Bo Persson

On 11/6/2012 1:13 PM, Bo Persson wrote:
Edward Diener skrev 2012-11-05 03:19:
Your rant against things like C++/CLI and C++/CX is ill-founded IMO. Language vendors certainly have the right to create a new language from an existing language for their own use and their customer's use. I think you are being intolerant to think otherwise. If Microsoft has said anywhere that C++/CLI or C++/CX is standard C++ I would like to see it.
The rant is really about them saying that they are "Fully Committed to C++", and pushing "A C++ Renaissance". Then it turn out they deliver C++/CLI and C++/CX, but cannot deliver new C++11 feature because of "lack of resources" and "missing the deadline".
Doesn't sound very committed to me!
My rant is actually not specifically about MS and C++11 as much as MS and C++ generally (including C++11, but also including C++98 facilities which they have yet to implement or implement correctly). What MS is doing (and, less frequently, saying) is, "We will implement all of C++ except those things which we don't want to implement, we don't like, or we don't feel are important." The attitude is like the kid on the playground that doesn't get his way so he takes his ball and goes home. What it shows is a *lack* of commitment to the standardization process by flouting the standard whenever they disagree with it. I.e. standardization is important so long as "we" agree with whatever is standardized. All I want from MS WRT C++ is what I said before. A commitment to implement all of standard C++ (not C++ sans XYZ) and then fulfilling that commitment in the limit. By that last, I don't mean that I am demanding that they have all of C++ implemented by tomorrow or by their next release, but that they are, in each release, improving by a margin that is reasonable given the length of the release cycle and indicative of their desire to fulfill the commitment. Specifically, I do *not* mean avoiding implementing XYZ by improving so lethargically that a new standard is published along with a new set of "interesting" features prior to having "time" to implement XYZ. There was a huge amount of time between C++98 and C++11--enough time to implement XYZ (e.g. the correct phases of translation, macro replacement, two-phase lookup, etc., etc.), but they didn't do it. Instead, they focused on extra-linguistic things (such as C++/CLI and C++/CX). Regards, Paul Mensonides

On November 5, 2012 1:55:42 AM Paul Mensonides <pmenso57@comcast.net> wrote:
On 11/4/2012 1:14 PM, Andrey Semashev wrote:
I am not against all extensions. I am against *feature* extensions. There is a place for *necessary* extensions, and, in those cases, lack of the presence of those extensions should easily fallback to the actual language. Hardware vectorization is one example of that. If it exists, it should exist as a compiler-specific pragma which can be ignored. Further, no feature extension to the language is good if the feature could be implemented via library--i.e. it should not have language extensions solely for the purpose of syntactic convenience.
IMHO, pragma-controlled compiler-generated vectorization is a utopy, except for the very trivial things. Real benefit is provided by extensions like __m128 and intrinsic functions. Also remember the mentioned typeof and long long. Some things cannot be implemented on the library level efficiently, and pragmas are not the silver bullet.
E.g. MS's AMP extensions:
[snip] I agree, some things could have been done differently. I don't really understand why that CLI extension is needed, so I didn't ever use it. Guess what, I couldn't care less it's there. And since it's not in the core language and I haven't seen it in the public code yet, I'd say the industry largely rejected it.
Look at extensions from another perspective. They are often a playground to test new features that may eventually, after usage experience is gathered, go into standard as a new feature. Only extensions are available here and now and not in a few years when the new std paper rolls out. This is pretty much like how it goes with libraries.
And they are used in production code, not playground code, and then go on to interfere with standardization because of attempts to be backward compatible with things that never officially existed.
That's the way the field experience is gathered. Someone suggests an extension, the industry tests and refines it. If it proves to be useful, it goes to the standard. There are downsides but that's how the progress works.

On 11/4/2012 2:38 PM, Andrey Semashev wrote:
On November 5, 2012 1:55:42 AM Paul Mensonides <pmenso57@comcast.net> wrote:
On 11/4/2012 1:14 PM, Andrey Semashev wrote:
I am not against all extensions. I am against *feature* extensions. There is a place for *necessary* extensions, and, in those cases, lack of the presence of those extensions should easily fallback to the actual language. Hardware vectorization is one example of that. If it exists, it should exist as a compiler-specific pragma which can be ignored. Further, no feature extension to the language is good if the feature could be implemented via library--i.e. it should not have language extensions solely for the purpose of syntactic convenience.
IMHO, pragma-controlled compiler-generated vectorization is a utopy, except for the very trivial things. Real benefit is provided by extensions like __m128 and intrinsic functions. Also remember the mentioned typeof and long long. Some things cannot be implemented on the library level efficiently, and pragmas are not the silver bullet.
WRT typeof, something like decltype would have come around without GCC's typeof. WRT to long long, nothing in the standard says int == long. Similarly, lambda comes around without a vendor implementing an extension due to heroic efforts of library developers whose task was made drastically more difficult due to lackluster compilers. In the theoretical case where all C++ compilers implemented the standard and were bug-free, how does one write portable code in C++ without it being a mess of #ifdefs for every processor architecture and platform? If that is a pipe-dream, then C++ is dead. With something like hardware vectorization, there will be some cases where architecture/platform/compiler-specific code is necessary. However, the majority of code does not require it--even when performance matters. The majority of code does not require the compute potential of the LHC.
I agree, some things could have been done differently. I don't really understand why that CLI extension is needed, so I didn't ever use it. Guess what, I couldn't care less it's there. And since it's not in the core language and I haven't seen it in the public code yet, I'd say the industry largely rejected it.
Maybe so, but my primary concern is not about C++/CLI, C++/CX, etc., or even extensions. My concern is about MS saying one thing and then doing another in such a way that I consider it to be purely marketing driven, misleading and unethical if not outright deceptive, and actually attempting to cause client code to be MS-specific regardless of what MS representatives are saying. WRT to language extensions, it may not be avoidable in all cases, but it is avoidable in most cases. Every single extension branches the language and makes it more difficult to write portable code, more difficult to write tooling that has to interpret the code, and has the overall effect of stunting technological growth.
Look at extensions from another perspective. They are often a playground to test new features that may eventually, after usage experience is gathered, go into standard as a new feature. Only extensions are available here and now and not in a few years when the new std paper rolls out. This is pretty much like how it goes with libraries.
And they are used in production code, not playground code, and then go on to interfere with standardization because of attempts to be backward compatible with things that never officially existed.
That's the way the field experience is gathered. Someone suggests an extension, the industry tests and refines it. If it proves to be useful, it goes to the standard. There are downsides but that's how the progress works.
If an extension is targetted at standard C++ and the purpose is actually to gather field experience, that's one thing. It is another when it is simply a vendor abusing their power to get their way in spite of the committee. Regards, Paul Mensonides

On Mon, Nov 5, 2012 at 12:08 AM, Paul Mensonides <pmenso57@comcast.net> wrote:
WRT typeof, something like decltype would have come around without GCC's typeof. WRT to long long, nothing in the standard says int == long.
But for a given platform you can't easily change the size of a type. You'd essentially create a new platform. Adding a new type is far easier. -- Olaf

On 11/5/2012 4:26 AM, Olaf van der Spek wrote:
On Mon, Nov 5, 2012 at 12:08 AM, Paul Mensonides <pmenso57@comcast.net> wrote:
WRT typeof, something like decltype would have come around without GCC's typeof. WRT to long long, nothing in the standard says int == long.
But for a given platform you can't easily change the size of a type.
Why not? I mean, really, why not? Backward compatibility for code which assumes things that it shouldn't? Perhaps we should take the hit now for things like this rather than deal with it forever. What's next, long long long? In some ways, that's a little like the Y2K scenario. Replace assumptions with verifiable fact (especially when you have all of the std typedefs that exist now) even if you have to meta-compute it. Don't get me wrong, I recognize that that would be a huge short-term task, but, in the long run, the work required as a result of over-prioritizing backward compatibility dwarfs it. Regards, Paul Mensonides

On November 5, 2012 1:55:42 AM Paul Mensonides <pmenso57@comcast.net> wrote:
On 11/4/2012 1:14 PM, Andrey Semashev wrote:
I am not against all extensions. I am against *feature* extensions. There is a place for *necessary* extensions, and, in those cases, lack of the presence of those extensions should easily fallback to the actual language. Hardware vectorization is one example of that. If it exists, it should exist as a compiler-specific pragma which can be ignored. Further, no feature extension to the language is good if the feature could be implemented via library--i.e. it should not have language extensions solely for the purpose of syntactic convenience.
IMHO, pragma-controlled compiler-generated vectorization is a utopy, except for the very trivial things. Real benefit is provided by extensions like __m128 and intrinsic functions. Also remember the mentioned typeof and long long. Some things cannot be implemented on the library level efficiently, and pragmas are not the silver bullet. Compiler vendors could provide a portable implementation of these types and intrinsics via a library. Of course, they should use names more
Le 04/11/12 23:38, Andrey Semashev a écrit : portable that should avoid conflicts using the the well know naming guidelines. Just my 2cts. Vicente

On November 5, 2012 4:16:42 AM "Vicente J. Botet Escriba" <vicente.botet@wanadoo.fr> wrote:
IMHO, pragma-controlled compiler-generated vectorization is a utopy, except for the very trivial things. Real benefit is provided by extensions like __m128 and intrinsic functions. Also remember the mentioned typeof and long long. Some things cannot be implemented on the library level efficiently, and pragmas are not the silver bullet. Compiler vendors could provide a portable implementation of these types and intrinsics via a library. Of course, they should use names more
Le 04/11/12 23:38, Andrey Semashev a écrit : portable that should avoid conflicts using the the well know naming guidelines.
That's what they do (at least, in case of gcc, intel and msvc). Only those types and functions are or based on compiler extensions.

On Sun, Nov 4, 2012 at 9:21 PM, Paul Mensonides <pmenso57@comcast.net> wrote:
Does it? If the goal is portable code, you compile the code on multiple platforms and compilers, and unintended use of extensions is easily detected. Lack of full support for standard C++ or C++11 and bugs are a much bigger problem IMO as they sometimes require non-trivial workarounds that complicate the code.
Yes, it does. Taking aim at GCC instead, a huge amount of GNU code will not compile without --std=gnu++11 instead of --std=c++11. Even more of it won't
If gnu++11 is used, the goal of the authors isn't portable code, is it? Basically you'd like to 'force' them to use c++11 by taking away the extensions?
Some libs should be part of the standard to lower the barrier to start using them.
Of course. However, I consider that to be a means to guarantee that the presence of those libraries included in the "price" of the compiler and support of that compiler. There is no way that every useful library in every useful domain can be added to the standard library. For C++ software development to really flourish, libraries from outside the standard need to be portable and reusable.
I agree
Take file system and networking libraries, for example. I can use those libraries regardless of whether the standard contains equivalent libraries because Boost contains them. I think adding networking support and file
For an app it's easy to depend on another lib. But for a lib, depending on another lib that might not be easily available / installable can be problematic.
OTOH, Linux has its issues also such as its massive assumption of a system-wide particular version of GCC with no C++ ABI encoding in the shared object versioning. The Linux package management models (i.e. apt-get, rpm, emerge) are also fundamentally broken because they don't scale (many-to-one-to-many vs many-to-many).
They could be better but I don't think calling them fundamentally broken is fair. -- Olaf

On 11/5/2012 4:16 AM, Olaf van der Spek wrote:
On Sun, Nov 4, 2012 at 9:21 PM, Paul Mensonides <pmenso57@comcast.net> wrote:
Yes, it does. Taking aim at GCC instead, a huge amount of GNU code will not compile without --std=gnu++11 instead of --std=c++11. Even more of it won't
If gnu++11 is used, the goal of the authors isn't portable code, is it? Basically you'd like to 'force' them to use c++11 by taking away the extensions?
For any extension that is just syntactic sugar or not desperately required (as hardware vectorization may be), yes. Their existence is damaging in the long term. Architecture-specific code is the minority. Besides those comparatively rare cases, platform-specific code (and by "platform" I don't mean "compiler") should be the only C++ code that is non-portable. Everything else should be portable almost by accident. A particular author might be shortsighted and not care about portability, but portability in the general case is what has to happen for computer science (not just C++) to really move forward. Platforms, compilers, and, in most cases, architectures need to be drop-in, interchangeable components, not foundations. Even with hardware vectorization, I'm not sure how much it should be used at present other than in critical places. The reason I say this is that I don't think we as academia/industry really know how to do multiprocessing (including vectorization) yet, and I suspect that it will end up being such that the overall way-of-coding, structuring data, etc. is significantly different--if not radically different. As an aside, I actually think this one thing might be the one that kills off all current major (which implies imperative) languages--including C++.
For an app it's easy to depend on another lib. But for a lib, depending on another lib that might not be easily available / installable can be problematic.
In some ways, Windows deployment is easier because you can distribute in-directory DLLs for many libraries that don't require their own installation programs and largely avoid DLL hell. In many ways, the Linux model is better because it has better facilities for reuse, but dealing with C++ ABI issues and version availablity issues can be a nightmare also. Granted, you can do the same thing as with Windows with rpath if you really want to, but then you throw away memory and, usually less importantly, disk reuse (just as you get with Windows with in-directory DLLs).
OTOH, Linux has its issues also such as its massive assumption of a system-wide particular version of GCC with no C++ ABI encoding in the shared object versioning. The Linux package management models (i.e. apt-get, rpm, emerge) are also fundamentally broken because they don't scale (many-to-one-to-many vs many-to-many).
They could be better but I don't think calling them fundamentally broken is fair.
Sorry, I didn't mean the tools themselves. I'm referring to the single points of update and/or vetting of the content that those tools work with (at least, via official repositories). They are fundamentally broken because all updates are essentially serialized through a single point. That just doesn't scale despite herculian effort, and most Linux distros are way behind the most current releases of most software because of that. Pressure for throughput at that point far outweights the available throughput--the outcome is inevitable. Currently, deploying on Linux via any of the package management systems is a nightmare unless you only need old compilers and only rely on old versions of other libraries. Besides the boilerplate distro differences in how one specifies a package, you run smack into version availability issues (related to which versions have so far gone through the single point) and ABI issues. For the ABI-related stuff, the so versioning model could include an ABI stamp of some kind. So, for example, I could build and install Boost 1.52 with both --std=c++98 and --std=c++11 and have them coexist. For handling the creation and processing of the (e.g.) dependency graph without single-sourcing it, I don't have any particularly great ideas. Regards, Paul Mensonides

On Mon, Nov 5, 2012 at 3:46 PM, Paul Mensonides <pmenso57@comcast.net> wrote:
For an app it's easy to depend on another lib. But for a lib, depending on another lib that might not be easily available / installable can be problematic.
In some ways, Windows deployment is easier because you can distribute in-directory DLLs for many libraries that don't require their own installation programs and largely avoid DLL hell. In many ways, the Linux model is better because it has better facilities for reuse, but dealing with C++ ABI issues and version availablity issues can be a nightmare also. Granted, you can do the same thing as with Windows with rpath if you really want to, but then you throw away memory and, usually less importantly, disk reuse (just as you get with Windows with in-directory DLLs).
Actually, I meant build-time deployment. Getting includes and libs installed.
They could be better but I don't think calling them fundamentally broken is fair.
Sorry, I didn't mean the tools themselves. I'm referring to the single points of update and/or vetting of the content that those tools work with (at least, via official repositories). They are fundamentally broken because all updates are essentially serialized through a single point. That just doesn't scale despite herculian effort, and most Linux distros are way behind the most current releases of most software because of that. Pressure for throughput at that point far outweights the available throughput--the outcome is inevitable. Currently, deploying on Linux via any of the package management systems is a nightmare unless you only need old compilers and only rely on old versions of other libraries. Besides the boilerplate distro differences in how one specifies a package, you run smack into version availability issues (related to which versions have so far gone through the single point) and ABI issues.
I guess my definition of broken is different than yours. Yes, the model can be improved (greatly), but calling it broken? -- Olaf

on Mon Nov 05 2012, Paul Mensonides <pmenso57-AT-comcast.net> wrote:
If gnu++11 is used, the goal of the authors isn't portable code, is it? Basically you'd like to 'force' them to use c++11 by taking away the extensions?
For any extension that is just syntactic sugar or not desperately required (as hardware vectorization may be), yes. Their existence is damaging in the long term.
Wow, that doesn't sound good to me. We already have trouble establishing "existing practice" for new features. What you're suggesting would be a serious problem for the process, unless you only want to see major changes to the standard, and no new not-desperately-required-but-nice-to-have "cleanups" or "syntactic sugar." IMO those incremental improvements might be just as important as the big things. -- Dave Abrahams BoostPro Computing Software Development Training http://www.boostpro.com Clang/LLVM/EDG Compilers C++ Boost

On 11/8/2012 6:23 AM, Dave Abrahams wrote:
on Mon Nov 05 2012, Paul Mensonides <pmenso57-AT-comcast.net> wrote:
For any extension that is just syntactic sugar or not desperately required (as hardware vectorization may be), yes. Their existence is damaging in the long term.
Wow, that doesn't sound good to me. We already have trouble establishing "existing practice" for new features. What you're suggesting would be a serious problem for the process, unless you only want to see major changes to the standard, and no new not-desperately-required-but-nice-to-have "cleanups" or "syntactic sugar." IMO those incremental improvements might be just as important as the big things.
If it starts and ends with the committee. For example, floating whether an idea has merit, implementing it if it appears the path to standardization is viable, gathering use experience, and then presenting it for standardization. If it is rejected, remove it (i.e. allow code to break that uses the experimental extension). If it is accepted as is, great. If it is accepted with modifications, modify it to conform (i.e. allow code to break that uses the experimental extension). In many cases of current extensions, however, we have significant existing practice in other languages. For example, one doesn't need an implementation of "properties" as an extension to C++, one doesn't need an implementation of an "interface" keyword, one doesn't need a "finally" extension, one doesn't need an "event" extension, etc., etc., yet that hasn't stopped MS from implementing them (and a bunch of others). Other implementers do this kind of thing also, but, AFAICT, none of them are even in the ballbark of MS running roughshod over the language with whatever they want. It is one thing if an extension is implemented for possible future standardization (and tweaked or removed pending the results of that) with the intent to gain use experience. It is another when it is simply some vendor deciding they like some feature, adding it, and then promoting its use. That is an abuse of power that fractures the language and destroys portability. Regards, Paul Mensonides

On 11/3/2012 5:24 PM, Paul Mensonides wrote:
On 11/3/2012 10:26 AM, Nathan Ridge wrote:
* Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS...
for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers.
Shortly after this talk, Herb held a Q&A session where people had the opportunity to ask him questions about these announcements (can't find a link at the moment).
Someone asked whether Microsoft intends to implement full 100% support for standard C++(11), and Herb answered with a resounding 'yes'.
<rant>
IMO, yet more marketing b***s***. This has been said before, and Herb has long since lost my trust (and the trust of many others). He is no longer a free voice. The only person on C9 that doesn't come off as an MS shill is STL.
Who is "STL" ?
Did anyone tell him about the problems with VC's preprocessor that come up on this list again and again and that prevent a powerful preprocessor metaprogramming library like Chaos from being usable on VC?
He's been told repeatedly.
Perhaps Herb Sutter does not have the power at Microsoft to determine what Microsoft will do in regards to complying with the C++ standard, as opposed to being just one perhaps leading voice among many in determining such things.
--
Not that I'm against a "foundation" or against adding more libraries to the standard library, but the only things that C++ programmers need to produce portable code are C++ compilers that implement the standard (and only the standard--not a bunch of vendor-specific extensions). As an example, paraphrasing, "We're proposing 'await' but if the committee doesn't want it we can always add it as an extension." It is particular compiler vendors and their compilers that are getting in the way of progress.
I disagree in principal that compiler vendors should not provide extensions to a computer language. After all gcc has done it for many years. Of course I feel that compiler vendors should implement a computer language as it is defined by the standard for that language and only provide extensions in situations where the extensions can be turned off in a clearly defined manner.

On 11/4/2012 12:10 PM, Edward Diener wrote:
On 11/3/2012 5:24 PM, Paul Mensonides wrote:
IMO, yet more marketing b***s***. This has been said before, and Herb has long since lost my trust (and the trust of many others). He is no longer a free voice. The only person on C9 that doesn't come off as an MS shill is STL.
Who is "STL" ?
Stephan T. Lavavej. He's a standard library guy at MS. He's around here from time to time, but also makes C++-related videos for MS's Channel 9 PDM (propaganda distribution mechanism).
Did anyone tell him about the problems with VC's preprocessor that come up on this list again and again and that prevent a powerful preprocessor metaprogramming library like Chaos from being usable on VC?
He's been told repeatedly.
Perhaps Herb Sutter does not have the power at Microsoft to determine what Microsoft will do in regards to complying with the C++ standard, as opposed to being just one perhaps leading voice among many in determining such things.
Sure. It is MS in general, not Herb specifically. If Herb wasn't there, it would be somebody else. However, Herb is the face of C++ at Microsoft, and he repeatedly refers to MS's supposed commitment to C++. Actions speak louder than words. Forget all of the careful marketing, and just implement the d*** language. Doing so would engender a much larger amount of goodwill towards MS over time.
Not that I'm against a "foundation" or against adding more libraries to the standard library, but the only things that C++ programmers need to produce portable code are C++ compilers that implement the standard (and only the standard--not a bunch of vendor-specific extensions). As an example, paraphrasing, "We're proposing 'await' but if the committee doesn't want it we can always add it as an extension." It is particular compiler vendors and their compilers that are getting in the way of progress.
I disagree in principal that compiler vendors should not provide extensions to a computer language. After all gcc has done it for many years. Of course I feel that compiler vendors should implement a computer language as it is defined by the standard for that language and only provide extensions in situations where the extensions can be turned off in a clearly defined manner.
For the record, I believe GCC's rampant use of extensions is also detrimental. Obviously, if all or most compilers implemented the entire language, extensions can be disabled, and the standard library 100% works without the extensions, we'd be far better off than we are now. However, the presence of language extensions cause people who target a particular compiler to use those extensions. The problem with this is that C++ development should not target a particular compiler. If C++ development did not *have to* target compilers (as it does currently), then feature extensions become meaningless, and portability ensues. Furthermore, compiler extensions tend to be far less thought out than actual language features, and, even when the basic idea is good, the presence of the existing extensions interferes with standardization. E.g. typeof vs decltype. Regards, Paul Mensonides

On 4 November 2012 14:42, Paul Mensonides <pmenso57@comcast.net> wrote:
Stephan T. Lavavej. He's a standard library guy at MS. He's around here from time to time, but also makes C++-related videos for MS's Channel 9 PDM (propaganda distribution mechanism).
So by propaganda you mean making high quality (both in material and production values) videos freely available to the C++ community? Kudos to Microsoft for that! Bring it on! As I see it, Herb is trying to bring together the C++ community, and unfortunately, you seem to be trying to drive a wedge into it. You've made a straw man argument about altruism, even though Herb specifically said on Friday that the reason companies are contributing to the Standard C++ Foundation is that it makes good business sense for them, so I certainly don't see where *anybody*, let alone Herb, is claiming companies are sponsoring this out of the goodness of their hearts (if you have a counter link, I'd appreciate seeing it). I see incredibly obvious, not hidden agendas. Full disclosure: I work for one of the founding members of the Standard C++ Foundation. Even with Beman as one of the directors of the C++ Standard Foundation, you still want to encourage us to fear and doubt the effort? -- Nevin ":-)" Liber <mailto:nevin@eviloverlord.com> (847) 691-1404

On 11/5/2012 12:26 AM, Nevin Liber wrote:
On 4 November 2012 14:42, Paul Mensonides <pmenso57@comcast.net> wrote:
Stephan T. Lavavej. He's a standard library guy at MS. He's around here from time to time, but also makes C++-related videos for MS's Channel 9 PDM (propaganda distribution mechanism).
So by propaganda you mean making high quality (both in material and production values) videos freely available to the C++ community? Kudos to Microsoft for that! Bring it on!
I'm sorry, but most of the content on C9 is *blantantly* propaganda. Moreso, the general vibe I get from it is that the entire intent of C9 is marketing driven, not community outreach driven. Now, I did say most, not all, content. STL's videos are good. Some conference talks are okay--such as Bjarne's recent one. The content of those do not come off as propaganda.
As I see it, Herb is trying to bring together the C++ community, and unfortunately, you seem to be trying to drive a wedge into it.
No, I'm not. The argument I'm making, perhaps poorly, is that there is one thing more than any other that is needed for C++: for C++ users to be able to target the standard rather than compilers. So the dependency graph is compiler -> standard user -> standard rather than compiler -> standard user -> compiler The reason the latter is bad and leads to a lack of portable libraries is that it isn't just one user with one compiler. Instead it is more like millions of users and a bunch of compilers. With just two of each, you currently get: compiler(1) -> standard compiler(2) -> standard user(1) -> { compiler(1) compiler(2) } user(2) -> { compiler(1) compiler(2) } to create a stable portable library base. This grows untenably, especially when one takes into account extension copying, non-compiler C++ tools, etc.. This gets way worse. In fact, it is usually more like: ... user(1) -> { compiler(1) } user(2) -> { compiler(2) compiler(3) } user(3) -> { compiler(1) compiler(4) } ... However, it doesn't *have* to because it could be: { compiler-or-tool(1...inf) user(1...inf) } -> standard For that to occur, extensions to the language should be downplayed in the extreme and only used when absolutely necessary and encapsulated to the greatest degree possible rather than promoted and contantly advertised. Also for that to occur, compilers need to correctly implement C++ in the limit.
You've made a straw man argument about altruism, even though Herb specifically said on Friday that the reason companies are contributing to the Standard C++ Foundation is that it makes good business sense for them, so I certainly don't see where *anybody*, let alone Herb, is claiming companies are sponsoring this out of the goodness of their hearts (if you have a counter link, I'd appreciate seeing it).
I don't have any problem with this foundation at all. I don't have a problem with MS funding it or any of the other companies funding it. At the same time, I don't yet know exactly what it is supposed to do or how successful it will be at whatever that is. I.e. my argument has nothing to do with the foundation or isocpp.org. Instead, it is years and years of MS saying the same thing, but not doing it, and also years of flat-out saying they *won't* implement some part of the standard. Sure, they do it for the big marketable features. There was no doubt they would implement variadic templates, for example. What I and many others want to hear from MS is the following, "We are going to implement all of the C++11 features and implement/fix all of the C++98 features which haven't been removed by C++11 (export)." And then go about actually doing that. I don't care if that takes five years. I don't care if the preprocessor issues are last on that list. WRT to propaganda, we have several years of Herb mentioning how we've dropped the ball on UI innovation suddenly followed by Metro coupled with a closed-platform app store. We also have several years of proselytizing that the future is distributed cloud-based computing followed by Azure. Sorry, when someone that was supposedly going to be "holding Microsoft accountable," starts constantly bringing up these types of things right before a big new marketing compaign, the respect and trust that I had from back in the GotW days deteriorates--so much so that even when he makes offhand comments about big touch screens I consider it to be nothing more than planned marketing--similarly for the near constant "beautiful modern apps." Even if I'm wrong, that is my perception and the perception of *many*, and if it is actually supposed to be "bring the community together" *it* is doing the exact opposite--not me. In that last announcement talk alone together with the follow-up Q & A there are numerous "subtle" bits of marketing. The entire general trend is one of attempting to turn a product (software) into a service. That is an anti-consumer nickle-and-dime model which is just as anti-consumer as trying to trick people that can't afford something into making payments. It is just as anti-consumer as many other related things occurring in the software industry nowadays--microtransactions being one of them.
I see incredibly obvious, not hidden agendas. Full disclosure: I work for one of the founding members of the Standard C++ Foundation.
Oh, don't worry. I also see incredibly obvious agendas. As I said, however, I have no problem with the foundation at present. The only thing about it I believe is questionable is Herb's presence on the board of directors--and that is because I do not trust Herb. Other than having met him a few times, I don't know Herb so I can't go off of personal knowledge of his integrity, and if it was just Herb, I'd give him the benefit of the doubt. But I won't give MS the benefit of the doubt given their history. Why, at this point, would it be rational for *anyone* to take what MS says at face value? I don't know enough about the others (except Beman and Bjarne which are people that I do trust--even if I violently disagree with Bjarne about certain things) to have a say one way or another.
Even with Beman as one of the directors of the C++ Standard Foundation, you still want to encourage us to fear and doubt the effort?
I'm specifically referring to VC++, MS's apparent general marketing strategy and how that affects VC++, and how MS's resulting behavior WRT to VC++ is actually harmful to C++ in general. I am not saying that *every* action that MS takes is harmful, I'm not saying that individual members of the VC++ team are untalented or unethical, and I am not saying that the foundation is harmful. For the latter, time will tell. There are politics and factions to some degree within committee. I'll reserve judgement until after I've seen that it doesn't turn into a one-party outlet. Don't get me wrong, I'm not expecting that, but it could happen. Regards, Paul Mensonides

on Mon Nov 05 2012, Paul Mensonides <pmenso57-AT-comcast.net> wrote:
Oh, don't worry. I also see incredibly obvious agendas. As I said, however, I have no problem with the foundation at present. The only thing about it I believe is questionable is Herb's presence on the board of directors--and that is because I do not trust Herb.
I've had my issues with Herb in the past, but on this particular point it's only fair to point out that he drove the whole effort of creating this foundation. He's invested significant *personal* blood sweat and tears in getting it set up, and his bosses at Microsoft didn't order him to do it. It seems perfectly appropriate to me that this organization be led by those with the largest personal investment in it. -- Dave Abrahams BoostPro Computing Software Development Training http://www.boostpro.com Clang/LLVM/EDG Compilers C++ Boost

On 11/8/2012 6:12 AM, Dave Abrahams wrote:
on Mon Nov 05 2012, Paul Mensonides <pmenso57-AT-comcast.net> wrote:
Oh, don't worry. I also see incredibly obvious agendas. As I said, however, I have no problem with the foundation at present. The only thing about it I believe is questionable is Herb's presence on the board of directors--and that is because I do not trust Herb.
I've had my issues with Herb in the past, but on this particular point it's only fair to point out that he drove the whole effort of creating this foundation. He's invested significant *personal* blood sweat and tears in getting it set up, and his bosses at Microsoft didn't order him to do it. It seems perfectly appropriate to me that this organization be led by those with the largest personal investment in it.
Well, I don't really have a problem with it. Whether it turns out to be a good thing or bad thing depends on how it is handled. With this foundation, I'm actually less concerned about Herb's presence and more concerned about it being a one-party party WRT to committee "factions", but time will tell. Regards, Paul Mensonides

On 11/3/2012 1:26 PM, Nathan Ridge wrote:
* Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS... for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers.
Shortly after this talk, Herb held a Q&A session where people had the opportunity to ask him questions about these announcements (can't find a link at the moment).
Someone asked whether Microsoft intends to implement full 100% support for standard C++(11), and Herb answered with a resounding 'yes'.
Did anyone tell him about the problems with VC's preprocessor that come up on this list again and again and that prevent a powerful preprocessor metaprogramming library like Chaos from being usable on VC?
I doubt if Microsoft considers full support for C++(11) to be also producing a compliant preprocessor. I am not defending them for that viewpoint but it does seem as if they view the preprocessor has a separate part of the C++ language which they do not have to implement according to the latest C/C++ standards.

On Sun, Nov 4, 2012 at 3:01 PM, Edward Diener <eldiener@tropicsoft.com>wrote:
I doubt if Microsoft considers full support for C++(11) to be also producing a compliant preprocessor. I am not defending them for that viewpoint but it does seem as if they view the preprocessor has a separate part of the C++ language which they do not have to implement according to the latest C/C++ standards.
I think the problem is that so few people care about a compliant preprocessor, and those who do care are often library developers who consistently bend over backwards with workarounds to support VC++ anyway. That coupled with the fact that fixing their preprocessor means potentially breaking people's code (even though it was noncompliant to begin with), means that it is an extremely low priority for them. IMO, they should just include something like Boost.Wave or Clang's preprocessor and have a compiler option to enable it (even if it defaults to not being enabled, falling back to their noncompliant preprocessor) and make sure that windows headers compile correctly with it on. I can't imagine that this would be too difficult for them to do, and at least then library developers have some way to support VC++ without tons of workarounds -- just tell users of the library to pass in the appropriate compiler option. -- -Matt Calabrese

On 11/4/2012 12:33 PM, Matt Calabrese wrote:
On Sun, Nov 4, 2012 at 3:01 PM, Edward Diener <eldiener@tropicsoft.com>wrote:
I doubt if Microsoft considers full support for C++(11) to be also producing a compliant preprocessor. I am not defending them for that viewpoint but it does seem as if they view the preprocessor has a separate part of the C++ language which they do not have to implement according to the latest C/C++ standards.
I think the problem is that so few people care about a compliant preprocessor, and those who do care are often library developers who consistently bend over backwards with workarounds to support VC++ anyway. That coupled with the fact that fixing their preprocessor means potentially breaking people's code (even though it was noncompliant to begin with), means that it is an extremely low priority for them. IMO, they should just include something like Boost.Wave or Clang's preprocessor and have a compiler option to enable it (even if it defaults to not being enabled, falling back to their noncompliant preprocessor) and make sure that windows headers compile correctly with it on. I can't imagine that this would be too difficult for them to do, and at least then library developers have some way to support VC++ without tons of workarounds -- just tell users of the library to pass in the appropriate compiler option.
There are certainly features of the language which are more important than the preprocessor. However, the preprocessor has been broken for decades. More than that, in comparison to many other language facilities, implementing it correctly is easy. The problem is not that it is a low priority among language features. The problem is that other things such as extensions and integrated libraries are prioritized ahead of language features/bugs--and new instances of those keep occurring so the end of the priority list never gets reached. In the specific case of the preprocessor and some others, the problem is that it is continuously classed as "won't fix" and therefore has no priority at all. I would be much happier with a priority list that was something like: 1. C++ feature A 2. C++ feature B 3. C++ bug 1 ... 100. C/C++ preprocessor 101. language extensions for AMP 102. language extensions for C++/CLI/CX I would be actually happy with a priority list that was something like: 1. C++ feature A 2. C++ feature B 3. C++ bug 1 ... 100. C/C++ preprocessor with language extensions for AMP, C++/CLI, and C++/CX not existing at all and instead implemented entirely as libraries. In that case, the compiler would not be dependent on the development of these things and therefore these things could be developed concurrently by separate teams and have their own priority lists. However, both of the above priority lists are *far* away from what actually appears to occur. Instead, it is more like 1. C++ bindings to WinRT (marketing driven) 2. C++ bindings to C++/CLI (marketing driven) 3. C++ subsetting/extensions for AMP (marketing driven ultimately for cloud services) 4. new C++11 features (actually also marketing driven) 5. feature-breaking bugs Almost everything else is classified as "won't fix." Worse than the above, (1), (2), and (3) get implemented and taken off the list only to be replaced by new marketing-driven, extra-linguistic concerns. The result is excessively slow adoption of C++11 features and almost no adoption of missing C++98 features or fixing of mis-implemented C++98 features. Virtually every action taken by MS WRT VC++ is contrary to what Herb repeatedly says in his various talks and interviews. MS is free to do whatever they want. However, stop lying to us and stop playing politics. Regards, Paul Mensonides

On 03/11/12 15:52, Beman Dawes wrote:
* Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS... for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers.
I find the idea of a community technical preview surprising, since Microsoft has been repeatedly refusing to fix bugs reported by the community unless you have purchased commercial support. It seems they focus on adding new features rather than fixing bugs. While a good marketing strategy for a commercial product, it arguably doesn't work so well with a "community" approach.

On Sat, Nov 3, 2012 at 2:05 PM, Mathias Gaunard <mathias.gaunard@ens-lyon.org> wrote:
On 03/11/12 15:52, Beman Dawes wrote:
* Microsoft has released an out-of-band community technical preview (CTP) for their compiler, adding explicit conversion operators, raw string literals, function template default arguments, delegating constructors, uniform initialization, and variatic templates. (See
http://channel9.msdn.com/Series/C9-Lectures-Stephan-T-Lavavej-Core-C-/STLCCS... for instructions on how to use the CTP). Microsoft is promising more such feature releases during the first half of 2013. This is important for the whole C++ community, since it means that full C++11 support is becoming a reality across all widely used compilers.
I find the idea of a community technical preview surprising, since Microsoft has been repeatedly refusing to fix bugs reported by the community unless you have purchased commercial support.
Have you filed a bug report recently? I filed one this summer and had a response the next day.
It seems they focus on adding new features rather than fixing bugs. While a good marketing strategy for a commercial product, it arguably doesn't work so well with a "community" approach.
Here some lists of bugs fixed: http://blogs.msdn.com/b/vcblog/archive/2012/06/15/10320846.aspx http://blogs.msdn.com/b/vcblog/archive/2012/08/10/10338661.aspx Note that those lists don't include bugs reported by those with commercial support contracts. --Beman

On Sat, Nov 3, 2012 at 7:47 PM, Beman Dawes <bdawes@acm.org> wrote:
I find the idea of a community technical preview surprising, since Microsoft has been repeatedly refusing to fix bugs reported by the community unless you have purchased commercial support.
Have you filed a bug report recently? I filed one this summer and had a response the next day.
It seems they focus on adding new features rather than fixing bugs. While a good marketing strategy for a commercial product, it arguably doesn't work so well with a "community" approach.
Here some lists of bugs fixed:
http://blogs.msdn.com/b/vcblog/archive/2012/06/15/10320846.aspx http://blogs.msdn.com/b/vcblog/archive/2012/08/10/10338661.aspx
Yes, I have: https://connect.microsoft.com/VisualStudio/feedback/details/752402/wrong-cod... https://connect.microsoft.com/VisualStudio/feedback/details/752386/std-atoll... https://connect.microsoft.com/VisualStudio/feedback/details/587544/copy-from... MS even stopped accepting feature requests via Connect. A modern UI style few people wanted was also considered more important then other IDE stuff. They do fix some things though, like https://connect.microsoft.com/VisualStudio/feedback/details/621653/including... -- Olaf

On 3 November 2012 18:58, Olaf van der Spek <ml@vdspek.org> wrote:
On Sat, Nov 3, 2012 at 7:47 PM, Beman Dawes <bdawes@acm.org> wrote:
I find the idea of a community technical preview surprising, since Microsoft has been repeatedly refusing to fix bugs reported by the community unless you have purchased commercial support.
Have you filed a bug report recently? I filed one this summer and had a response the next day.
It seems they focus on adding new features rather than fixing bugs. While a good marketing strategy for a commercial product, it arguably doesn't work so well with a "community" approach.
Here some lists of bugs fixed:
http://blogs.msdn.com/b/vcblog/archive/2012/06/15/10320846.aspx http://blogs.msdn.com/b/vcblog/archive/2012/08/10/10338661.aspx
Yes, I have: https://connect.microsoft.com/VisualStudio/feedback/details/752402/wrong-cod... https://connect.microsoft.com/VisualStudio/feedback/details/752386/std-atoll... https://connect.microsoft.com/VisualStudio/feedback/details/587544/copy-from...
MS even stopped accepting feature requests via Connect
See Doug Turnure's response to my twit here: https://twitter.com/dougt/status/244191668193087489 Screenshot if the twitter is inaccessible: http://www.flickr.com/photos/mloskot/8157327469/ Like it or not, that's due to the confusing way Microsoft works through the Connect reports. For example, a bug is reported to VS2010, then marked as "won't fix", but it does not mean it won't be fixed at all. It means it won't be fixed for VS2010. The bug is closed. Then VS2012 is released, and magically the bug reported to VS2010 appears to be fixed in VS2012.
They do fix some things though, like https://connect.microsoft.com/VisualStudio/feedback/details/621653/including...
Yes, e.g. this fix may be of interest of the boost::property_tree users: "Compiling trivial boost::property_tree test gives fatal error C1001" https://connect.microsoft.com/VisualStudio/feedback/details/708011 Best regards, -- Mateusz Loskot, http://mateusz.loskot.net

On 03/11/12 19:47, Beman Dawes wrote:
Have you filed a bug report recently? I filed one this summer and had a response the next day.
I have filed a couple, and voted for several existing ones. Always had something like "it's too late in the release cycle, WONTFIX" as an answer. While I do stumble across VC++ bugs fairly often (that compiler seriously has way more bugs than other major C++ compiler, both language bugs and internal errors) I simply do not report them anymore. What's the point if they're all going to be marked WONTFIX?
Here some lists of bugs fixed:
http://blogs.msdn.com/b/vcblog/archive/2012/06/15/10320846.aspx http://blogs.msdn.com/b/vcblog/archive/2012/08/10/10338661.aspx
Those are library bugs. The problematic bugs to get fixed are those in the compiler itself. Codegen bugs tend to get fixed eventually. Minor C++ bugs sometimes get fixed if you're lucky. Core C++ bugs? they're all closed, unless the bug is so major that the feature doesn't work at all.

On Sat, Nov 3, 2012 at 7:50 PM, Mathias Gaunard <mathias.gaunard@ens-lyon.org> wrote:
Here some lists of bugs fixed:
http://blogs.msdn.com/b/vcblog/archive/2012/06/15/10320846.aspx http://blogs.msdn.com/b/vcblog/archive/2012/08/10/10338661.aspx
Those are library bugs. The problematic bugs to get fixed are those in the compiler itself.
The first list was library bugs, the second was roughly 300 compiler bugs. --Beman

I find the idea of a community technical preview surprising, since Microsoft has been repeatedly refusing to fix bugs reported by the community unless you have purchased commercial support.
Have you filed a bug report recently? I filed one this summer and had a response the next day.
I'll help with this with pleasure: http://boost.2283326.n4.nabble.com/boost-msm-std-vector-lt-MyStateMachine-gt... The issue was closed as won't fix and is still that way. The links to related bugs are not working so no idea about them. I won't even think of using VC again until this is fixed, it's simply too big to allow me trust the compiler. Christophe

On Sun, Nov 4, 2012 at 12:27 PM, Christophe Henry < christophe.j.henry@googlemail.com> wrote
I'll help with this with pleasure: http://boost.2283326.n4.**nabble.com/boost-msm-std-** vector-lt-MyStateMachine-gt-**generate-stack-overflow-**td3532530.html<http://boost.2283326.n4.nabble.com/boost-msm-std-vector-lt-MyStateMachine-gt-generate-stack-overflow-td3532530.html>
The issue was closed as won't fix and is still that way. The links to related bugs are not working so no idea about them. I won't even think of using VC again until this is fixed, it's simply too big to allow me trust the compiler.
Interesting! I totally forgot about this. I just retrieved the zip with the source code, and I tested it with - boost 1.51.0 - VS2012 + CTP1 that we are talking about (didnt try with non-CTP version of the compiler) (there is also the october update) I get no warning anymore and the executable seems to run correctly. Looks like there have been a fix, not sure if it's MSM or the compiler. Joel Lamotte

The issue was closed as won't fix and is still that way. The links to related bugs are not working so no idea about them. I won't even think of using VC again until this is fixed, it's simply too big to allow me trust the compiler.
Interesting! I totally forgot about this. I just retrieved the zip with the source code, and I tested it with - boost 1.51.0 - VS2012 + CTP1 that we are talking about (didnt try with non-CTP version of the compiler) (there is also the october update)
I get no warning anymore and the executable seems to run correctly.
Looks like there have been a fix, not sure if it's MSM or the compiler.
Joel Lamotte
There was no MSM fix for this, so maybe a compiler fix. Who knows? Maybe MS fixes bugs once in a while. I have no VS2012 so I can't test it. Christophe

On Sun, Nov 4, 2012 at 3:27 PM, Christophe Henry <christophe.j.henry@googlemail.com> wrote:
I find the idea of a community technical preview surprising, since Microsoft has been repeatedly refusing to fix bugs reported by the community unless you have purchased commercial support.
Have you filed a bug report recently? I filed one this summer and had a response the next day.
I'll help with this with pleasure: http://boost.2283326.n4.nabble.com/boost-msm-std-vector-lt-MyStateMachine-gt...
The issue was closed as won't fix and is still that way. The links to related bugs are not working so no idea about them. I won't even think of using VC again until this is fixed, it's simply too big to allow me trust the compiler.
+1, I had reported a memory leak bug [1] to MS and had been told that the behavior is correct as it doesn't contradict the Standard. I can't trust the vendor that treats users this way and declares a memory leak as a valid behavior. I doubt I will ever bother reporting any other bugs in MSVC. [1] The leak appeared in STL streams, if initialized multiple times. Here's a code snippet: https://sourceforge.net/apps/trac/boost-log/ticket/2#comment:4 I didn't keep a reference to the MS bug tracker and I can't find it now.

[Andrey Semashev]
+1, I had reported a memory leak bug [1] to MS and had been told that the behavior is correct as it doesn't contradict the Standard. I can't trust the vendor that treats users this way and declares a memory leak as a valid behavior. I doubt I will ever bother reporting any other bugs in MSVC. [1] The leak appeared in STL streams, if initialized multiple times. Here's a code snippet: https://sourceforge.net/apps/trac/boost-log/ticket/2#comment:4 I didn't keep a reference to the MS bug tracker and I can't find it now.
This was Dev10#831920/Connect#518512. I can't load the Connect link anymore (it is supposed to be http://connect.microsoft.com/VisualStudio/feedback/details/518512/memory-lea... and I don't know why it's broken) but I can still see the comments through Team Foundation Server. This behavior was really, truly conformant to the Standard - I was surprised too, so I had to ask P.J. Plauger for an explanation. Here are the comments: -- Posted by ildjarn on 12/7/2009 at 3:21 AM Reproduced using VC++ 2010 beta 2. I traced the leak back to std::ios_base::_Init(), which effectively does the following (where _Ploc is a std::locale*): _Ploc = 0; // other code _Ploc = new locale; I would think that before assigning NULL to _Ploc, it should be doing 'delete _Ploc;'. Posted by Microsoft on 12/7/2009 at 6:39 PM Thank you for your feedback, we are currently reviewing the issue you have submitted. If this issue is urgent, please contact support directly(http://support.microsoft.com) Posted by Microsoft on 12/8/2009 at 10:36 PM Thanks for your feedback. We are rerouting this issue to the appropriate group within the Visual Studio Product Team for triage and resolution. These specialized experts will follow-up with your issue. Thank you Posted by Microsoft on 12/9/2009 at 3:11 PM Hi, Thanks for reporting this issue. I've resolved it as By Design because section 27.4.4.1 [lib.basic.ios.cons] of the 2003 C++ Standard does not allow init() to be called on a constructed object. (Yes, this is strange.) If you have any further questions, feel free to E-mail me at stl@microsoft.com . Stephan T. Lavavej Visual C++ Libraries Developer Posted by AndySem on 12/9/2009 at 9:42 PM Sorry, but what particular statement in the Standard leads to this conclusion? I did not find such a restriction in 27.4.4.1 (and in particular, in point 3, where init effects are described). Posted by Microsoft on 12/10/2009 at 4:47 PM See 27.4.4.1/2: basic_ios() "Constructs an object of class basic_ios (27.4.2.7) leaving its member objects uninitialized. The object must be initialized by calling its init member function. If it is destroyed before it has been initialized the behavior is undefined." Because init() is called when data members are uninitialized, it must assume that any pointers are garbage, and can't delete them. That's why it leaks memory when called on a fully initialized object. Stephan T. Lavavej Visual C++ Libraries Developer Posted by AndySem on 12/10/2009 at 8:24 PM First, this does not mean that init cannot be called more than once. Technically speaking, it may track whether it was called or not in an internal data member that is not covered by the Standard. Second, the Standard does not describe data members at all in the first place. It does not say that basic_ios shall have raw pointers instead of, say, auto_ptrs. The constructor cannot be called without default initializing these members. So the whole statement about "leaving its member objects uninitialized" is a moot and should be understood as "the members are default initialized and in that state the basic_ios object may not be usable". That is why I still consider it as a bug. I think it should be fixed. Posted by Microsoft on 12/11/2009 at 1:24 PM The Standard allows an implementation to provide very weak guarantees on when init() may be called; therefore, VC's implementation conforms to the Standard. Although we could probably provide stronger guarantees, strictly conforming programs couldn't take advantage of them. Our implementation (licensed from Dinkumware) has behaved this way for over 20 years, and given our finite resources we don't believe that changing our design here is worth the time necessary. (There are some places where we believe that exceeding the Standard's guarantees is worthwhile - e.g. we supported stateful allocators long before C++0x required it.) I know that this isn't what you want to hear, but this is why the Standard exists in the first place - to provide a contract between implementers and users, saying what implementers are required to provide and what users are allowed to depend on. If you would like the Standard to provide stronger guarantees, you can file a Library Issue with the Committee. Stephan T. Lavavej Visual C++ Libraries Developer -- STL

On November 5, 2012 6:06:42 AM "Stephan T. Lavavej" <stl@exchange.microsoft.com> wrote:
[Andrey Semashev]
+1, I had reported a memory leak bug [1] to MS and had been told that the behavior is correct as it doesn't contradict the Standard. I can't trust the vendor that treats users this way and declares a memory leak as a valid behavior. I doubt I will ever bother reporting any other bugs in MSVC. [1] The leak appeared in STL streams, if initialized multiple times. Here's a code snippet: https://sourceforge.net/apps/trac/boost-log/ticket/2#comment:4 I didn't keep a reference to the MS bug tracker and I can't find it now.
This was Dev10#831920/Connect#518512. I can't load the Connect link anymore (it is supposed to be http://connect.microsoft.com/VisualStudio/feedback/details/518512/memory-lea... and I don't know why it's broken) but I can still see the comments through Team Foundation Server. This behavior was really, truly conformant to the Standard - I was surprised too, so I had to ask P.J. Plauger for an explanation. Here are the comments:
[snip] Thank you Stephan for digging it out. But I still don't agree with your conclusions and consider it a serious bug.

On 5 November 2012 02:22, Andrey Semashev <andrey.semashev@gmail.com> wrote:
Thank you Stephan for digging it out. But I still don't agree with your conclusions and consider it a serious bug.
What wording in the standard makes it a bug (instead of just a QoI issue)?
--
Nevin ":-)" Liber <mailto:nevin@eviloverlord.com> (847) 691-1404

On Mon, Nov 5, 2012 at 12:31 PM, Nevin Liber <nevin@eviloverlord.com> wrote:
On 5 November 2012 02:22, Andrey Semashev <andrey.semashev@gmail.com> wrote:
Thank you Stephan for digging it out. But I still don't agree with your conclusions and consider it a serious bug.
What wording in the standard makes it a bug (instead of just a QoI issue)?
A method can be called more than once, if not specified otherwise in its documentation. I'm not sure it is spelled out explicitly in the standard but that's how I understand it. There is no statement prohibiting calling init() multiple times in the Standard so I assume it is allowed. If this doesn't look obvious then something must be wrong with my head.

Andrey Semashev wrote:
Thank you Stephan for digging it out. But I still don't agree with your conclusions and consider it a serious bug.
You code just seems broken to me, sorry. It's absolutely clear that the intent of the standard as currently written is for init to be called exactly once, immediately after default construction.

AMDG On 11/05/2012 01:57 AM, Peter Dimov wrote:
Andrey Semashev wrote:
Thank you Stephan for digging it out. But I still don't agree with your conclusions and consider it a serious bug.
You code just seems broken to me, sorry. It's absolutely clear that the intent of the standard as currently written is for init to be called exactly once, immediately after default construction.
+1. The requirement that init must be called before destruction doesn't make any sense if this weren't the intent. In Christ, Steven Watanabe

On Mon, Nov 5, 2012 at 3:22 AM, Andrey Semashev <andrey.semashev@gmail.com> wrote:
On November 5, 2012 6:06:42 AM "Stephan T. Lavavej" <stl@exchange.microsoft.com> wrote:
[Andrey Semashev]
+1, I had reported a memory leak bug [1] to MS and had been told that the behavior is correct as it doesn't contradict the Standard. I can't trust the vendor that treats users this way and declares a memory leak as a valid behavior. I doubt I will ever bother reporting any other bugs in MSVC. [1] The leak appeared in STL streams, if initialized multiple times. Here's a code snippet: https://sourceforge.net/apps/trac/boost-log/ticket/2#comment:4 I didn't keep a reference to the MS bug tracker and I can't find it now.
This was Dev10#831920/Connect#518512. I can't load the Connect link anymore (it is supposed to be http://connect.microsoft.com/VisualStudio/feedback/details/518512/memory-lea... and I don't know why it's broken) but I can still see the comments through Team Foundation Server. This behavior was really, truly conformant to the Standard - I was surprised too, so I had to ask P.J. Plauger for an explanation. Here are the comments:
[snip]
Thank you Stephan for digging it out. But I still don't agree with your conclusions and consider it a serious bug.
If you read the standard one way and a library supplier reads it another way, you can file a Library Working Group issue. It may be that the standard needs clarification, or there is a mistake in the standard, or the standard is correct as written but the LWG feels the behavior should be changed, or whatever. Resolving this kind of issue is one of the functions of the standards committee. If some implementations behave one way, and some behave a different way, the LWG is likely to be particularly interested in pursuing the issue. --Beman --Beman

On Tue, Nov 6, 2012 at 5:24 AM, Beman Dawes <bdawes@acm.org> wrote:
On Mon, Nov 5, 2012 at 3:22 AM, Andrey Semashev <andrey.semashev@gmail.com> wrote:
On November 5, 2012 6:06:42 AM "Stephan T. Lavavej" <stl@exchange.microsoft.com> wrote:
[Andrey Semashev]
+1, I had reported a memory leak bug [1] to MS and had been told that the behavior is correct as it doesn't contradict the Standard. I can't trust the vendor that treats users this way and declares a memory leak as a valid behavior. I doubt I will ever bother reporting any other bugs in MSVC. [1] The leak appeared in STL streams, if initialized multiple times. Here's a code snippet: https://sourceforge.net/apps/trac/boost-log/ticket/2#comment:4 I didn't keep a reference to the MS bug tracker and I can't find it now.
This was Dev10#831920/Connect#518512. I can't load the Connect link anymore (it is supposed to be http://connect.microsoft.com/VisualStudio/feedback/details/518512/memory-lea... and I don't know why it's broken) but I can still see the comments through Team Foundation Server. This behavior was really, truly conformant to the Standard - I was surprised too, so I had to ask P.J. Plauger for an explanation. Here are the comments:
[snip]
Thank you Stephan for digging it out. But I still don't agree with your conclusions and consider it a serious bug.
If you read the standard one way and a library supplier reads it another way, you can file a Library Working Group issue. It may be that the standard needs clarification, or there is a mistake in the standard, or the standard is correct as written but the LWG feels the behavior should be changed, or whatever. Resolving this kind of issue is one of the functions of the standards committee.
If some implementations behave one way, and some behave a different way, the LWG is likely to be particularly interested in pursuing the issue.
I've tested this with STLPort when I found the problem and with GCC 4.7 now. Both did not show the leak, so MS STL stands out. Yes, I would like this issue to be resolved in the Standard but I don't think I have the necessary resource to prepare and (most importantly) defend the proposal before the committee.

on Tue Nov 06 2012, Andrey Semashev <andrey.semashev-AT-gmail.com> wrote:
I've tested this with STLPort when I found the problem and with GCC 4.7 now. Both did not show the leak, so MS STL stands out. Yes, I would like this issue to be resolved in the Standard but I don't think I have the necessary resource to prepare and (most importantly) defend the proposal before the committee.
This doesn't require a proposal, it merely requires filing a defect report. That's the sort of thing that goes on the LWG issues list, not a stand-alone paper of the kind that goes in the mailing. Surely you have time for that. -- Dave Abrahams BoostPro Computing Software Development Training http://www.boostpro.com Clang/LLVM/EDG Compilers C++ Boost

On Thu, Nov 8, 2012 at 6:18 PM, Dave Abrahams <dave@boostpro.com> wrote:
on Tue Nov 06 2012, Andrey Semashev <andrey.semashev-AT-gmail.com> wrote:
I've tested this with STLPort when I found the problem and with GCC 4.7 now. Both did not show the leak, so MS STL stands out. Yes, I would like this issue to be resolved in the Standard but I don't think I have the necessary resource to prepare and (most importantly) defend the proposal before the committee.
This doesn't require a proposal, it merely requires filing a defect report. That's the sort of thing that goes on the LWG issues list, not a stand-alone paper of the kind that goes in the mailing. Surely you have time for that.
Is there a description of how to do that? E.g. what the defect report should look like and where should I send it? I tried to find it on the isocpp.org but I didn't succeed.

on Thu Nov 08 2012, Andrey Semashev <andrey.semashev-AT-gmail.com> wrote:
On Thu, Nov 8, 2012 at 6:18 PM, Dave Abrahams <dave@boostpro.com> wrote:
on Tue Nov 06 2012, Andrey Semashev <andrey.semashev-AT-gmail.com> wrote:
I've tested this with STLPort when I found the problem and with GCC 4.7 now. Both did not show the leak, so MS STL stands out. Yes, I would like this issue to be resolved in the Standard but I don't think I have the necessary resource to prepare and (most importantly) defend the proposal before the committee.
This doesn't require a proposal, it merely requires filing a defect report. That's the sort of thing that goes on the LWG issues list, not a stand-alone paper of the kind that goes in the mailing. Surely you have time for that.
Is there a description of how to do that? E.g. what the defect report should look like and where should I send it? I tried to find it on the isocpp.org but I didn't succeed.
http://www.comeaucomputing.com/csc/faq.html#B15 There ought to be a better way, but for now, that's how. -- Dave Abrahams BoostPro Computing Software Development Training http://www.boostpro.com Clang/LLVM/EDG Compilers C++ Boost

On 8 November 2012 09:50, Dave Abrahams <dave@boostpro.com> wrote:
http://www.comeaucomputing.com/csc/faq.html#B15
There ought to be a better way, but for now, that's how.
More up to date: < http://cplusplus.github.com/LWG/lwg-active.html#submit_issue>.
--
Nevin ":-)" Liber <mailto:nevin@eviloverlord.com> (847) 691-1404

on Thu Nov 08 2012, Nevin Liber <nevin-AT-eviloverlord.com> wrote:
On 8 November 2012 09:50, Dave Abrahams <dave@boostpro.com> wrote:
http://www.comeaucomputing.com/csc/faq.html#B15
There ought to be a better way, but for now, that's how.
More up to date: < http://cplusplus.github.com/LWG/lwg-active.html#submit_issue>.
Right; what Nevin said -- Dave Abrahams BoostPro Computing Software Development Training http://www.boostpro.com Clang/LLVM/EDG Compilers C++ Boost

On Fri, Nov 9, 2012 at 1:44 AM, Dave Abrahams <dave@boostpro.com> wrote:
on Thu Nov 08 2012, Nevin Liber <nevin-AT-eviloverlord.com> wrote:
More up to date: < http://cplusplus.github.com/LWG/lwg-active.html#submit_issue>.
Right; what Nevin said
Thank you for the link, I've sent the DR. It would be nice if this link (or the whole walkthrough of creating a DR) was available on isocpp.org, since it is advertised as the central resource around C++.

On Thu, Nov 8, 2012 at 9:18 AM, Dave Abrahams <dave@boostpro.com> wrote:
on Tue Nov 06 2012, Andrey Semashev <andrey.semashev-AT-gmail.com> wrote:
I've tested this with STLPort when I found the problem and with GCC 4.7 now. Both did not show the leak, so MS STL stands out. Yes, I would like this issue to be resolved in the Standard but I don't think I have the necessary resource to prepare and (most importantly) defend the proposal before the committee.
This doesn't require a proposal, it merely requires filing a defect report. That's the sort of thing that goes on the LWG issues list, not a stand-alone paper of the kind that goes in the mailing. Surely you have time for that.
+1 See http://cplusplus.github.com/LWG/lwg-active.html#submit_issue for how to submit a Library issue. --Beman
participants (22)
-
Andrey Semashev
-
Beman Dawes
-
Bo Persson
-
Christophe Henry
-
Cory Nelson
-
Dave Abrahams
-
Edward Diener
-
Eric Niebler
-
Greg Rubino
-
Klaim - Joël Lamotte
-
Mateusz Loskot
-
Mathias Gaunard
-
Matt Calabrese
-
Nathan Ridge
-
Nevin Liber
-
Olaf van der Spek
-
Paul Mensonides
-
Peter Dimov
-
Stephan T. Lavavej
-
Steven Watanabe
-
Tim Blechmann
-
Vicente J. Botet Escriba