Sure, the effect of such guidelines, including the "Core Guidelines" is to further shatter the language into effectively incompatible dialects while not really solving anything.
One of the book's authors actually proposed work to get C++ to a place where C++ 20 could begin actually abolishing the things everybody agrees you shouldn't do, Vittorio's "Epochs" proposal. https://vittorioromeo.info/index/blog/fixing_cpp_with_epochs...
But the preference for C++ 20 was to add yet further stuff. I believe that, unwittingly, the C++ 20 process sealed the fate of the language in choosing to land features like Concepts but not attempt Epochs, I'm not sure it was actually possible, but if it wasn't possible in 2020 it's not going to be any easier in 2026 and it's only going to be more necessary.
I reference the cpp core guidelines all the time. It's not dead at all, things like clang-tidy make references to it constantly for rationale behind recommendations.
There's only a few things I'm not a fan of, mostly around mutable out parameters (I prefer pointers not refs to make it clear at the call site it's a potential mutation)
I imagine he/she means the newly introduced C++ Core Guidelines with the _prt<>() and _view() classes, and the adoption of Rust like lifetime analysis for static analysers.
I would be so happy if that comittee started removing features instead of adding them. Guidelines like that are just a way of dealing with all the complexity they introduced to C++ over the years.
Of course, I didn't mean this specific document, just a guidelines document similar to this, but for older versions of the language. I'm sorry for comparing to Python again, but it's a much younger language and it has had an official document equivalent to this one since 2001. Remember, the Core Guidelines project was announced in 2015, after C++14 was already in production systems all over the world.
C++11 has existed for more than a decade, it has already been superseded by 3 newer versions of the language that add a lot of new features (some of them incompatible with each other), yet for most of its existence there was no official guideline from the creators that explains how it should (and shouldn't) be used. And as far as I know, nothing like it existed for C++98. It's too little, too late. They clearly have some priorities regarding documentation that, as a user, I strongly disagree with.
Big organizations (EA, Unreal, Google, etc) can live with this -- they can afford to have their own, very restrictive standards on how to use the language. Most small organizations (in my experience) don't, either through lack of expertise, money, time, or will (or a combination of all), and their code is usually ugly, hacky, unstable, and hard to work on. This document only solves this problem insofar it exists as a very strong recommendation to buy ReSharper and just do whatever it says.
Did you by any chance observe any of the folks involved officially saying that deprecating things was one of the goals? I thought keeping working code working has always been of the C++'s official goals.
Sure, but that’s not what’s being asked for. People are demanding that the language evolve over decades to become less complex while maintaining backwards compatibility. It’s a “I still want my cake after I eat it” situation.
So, instead the committee focuses on adding features such that if you migrate parts of your old code or add new code to an old project, the new code can be much simpler than it would have had to be before and the old code that no one wants to touch still compiles right along side the new stuff.
Want C++ to be simpler? Use the new features instead of the old features. Use unique_ptr instead of raw pointers. Use constexpr if instead of SFINAE. Use range based for loops. Yay! C++ is Simpler!
Fair. However, there continues to be an unresolved tension in the C++ community about whether they want C++ to focus on its considerable legacy by ensuring enduring compatibility ("No ABI breaks, ever!") or to continue growing and changing even if that means not everybody can follow ("Performance trumps compatibility, ship it!"). Although we can anticipate some sort of compromise, it just isn't possible to have a C++ 2x that delivers everything the modernizers want yet still runs people's technically conforming C++ 11 code unchanged with that binary DLL they've got no source code for. Some people will be unhappy, maybe everybody in the C++ community will end up unhappy.
In the "no breaking changes" case I agree that C++ objects that live in the free store are in the same place as Go, the programmer has a responsible which they may not fulfil, to manage this resource and a linter can only help mitigate this problem.
But plenty of people including Stroustrup want to do lifetime management, despite potential breaking changes from that, and under lifetime management the compiler has visibility into your object lifetimes and can reject programs which inadvertently leak.
Now, that doesn't (can't) make leaks impossible, but it means any leak is now in some sense "on purpose" and would happen for GC'd resources too, it isn't just an accident. For example Rust's mem::forget will prevent the drop happening for the object you're forgetting, but it's not as though you type mem::forget() by mistake. You clearly wanted to achieve that (e.g. you stole the underlying Unix file descriptor from a File and sent it over a socket to a separate process, so now cleaning up that descriptor is the Wrong Thing™) and incorrect usage is not the same category of error as forgetting a with clause in Python.
Fair point. My feeling has been that the committee is trying to make the language safe (as is now the fashion) by adding increasingly more features, with the intention of deprecating old workflows aggressively.
But the language complexity is now so significant that a new dev would take years to fully understand both "old" C++ and new C++, and how to make them work well together
You don't usually improve the standard for legacy code. You want backward compatibility, sure, but the expectation that a new standard automatically makes old code not suck is just setting you up to be disappointed.
What C++ has allowed us to do is slowly (sometimes wholesale), add the new stuff to old code or just write new code that doesn't suck as much.
Each C/C++ guideline define language subset so de facto new language but with weak enforcement where in practice most organisations following it anyway create some specific exceptions from that guidelines. C/C++ world (C less) is a mess, it can't be fixed and this agony will be very long lasting.
"I've read once on the blog of a former member of the C++ technical steering committee, that at his best, when actively working with the committee and focused on this, he maybe knew 50% of the language spec."
This isn't relevant. The standard is for compiler writers and library writers who use advanced meta-programming.
"What are the odds of the common folk knowing the pitfalls and good practices of the language and that staying relevant in 5-10 years?"
That depends. Some pitfalls just disappear like std::auto_ptr but I doubt anyone complains. Big changes came with c++11 and c++20, that is once in 10 years.
While c++11 added move semantics and new initialization syntax that are somewhat complicated and hard to ignore, c++20's additions only make you glad that your life has become easier.
reply