Unfortunately, this antipattern has persisted way beyond the 90s. But, you know, if the only way the standard provides to achieve a certain goal is a shitty hack, people will use the shitty hack.
The compiler is blind to them, so they can't be reasoned about / carried forward anywhere / allowed for in one way or another.
Moreover, because of `#define`s, the effect of including a file can be entirely different, so a compiler can never say "Oh, I know what's in this include file already, I don't need to include it again in other translation units".
Of course #define's have many uses, and at present cannot be avoided entirely, but C++ has been making an effort to have other mechanisms put in place so as to obviate the use of #define's
In particular, look for CppCon talks about Modules.
interesting. so traditionally the preprocessor has been used to work around incompatibilities across various compilers/platforms and to completely enable/disable features at compile time. what's the new thinking there? can you build big projects across clang/g++/intel without preprocessor workarounds now? has the number of viable compilers in use dropped and compatibility amongst those that live on increased? how about stuff like debugging/instrumentation? or is the current thinking on all that stuff to always build with all of it and enable/disable via runtime branch? (along with some argument that configuration at build time was more trouble than it's worth on modern spacious/fast machines)?
In a nutshell: Using compile-time facilities which are within the language itself. Compilers will expose information about the platform, about themselves, etc.
It can be a trade-off, but becoming less and less so. I'll admit to not looking at the implementation, but when you sell it as "modern" then it's not OK to do it for performance, for example.
And almost every single use I've seen "for performance" actually has zero or negligible performance impact.
reply