I usually run my library tests for dozens of configurations with a script that is launched with lower than normal priority and with -jN with N being 1.5 times the number of CPUs (I find compiling is many times IO-bounded). At least with a SSD disk, I can still do my low-cpu tasks like e-mail and browsing without any problem. I'm using windows, no idea if other OS/desktops have that problem.
nmake, jom or ninja -jN? jom for example doesn't keep the cores as busy as ninja in my experience. but it was just an artificial example that there are plain build performance is not only aspect to consider ... also some applications (sublime text and conemu for example) seem to perform much worse under load than others
otoh, developers who like IDEs may have a higher productivity due to debugger integration.
Sure. At least in windows you can open an executable as a project with Visual Studio and debug it flawlessly. If there is a problem with a Boost.Build test, I can just drop the exe into the IDE and debug it without problems, if it has debug info (and in my local tests I activate debug symbols also in release, just in case). For Linux gdb -tui is what I use.
if you drop the exe into the ide, you can debug, but does it run the executable with specific command line options in a specific working directory in a specific environment? and/or does it populate a project with source files involved and does it create a code model to allow source navigation as in native msvc project files? debugging may be possible, but the UX will probably differ ... the point is not necessarily that something is possible, but that some people prefer one workflow, but others prefer another (at least i've learned this lesson from converting 2.5 companies with non-trivial buildsystems to cmake).