
On Tue, Feb 03, 2015 at 09:10:34PM -0800, Robert Ramey wrote:
Steve M. Robbins-2 wrote
On Tue, Feb 03, 2015 at 04:22:00PM -0800, Robert Ramey wrote: I can't imagine this to be a very common use case. After about 2-3 build failures due to missing libraries, your user is going to desperately want "all of boost" in one tar ball.
what about when boost hits 500 libraries? Is he still going to want the whole thing?
On my *development* machine: yes.
He'll always be able to get the "whole thing". That's not an issue. The whole discussion is predicated on the idea that one shouldn't have to download the whole thing. For other users - this discussion is not relevant.
Yes, I get that. I understand not wanting to download everything. For my use, package managers for linux distributions, Perl, Python, Ruby, etc, have basically figured it out. I gather that the "bpm" does the same for Boost. I guess I don't personally see a use-case for basing the dependencies on what my code today happens to include. It seems terribly fragile to me.
It's not how I want to work. When want to use library X, I want X plus all its dependencies for any conceiveable program:
Hmmm and you want to distribute DLLS with your program which are many times larger than necessary? Will your customers be happy with that?
I see that as a different problem. I'm certainly willing to pick and choose what to distribute and at that point I'd be willing to run a script on my source code to generate a minimal set of distributables. Preferably, it would be automatic and integrated into my build script.
"apt-get install X-dev" and go. I don't want to have to keep running a dependency tool on my source code.
OK - no problem, you're already good to go.
Yes. I respond only because I've never personally encountered your use case and was curious how widespread it really is. I'm sure you have more experience than I so I'd like to learn from it. Best, -Steve