
Dan K via talk wrote:
The official BOOST site has a bit of disparaging dialog on repackaging efforts of various sorts. I think if your target system is x86, then apt-get is probably a fine choice. But if you have to either x-compile, or worry about updates, or want to enable non specific improvements in the indefinite future of any calling app, the the hard way is the easy way !
You can apt-get stuff onto all manner of different architectures with no problem; Debian-style packages are built for quite a range. And the philosophy there (and with most desktop/server-ish Linux distros) is that software is compiled on its native architecture, so the source code gets farmed out to a hodgepodge of different boxes to compile for ARM, MIPS, PowerPC, and whatever else they want to support. Five years ago a bunch of us on this list visited Seneca/York for the launch of Fedora-for-ARM and saw the equipment used to compile that distro on that architecture: there were a few dozen PogoPlugs and similar ziptied to racks with a few "normal" servers providing swap-over-NFS to solve the not-enough-RAM-on-those-bitty-computers problem. Multiply that by the number of distros and number of architectures because I'm sure the others are using similar lashups. Note also that though the Linux distros are coming from x86, nowadays that's getting long in the tooth and the default flavour for many a year now has been AMD64/x86-64. Improved small hardware means things like a Raspberry Pi are more than powerful enough to compile their own code natively, although desktop software is starting to assume 64-bit with huge wodges of memory and becomes problematic on 32-bit even with more RAM than most of us could afford 20 years ago. Meanwhile, the traditional embedded assumption has been that the resources of the target are not sufficient for native compilation, so they go much more for cross-compilers hosted on a reasonably beefy workstation or server. You can even mix-and-match, with an upgraded target device for native compiles but a cross-compiler for developers who need a quick edit-recompile-test loop on one binary, or need to be able to do that on a laptop in the field. Another "gotcha" is that some software wasn't written to cross-compile and has build scripting that _will_ assume the build environment is the target, so if you have any pieces like that then a native compile is the easier route. -- Anthony de Boer