On Sep 14, 2017, at 1:22 PM, Jon Steinhart <jon(a)fourwinds.com> wrote:
I don't have much love for the libtool/autoconf/automake/etc. system.
While it works, and is better than nothing, I have always felt that
it was the wrong approach. I am fortunate that I know some of the
folks who worked on these tools because they're part of the too complex
for casual users thing that I mentioned in my earlier post about open
source.
My days of wrestling with libtool/autoconf/automake/cmake are mostly
in the past. On FreeBSD/MacOS I use pkg/brew. For any new coding I
mainly use Go. Even cross compiled binaries just work. It has a very
well engineered ecosystem.
There are two big advantages to handling portability
this way. First, the
source code is easier to read; it's not full of #ifdef this and #ifndef that.
Second, once the portability library existed it just worked and could be
reused. With the GNU tools methodology, every time someone needed to do a
fopen on a machine where the target behaved differently, the alternate code
needed to be written. There was no debugged library where this stuff only
had to be figured out once.
I agree with this. auto{conf,make}/configure is just the wrong approach.
At Real Networks our media server ran on 12 or so Unix platforms + windows +
macOS9 (at the time). I managed to corral machine dependent code in basically
a couple files for all but MacOS9. No #ifdefs in any other file. C++ also
helped to hide things like select(2) vs poll(2) from other code.