OK, not gonna splice in all of the things to which I'm responding.
I don't have much love for the libtool/autoconf/automake/etc. system.
While it works, and is better than nothing, I have always felt that
it was the wrong approach. I am fortunate that I know some of the
folks who worked on these tools because they're part of the too complex
for casual users thing that I mentioned in my earlier post about open
source.
I worked for a startup in 1985 that produced CAD tools that needed to
run on the various workstations of the era: Daisy, Apollo, Sun, etc.
When I started at the company, they would hire a new person to do a port
whenever we needed to support a new system. I changed that so that it
took someone at most a few hours to do a port.
The way that I did this was to have a portability libarary and header file
for each target system. We wrote our code for SunOS with the minor difference
that each library or system call had a p_ prefix, i.e., p_fopen and so on.
The portability library and header mapped each of these into the target
system facilities. In most cases this was done by a macro in the header
file that just defined p_fopen to be fopen and so on. In the harder cases
we had to write actual code to do some translation. I have a faint memory
that Apollo had some different arguments to fopen, and at least one of the
systems either used carriage return as a line terminator or maybe used CF-LF.
There are two big advantages to handling portability this way. First, the
source code is easier to read; it's not full of #ifdef this and #ifndef that.
Second, once the portability library existed it just worked and could be
reused. With the GNU tools methodology, every time someone needed to do a
fopen on a machine where the target behaved differently, the alternate code
needed to be written. There was no debugged library where this stuff only
had to be figured out once.
Just my $.02.
Jon