On Thursday, June 20th, 2024 at 6:15 PM, Alexis <flexibeast(a)gmail.com> wrote:
Bakul Shah bakul(a)iitbombay.org writes:
But the overlap between two different programs or
their
assumptions
will be
only partial (except for some very basic things) which likely
means
the cache
won't quite work. For example, you may find that program A and B
depend on different versions of some library C.
The basic things are, in fact, a significant part of what autoconf
is being used to check for. "Does this platform provide this
function?"
...
Alexis.
This aspect of things I have found a bit perplexing. On one hand, sure, it's nice to
have some scripted system spit out a:
"Dependency xyz not found"
But where it falls apart in my head is what that tells me that, for instance, cpp's
error diagnostic about a missing include or ld saying a symbol or library wasn't
found does. It's in my mind a minor convenience but one that doesn't justify
all the machinery between one's self and make just to provide. Granted that's
not all autotools does, so a poor example in practice, but in theory gets at one of my
irks, packaging something that you are already going to discover some other way. That and
"does my target platform list support xyz" isn't necessarily a matter
I'd wait until I've created a whole build package around my software to
settle...
Just a small part of the puzzle but one of the parts that gives me more headaches than
not. Now I don't get to respond to a compiler or linker asking for something by
putting it where it asked for it, now I also have to figure out how to do the extra work
to ensure that it's put somewhere and in a way that all this machinery between myself
and the compiler can also verify in its own magical way component <xyz> is present.
I'd be willing to wager that half the time autotools, cmake, etc has made me want to
put my head through a wall is not that some needed thing isn't there, it's just
that it's not there according to whatever extra expectations or formulas come into
play just to satisfy the build machinery. These tools can be helpful in the face of
extreme complexity, but I feel silly when most of the work I put into compiling some
package that's like 6 source files is making sure the environment on my system can
satisfy the expectations of the build tools.
It has been said already that part of the problem too with the uptake of these tools and
their growing ubiquity is new folks who don't know any better think that's just
"how it is" and then wind up spinning an autotools or cmake build for a <1000
line tool written in ANSI C. I've done the same in less experienced times, one of my
first attempts at a game engine uses an autotools build. I quickly grew frustrated with
it and everything since has used a flat makefile and has been just fine. Granted I'm
not building a triple A game, but that gets at the root of one of my gripes, I think these
sorts of frameworks are overused. They have their areas that they shine, or they
wouldn't have reached the critical mass they have, but as consequence folks will use
them haphazardly regardless of the need.
Long story short, maybe gcc needs a configure script, but does GNU ed? Maybe KDE Plasma
needs CMake files, but does libtiff? I make no claims regarding the complexity of these
actual codebases...but one does have to wonder...
- Matt G.