I think people expect the language to do too much these days, sort of
like if it compiles without warnings then likely as not it works. I
suppose that can be true with some languages if you follow certain
patterns cookbooks, but it's not true of C or any program I ever wrote,
I think this issue is a combination of style and language mechanics. I
recently had to explain to a young manager, why I couldn't look at a
program and tell him definitively whether it works or not. I did
explain, but I don't think he really understood me, and I'm sure he was
disappointed that I wasn't "good enough" to do that. Sort of the
negative proof thing - if I find something wrong, I can make a pretty
convincing argument why it's broken, but if I don't see anything wrong,
that doesn't mean it works, and depending on such reviews without
testing (which is also incomplete, but has the benefit that it may
happen upon something we don't think of) is incredibly dangerous.
I think general software engineering knowledge and experience cannot be
'obsoleted' or made less relevant by better languages. If they help,
great, but you have to do the other part too. As languages advance and
get better at catching (certain kinds of) mistakes, I worry that
engineers are not putting enough time into observation and understanding
of how their programs actually work (or do not).
On 01/31/2023 05:49 PM, Larry McVoy wrote:
Even though I participated in this thread, so I too
have sinned, this
definitely wants to be discussed elsewhere. comp.lang.c maybe?
This discussion reminds a lot of what my career has been like. Over and
over again, I got told "that won't work". I wasn't very interested
in
those people, if they stayed out of my way. I was way more interested
in the people who said "hmm, maybe if you did it this way".
I've written a ton of working C code. I've managed and guided others
to do the same. I'm positive that I've written way more commercially
supported C code than was present in v6/v7 kernels. Not sure about
userspace, I think Unix has me beat there.
The point? It's very possible to write working C code. You need good
programmers and those seem to be going out of style so people are turning
to Rust because it doesn't compile crappy code.
I'm just happy to be retired, C worked for me for 40+ years, I feel
like I dodged a learning curve bullet by getting out before I had to
switch to Rust (or whatever).
On Wed, Feb 01, 2023 at 01:40:23AM +0100, Steffen Nurpmeso wrote:
> ron minnich wrote in
> <CAP6exY+Qz2Oe4gC4D1Fqy22JKKDaanTOYpc0gxugBv485JUknQ(a)mail.gmail.com>:
> |That example was a simplified bit of code from a widely used code base. All
> |I need to do is change the function g go a pointer to function, or have it
> |be provided by a .so, and all bets are off.
> |
> |In any event, the important thing here is not that y should be initialized,
> |or should not; it's that it is not possible to get a consistent answer on
> |the question, from people who have been writing in C for decades.
>
> I find the syntax just terrible. (I have not programmed with it.)
> I mean annotations are for the better, ok, (but i luckily get away
> with only "const" and "volatile" (and "mutable"), and
leave things
> like "restrict" be a good man). Johannes Lundberg i think it was
> who then left FreeBSD (?) but pointing to a Rust program
> he had written (i cannot find the reference), that was a few years
> ago (i looked once i composed this message first, it might even
> had been as early as 2017!), and i was lucky i do not have to deal
> with it.
>
> Even
nim-lang.org that at least converted its code to plain C back
> in the day when it was still called Nimrod is too heavy for me now,
> but claims to be hard realtime aware etc. It at least tries to
> give programmers some syntactic sugar that makes people happy when
> they look at the screen for 7 hours (underpaid) or 16 hours
> (over-ambitioned). Like the only python thing i like, the
> syntax sugar. But as languages grow they get more and more
> "refined", and now i read things like
>
> type
> BinaryTree*[T] = ref object # BinaryTree is a generic type with
> # generic param `T`
> le, ri: BinaryTree[T] # left and right subtrees; may be nil
> data: T # the data stored in a node
>
> proc newNode*[T](data: T): BinaryTree[T] =
> # constructor for a node
> new(result)
> result.data = data
>
> which gets me off a bit and makes me think that "hey, maybe
> Objective-C is not really the worst thing" (despite the syntax).
>
> I do not know. Everything beyond a small monolithic
> self-contained thing is a problem per se. You can then unit test
> atomically, and if thousands of modules do that, you plug them to
> a better whole. But that also not a good answer, all those flakes
> that live their own life, many maybe even in remote locations out
> of any control.
>
> You could have a language which hard fixates all the call chain,
> or you could have tools which bring this to a small and simple
> language which does not offer it. A compiler can figure out which
> variables are assigned etc and could create in-object-file
> annotations which the linker automatically verifies.
> Of course there are many code paths through functions.
>
> Back in 2009 when i bought the one Apple i wanted to have i tried
> out their Development software, which i think required internet
> access. All the pop-ups and titles, i think it (later?) could
> compile on-the-fly and inject the new object in the running tested
> application, etc.
>
> And then there are semantic description languages like interface
> builders, where robots create the actual code. So then you could
> have Corba interface descriptions / DBUS and plug it all via it.
>
> Then Rob Pike says "make it a string" (more or less), and luckily
> i do not have dyslexia.
>
> Maybe Qt has a good answer by not only not banishing the
> C preprocessor, but introducing another one on top.
> So then the compiler can analyze the code and generate a correct
> variable-state-at-function-call-time state that a dynamic linker
> could then verify against the consumer or producer (whatever is
> needed) of linked modules, that link modules, that link modules,
> all of which are subject to replacements due to development
> iterations, bug patches etc, ie, new releases.
>
> As in, the library versioning that is used for ELF object files,
> today often even linker scripts which furtherly diversify this,
> offer are not enough, they do not do this. They only bundle
> a name to a library version, or multiple (as in
>
> 1966 FUNC GLOBAL DEFAULT 13 realpath@@GLIBC_2.3
> 33 FUNC GLOBAL DEFAULT 13 realpath(a)GLIBC_2.2.5)
>
> I do not know how Rust deals with this. Is it at all possible to
> fixate all the call chain over all possibly involved dynamics?
> What do i gain from initializing an object with a default value if
> that default value is not not not not one of the values that
> i expect after some external function is called?
>
> Sure i not rarely see patches fly by where one more memset(,0,) is
> used to fully initialize a struct / prevent memory address
> disclosures (i only track BSDs).
> And there are still patches that fix bugs on old code that
> sometimes is twenty years or more, say zlib or tzcode.
> And then there are compiler bugs that bring in bugs -- that is not
> avoidable, even if the documentation is clear and obvious like
>
> /* The application can compare zlibVersion and ZLIB_VERSION for consistency.
> If the first character differs, the library code actually used is not
> compatible with the zlib.h header file used by the application. This check
> is automatically made by deflateInit and inflateInit.
> */
>
> That is only graspable by a human programmer that reads it. Rapid
> application development that surely is not.
> (And actually i am not sure this is really true. But i personally
> would hope for it. It is more earthy, and has more blood, sweat
> and tears.)
>
> --steffen
> |
> |Der Kragenbaer, The moon bear,
> |der holt sich munter he cheerfully and one by one
> |einen nach dem anderen runter wa.ks himself off
> |(By Robert Gernhardt)