Moving over to COFF from TUHS. The following was Larry McVoy:
> I don't consider myself to be that good of a programmer, I can point to
> dozens of people my age that can run circles around me and I'm sure there
> are many more. But apparently the bar is pretty low these days and I
> agree, that's sad.
>
>
It's hard not to feel like the bar is lower. I feel like since Steve
Grandi retired at NOIRLab, I and Josh Hoblitt are the only people left who
actually understand how IP networks work. And I'm not great, never was,
but I know a lot more than...everyone else.
And kids these days, well, I'm not very fluent in TypeScript and I really
don't understand why every damn thing needs to be asynchronous especially
if you're just awaiting its completion anyway. But, hey, it ain't that
hard to do.
But again, there's a part of me that wonders how relevant the skills I miss
*are* anymore. I'm a software developer now, but I always thought of
myself as basically a sysadmin. It's just that we had automated away all
of what I started out doing (which was, what, 35-ish years ago?) by 20
years ago, and staying ahead of the automation has made me, of necessity, a
software developer now.
But I was also thinking of Larry saying he wouldn't last a week in today's
workplace, and I'm not sure that's true.
I mean, there's a lot of stuff that you once COULD say that would these
days get you a quick trip through HR and your crap in a box and a walk to
the curb...but I am a pretty foul-mouthed individual, and I have said nasty
things about people's code, and, indeed, the people who are repeat
offenders with respect to said code, and nevertheless I have had
surprisingly few issues with HR these last couple decades. So in some
sense it really DOES matter WHAT it is that's offensive that you're saying,
and I am living-and-still-employed proof.
If you generally treat people with respect until they prove they don't
deserve it, and you base your calumny on the bad technical decisions they
make and not their inherent characteristics, then it really ain't that hard
to get along in a woke workplace. And I say this as an abrasive coworker,
who happens to be a cis het white dude from a fairly-mainstream Christian
background and the usual set of academic credentials.
Let's face it: to do a good job as a software developer or generally an IT
person, you do not need a penis. You do not need to worship the way most
people at your workplace do. You do not need a college degree, let alone
in CS. You do not need to be sexually attracted to the opposite sex. You
do not need to have the same gender now that you were assigned at birth.
You do not need two (or given the current state of the art, ANY) working
eyes. Or hands. You do not need to be under 40. You do not need to be
able to walk. You do not need pale skin. And anyone who's saying shit
about someone else based on THAT sort of thing *should* be shown the curb,
and quickly. And the fact that many employers are willing to do this now
is, in my opinion, a really good thing.
On the other hand, if someone reliably makes terrible technical decisions,
well, yeah, you should spend a little time understanding whether there is a
structural incentive to steer them that way and try to help them if they're
trainable, but sometimes there isn't and they're not. And those people,
it's OK to say they've got bad taste and their implementations of their
poor taste are worse. And at least in my little corner of the world, which
is quasi-academic and scientific, there's a lot of that. Just because
you're really really good at astronomy doesn't mean you're good at writing
intelligible, testable, maintainable programs. Some very smart people have
written really awful code that solved their immediate problems, but that's
no way to start a library used by thousands of astronomers. But whether or
not they're competent software engineers ain't got shit to do with what
they have in their pants or what color their skin is.
And it's not always even obvious bigotry. I don't want to work with toxic
geniuses anymore. Even if the only awful things they do and say are to
people that they regard as intellectually inferior and are not based on
bullshit as above...look, I'd much rather work with someone who writes
just-OK code and is pleasant than someone who writes brilliant code and
who's always a quarter-second from going off on someone not quite as smart
as they are. Cleverness is vastly overrated. I'd rather have someone with
whom I don't dread interacting writing the stuff I have to interface with,
even if it means the code runs 25% slower. Machine cycles are dirt cheap
now. The number of places where you SHOULD have to put up with toxicity
because you get more efficient code and it actually matters has been pretty
tiny my entire adult lifetime, and has been shrinking over that lifetime as
well. And from a maintainability standpoint...if I encounter someone
else's just-OK code, well, I can probably figure out what it's doing and
why it's there way, way more easily than someone's code that used to be
blazing fast, is now broken, and it turns out that's because it encodes
assumptions about the runtime environment that were true five years ago and
are no longer correct.
That said, it's (again, in my not-necessarily-representative experience)
not usually the nonspecific toxic genius people who get in trouble with
HR. The ones who do, well, much, MUCH, too often, are the people
complaining about wokeness in the workplace who just want to be able to say
bad things about their coworkers based on their race or gender (or...)
rather than the quality of their work, and I'm totally happy to be in the
"That's not OK" camp, and I applaud it when HR repeats that and walks them
out the door.
Adam
At 2024-10-02T16:42:59-0400, Dan Cross wrote:
> On Wed, Oct 2, 2024 at 2:27 AM <arnold(a)skeeve.com> wrote:
> > Also true. In the late 80s I was a sysadmin at Emory U. We had a Vax
> > connected to BITNET with funky hardware and UREP, the Unix RSCS
> > Emulation Program, from the University of Pennsylvania. Every time I
> > had to dive into that code, I felt like I needed a shower
> > afterwards. :-)
>
> Uh oh, lest the UPenn alumni among us get angry (high, Ron!) I feel I
> must point out that UREP wasn't from the University of Pennsylvania,
> but rather, from The Pennsylvania State University (yes, "The" is part
> of the name). UPenn (upenn.edu) is an Ivy in Philly; Penn State
> (psu.edu) is a state school in University Park, which is next to State
> College (really, that's the name of the town) with satellite campuses
> scattered around the state.
There's another method of distinguishing UPenn from Penn State. Permit
me to share my favorite joke on the subject, from ten years ago.
"STATE COLLEGE, Pa. -- Construction workers tore down Penn State's
iconic Joe Paterno statue on campus two years ago -- but this town might
not be without one for much longer.
Two alumni already have received the OK from the borough to install a
projected $300,000 life-sized bronze sculpture downtown, about two miles
from the original site." -- ESPN ([1])
"The key difference is that the new statue will look the other way."
-- Chris Lawrence
Regards,
Branden
[1] https://www.espn.com/college-football/story/_/id/10828351/joe-paterno-honor…
On Tue, Oct 1, 2024 at 9:13 AM <arnold(a)skeeve.com> wrote:
> This goes back to the evolution thing. At the time, C was a huge
> step up from FORTRAN and assembly.
>
Certainly it's a step up (and a BIG step up) from assembly. But I'd say C
is a step sidewise from Fortran. An awful lot of HPTC programming involves
throwing multidimensional arrays around and C is not suitable for that.
-Paul W.
On Tue, Oct 1, 2024 at 10:07 AM <arnold(a)skeeve.com> wrote:
[regarding writing an Ada compiler as a class project]
> Did you do generics? That and the run time, which had some real-time
> bits to it (*IIRC*, it's been a long time), as well as the cross
> object code type checking, would have been real bears.
>
> Like many things, the first 90% is easy, the second 90% is hard. :-)
>
> I was in DEC's compiler group when they were implementing Ada for VAX/VMS.
It gets very tricky when routine libraries are involved. Just figuring
out the compilation order can be a real bear (part of this is the cross
object code type checking you mention).
From my viewpoint Ada suffered two problems. First, it was such a large
language and very tricky to implement--even more so than PL/I. Second, it
had US Government cooties.
-Paul W.
[-->COFF]
On 2024-10-01 10:56, Dan Cross wrote (in part):
> I've found a grounding in mathematics useful for programming, but
> beyond some knowledge of the physical constraints that the universe
> places on us and a very healthy appreciation for the scientific
> method, I'm having a hard time understanding how the hard sciences
> would help out too much. Electrical engineering seems like it would be
> more useful, than, say, chemistry or geology.
I see this as related to the old question about whether it is easier to
teach domain experts to program or teach programmers about the domain.
(I worked for a company that wrote/sold scientific libraries for
embedded systems.) We had a mixture but the former was often easier.
S.
>
> I talk to a lot of academics, and I think they see the situation
> differently than is presented here. In a nutshell, the way a lot of
> them look at it, the amount of computer science in the world increases
> constantly while the amount of time they have to teach that to
> undergraduates remains fixed. As a result, they have to pick and
> choose what they teach very, very carefully, balancing a number of
> criteria as they do so. What this translates to in the real world
> isn't that the bar is lowered, but that the bar is different.
>
> - Dan C.
Taken to COFF...
Hi Arnold,
> In main(), I *think* I'm assigning to the global clientSet so that
> I can use it later. But because of the 'err' and the :=, I've
> actually created a local variable that shadows the global one, and in
> otherfunc(), the global clientSet is still nil. Kaboom!
>
> The correct way to write the code is:
>
> var err error
> clientSet, err = cluster.MakeClient() // or whatever
I think this is a common problem when learning Go, like assigning
getchar()'s value to a char in C. It was back in ’14 anyway, when I saw
https://www.qureet.com/blog/golang-beartrap/ which has an ‘err’ at an
outer scope be unwritten by the ‘:=’ with the new, assigned-to ‘err’
going un-checked.
The author mentions ‘go vet’ highlights these cases with -shadow, which
is off by default.
https://pkg.go.dev/github.com/golangci/govet#hdr-Shadowed_variables
suggests that's still the case.
--
Cheers, Ralph.
[moving to COFF as this has drifted away from Unix]
On Sat, Sep 28, 2024 at 2:06 PM Larry McVoy <lm(a)mcvoy.com> wrote:
> I have a somewhat different view. I have a son who is learning to program
> and he asked me about C. I said "C is like driving a sports car on a
> twisty mountain road that has cliffs and no guard rails. If you want to
> check your phone while you are driving, it's not for you. It requires
> your full, focussed attention. So that sounds bad, right? Well, if
> you are someone who enjoys driving a sports car, and are good at it,
> perhaps C is for you."
>
> If you really want a language with no guard rails, try programming in
BLISS.
Regarding C and C++ having dangerous language features--of course they do.
Every higher-level language I've ever seen has its set of toxic language
features that should be avoided if you want reliability and maintainability
for your programs. And a set of things to avoid if you want portability.
Regarding managed dynamic memory allocation schemes that use garbage
collection vs. malloc()/free(), there are some applications where they are
not suitable. I'm thinking about real-time programs. You can't have your
missle defense software pause to do garbage collection when you're trying
to shoot down an incoming ballistic missile.
-Paul W.
Poul-Henning also suggests this link as well ...
Warren
----- Forwarded message from Poul-Henning Kamp -----
There is also 3B stuff in various other subdirectories on that site,
for instance: https://www.telecomarchive.com/six-digit.html
----- End forwarded message -----
Moving to COFF ,..
From: "Rich Salz" <rich.salz(a)gmail.com>
To: "TUHS main list" <tuhs(a)tuhs.org>
Cc: "Douglas McIlroy" <douglas.mcilroy(a)dartmouth.edu>
Sent: Monday, September 30, 2024 4:03:15 PM
Subject: [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum)
On Mon, Sep 30, 2024 at 3:12 PM Steffen Nurpmeso < steffen(a)sdaoden.eu > wrote
noone ever told them that even the eldest C can be used in a safe
way;
Perhaps we have different meanings of the word safe.
void foo(char *p) { /* interesting stuff here */ ; free(p); }
void bar() { char *p = malloc(20);
foo(p);
printf("foo is %s\n", p);
foo(p);
}
Why should I have to think about this code when the language already knows what is wrong.
No one would make the claim that programming in machine "language" is safe.
No one would make the claim that programming in assembly "language" is safe.
I've always viewed C as a portable assembler. I think the real issue has nothing to do with the "safety" of C, but rather the "safety" of your-choice-of-C-libraries-and-methods.
My $.02
Jim
FWIW, I just saw this in code generated by bison:
(yyvsp[-4].string_val), (yyvsp[-2].string_val), (yyvsp[0].string_val)
(IIUC) referencing the addresses under the top of the stack when passing
$2, $4, $6 into a function from an action (skipping a couple of
tokens). So the sign just depends on which way the stack is growing.
As for range checking of pointers into a malloc'd block of memory, the
pointer could have just 2 things: the address at the start of the block
and the pointer itself, some moving address in the block; and then
before the start of the block malloc could stash the address of the end
of the block (where it could be referenced by all pointers into the
block). So instead of a triple word, the pointer is a double word, and
the malloc'd block has an extra word before it. This must have been
done before by someone, somewhere.
I don't think of pointers and arrays in C as the same thing, but rather
array references as an alternate syntax for pointer arithmetic (or vice
versa).
- Aron