Dan - I think you hit the problem on the head. Some of us broke into the biz with large IBM Systems. My own first system's programming job was for TSS/360 (written in BAL of course).
The reason why this is acute is actually predicted by the late Prof Clay Christensen in his book "The Innovators Delima." During the heady days of IBM mainframes in the late 1960s, Wall street/business computing systems were all custom built. If you walked in the Morgan Stanley, Chase, Citi, much less NYSE there were rooms and rooms of cobol programmers writing custom code. Just for that firm. Nothing was shared, each used his/her pool of programmers to survive and try to get an edge or the others.
More importantly, they were all using IBM ISAM databases behind all the Cobol 'business logic.' Hey, life was good. The business boomed. Guess what, government programmers pretty much did the same thing, although they were not competing, they were just trying to build systems to support the different programs they had (collecting taxes, paying benefits, etc). What was common in all cases, was that the institution (commercial or government) behind all that programming and system deployment had lots of $s. So IBM, and leaches like Ross Perot's firm (there were many like him - his was just one of the biggest), performed what Christensen calls 'sustaining technology.' They kept making the same things, faster big better, etc.; because that's what their customers wanted and were asking to buy.
But in the early 1970s four things happened: 1.) Edgar F. Codd, an Oxford-educated mathematician working at the IBM San Jose Research Lab, published a paper showing how information stored in large databases could be accessed without knowing how the information was structured or where it resided in the database - that is, he invented the Relational Model for DBs, 2.) the Super-Mini's like the VAX or the like came on the scene, 3.) Oracle would eventually clone Codd's work and created a cheap DB that ran on #2, and 4.) companies like Oracle, BANN, SAP built >>re-configurable<< applications that worked a lot like the custom ones that had been built by the huge Cobol teams but were good enough for most firms. i.e. a general ledger/accounting system, the small banking system. A car dealership or super-market system. In fact, what arose was a legion of folks like the big-8 that would take those systems and set them up for smaller firms. Customize some of the reports, but many customers used the same codes (rinse and repeat...)
And something else happened... Oracle did not try to attack IBM head-on. Larry and team followed Christensen's idea of looking for new customers/a new market, where what they sold might not be originally quite as sexy as the custom things the big financial folks/government were building for themselves, but they were a load cheaper and were 'good enough' for most smaller firms.
The rest, as they say, is history. Eventually, the systems got bigger and bigger, and more capability and the Oracle, SAP, et al got better and better and could do more. Eventually, Wall Street started to switch to the new less customized world. And in many places, eventually, the reconfigurable applications for accounting, payroll et al, slowly started to get replaced >>on the commercial side<<. The key point was that Oracle and the commercial applications and database side of the world, disrupted the original mainframe in the financial sector ---> except for one type of customer: Government.
There were no new customers for these systems, so governments just kept using the custom stuff they had. They did start to put web style front-ends one them. But what elected official is going to try to get money to replace the current system they have? It works and redo/rewrite/modernization is not cost-effective because there are no economies to scale. So unless the Fed's put together a program for all 50 states (which is what they did for the IRS and the new system is still having issues). Remember, redoing something like that is not going to get you re-elected and it sure is going to take money away from a special project that might.
That said, here is a question for you. I suspect you have used something sqlite(3), Berkeley DB or any simple relational DB for something at some point. But did you ever learn how to use a Codisyl/ISAM DB? I bet there really was no reason for you to learn how and frankly it was not likely to be on your system. But take a look --> trying front-ending ISAM style queries from JavaScript. We have a whole set of routines and standards to call SQL -- Cobol calling ISAM is much more ad hoc (IBM called these 'access methods'). The big UNIX idea of everything is file is the complete antithesis of the codasyl.
Do you remember what happened during the Obama administration, when they rolled out ACA with most states? The web sites were crashing because the databases couldn't hack it and keep up the queries. The web-front ends scaled, but those back-ends databases were never designed to be accessed in that way. The problem is really not Cobol as much as the design of those systems was all custom and assumes a very structured back-end with structured data.
Those systems grew up over time with small incremental changes. Most changes were forced when new legislation came in and new features had to be added. But they stayed as ther were and were patched to keep going. So somewhere, somebody put a pretty front end with the web, but in back office, it's still an old system. Think about, you live in MA. Have you tried to get a new driver's license with the Real-ID from the DMV? Same issue -- great web front end. But the DMV system behind is all is circa late 1960 Cobol/ISAM -- never replaced. It's a nightmare and a lot of it manual. Same with voter registration systems for the towns. A couple of years ago, my sister arranged to get a 9-track (EBCDIC) tape from some of the towns with all voters on it for a story she was writing for the Boston Globe and she needed to do some statistical analysis. Fortunately, I still had a working 9-track and I can deal with almost anything. It was clearly an ISAM database dump. Between dd(1) and a few sed/grep/awk scripts, she was is business. IIRC, I started to try to put it into Berkeley DB, but mostly just converting to an ASCII form tabular form and being able to run grep et al was good enough. She shows her town clerk what she had, and the clerk wanted it. The clerk told my sister that she had been trying for years to get the state to give her some of that data in the form we wanted. She had been told it could not be done etc... The reason is they no longer had anyone that knew anything about the DB.
So the problem is in middle and higher management. What Christensen points out, is when you can create a new market, you can change things quickly (disrupt the old market in his terms). But without those market forces, the developers will continue to build you better and better systems that are incremental over the old one. If there is not a new market there to disrupt the old one stays.
BTW: In my world, HPC, this is why Fortran is still king for the same reasons, and there it is worse because the math has not changed and we have 50-70 years of datasets that would have to be rechecked and revalidated if the programs changed. Check out: https://www.archer.ac.uk/status/codes/