Most of the RS232 spec seemed to be designed for Sync modems and their management.
Most machines of the mini generation seemed to use either Async or Sync interfaces. Stuff like the VT180 had a comm port that was a 8251 USART for serial comm that could be either sync or async. I don't believe Dec had anything like that in that PDP11 or early Vax days.
Anyone care to enlighten me?
I know some of the story having known some of the players and lived a little of it, but I do not claim to know all of it. So I might be able to fill in some holes, but there is still plenty missing from the complete story.
First, remember RS-232 is an ECMA spec for connecting communication gear to data gear. It's not so much about sync/async as it dealing with the phone systems in the US and Europe and how to interface computer gear to it. The driver of the spec was building and connecting RJE-like systems for banks and financial institutions, airline terminals, etc. to connect to a central (mainframe) computer. This is why it uses terms like "Data Communications Equipment" and "Data Terminating Equipment" - as opposed to modems, computer terminals, hosts and the like.
Different systems vendors had different ways of thinking about the computer they were selling and how people would interface with them. And you can see the differences in the choices they make in gear like the peripherals that they interface at the time.
The AT&T Teletype Model 28 (https://en.wikipedia.org/wiki/Teletype_Model_28 circa 1953 intro, 5-bit BAUDOT code, current loop) was the standard terminal on DEC systems for many years. Gordon Bell invented (patented) the UART to talk to it for the PDP-1 (maybe it was the PDP-6) sometime in the early 1960s. What I do not know/understand is what was the work he did at DEC and what when he had left to be a CMU prof. I was >>under the impression<< the patent was granted during his CMU time; but I had thought DEC originally built them as FLIP-CHIPS for the PDP-6.
Then in 1963, 7-bit ASCII was introduced. IBM and AT&T were to two biggest firms behind it (remember that the IBM System 360 was supposed to be an ASCII system and has a lot of support for ASCII in its ISA; but due to the OS SW being late that stayed with their earlier BCD – creating EBCDIC - read Fred Brook's "The Mythical Man-Month" for the details). However AT&T, GE and DEC did switch to 7-bit ASCII pretty much as soon as they could. The AT&T/Teletype Model-33 (https://en.wikipedia.org/wiki/Teletype_Model_33 circa 1963 intro, 7 bit ASCII code/current loop, but lacked shift key on the keyboard) was introduced soon thereafter and that became the standard terminal (7-bit byte, plus parity and 3 bits worth of start/stop -- 11 serial bits per transmission). My memory is that AT&T did not make sell an ECMA RS-232 version, but the aftermarket had a ton of converters between the current loop interface at the ECMA standard.
So at the time, you have IBM using primarily synchronous interfaces, while AT&T (Teletype) used asynchronous. IBM liked sync because of the fact that it needed no wasted start/stop bits. They liked 1/2 duplex because their devices were primarily going one way at a time. In the 60s, IBM's big business has them connecting RJE stations and they would only much later do 1/2 duplex synchronous terminals. DEC was more interactive much sooner, used Teletype's and thus was async and full-duplex.
I was also under the impression (i.e. once was told) that Western Digital obtained a license to make the UART as chips but it was never completely clear to me who held the patent (CMU or DEC) i.e. who/how WD got the license from. But after the chips appeared, DEC would buy those chips from WD for things like the DL/KL-11's and DH-11 interfaces and I think they made something like the DL11 for the PDP-8. If you look at the schematics for the early serial ports for the PDP-11, they are all using the same WD chip.
Soon after the UART, WD also starts making a USRT, which (the best I can tell) they seem to be selling to IBM and the com vendors for IBM gear. I personally never programmed them, but they are in an old WD book I once had (may still). I remember seeing them in some Gandalf gear in the mid-70s besides the IBM gear, but I don't known/remember much more.
In the early 1970s, CMU used the same WD UART chips as DEC was using in the DL-11, but had designed their own serial board, which we called the ASLI (there were other differences but mostly the SW could not tell the difference).
Nat Semi was a second source for WD at some point in the late 1960s, and by the early 1970s, they started to design there own UART (as was pointed out eventually they created 8250 and it's follow ons). I'm not sure when Intel and Moto started to make them, but I think both the 8080 and 6800 families had UARTs chips. MOS Tech did not originally, although later when Rockwell became their second source, a UART for that family appeared too.
At some point in the early 1970s, the first USARTs start to appear. I was under the impression, WD was the origin of them, but I do not know. By the time of the 16-bit micros, however, many of the better serial interface chips could be either synchronous or asynchronous under program control. With the 16-bit chips, a Zilog USART chip was fairly popular at one point from Macs to UNIX boxes. As other pointed out, because of the PC/AT the Nat Semi 8250 stuck around as it had ended up are part of the 'PC support chip family', even though it was a bit of dog and notorious for dropping characters at high speeds.
For completeness, the Unix folks at BTL used the Teletype Model 37 (https://en.wikipedia.org/wiki/Teletype_Model_37 ß 1968 intro, 7 bit ASCII code, full U/L case ) as their native printing terminal. IIRC the ASR-37 had an RS-232C option as well as a current loop one from Teletype as the industry had pretty much dropped off of the current loop standard by then. Interesting side note, the AT&T/BTL programmers often did not have 'hardwired' lines in their offices, but used modem (there was the phone company of course).