On 2018-Sep-24 16:45:13 -0400, Arthur Krewat <krewat(a)kilonet.net> wrote:
The 8086 was the first of the "x86" line,
which was 16-bit, although
it's I/O was more 8080-ish if I recall correctly.
The 8086/8088 bus was designed to be similar to the 8085 to allow 8080/8085
peripherals to support 8086 systems. This saved Intel the effort of
developing a new range of support peripherals and made it quicker for
vendors to build 8086 systems because they didn't need to wait for the
peripheral chips. There are still 8080 support chips - 8253, 8257, 8259 -
embedded in PC Southbridge chips.
The 8088 was an 8-bit
data bus, granted, but having done both 8080 and 8086+ assembler, you
couldn't really tell the difference, programming-wise between the 8086
and the 8088, 16-bit registers, and all.
This was deliberate - the 8080 was an upgraded 8008 and Intel made the 8086
similar enough to allow automated ASM translation. It seems highly likely
that the undocumented 8085 opcodes were undocumented because they weren't
readily translatable to 8086.
Cutting costs, as always, IBM opted for the 8088, which
allowed them to
use an 8085-style I/O architecture.
An 8-bit memory bus means half as many RAM chips and buffers. Keep in mind
that the IBM 5150 was intentionally crippled to ensure it didn't compete with
IBM's low-end minis.
--
Peter Jeremy