Hi Bakul,
As far as processor design is concerned, I believe one
of the problems
is that there are fewer and fewer people who can do both h/w and s/w
design competently.
The ARM chip came about because the UK's Acorn Computers couldn't find a
decent cost:performance 16-bit CPU to replace the 6502 in new models.
Furber and Wilson realised from a visit to Bill Mensch's WDC that chip
design could be done by a small outfit. The recent UCB reports on RISC
suggested a simple design could still have high performance so they set
about skipping 16-bit CPUs and rolling their own 32-bit RISC CPU, the
Acorn RISC Machine.
The point of that bit of history is they were not chip designers, but
knew electronics and programming. Wilson designed the ARM's instruction
set and it was a delight to code: very orthogonal, and every instruction
had four-bits of condition-flag test, e.g. Carry Set, and a bit to
indicate if this instruction should set the condition flags. Thus
several instructions in a row could test the condition flags set by an
instruction a few earlier and unaltered since; this cut the need for
quite a few branches.
I think Wilson did such a good job because she had coded extensively in
assembler on several different architectures. Not just the odd device
driver or context-switch but 16 KiB of 6502 instructions which were BBC
BASIC, a structured BASIC with WHILE, PROC, integers, floats, etc.
(The BASIC ROM remains a good test of any 6502 emulator today because of
all the corner cases the hand-written assembly exercised.) This was
just for the BASIC interpreter; the OS, file system, etc., were all in
other 16 KiB ROMs of hand-written assembly.
The ability to add a co-processor to the BBC Microcomputer meant Z80 and
other implementations followed. So with all that experience it's not
surprising that the instruction set, though RISC, gave just what the
assembly writer wanted, nothing more, whilst being easy to learn and
remember. I don't think a hardware-chip designer could have done such a
good job.
The later pressure to drop 32-bit instructions for a mixture of 16-bit
and 32-bit due to mobile's small flash capacity beget Thumb, still used
today as Thumb-2, which dropped much of the nice features which made
hand-written assembly such a pleasure, but then by then compilers were
better quality and more common.
This is why I think more programmers should roll up
their sleeves and
design a processor and understand the issues involved, especially now
that programming FPGAs is becoming common. May be start with an
existing RISC-V core in some HDL, and push and pull it into (what you
think is) an ideal minimalist design.
I agree with the sentiment, but it sounds quite a big leap for most
programmers. I knew assembly and logic but not how to fill the gap to
create a CPU. The popular book from a few years ago which plugged the
hole for me was ‘The Elements of Computing Systems’, which I see has now
had a second edition: ISBN 0-262-53980-2,
https://amzn.to/3xE0IZo
It's a book of two halves: the first builds a simple CPU for a primitive
computer from nothing but NAND gates; the second half writes an
assembler, then a compiler for a language targeting a VM and then
implements the VM on the CPU built in part one. It's the reader who has
to do all this. Initially in the book's HDL run against the provided
simulators and test cases. Then in the programming language of his
choice as just text I/O is required.
So to start the other binary-operator logic gates have to be build with
NANDs, then multiplexers, etc. There's a target of how many gates to
use so an efficient design has to be produced. By chapter two we're
implementing the ALU. After the sequential logic of chapter three the
instruction set is introduced in chapter four. It proceeds at pace,
just giving the barest tuition needed to understand each conceptual
part. Any discussion of subtleties or alternative approaches is
eschewed for weightier textbooks.
I think it's a better introduction to CPU design and from that one can
read up on Verilog, etc., and start experimenting with little CPU
designs on FPGAs.
--
Cheers, Ralph.