https://en.wikipedia.org/wiki/DEC_J-11
which was used to build this
https://en.wikipedia.org/wiki/DEC_Professional
DEC had a 1-chip PDP-11 but a 1-chip PDP-11 wasn't competitive with the better chips coming out around 1982. That DEC Professional was not such a great machine and the software support for it was worse. You couldn't run software from the minicomputer and even if you could it wasn't suitable for the needs of end users on a single-user system who were asking to open bigger spreadsheets and such.
I knew the PDP-11 pretty well, I never got my hands on a high-end CP/M machine.
The PDP-11 came out around 1970, the OS I always used on it was RSTS/E which was provided an interactive BASIC programming environment. You might have 15 terminals and each user got their own 64k address space. It was a lot like using BASIC on a Apple ][ but a little better, especially because you got to keep files on a hard drive and if you were in a programming class you could share files with the instructor and other student. You could also, like CP/M, run other binaries in that address space and you could edit with the TECO text editor, use a FORTRAN compiler, etc. Ordinarily you would use a VT-100 terminal with 80x25 text which was bigger than most home computers which were more like 40x25. If you were lucky you had a color vk100
https://terminals-wiki.org/wiki/index.php/DEC_VK100
with better graphics capabilities than home computers except for animation.
High end CP/M machines used the
https://en.wikipedia.org/wiki/MP/M
OS for multitasking or actually had one CPU board for user, that, like the PDP-11 gave every user a 64k address space.
When I was getting into this stuff as an 8 year old in 1980 (really!) there was a lot of talk about an 8 to 16 bit transition so of course I imagined future micros would look something like the PDP-11. With micros we had just a handful of 8-bit registers, with mostly 8-bit operations but sometimes 16-bit operations because you sure as hell have to be able do pointer arithmetic. The PDP-11 had 8 16-bit registers and seemed pretty powerful in comparison, but...
it had a 64k problem space. When micros first came out, it was prohibitive to fill out the 64k address space but by 1983 or so even cheap machines had the full 64k. The crisis of the industry was that ordinary user applications needed more RAM. The PDP-11 had a virtual memory system that had 8 8k pages, not too different from virtual memory systems today but simple and small. It worked great for sharing the machine between users but not so great for applications, i guess they could have updated the OS and compilers to do better -- late generation 8-bit machines like my TRS-80 Color Computer 3 had a similar memory management system, but they didn't.
People thought the 68k was the future with a 16x32-bit register file like the IBM 360 mainframe but it wasn't... turns out you can't really pipeline a computer which has indirect addressing [1] so the likes of Motorola and DEC were abandoning the 68k and VAX by the late 1980s. These had no future. Also the 68k did not perform in real life as well as people thought it would.
The 8086 though was pure "worse is better"; the segmentation model seemed lame in comparison to the 32-bit 68k line, but it was actually easy and fun (in my opinion) to use in assembly language and you couldn't afford, say, 8 MB of memory and didn't need a system that could handle it. All of a sudden you could make bigger spreadsheets and CP/M was headed for the dustbin at the high end.
Funny though CP/M did have a sort of revival in that it became common in low-end machines like the C-128 but it was too little too late, the industry going in the PC direction. A last hurrah for me was that circa 1988 I wrote some software in BASIC for a teacher at my school who had a CP/M computer and I had a Z80 emulator that ran on my 286 machine which was 3x faster than any real Z80. That 286 was lame in a lot of ways but it was crazy fast for the price.
[1] ... this was the one RISC/CISC CPU thing that really mattered!
Amstrad were good late 80s CP/M machines. We got both those and C128 in New Zealand.
> the one RISC/CISC CPU thing that really mattered!
Not only indirect addressing, but also multiple memory operands in the same instruction — more than one VM page, really, though a single unaligned operand crossing a page boundary is also bad. Many machines trap on that case to this day and let software emulate it.
Not being able to easily tell how long an instruction is (and thus where the next one starts) is also bad, but can be overcome at some cost in the front end, and the back end is unaffected. Unlike x86 and VAX the 68k does actually tell you everything you need in the first 16 bits, but yes the complex addressing of the 020/030 were what killed it.