It's a chip designed for interfacing with the real world via digital and analog means rather than as a general purpose CPU.
Even though it uses an M8C core (I complained but the design team said it was chosen due to popularity of the core) it had many interrupts and hardware to cater for critical real time application.
IOW, the device is designed for multitasking as all CPU cores that are designed for interfacing to the real world are, but, it's not like what is found on general purpose CPU's.
So there are few security concerns, the only one being flash levels of security to separate supervisor functions from user code and the ability for a user to define in 64byte blocks which part of the flash are handled in what way, e.g. write protected, field upgradable, no protection, factory protection. All are just protecting the flash from being accidentally overwritten.
The 256 byte memory model for RAM is basically inherited due to the CPU code, your compatibility point.
The instruction set came from a time when it was expensive to to have large RAM or impossible to have it on a chip.
Remember that single chip stuff when it came out was amazing. I was designing interfacing boards where the CPU had no ROM or RAM and needed latches etc to interface to the eternal memory.
When the single chip stuff came out they implemented what they could, that meant limited RAM, ROM and a simple 8 bit addressing mode for RAM.
The designs were meant to replace complex designs such as IR remote controls for TV which required little in the way of RAM. Usually just a few variables and a stack.
Another example would be dashboard instruments in cars.
You'd be surprised but much of the industry uses 8085, 8051 and 6800 based codes in the CPU cores because development tools exist that have proved reliable over the years/decades.
Where I differ is that when designing a new chip such as the PSoC (I was involved during the development stage, before the release of the parts), my choice would be to use a new/newer core as well. One optimized to the hardware. They choose to go with an established core. It was a startup called Cypress Micro Systems, which was eventually taken over by the parent company. Makes development easier, but not better. In the long run I think they regretted that decision as it severely limited the devices acceptance in the community, IMHO.
It was the analog and digital blocks that kept it alive till they got new cores but that happened much later, again, IMHO, formed by discussing the PSoC with other engineers.
Regarding the 286, it was again a compatibility issue. People wanted compatibility with the 8086.
Which IMHO is a load of shit as a concept.
Their reasoning was that it could run 8086 code naively and if they extended the segmentation modes (the 8086 was segmented already) they could run multiple 8086 programs with minimal security issues and minimal development time. Turns out that didn't quite work out the way they wanted and the 386 was born.
Naturally, it now had to be backwards compatible to the 286 which wasn't able to fulfill its design goal.
Basically a fuck up built on a fuck up but that's the whole IBM PC as it was developed to help reduce IBM's tax. IOW what we all have now is a system that was designed as a tax write off.
Any design engineer at the time just looked at the abortion that it was and rejected it as over expensive junk. It remained that way for years.
The business world however saw it came from IBM and embraced it.
Remember that the Amiga was at the time far far cheaper (around $600 compared to about $6000), faster, had a linear memory model, had a graphics processor and blew the PC apart in performance by magnitudes especially in the graphics aspect. While the PC was still with CGA and EGA graphics cards barely doing simple 2D work, the Amiga was doing hires 3D in real time. It also had sound that was real sound, not a stupid beep controlled by on/off in software.
So many other systems, so many better CPU's but we got a segmented pile of shit for years. It took the PC about a decade before it came close to the graphics and sound capability of other systems before it. Then came the abortion OS known as Windows. My laptop has Windows 8.1 and it can't even implement window re-sizing correctly. My code which runs on my other PC (Vista) won't run on the new laptop (8.1) even though it was developed with Microsoft products. The computer industry is shit in a blender.
Now, many people go on about things being compatible as a limiting factor. I call crap. You can design a CPU that can execute non native instruction almost as fast as the native CPU.
Downloadable microcode is one example.
Most O/S now are written in a high level language making porting pretty easy. Nothing like a re-write even if the instruction set is different.
It was certainly possible back then and within a year or so it would be far faster than the original, in many cases faster simply due to the speed of the newer processor.
They tried it with the PowerPC but MicroScum already had the O/S market and didn't want port their O/S to the newer CPU eventually bring it to decline.
Go figure, a company that refuses to sell it's product to expand sales.
IOW, business politics and behind the scenes handshake deals kill the advancement of technology.
Along with idiots that line up to get the newset O/S when it's released like it's some form of magic pill.
Ok, rant done. Feel much better now. Time to go back to finding out why this stupid thing crashes.