Sigma_Orionis wrote:FZR1KG wrote:Ah young student, there is a difference between going from bit width to instruction set change......
Tell that to the
IBM System/360 introduced in 1964
Direct from the horse's mouthMainframe customers tend to have a very large financial investment in their applications and data. Some applications have been developed and refined over decades. Some applications were written many years ago, while others may have been written "yesterday." The ability of an application to work in the system or its ability to work with other devices or programs is called compatibility.
The need to support applications of varying ages imposes a strict compatibility demand on mainframe hardware and software, which have been upgraded many times since the first System/360™ mainframe computer was shipped in 1964. Applications must continue to work properly. Thus, much of the design work for new hardware and system software revolves around this compatibility requirement.
The overriding need for compatibility is also the primary reason why many aspects of the system work as they do, for example, the syntax restrictions of the job control language (JCL) that is used to control batch jobs. Any new design enhancements made to JCL must preserve compatibility with older jobs so that they can continue to run without modification. The desire and need for continuing compatibility is one of the defining characteristics of mainframe computing.
Absolute compatibility across decades of changes and enhancements is not possible, of course, but the designers of mainframe hardware and software make it a top priority. When an incompatibility is unavoidable, the designers typically warn users at least a year in advance that software changes might be needed.
Um, you realise that most of that series had downloadable microcode, basically what I was talking about, right?
Others has different instruction sets but provided emulators in hardware.
If you want compatibility you could have it. If you wanted speed you could have it.
You couldn't have both. If your program was written in a smaller CPU the better CPU would run it but it would run it as slow or slower on a clock by clock basis. Getting the extra speed needed faster clocking or conversion to the new instruction set.
Most of the time the microcode was loaded as a bootloader so changing to a different instruction set required a reset. Not practical but possible if you needed it. We don't need to go that far anymore since technology has come a long way since then.
They also offered microcode for scientific applications as well, you could just download it and get scientific (floating) calculations instead of BCD math that the business instruction set ran. The two were not compatible obviously.
That's the whole point about downloadable microcode, though back in those days it was simply microcode that was usually stored in core memory (magnetic cores). The really fast ones used hardwired logic rather than microcode and thus needed emulation hardware for backward compatibility.
It was when the modern CPU's came to being on a single chip that made having a separate bus or downloadable microcode more difficult, so they settled on the cheaper alternative of ROM based microcode due to silicon real estate. At that point the decision to stay with one CPU really locked you down on a machine code level but you could still emulate or simulate.
But just to be clear, I'm not saying compatibility isn't a top priority.
What I'm saying is that does not translate to CPU instruction set compatibility.
The IBM 360 you inked to is a classic example.
So are the cray's, the Primes, the Honeywell's, the Cybers etc.
They ran different CPU's which were obviously not machine code compatible but the software, microcode or extra hardware made them compatible.
I can't get software compatibility now even though the hardware is machine code compatible, but they got software compatibility without hardware compatibility back then. We've gone backwards, not forwards in that regard.
As a matter of interest, did you know that Motorola produced a 68000-360 a CPU with the scaled down IBM 360 instruction set?
The microcode/nanocode was in ROM rather than downloadable but there's what I was talking about.
CPU's of the modern era that can run code as far back as the late 50's because they ran changeable microcode.
Now if they did that in a downloadable form, the PC would be hardware compatible to almost any CPU's of the past simply because of it's address and data bus widths being more than enough to execute any instruction set.
If you want hardware compatibility, that's the way to do it.
Imagine if we'd gone down that path what things would be like today.
1) I could have any O/S I like on my laptop without compromising it's new features.
2) with multi processing CPU's I could run different O/S's at the same time on the same hardware.
3) If I needed a specific instruction set, I could download it. I could run IBM 360 programs or Cyber 18/20 programs faster than they were ever run.
Basically the computer industry was slowed down. The trend changed from producing better, faster and more flexible CPU's to CPU's that had backward machine code compatibility when there was no real need for it. As your example of the IBM 360 shows.
Sigma wrote:No worries dude, arguments like these are great, makes me research and find stuff that is new to me and we get to pass the time.
BTW MostlySucks has a free version of their
Visual C++ what-have-you 2013 thing, it might help.
Right now I am seeing how the idiots at corporate pretend to take a Datacenter that has given them 4 years of almost continous trouble-free operation and turn it into shit. So, yeah, I hear you.
Besides, you and I are way too old and have been too long in this business to bitch at each other over some geeky point that 90% of the world doesn't understand, much less give a damn about
Ain't that the truth. Not that I'm bitching at you. It's the industry I'm pissed off with.
Remember, I was a design engineer back in the days where CPU development was really starting to take off and knew how to design CPU's and their peripherals such as cache etc. So watching the industry go down the path they did was frustrating as hell. Back in those days I'd debate with other design engineers about the trends and we speculated where it was heading. So far our predictions back then were pretty much spot on. Not bad for predictions made close to three decades ago.
The only one that hasn't happened yet is one I predicted would happen but hasn't quite yet, though the trend is heading that way so it's just a matter of time. Maybe I should check again, it's been a while. It may have happened since I last checked. PM me if you want to know what that was as I don't want to post it publicly.
I tried the 2013 C++ but it's not compatible to the 2005 version either.
The 2008 version allows you to update your code but all it does is break it.
The later versions don't even bother trying.
It is fun going back through memory lane though.
Most people don't even know what microcode is or if they have heard of it have no idea what it does or how it works yet their lives revolve around machines that use it. Not that I care, I just find it amusing at times. Especially when hearing people talk about computers and you know they don't really know much about them, but they think they do and are happy to share their wealth of knowledge with everyone. Little do they know the person next to them who looks like a blue collar laborer designed such systems for a living before they were even born.
Even more amusing is when people know I'm an electronics engineer and ask my advice on which computer to get.
I don't know. Why the hell would I?
I don't get involved with the latest CPU or benchmarks because I find it boring and I'm kind of pissed off with the whole industry anyway.
People overclocking CPU's to get them to run a bit faster. I know a guy that spent days tweaking a PC to get a few percent more performance.
Yay, he managed to tweak it to do benchmark software faster. Excellent. Good job. Here's a dog biscuit for you. lol
I'm also probably very biased. I'm a hardware guy. I love design. Digital and analog. I love to make new designs but that side of the industry is dying or so specialized it's not worth the effort. So I guess watching what happened was distressing to me.
Maybe I have 8086 PTSD?
Ever seen Charleston Heston at the end of "Planet Of The Apes?"
That's me banging my head asking, "why this piece of shit processor? why? WHY!!! arrrgggghhhhh" Damn you to hell!!!