The Amiga used a CPU rated for 8 megahertz, but clocked at 7.14 megahertz. What was the reason for this number? I remember it was something to do with a multiple of the frequency of the video circuitry, but I forget the details.
|
|
|
I think you meant 7.15909 MHz. 7.15909 MHz is twice the NTSC color burst frequency (3.579545 MHz). The NTSC color burst frequency is 455/2 times the line rate, the line rate is 262.5 times the field rate, and the field rate is 60 * 1000 / 1001 (59.94 Hz). See also https://www.repairfaq.org/samnew/tvfaq/tvwinsswf.htm On PAL systems that have different field rates, line rates, etc. the CPU is clocked at 7.09379 MHz. |
|||||||||||||||||||||
|
|
When color video was introduced in the USA, the horizontal scan rate was set as precisely 15750 * (1000/1001)Hz, i.e. roughly 3579545.4545Hz and the color sub-carrier frequency (also called the chroma clock) was defined as 227.5 times the horizontal scan rate. Many computers of that era use a multiple of the chroma clock as the pixel clock (the Amiga, for example, uses 4x chroma, or 14318181.818Hz). On systems where video generation shares the a memory bus with everything else, it's generally necessary that a fixed relationship exist between the video frequency and the CPU clock. In the case of the Amiga, the system clock is 1/2 the dot clock (2x chroma), which is a bit faster than some earlier machines, though the 68000 doesn't do as much each clock cycle as some other processors. Some other typical machines:
The IBM PC had video memory that was separate from the main memory, and was designed to allow the video clock to be independent of the CPU clock. Nonetheless, the original PC with a CGA card used a dot clock of chroma*4, and a CPU clock of chroma*4/3. |
|||
|
|
|
The architecture of most "color computers" of the 70s-80s was very tightly built around the NTSC color video standard. Almost all of them had a 14.31818 MhZ crystal. Note that this is four times the 3.579545 MhZ frequency of the NTSC color standard, which was called a "color clock". They divided that crystal frequency to derive their actual clock frequency. For instance
Why didn't these color-computers just operate asynchronously and run the CPU at max spec, while the video system operated on color-clock? Because memory was at a premium in those days, so they used memory-mapped video. This was dual-ported in the simplest possible scheme**, which required memory clocks be in lock sync with the video system. The Apple II even used the video system to accomplish dynamic memory refresh. So you see. The Amiga, the last of the machines built with the "color computer" mindset (heh, speaking of another), deliberately chose a CPU speed again divided down from that same familiar crystal, and again lockstepped to the NTSC color clock. This allowed them to leverage all the color-computer design which had come before, rather than having to reinvent the wheel. It's difficult to imagine, in this day and age of 4k video, gigabytes of video RAM, and VPUs that can mine Bitcoin faster than the CPU... that we were so pious in our worship of the NTSC standard. No longer needing the thing, it was kind of a lousy standard. ** To digress on pre-Amiga-age dual-porting: On the Apple II, the video system got every other RAM cycle, and on the Atari, the far more sophisticated Antic system interrupted the CPU to take the memory cycles that video needed, and it also took care of video refresh, which allowed linear mapping of video instead of the goofy Apple mapping (this mapping provided the intended side effect of dynamic memory refresh, a genius double-use which was Wozniak's stock in trade.) This meant available CPU power varied between display modes and whether you were in vertical blank. In a lot of Atari games, the CPU often "rode the raster", spending its very limited CPU cycles to cue up the changes to colors or sprites that would happen on the next scan line, wait for horizontal-blank, execute changes and cue up the next. All the computational work of playing the game happened after it got to a low-attention area like the bottom scoreboard, and the vertical blank period. (a significant amount of time when the raster was off the visible screen). All this was coded in assembler of course, with a great deal of painstaking cycle-counting to make sure the operations could happen in the requisite time. |
||||
|
|