Login
Back to forumSee the full topicGo to last reply

Posted By

TLC
on 2023-06-04
18:06:32
 Re: Emulation video option you never knew you wanted until now!

@MMS just had to get around to do it someday... happy

The 1084S looks indeed better, but that's partly (admittedly) due to the saturation setting being a bit off (low) on the TV. Maybe I should check and readjust it according to the service manual someday. (Past years I typically used the 1084S for retro, except when RF was strictly needed.)

The chroma crosstalk is due to a design flaw of the 264 series PCB's (all models). In Commodore's designs, signals coming-going from-to the outsite world have been regularly loaded with ferrite bead filters, to get rid of possible radio frequency interference noise. (That's been done as much to prevent the machine from receiving external RF interference, as well to prevent the outside world from RF noise generated by the machine. This has ultimately all been done to make the design(s) succeed to obtain FCC certificates, a pre-requisite to mass production and sale). Now, on all 264 series PCB's, there's one extra ferrite bead filter (with regards to either the C64 or the VIC-20 design) in series with the common ground point of the Video socket, pin 2 of the 262° DIN-8. (See: FB7 on the Plus/4's schematic, FB13 in the C16 and the C116.) That's been one bad idea, or, maybe a simple mistake, I don't know. The ferrite bead filter has a very apparent frequency response, that's why it's used to get rid of high frequency components in the first place happy . Now in this (flawed) setup, the common ground point has non-zero output resistance, which is also noticeably lower or higher with actual operating frequency. The result is crosstalk, i.e. there'll be remnants of output signals on all the other output signals. Due to the frequency response of the common ground point, crosstalk from high frequency signals to low(er) frequency signals would be most pronounced, which primarily means chroma crosstalk to both luma and sound (but luma is probably the most noticeable one). What you can see from this (if you use a good, high resolution CRT display, and either luma-only (monochrome) or separate luma-chroma connection), that there is a somewhat "grainy" surface look for all coloured areas on the screen. (Black, grey, and white will look completely smooth, since they all have flat 0V corresponding colour subcarriers, but all the coloured fields, where colour is nonzero, would look a bit grainy, or, should I call it patterned, due to the added chroma signal overlay from crosstalk.) If one replaces the FB component by a wire (or, merely shorts it's pins together with solder) on the PCB, the problem is gone. From that point on, all colour fields look perfectly smooth, no graininess.

Re. non-standard luma and chroma levels, these typically don't affect traditional analog TV's or monitors in practice. There's a simple reason to that. Back then, when video signals and colour encoding systems were designed, people knew that neither RF transmission systems nor analog electronics (especially old electron tube based circuits) were ever stable over time (or, at all). (RF field strenghts change, signal levels change, amplification curves change, resistances change, etc. etc.) That meant that, in practice, signals and processes had to be designed to use references embedded in the signals, rather than arbitrarily selected absolute values, and electronics had to be also designed to bear with a lot of tolerances. Now. The composite colour signal indeed has a maximum value by standard. At the same time, however, colour demodulation, in practice, is not based on absolute values, but on a signal part called the "colour burst", which is transmitted together with the content, and is used as a reference for demodulation. The TV keeps identifying this section of the TV lines (somewhere at the start of each lines), and uses the burst's phase and amplitude to adjust it's own internal references - which in turn are used to demodulate the line's colours. That means, that given colour signal will be demodulated to the same actual colours regardless to signal amplitude, 0.3 or 0.9V, that all doesn't really matter. What really matters is the relation between colour burst and content. At the same time, due to design tolerances (...who said there were guarantees for levels that came from demodulated aerial signals), TV's traditionally didn't limit signal levels to predefined standard maximum values, they usually allowed for considerably higher absolute signal levels before they'd start clipping and distorting (practical limitations varied). That means that colour signal amplitudes even 50-100% larger than standard, were still much likely to be demodulated without a glitch, as long as proportions between colour content and the burst were otherwise kept. Modern LCD TV's and other gadgets might or might not tolerate high signal amplitude levels anymore, depending on design. TL;DR: high colour signal amplitude usually didn't pose a problem for classic (analog, CRT) displays, that was within their design tolerances.

But I probably better not hijack MIK's thread with this... :-)



Back to top


Copyright © Plus/4 World Team, 2001-2024. Support Plus/4 World on Patreon