logoalt Hacker News

cf100clunkyesterday at 9:52 PM1 replyview on HN

Analogue interlaced-scan TV systems like PAL and SECAM were actually ''higher'' definition in relation to NTSC by visual line count, although the former's 25Hz refresh rate was noticeable for flickering compared to NTSC's ~30Hz, which was much closer to the human eye's comfort level.

There was a prototype 819-line analogue ''high definition'' system used to record The T.A.M.I. Show in 1964, with excellent results, but the recordings were committed to film for distribution since there was no apparatus for broadcasting it:

https://en.wikipedia.org/wiki/T.A.M.I._Show

There were also experiments by NHK of Japan with analogue HD broadcasting, but digital TV was so close on the horizon that it was mooted.

''High definition'' has been a relative term in the professional TV world all along, but became consumer buzzwords with the advent of digital TV in the early 2000's. Nowadays we know it to mean 720, 1080, or higher lines, usually in progressive scan.


Replies

account42today at 11:09 AM

> Analogue interlaced-scan TV systems like PAL and SECAM were actually ''higher'' definition in relation to NTSC by visual line count, although the former's 25Hz refresh rate was noticeable for flickering compared to NTSC's ~30Hz, which was much closer to the human eye's comfort level.

Yet motion pictures are still stuck at 24 FPS to this day and there are even people who have strong opinions about this being a good thing.

Also just because NTSC was 29.97 Hz doesn't mean that the video content actually was - almost everything shot on film was actually effectively 23.97 Hz - telecined to 59.94 fields per second but that doesn't actually change the number of unique full frames.