⧠Support manual / timer / motion / alarm / network remote recording ⧠Ensure the authentication of recorded images with Watermark function ⧠Support TCP/IP, PPPoE, DHCP and DDNS network connection 1.3 Specification SPECIFICATION. Model 1 Model 2 Model 3 Model 4 Model 5 Video System NTSC / PAL. CCTV DVR NTSC PAL Video Setup for iDVR-PRO Analog CCTV / AHD / TVI Surveillance DVRs. This article will go over toggling the NTSC / PAL video setting on the iDVR-PRO video surveillance DVRs. NTSC is the video standard for the United States. Most countries outside of the United States use PAL video. Phase Alternating Line (PAL) is a colour encoding system for analogue television used in broadcast television systems in most countries broadcasting at 625-line / 50 field (25 frame) per second.Other common colour encoding systems are NTSC, ATSC, and SECAM. All the countries using PAL are currently in process of conversion or have already converted standards to DVB, ISDB or DTMB. Hey abinanth, NTSC and PAL refer to two different analog TV broadcasting standards. In North America and Japan, PS2's and other electronic devices generally are sold in the NTSC format only, as that is the standard used in those parts of the world.
< Talk:NTSC
Length
This article is way too long. Historical aspects of the evolution of NTSC, including all explanations why certain aspects were defined that way, should be in a separate 'History of NTSC' article, comparisons to others systems in a separate article; framerate conversion info should be dropped and replaced by links to Television standards conversion and Telecine articles. NewRisingSun (talk) 16:24, 21 November 2007 (UTC)
How to organize encyclopedia material for such a complex topic
I think we need to get into the head of the person reading this. Starting what a lay person is likely to have experienced in his/her own everyday life should come first. In any case, I think an Outline would be a good place to start. I noticed that the article starts with many ideas that are foreign to most people. So the article should start off GENTLY. As with most informative text, simple to complex is the best approach, as long as the simple text is also true and correct. And avoiding tightly interrelated forward references as much as possible serves greatly to the learning experience of someone reading the article. Coverage could go something like this? (I know, my wording is clumsy.) I include [comments] between brackets. --208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
The term 'NTSC' has several practical meanings depending on the context.--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
For the public in general, the 'NTSC' term is imprecise industry jargon meant to convey a 'TV standard' or 'video standard'. The purpose of a standard is to insure that TV broadcast signals, prerecorded media such as DVD and VCR tapes, and equipment that conform to the same standard, are compatible with one-another, in order to prevent operational failures such as scrambled pictures, missing color, no sound and the like.--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
For local analog over-the-air TV signals, a few countries like the U.S. , Japan and a few owthers used the NTSC standard, while most others used other analog standards such as 'PAL' and 'SECAM'. Mixing standards, such as in an attempt to use an older PAL TV to receive an NTSC TV signal, results in partial or complete operational failures such as scrambled pictures, no color, or missing colors. This is why PAL/NTSC combination equipment is often found at airports around the world because travelers often end up with equipment of the wrong standard for the local media and TV signals.--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
[Comment: The above usually satisfies what most people want to know in a practical sense, and for that reason something like it should appear first in the article. In the practical sense, it is not tightly interelated with other facts, and so is not really forward reference.]--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
[Comment: In order to understand NTSC in a practical sense, some of the political reasons, such as those actually debated in the U.S. congress, should be reported.]--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
'NTSC' stands for 'National Telvision System Committee', which was, on two separate occasions, an industry group set up by the U.S. government (FCC) to write federal regulations governing the engineeering specifications of local over-the-air analog TV broadcast signals. The purpose of these regulations was to insure interoperabilty between the transmitted TV signal and the TV sets receiving that signal, fulfilling a basic public interest of having any TV set work correctly with any TV broadcast signal. The work of the first NTSC became federal regulation in 1941, authorizing only monochrome (black and white) broadcasting. The work of the second NTSC, which we can call 'color NTSC' here, became federal regulation in 1953, authorizing a system of color broadcasting to replace another color TV standard known as the CBS color system. --208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
Nine years after the NTSC monochrome TV standard was approved, the CBS color system was approved by the federal government in 1950. Unfortunately, unlike with the case today, the CBS color system was not backward-compatible with the then millions of monochrome TV receivers already in the viewing audience. These older monochrome (black and white) sets failed to work with the CBS color telecasts, even in black and white. Interestinly, CBS color recievers themselves were dual-standard, making these CBS color receivers backward compatible so as to receive monochrome NTSC telecasts for display in black and white.--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
[ The above are the simple facts, all verifiable of course. ]--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
[Comment: Note that talk about mechanical spinning 'color wheels' and electronic versus mechanical reproduction was all in the press, it was actually a red herring and politial posturing because in the lab at least, I'm sure that an all-electronic system could be adapted to receive the CBS telecasts. There is information I have read attesting to this going back 50 years. Perhaps reporting on both the political posturing and the articles refuting this is in order ? If so, it's really a side show, and should only be included later in the article after a quick explanation on how color TV works, because, being a red herring, it would serve the same purpose it did 50 years ago in the press, which is to confuse. And the last thing an encyclopedia article should do is to confuse. The real problem, also coverered widely in the press, was backward compatibility.]--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
While the public may have thought that going from black and white to color was as easy as replacing black and white film with color film for photography, the technical hurdles for TV technology were actually huge, requiring many tens of millions of dollars of industry investment to make it practical. [Comment: Is this conjecture?]--208.252.179.21 (talk) 16:23, 16 March 2009 (UTC)--Ohgddfp (talk) 16:29, 16 March 2009 (UTC)
NTSC is just a colour encoding system!
This seems to be lost on a lot of people, and this article doesn't do much to explain it. NTSC is purely related to how colour is encoded over luminance. NTSC-M is the North American broadcast standard, and it's the M rather than the NTSC that designates number of lines, frames per second, etc. The third paragraph of the PAL article makes this clear, but it is only briefly mentioned in this NTSC article under the 'Refresh rate' section: 'The NTSC formatâor more correctly the M format; see broadcast television systemsâconsists of 29.97 interlaced frames'.
Given that it's the M format that determines this, and not NTSC, I propose that the section on refresh rate be moved to Broadcast television system and that only colour-related technical details be left in this article. I also propose that the third paragraph of the PAL article be adapted and included in this NTSC article. The opening paragraph of this article should also be modified, as it doesn't mention colour at all! Balfa 20:09, 21 June 2006 (UTC)
The NTSC standard does include line and field rates, but to use PAL as synonymous with the 625 line system is wrong - PAL is just a colour coding system, not a line standard. Not all PAL systems are 625, and not all 625 sytems PAL. The author should have known this, that they are not interchangeable terms.
1) NTSC stands for 'National Television System Committee', which was an industry group set up by the U.S. government to create an 'engineering standard' that would be incorporated into the federal rules and regulations of local over-the-air analog TV broadcasting stations.(The purpose of an 'engineering standard' is to insure interoperability between the transmitted signal and television receivers in order to prevent operational failures such as scrambled pictures.)--208.252.179.24 (talk) 21:31, 13 March 2009 (UTC) 208.252.179.26 (talk) 01:26, 16 March 2009 (UTC) me. Ohgddfp (talk) 01:48, 16 March 2009 (UTC)
2) The first NTSC established a monochrome-only engineering standard. This is where several of the engineering specifications, such as number of lines (525), number of active lines (approximately 483), video bandwidth (somewhere between approximately 3 MHz and 4.3 MHz), 2:1 interlace, horizontal blanking period (somewhere between approximately 8 and 12 microseconds), FM sound located 4.5 MHz higher than the visual carrier frequency, maximum deviation of FM (+ or - 25 kHz, preemphasis for the FM sound, positive sync with 75 percent of carrier blanking, some kind of logarithmic transfer function for the video, amplitude modulation using negative polrity for the composite video. Spectrum spacing of visual carriers to be 6 MHz apart. This all the law of the land in 1941. I'm not sure if the 'M' designation was even in use at that time.--208.252.179.24 (talk) 21:31, 13 March 2009 (UTC) 208.252.179.26 (talk) 01:26, 16 March 2009 (UTC) me Ohgddfp (talk) 01:48, 16 March 2009 (UTC)
3) A few years later, the second NTSC, which we can call 'color NTSC', added color information to the transmitted signal in the form of a transparent watercolor-like overlay signal (called color difference signals) on top of the existing monochrome picture signal. This achieved backward compatibility with the previous monochrome-only service, which was the the political reason for the invention of color NTSC. The engineering standards for the monochrome signal remained essentially the same as before. The two kinds of signals, monochrome and color-difference, remained effectively separate while being transmitted on the same channel of a color program. For backward compatibility of older monochrome receivers, a monochrome reciever would simply use only the monochrome part of the color broadcast signal to display the program in black and white. The color receiver would utilized both the monochrome portion and the color differnce portion together so that the image would appear in full color. The reason the color receivier needs the monochrome portion of the signal is provide infomation on the lightness and darkness of the colors. This is the same as what a black and white photo acomplishes. When manually colorizing a black and white photo using transparent watercolors paints, as was popular many years ago, the transparency of the paints allowed the lightness (like with yellow) or darkness (like with violet) of the color to be determined by the black and white photo, while the other qualties of the color (saturation and hue) were detemined by the watercolor paints. In the same way, an NTSC color reciever overlays color differnce attributes on top of the monochrome picture. This technique has several benefits, one of which is backwards compatibility where the color receivier can recieve monochrome broadcasts for display in black and white simply because a monochrome broadast has no color difference signal to produce a colorizing overlay.--208.252.179.24 (talk) 21:31, 13 March 2009 (UTC) 208.252.179.26 (talk) 01:26, 16 March 2009 (UTC) me Ohgddfp (talk) 01:48, 16 March 2009 (UTC)
4) Adding more information like color differnce attributes to a monopchrome TV signal already occupying the full assigned bandwidth was once thought to force an increase in the spectrum bandwith in order to avoid self-interfernce within the channel. But increasing the bandwidth causes a wider spectrum spacing between carriers. This in turn would have forced some stations off the air to make room in the spectrum, reducing viewer's choice of programming. Even worse, the change of carrier frequencies would have caused exisitng monochrome receivers to malfunction, destroying backward compatibiity. So the NTSC had no choice but to maintain the same bandwidth and add the color difference information anyway, causing intentional self-interference. This can be seen as occasional fine dot crawl, expecially at the borders of strong contrasting colors, and false moving rainbow colors running through fine woven clothes patterns and other finely detailed patterns. This cross-color interference is one of the chief critisims of NTSC picture quality, even though many people do not notice these defects.--208.252.179.24 (talk) 21:31, 13 March 2009 (UTC) 208.252.179.26 (talk) 01:26, 16 March 2009 (UTC) me Ohgddfp (talk) 01:48, 16 March 2009 (UTC)
The reason many people do not notice these NTSC picture quality defects often is because the NTSC borrowed a clever technique called 'frequency interleaving', and combined it with some other benefits provided by breaking up the picture information into luminance attibutes (the monochrome image as explained above) and color-difference attributes (the watercolor-like overlay). The color-difference information is coded at the TV station into a 'chroma' signal which, because it occupies the same general frequency range as fine detailed luminance, is not easy to separate, and consequently appears as a finely detailed monochrome interference dot-crawl pattern on the viewing screen of both color and monochrome receivers. [photo examples here]. Color receivers decode the chroma signal back into the color-difference attributes of the picture, and then overlay these attributes on top of the monochrome image to form a full-color picture. But unfortunately, the dot-crawl interference still apppears in the monochrome image. The genius of the dot-crawl interference pattern is that it is finely detailed and seems to move, making it difficult for the untrained human eye to follow or see. Also, the strength of the pattern is strong only in those areas of the scene where the color is strong (highly saturated). A gray object in the scene has no color, therefore no color differnce attributes, therefore no chroma, and therfore no dot-crawl. In fact, careful close viewing of the dot-crawl on a black and white receiver reliably shows which object(s) in a scene is strongly colored. On a monochrome reciever, a trained observer can actually examine dot-crawl or the lack of it to determine if a movie is transmitted in black and white or color. [photo examples here].--208.252.179.24 (talk) 21:31, 13 March 2009 (UTC) 208.252.179.26 (talk) 01:26, 16 March 2009 (UTC) me Ohgddfp (talk) 01:48, 16 March 2009 (UTC)
So the NTSC also had to make sure the dot-crawl interfrence pattern would be finely detailed to reduce visibility of it. This means that the chroma portion of the signal must have a narrow bandwidth, and therefore the color-differenc information itself must carry only one-third to one-eight of the infomration present in the lunminance image. But this makes the colors bleed into each other horizontally. Back to colorizing a black and white photograph with transparent watercolor paints, it's amazing that a sloopy aplication of the paint such that they bleed a little into each other, still results in what appears to the human eye as a perfectly colorized detailed full-color image with no color bleeding. That's because the human eye cannot see detail in the color hue and color saturation vairations provided by the water color or the color difference attributes in the case of TV. For human vision, all the fine detail is only in the monochrome (black and white) image. So the federal rules and regulations written by the NTSC mandate certain colors (orange and cyan on the I axis) to have a bandwidth of at least 1.3 MHz, which is substantially narrower than the 4 MHz bandwidth of the luminance channel. Even though it's still on the books with the force of law today in 2009, TV stations generally ignore this federal regulation by reducing bandwidth too far. Other color difference combinations require at least a 0.6 MHz bandwidth. Amazingly, many TV stations ignore this even this federal law as well by reducing bandwidth too far (see broadcast U-matic analog videotape format) (see also NTSC to digital TV signal converters used in digital TV broadcast stations). A lot of NTSC picutre qualtiy complaints come from these violations of federal regulations. (is this an opinion or fact? Need a source.)--208.252.179.24 (talk) 21:31, 13 March 2009 (UTC) 208.252.179.26 (talk) 01:26, 16 March 2009 (UTC) me Ohgddfp (talk) 01:48, 16 March 2009 (UTC)
NTSC Standard
The following was formerly in NTSC standard which, being redundant, I have redirected to NTSC:
NTSC standard: Abbreviation forNational Television Standards Committee standard. The North American standard (525-line interlaced raster-scanned video) for the generation, transmission, and reception of television signals.
All the early books call it 'National Television System Committee'.Ohgddfp (talk) 01:39, 14 March 2009 (UTC)
I was told, by some 'old timers', (the chief engineers that wanted nothing to do with the new Japaneese video products I was hawking), that it stood for 'National Television Standing Committee'.
Of course the 'standing' joke was 'Never Twice the Same Color'. 72.155.80.11 19:14, 17 August 2007 (UTC)
Exact frame rate
I think it should be made more clear that the actual frame rate of NTSC is 30/1.001 Hz. 29.97 Hz is an inexact approximation. That 29.97 is the actual rate is a common misconception which I would prefer not to propagate.Dmaas 16:39, 7 January 2005
There are two exact frame rates. In 1953 it was exactly 7159090 / ( 455 * 525), which is approximately 29.970026164312 Ohgddfp (talk) 02:17, 14 March 2009 (UTC)
In about 1978 or so, the subcarrier was changed to exactly 5000000 * 63 / 88 -- See Code of Federal Regulations Part 73.628 'TV transmission standard'. Ohgddfp (talk) 02:17, 14 March 2009 (UTC) Ohgddfp (talk) 02:43, 14 March 2009 (UTC)
So after 1978 the frame rate is exactly 30000/1001, which is approximately 29.970029970029970029.. Ohgddfp (talk) 02:43, 14 March 2009 (UTC)
Why is it that PAL at 50 fields per second produces 25 frames per second, while NTSC at 60 fields per second gives 29.97 frames?
NTSC is not 60 fields per second. It's about 59.94 fields per second. Ohgddfp (talk) 02:43, 14 March 2009 (UTC)
And if i got this wrong, then what is the reason that NTSC has 29.97 frames, which is not the half of 60? This is not explained in this article or any related ones, like PAL, Interlace, Frame rate. Someone who knows, please clarify. âPreceding unsigned comment added by Tapir666 (talk ⢠contribs) 15:51, 12 September 2007 (UTC)
The subcarrier is always 455 times half the horizontal scan frequency. With the subcarrier frequency at approximately 3579545 Hz, and considering 525 lines, this puts both the frame rate and the field rate at the wierd numbers of about 59.94 fields per second, and the frame rate exactly half the field rate at approximately 29.97 frames per second. The reason for the wierd numbers is very surprising -- it's because of the sound. The subcarrier frequency was chosen so that not only frequency interleaving would take place with the video, but surprisingly, so that the minor intermodulation distortion in the receiver would produce interference beat patterns witht the 4.5 MHz difference between the visual and AUDIO carriers, such that the 920 MHz interference beat is ALSO FREQUENCY INTERLEAVED, WHEN THERE IS NO AUDIO MODULATION. But with talking, this trick breaks down and the interference beat becomes more visible. So to reduce intermodulation distortion, the audio carrier is 40 dB weaker than the video carrier at stategic points within the receiver. So it's the SOUND that is the reason for the wierd scan frequencies. The designers could have ignored the sound issue and used at slightly different subcarrier frequency to put the rates at exactly 30 and 60 Hz, but then the sound carriers going thorugh the same amplifiers as the video carriers (to save on receiver costs) would need more careful design and more expensive TV receivers. Ohgddfp (talk) 02:43, 14 March 2009 (UTC)
Unsharp
Why is NTSC so unsharp and blurry compared to PAL? --Abdull 21:17, 17 Jan 2005 (UTC)
The question should really be 'why are 525 line systems blurry compared to 625 line ones?' 625 line systems use 20 - 50% more horizontal and 20% more vertical resolution than 525 sytems. This is a function of the line number and system bandwidth rather then the colur system used. 625 NTSC, which I have seen, should be sharper than 625 PAL, because the luminance and chrominance do not ovelap as much, so can be separated more efficiently.
I concure that 625 NTSC will be sharper than 625 PAL. That's because the better comb filters don't work at good with PAL's 8 field sequence. But all in all, PAL usually looks better because of an even bigger reason. European producers are more careful with their engineering. I've seen broadcast NTSC done right, which is very rare. And at normal viewing distances with a 27 inch screen, it has been mistaken for HDTV. The biggest mistakes are 1) SYSTEM modulation transfer function (a COMPREHENSIVE measure of detail and sharpness), that is measured from TV camera lens to viewing screen, is not maintained according to any standard, and 2) Automatic equalization based on the already mandated ghost canceling signal (in the vertical blanking interval) is not utilized by receiver manufacturers. The result of these two stupid American mistakes is frequently blurry pictures, even on cable TV systems! Ohgddfp (talk) 02:55, 14 March 2009 (UTC)
Map
Note that the map is cropped such that a good half of New Zealand is missing. Does anyone have the original image? Or is a completely new one necessary to fix this? --abhi 17:18, 2005 Apr 15 (UTC)
Consumer-grade equipment
Reading magazine reviews of both HDTV and SDTV sets, it seems to me that many TV manufacturers deliberately deviate from the NTSC specification; apparently it's normal to have some amout of 'wide angle color demodulation' and 'red boost' inherent in NTSC color decoders; this does not seem to be a common manufacturing practice with PAL devices. That of course contributes to the 'Never the same color' stigma as well; maybe someone with a more in-depth technical understanding of this could write about this in the article. NewRisingSun 16:10, 16 Jun 2005 (UTC)
I concure that American TV makers fool around with color to give images a high contrast snap in the TV store. This is a stupid American practice that had the practical effect of putting us behind the Europeans. Here's the history. From 1954 to 1961, manufacturers used the correct NTSC phosphors that produce an extreme wide range of colors, including color that people have never seen before. But in 1962, better energy efficiency and therefore smaller and cheaper power supplies was effected by using higer efficiency phosphors at the expense of poorer color reproduction. If the receiver manufactures had only used a proper degamma - linear matrix -- regammer for color space conversion, all normal colors would reproduce just as well. But no. Only a linear matrix was used for color space conversion, which worked for many normal less saturated colors, but screwed up the more saturated colros. Even in the broadcast studio, around 1967 or so, SYMPTE-C phosphors became the broadcast monitor standard, contrary to federal regulations. Actually, federal regulations don't regulate monitors or receivers in this regard, but they do insist that video cameras and endoders be designed to work best on a reciever that uses NTSC phosphors. So camera designers did this pretty much, and TVs used color space conversion circuttry for the new phsphors. And even these broadccast picture monitors had a 'matrix' switch to turn on the same linear compensation (color-space conversion) for good rendition on normal colors. (Yes, it does make the picture more red than without the color space conersion, but this is correct.) But like in the consumer receivers, strong colors got screwed up, especially all shades of red looking like the same shade of red-orange. And with a hue control (commonly called 'tint') manufactures didn't even try to make the decoding circuitry stable. Sloppy operations at even the network level also contribuited mightily to the green and purple people problem.So instead of fixing lousy decoding circuits and making the engineers at transmitter manufactures do a good job (differential phase problems), manufacturers hit on the idea of making all colors close to flesh tone actually BE flesh tone. This happened in the 1970's, called automatic color adjustment under various cute names, quite a bit later than the phosphor changes of 1962. In the 1970's a new round of much brighter phosophors came about to even futher reduce maximum color saturation. Some TVs could not reproduce a decent red at all, making those colors red-orange. One RCA set had a red phospshor that was just plain orange. Only mild reds could be reproduced on that set. Combined with engaging the automatic color adjustment system, the 1970's represented the dark ages of accurate color rendition, with most of even the most expensive TVs (Magnavox and Capehart) with auto color engaged never showing the color green or the color purple EVER. Everything was brown, orange and flesh, with the grass dark cyan and the blue sky light cyan. I've repaired a lot of sets during this time. Unbelievable. In the 80's things started to improve. The Japanes imports used what the Europeans use now for phosphors, better than the American sets of the 1970's. But even some Japanes sets had problems with color accuracy. Anyway, now the solid state studio equipment and the solid state TV sets have good circuit statbility so that a hue control is no longer needed, even though it appears in the menu from shear inertia. Today, the best NTSC sets ever made for accurate color rendition is the 21 inch round 1955 RCA Victor, because it uses full DC restoration with the proper NTSC phosphors and the proper MTSC decoder, the 1954 model used proper I/Q demodulation as well, but had a very small screen. A very distant second for accurate color are the very expensive Runco honme theatre systems. Granted, the old TVs were maintenance hogs, but when fixed correctly, they had the best (most accurate) color, even in a reasonably lit room. Ohgddfp (talk) 03:24, 14 March 2009 (UTC)
Current NTSC sets still have and need a hue control (usually called 'tint'). The instability of the colour displayed by NTSC sets is nothing to do with the design of the set or any changes to the circuitry within. The hue changes are a phonomenon of NTSC colour transmission that will never go away because they are caused by differential phase distortion in the transmission path.
Indded, on the multi standard TV sets now almost universal in Europe, the controls, which are now in menus rather than knobs on the front panel, feature a 'tint' adjustment. It is usually unavailable ('greyed out') in all TV modes except NTSC. 20.133.0.13 (talk) 08:19, 21 September 2009 (UTC)
About 'Current NTSC sets still have and need a hue control (usually called 'tint').' There is an important point to clarify here. Differential phase is NOT an inherent quality of NTSC. What is true is that NTSC is VULNERABLE to EQUIPMENT that has differential phase defects. PAL and SECAM have VERY LOW vulnerability to defective differential phase. But equipment defects are much less than they used to be. I suppose NTSC needs a tint control if we are talking about OUTSIDE the U.S., but I give a different reason for that. It's due to globalized TV receiver design, combined with poorly trained personal and/or poorly maintained equipment in other NTSC countries. Here's why. In the old days there were three sources of tint error. 1) By far the biggest was ironically, manual adjustment of tint, at both the smaller TV stations (burst phase) and in the receivers (tint). 2) Unstable circuits at both the studio and in the receiver which required that adjustment of tint. 3) Differential phase problems in poorly maintained (direct color) video tape equipment and transmission equipment, which was by far the smallest cause of wrong tint in the U.S. Now TODAY, bad engineering practice in small TV studios where personal adjust tint (burst phase) unecessariily remains a problem with small companies. But most channels on analog cable TV are from larger companies. So differential phase problems in the main transmitter (or cable RF amp), which is historically the smallest of the three causes of wrong tint, is the only real problem that remains, and even that is smaller than in the past. So small that in practice, people only rarely fiddle with tint, and then only because someone else fiddled with it before. Since the very act of fiddling with tint (and burst phase) is today MORE LIKELY in the U.S. to cause tint errors than all other problems combined, including differential phase issues in the transmitter, it's better to leave that control out. So why has it not been removed? The inertia of history, where historically, NTSC has a tint control, while PAL and SECAM did not. So it's no surprise that this inertia is carried over today. Tint may actually be REQUIRED in NTSC areas outside the U.S. due to poorly trained operators at TV stations and/or poorly maintained studio and transmitter equipment in those countries. But I would have to research that myself to be sure. Globalized product design, recongnizing the need of other countries besides only U.S. customers, is giving U.S. customers a tint control that they are now better off without.
So what is the point of the foregoing? It's a countrary opinion. What reliable source has given the LOGIC behind the claim that differential phase TODAY is bad enough to require a tint control? First of all, a differential phase problem will indeed respond to a CAREFUL tint adjustment, but only PARTIALLY. Only skin color will be fixed, but that makes other colors worse, although correct skin tones make a net improvement. But what if a study shows that, due to a combination of HUMAN NATURE and how small the differential phase problem really is, that a tint control is more likely to make tint errors WORSE than no tint control at all? Without some definitive study that can be cited, it looks as if anything along these lines really cannot be put into this kind of article. Ghidoekjf (talk) 02:48, 3 July 2010 (UTC)
Variants, NTSC-J
Um, isn't the stereo audio system also different on the Japanese system? The North American version uses a similar approach to FM stereo (suppressed-carrier AM difference signal added to audio before medium-width FM modulation), but moves the frequencies down so that the stereo pilot tone is the same as the TV line frequency; the Japanese stereo implementation predates this and IIRC inserts a second FM subcarrier onto the audio before the main FM sound modulation step. --carlb 01:01, 22 August 2005 (UTC)
'Pixels' in NTSC
Many readers are familiar with the pixel dimensions of their computer monitors. If the above estimate of 338x483 is accurate, it would be educational to mention it in the article. (I'm insufficiently knowledgeable to confidently do this.)
The graph at the bottom currently claims boldly that VGA and NTSC are the same, 640 x 480. NTSC 'pixels' of course are about a billion times blurrier than any 640x480 VGA monitor. Tempshill 00:04, 30 July 2005 (UTC)
Remember that digital images seen on the Internet are not often separate RGB. Like with NTSC/PAL/SECAM, they are also luminance with high detail, combined with lower-detailed color-differnce information. Converting this within a computer for delivery to an RGB monitor does not improve matters. Inside an NSTC color reciever, there is also a conversion from luminance/color-differnce informtaion to R G B, where red, green and blue video is applied separately on separate wires to the cathode ray tube. Ohgddfp (talk) 03:48, 14 March 2009 (UTC)
MOre about separate RGB: Some Internet images not used very much are called 'bitmapped' with file extension .bmp. Some versions of .bmp are indeed RGB, but are not popular because of too much bandwidth over the Internet. Very popular is JPEG. This is the one that uses luminance information separated from color-difference information, with less color-difference resolution than the luminance resoltuion. The MPEG-2 type of architecture, used on the Internet and is also the basis for digital television, also does this. Luminance with color-differnce separation feature, has been adopted by all analog TV standards around the world, along with most digital Internet video and all broadcast digital television. This is the technique first used by the system that introduced it, NTSC. Ohgddfp (talk) 04:17, 16 March 2009 (UTC)
Here it is: A digital system that can produce all the detail that the over-the-air analog NTSC system can reproduce must have a minimum pixel count of 453 horizontal by 483 vertical. Ohgddfp (talk) 03:48, 14 March 2009 (UTC)
List of Countries
Diego Garcia isn't in the Pacific. I have no idea which continent it is associated with. --MGS 13:37, 4 August 2005 (UTC)
Diego Garcia is in the Indian Ocean
Explanation of Standard
In the Standard section, voodoo is used to justify the number of lines in an NTSC frame. It should explain why the line frequency is 15750Hz.--66.66.33.245 23:59, 21 August 2005 (UTC)
The 15.750 KHz frequency was used with tho original black and white NTSC standard. Colour NTSC uses a 15.734 KHz line frequency 80.229.222.48 10:35, 24 June 2007 (UTC)
Vertical Interval Reference (VIR)
Would it be worth mentioning the 'patches' applied to attempt to fix/salvage NTSC by inserting extra data such as VIR (according to SMPTE Historical Note JUNE 1981, 'The vertical interval reference (VIR) system constrains the receiver settings of chroma and luminance to follow the values set at the studio')?
It would seem like, much in the same way that DNA has supplanted dental records to identify accident victims defaced beyond recognition, VIR, comb filters (and a largely-unused GCR [ghost canceling reference] signal intended to replace VIR) could allow viewers to identify images defaced beyond recognition by NTSC? --carlb 01:47, 22 August 2005 (UTC)
In my opinion, yes, you should discuss this. So be bold!
I worked in studios inserting VIR into the signal. Conceptually, a very good idea. In practice, ALWAYS DONE WRONG. If all TV sets after 1981 implemented VIR, studios would be forced to make it work properly in order to keep the viewing audience. This is one of those few instances where government regulation (mandating VIR in all receivers with hue controls removed) could have helped. Ohgddfp (talk) 03:58, 14 March 2009 (UTC)
Cleanup
The 'Standard' section really needs to be cleaned up. It makes reference to too many magic (arbitrary numbers). The section should either give an overview of how the various important frequencies were choosen, or explain where the magic numbers come from: get rid of 'then one multiplies xyz MHz by 2 and then divides by yzx before adding zxy to obtain the frequency' by explaining where xyz, yzx and zxy come from. And explain WHY one multiplies and divides by these numbers; otherwise you are just doing magicmath.
The section also claims that NIST operates a 5MHz reference broadcast for this purpose. Is this really true? If it is, then 'not a coincidence' needs to be removed (and a reference cited, perhaps?). My gut says that it isn't true because there would be all sorts of problems with the phase (for various technical reasons) of the NIST broadcast at the NTSC transmitter vs. the NTSC reciever. --66.66.33.245 23:59, 21 August 2005 (UTC)
By the way, the 1954 model Westinghouse color TV used NO CYRSTALS. In its place, a manual 'color synchronization' knob was availble for the set owner to turn. If misadjusted, the color would break up into nothing but rainbows running through the picture. So it was easy to find the right setting, which was when a normal color picture appeared. Just like adjusting horizontal sync. Once adjusted, the color sync would become perfect because it locked onto a smaple (transmitted burst) of the subcarrier. In this way, if the studio locked onto NIST, then indirectly, the TV receiver (with or without crystals) would also be frequency-locked onto NIST. Ohgddfp (talk) 04:05, 14 March 2009 (UTC)
Note 1: In the NTSC standard, picture information is transmitted in vestigial-sideband AM and sound information is transmitted in FM.
Note 2: In addition to North America, the NTSC standard is used in Central America, a number of South American countries, and some Asian countries, including Japan. Contrast withPAL, PAL-M, SECAM.
Source: from Federal Standard 1037C
standard (moved here from article)
I removed the section labelled 'standard'. It is too technical for most people, not informative enough, the tone is too 'chatty', and is unnecessary. (Of course, all that is strictly IMHO.) Some of this may want to be moved back so I place it here intact.
525 is the number of lines in the NTSC television standard. This choice was not an accident. The reasons for this are apparent upon examination of the technical properties of analogtelevision, as well as the prime factors of 525; 3, 5, 5, and 7. Using 1940s technology, it was not technically feasible to electronically multiply or divide the frequency of an oscillator by any arbitrary real number.
So if one started with a 60 hertz reference oscillator, (such as the power line frequency in the U.S.) and sought to multiply that frequency to a suitable line rate, which in the case of black and white transmission was set at 15750 hertz, then one would need to have such a means of multiplying or dividing the frequency of an oscillator with a minimum of circuitry. In fact, the field rate for NTSC television has to be multiplied to twice the line rate to obtain a frequency of 31500 hertz, i.e. for black and white transmission synchronized to power line rate.
One means of doing this is of course to use harmonic generators and tuned circuits, i.e. if using the direct frequency multiplication route. With the conversion of U.S. television to color, beginning in the 1950's the frequencies were changed slightly, so that a 5 MHz oscillator could be used as a reference. The National Institute of Standards and Technology (NIST) transmits a 5 MHz signal from its standard time and frequency station WWV which may be useful for this purpose. The 5MHz signal may be multiplied by a rational number to obtain the vertical frequency.
Interestingly enough, when one analyzes how people get the 59.94 vertical field rate, one realizes that it is just 60 hertz multiplied by 1000/1001. Now 1001 in turn has prime factors of 7, 11, and 13, so that when cascading simple flip flop based circuitry it is possible to take a 60 kilohertz reference source and divide it by 1001 exactly to obtain the vertical field rate. It is not a coincidence that NIST operates a radio station, WWVB, that broadcasts a time and frequency standard synchronized to an atomic clock on this frequency, that is, 60 kHz.
If 5MHz is [i]multiplied[/i] by rational number as stated above you get a very high number of megaherz - not the low frequency vertical field rate. There is no rational number which 5MHz can be divided by to give the vertical rate. It seems that as well as the world's worst television system, the USA has the world's worst mathematicians!
59.94005994005994005994005994005994.. = 5000000 / (250250 / 3) --Ohgddfp (talk) 00:27, 18 March 2009 (UTC)
Australia used NTSC?
The list of (former) NTSC countries now includes Australia. AFAIK, Australia only experimented with NTSC before choosing a 625-line system (one possible factor was that the country uses 50-Hz electricity). If anyone can give more background on this, it would be helpful. ProhibitOnions 21:40, 12 January 2006 (UTC)
The UK also experimented with 625 NTSC (but never 405 NTSC as the article states.) Before the UK went colour, the BBC wanted to use 625 line NTSC and ITV 405 line PAL. I have seen pictures from this experiment in 625 line NTSC. One of the arguments was that for the same line standard NTSC gives sharper pictures because the luminance / chrominance interleaving is more perfect, giving less overlap so allowing better use of bandwidth.
Shift of field rate
Corrected explaination of shifted field rate.Reason is to minimize interferences between
AND
Clarification on the 'beat' paragraph?
After reading this article, the first explanation for why there is now a 59.94059..field rate is still unclear. The text that I feel needs improvement reads:
'When NTSC is broadcast, a radio frequency carrier is amplitude modulated by the NTSC signal just described, while an audio signal is transmitted by frequency modulating a carrier 4.5 MHz higher. If the signal is affected by non-linear distortion, the 3.58 MHz color carrier may beat with the sound carrier to produce a dot pattern on the screen. The original 60 Hz field rate was adjusted down by the factor of 1000/1001, to 59.94059.. fields per second, so that the resulting pattern would be less noticeable.'
It's clear that the 4.5MHz audio carrier and the 3.58MHz color carrier will beat and produce a low frequency of 920KHz. But why does this generate interference? Interference with what? How does adjusting the field rate down prevent this interference? I don't know the answers, but I'm sure somebody does.
The 920 KHz beat frequency that may be produced is well within the video frequency range, so if the beat frequency occurs at all, it is displayed on the picture as a pattern of bright and dark dots with roughly 920/15.75=58 dots per line. If this dot pattern were not synchronized with the video frame, its constant motion would be very distracting. By choosing the frequencies as they did, the dot pattern, if it occurs, is:
Comments are closed.
|
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |