Progressive Vs. Interlaced Scanning

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Karma
    Senior Member
    • Nov 2005
    • 801

    Progressive Vs. Interlaced Scanning

    HI All,
    We all grew up with interlaced video. Even now, most TV's use interlaced horizontal scanning systems. It is the system I am familiar with. A few years ago I started hearing about progressive scanning monitors mostly related to computers. I understood that the advantage to progressive scanning was to reduce blinking displays by using higher horizontal scan rates.

    But now I hear about progressive scan being used in TV specifically with HD in the format of 1080P which results in a number of unspecified advantages. It is being presented as the video equivalent of the the holy grail. Can you all explain the advantages of progressive scanning in television and how it differs from interlaced scanning? I must confess that I have never had a complaint with interlaced scans. So, what does progressive scanning buy? Does it result in major, minor, or none at all, performance advantages?

    Thanks, Sparky
  • aud19
    Twin Moderator Emeritus
    • Aug 2003
    • 16706

    #2
    Here's a couple helpful links for you to read through:





    The short answer is that more resolution is never a bad thing, however it's only one link in the chain of good quality video. In other words a well built, properly calibrated 720p/1080i display and source will look better than a poorly built/calibrated 1080p display/source. Also if the media is 1080p but output and received in 1080i and properly de-interlaced back to 1080p (ie current gen Sony SXRD) there should be no noticeable difference between it and the original 1080p signal.
    Jason

    Comment

    • NonSense
      Senior Member
      • Nov 2003
      • 138

      #3
      In short, progressive scanning was probably the way television was meant to be viewed. It just wasn't practical back in the day. So we carried around the (RS-170A) NTSC legacy until there was a point to make a clean break. The advent of DVD and digial broadcast televison. Ah! free at last! Blame the masses who keep their TV's for 15 years, as there is something about backward compatability the seems to impede progress.

      To answer your question, back in the day when flyback circuits, chroma bursts, equalization and serration pulses all made a difference, The good ol' cathode ray tube still had a few tricks to learn as well. One of the problems was the persistance of the phosphorus lining the inside of the screen. With only a single electron gun performing the refresh, the best performance was had by refreshing half the vertical video lines at twice the refresh rate. (Not quite that simple, but good enough) All the odd number lines are refreshed on one pass (odd field) and all the even lines (even field) are refreshed on the next pass. It also has the benefit of a reduced bandwidth, as you only need to send half the amound of picture information. (60 frames, but only 30 frames of new information) I think for most people, their eyes only require approx 15 frames to second for minimally smooth motion video.

      With interlaced, scene changes or high action/sports shots you may have successive fields with dramatically different video information. With the current technology, it is easy to refresh the entire sceen. Displays like TFT LCD's will refresh all the pixels anyway. With digital compression, bandwith is less of a problem. (Over compression on broadcasts is a different issue altogether. I won't even get started!)

      To compare loosly with computer LCD monitors, the resolution will offer the most improvement. When you move from 480 to 720 to 1080 vertical lines of picture you are getting better resolution and you will see a significant difference. As you move from 480i to 480p, 720i to 720p and 1080i to 1080p you will (theoretically) improve sharpness as the entire picture will be contiuously scanned with no interlacing artifacts. The responsiveness should improve as you will be receiving twice as many full frames of video per second. I have not seen a 1080p demonstration, but for me the change for example from 480i to 480p on the DVD player is minor compared with the change from 480p to 1080i received from the digital cable receiver. With a proper source, source material and good diplay device, I'm sure the improvement can be seen. The people who usually gain the most are those with very large diplays or projectors where the pixels are significantly larger.


      This more important issue is that you choose a set that will be capable of the next standard. Right now 720p and 1080i are common for both cable and satellite HD broadcasts.

      Hope this helps
      Bruce

      Comment

      • Karma
        Senior Member
        • Nov 2005
        • 801

        #4
        HI Bruce,
        I can understand the importance of more resolution. But, just as in digital photography, pixels per inch is an indication of resolution that does not equate directly to picture quality past a certain point. Other things, like the ability to crop then enlarge, can become the driver for more resolution. But that's just digital photography where I have more than a little experience.

        For example, I use two Nikon D2H digital SLR's for most of my work. These cameras use 4.2 mega pixel sensors. By todays standards this is a low pixel count sensor. Believe me, I can afford a D2X with its over 12 mega pixel sensor. But, for the type of photography I do, I don't need the advantages of the the large sensor. Under all but the most extraordinary circumstances, my small pixel count cameras produce stunning results indistinguishable from the the D2X. And I have small files that are easily processed in Photoshop.

        So far, the two responses only deal with theoretical advantages with progressive scanning having to do with issues where I have never noticed a problem. If these are the only advantages I still don't see the obsesive move to prograssive systems. It seems like you all are simply taking someone's word that it is markedly better.

        So I repeat: what and WHY?????? Is not lines of horizontal resolution still the primary indicator of resolution (plus video bandwidth) independent of the scanning system used to display those lines?

        I must be missing something important because I still don't get it.

        Since this thread is not getting much action I wonder if this indicates that many others are confused about this issue and technically based answers by experienced observers are rare.

        Sparky

        Comment

        • aud19
          Twin Moderator Emeritus
          • Aug 2003
          • 16706

          #5
          I actually thought I summed it up pretty nicely there Sparky....and did you read through those links? They answer a lot of questions.

          Anyhoo... the biggest advantage progressive has is with fast motion. It will render it more clearly and without possibilities of jaggie edges caused by interlacing. In short, progressive will always be better/clearer compared to it's interlaced counterpart as long as everything else is of equal and good quality in the display chain (source, cables, inputs/outputs, scaler, optics etc).
          Jason

          Comment

          • Brandon B
            Super Senior Member
            • Jun 2001
            • 2193

            #6
            It's pretty simple really. If you are talking about displays, it is not an "obsessive move". Basically every single display technology is progressive by nature, except CRT. And since CRT is going away, probably completely (except niche uses) in this decade, it makes sense to abandon interlaced formats except where bandwidth or backwards compatibility requires them.

            The problem is decent de-interlacing processing is still a bit of an art and requires some smarts and $ to achieve right, so a lot of products don't manage. As soon as someone throws a cheap and really good solution out there (seems ike it's coming soon between Gennum and HQV), it will be a complete non-issue.

            BB

            Comment

            • Karma
              Senior Member
              • Nov 2005
              • 801

              #7
              HI aud19,
              Yes, I did read the links you provided. Thanks. However, the articles only state opinions and provides no technical backup. I want to know WHY progressive scanning is considered better from an engineering point of view. From what I could gather from the articles, I could not tell if the authors had even seen a full out prgressive system. I could get the same information from a saleman. That's not what I am after.

              I don't see any intrinsic reason for progressive to be better than interlaced scanning except for motion, not resolution. But I have never noticed motion to be a problem so that seems a theoretical benefit rather than a practical issue. Computers and TV are two very different cases because computers are not dealing with channel bandwidths and FCC requirements. Basically, the computer manufacturers can do anything thay want. Not so with television.

              Here is the issue it seems to me. Adoption of interlaced scanning had nothing to do with CRT technology. CRT's and the associated scanning circuits can perform progressive scanning with no trouble. Rather, the problem which caused interlaced scanning to be adopted was the available video bandwidth which was limited by the way channel bandwiths were assigned by the FCC. It was a trick to permit more channels to be made available in the very crowded commercial RF space.

              I also don't understand how resolution plays into it. For example, my Hitichi HD plasmas native mode is 1080i. I know that the HD standard allows broadcast in 720p or 1080i. I have a copy of the FCC technical regulations and it definitely states that both systems are allowed. It seems to me that 1080i has the greater resolution because there are more horizontal lines than 720p. Granted, the 1080i framing is slower due to the alternate frame scans. But does this reduce the resolution of 1080i?

              Thanks for your answers but I want to know more. I am certain we are just playing around the edges of the question which has more profound answers.

              Sparky

              Comment

              • aud19
                Twin Moderator Emeritus
                • Aug 2003
                • 16706

                #8
                I think your a bit confused. Resolution is resolution and interlacing/progressive scan do not change the resolution. And actually Brandon is right in that early CRT TV's could not progressive scan which is where the system came from, the lower bandwidth was just an added "bennefit" to broadcasters. Now that all new display types are naturally progressive, there's no reason to continue interlacing. As for you not noticing a difference between interlaced signals and progressive one's, my guess is that, good or bad, that's due to the equipment you're using, not the picture format. Also, you may just not know what to look for

                For example your plasma probably has a native resolution of 1280x768p (or something similar). Everything being sent to the set is being converted to that resolution which can involve any amount of scaling up or down, deinterlacing etc. So if you're TV does an exceptional job of downscaling and de-interlacing 1080i to your sets native res than you might not notice a difference. Conversely if it does a poor job with 720p that may be why you don't notice a difference. Also you have no way of actually testing 720p right now because if you're feeding it from Shaw's Motorola 6412, that box is default to send 1080i only regardless of what the staion uses and most of them use 1080i anyway. Not to mention regardless of what your 6412 is sending, as I mentioned your plasma has to re-scale it anyways. I won't even delve in to DVD :lol:

                Unless you had a 1280x720 display being fed an actual 720p signal and a equally calibrated 1920x1080i display being fed a 1080i signal, you have no real way of comparing them directly.

                So while an interlaced format can look as good as a progressive one if it's properly deinterlaced and scaled, (a BIG if), a progressive signal of equal resolution will always look as good if not better, particularly with motion.

                Here's some extreme examples of interlaced vs progressive video:

                Jason

                Comment

                • Brandon B
                  Super Senior Member
                  • Jun 2001
                  • 2193

                  #9
                  Originally posted by Karma
                  For example, my Hitichi HD plasmas native mode is 1080i.
                  No it is not. Your Hitachi's front end processing can only accept 1080i. It is converting any and all video information to progressive at the native resolution of your display. Your plasma does not refresh odd rows of pixels, then even. They are all refreshed across the entire display for each frame.

                  This front end processing was built to a price point. It is relatively certain that this price point precluded de-interlacing quality on par with, say, a $3000 external scaler/de-interlacer.

                  Now if your plasma could be designed with the assumption that it will always be fed a progressive signal, it would only have to scale, not de-interlace as well. This means for the same expenditure of resources, i.e. the same price point, you would very probably end up with a superior picture.

                  So if we end up in the near future with predominantly progressive sources, where interlaced is only a niche issue for legacy stuff, we will see higher quality images at similar price points, all other things being equal.

                  BB

                  Comment

                  • NonSense
                    Senior Member
                    • Nov 2003
                    • 138

                    #10
                    Originally posted by Karma
                    Computers and TV are two very different cases because computers are not dealing with channel bandwidths and FCC requirements. Basically, the computer manufacturers can do anything thay want. Not so with television.
                    This is not really true. Television manufacturers and cable operators have alot of flexibility, within the rules of their liscense. But they have chosen to adopt a standard. From a manufacturing and marketing perspective, system compatability and lower manufacturing costs are keys to success. NTSC (National Television System Committee formed in 1953) was a consortium of groups from both manufacturing and the broadcast industry which decided a standard would be a good thing for everyone. Their mandate was to decide which technology was to be adopted as the color television broadcast standard. And can you believe it? They used a technology that allowed backward compatability with Black and White interlaced televisions that were currently being used, to keep all their customers happy while they introduced a new technology. (Their technology allowed the separation/extraction of the color information without interfering with the signal content required to generate the black and white television signal. The separated signals are known as Lumanace (B/W intensity) and Chromanance (color content)) Since SAW filters and DSP's were not availble back in the day, an ingenious analog waveform was devised to allow simple passive filters (and PLL's) to do the job of extraction.

                    The standard channel line up recommended allows manufacturers to preset the tuners independent of the region in north america. However, in the days before cable ready TV's, cable operators supplied settop boxes to tune cable channels which were different than standard UHF/VHF broadcast channel assignments. Many operators used a non-standard frequency allocation known as HRC (harmonic related carriers) to reduce a by product of mixing known as composite tripple beat. This has now fallen out of favor as cable ready tuners found in most tv's are preset to the standard freqency allocation. Driven mainly by manufacturing vs. any regulatory body.

                    Much like the computer industry who in fact are faced with meeting their fair share of standards as well. If you buy a new computer monitor, it must work with the standard xVGA signal which is well documented, as well as many other lower resolutions to ensure backward and industry compatability. This is the same with countless other standards such as ISA, PCI, AGP, DDR RAM, ATA, SATA etc... which all have committees from various areas if the industry. However the computer industry has been much more successful at educating their consumers with the need to adopt new standards in order to improve their products.

                    Originally posted by Karma
                    Here is the issue it seems to me. Adoption of interlaced scanning had nothing to do with CRT technology. CRT's and the associated scanning circuits can perform progressive scanning with no trouble. Rather, the problem which caused interlaced scanning to be adopted was the available video bandwidth which was limited by the way channel bandwiths were assigned by the FCC. It was a trick to permit more channels to be made available in the very crowded commercial RF space.
                    Sparky
                    This is not true. In 1947 when the industry was taking off, interlaced scanning was absolutely based on technology limitations. When EIA (the Electronics Industry Association) created the RS-170 standard for Black and White composite video, interlacing was used to minimize flicker due to the refresh rate. Also, matching the frame rate to the number of cycles in the power system, reduced the interference from poor power filtering. (Note, PAL and Secam systems that operate at 240V/50Hz use 50 1/2 frame interlacing schemes)

                    Back in the day, there was absolutly no issue with crowded commercial RF space. If the one TV broadcaster in your area needed to use 30Mhz bandwidth for TV, I'm sure he would have gotten a liscence. Digital on the other hand has been driven by bandwidth limitations. Many cable systems have trunking systems limited to 450MHz or less. The capital cost to increase the bandwidth on these systems for the hundreds of channels needed to compete with satellite broadcasters would be well beyond their means. With digital compression, you can fit several digial channels within a single analog broadcast channel. Therfore increasing channel capacity with minimal capital costs by replacing analog channels with digital.



                    Some of the other posters have given great explainations regarding the technological reasons. Since most advanced displays, LCD, DLP etc. (except CRT's which have a short remaining life) use prograssive scanning techniques now, the broadcast standards are adapting naturally.

                    With new didtal display technology that have native progressive scan capability (that has been used with PC's for a long time), the TV front end
                    is required to do more work to remove the interlacing then if the broadcast was sent as a progresive scan mode. Therefore it is natural to move to progressive scanning for reasons not only assiciated with the picture quality which may only be marginally better under certain panning or action shots.
                    Bruce

                    Comment

                    Working...
                    Searching...Please wait.
                    An unexpected error was returned: 'Your submission could not be processed because you have logged in since the previous page was loaded.

                    Please push the back button and reload the previous window.'
                    An unexpected error was returned: 'Your submission could not be processed because the token has expired.

                    Please push the back button and reload the previous window.'
                    An internal error has occurred and the module cannot be displayed.
                    There are no results that meet this criteria.
                    Search Result for "|||"