So after answering this question a few dozen times, here's the real deal regarding video resolution, it gets a little technical, but don't worry it should be comprehensible to all.
Like all digital based technologies video has been going through some dramatic transitions in recent years, and the pace does not seem to be slowing down. At home we have seen the transition from CRT based televisions to flat LCD and plasma screens. We also are in the final days of transitioning to digital TV (at least in North America), with analog TV broadcasting going off the air in early 2009 in the US.
These last couple year has also seen the emergence of Blu-Ray as the last man standing in the high definition disc wars, and also dramatic drops in the pricing of large screen LCD TVs.
But before looking at video equipment we need to understand some of the blizzard of buzzwords and jargon that surrounds contemporary TV and consumer video. The first of these has to do with HD (High Definition) standards.
720P / 1080i / 1080P
HD is not just one standard; it's many. The numbers 720 and 1080 refer to the vertical resolution (the number of horizontal lines on screen); either 720 lines or 1080 lines. Along with 720 vertical lines one gets a width of 1280 pixels, while the 1080 standard provides 1920 horizontal pixels of resolution.
Put another way, the 720 standard creates images that have 720 X 1280 = 921,600 pixel of overall resolution, while 1080 X 1920 provides 2,073,600 pixels of resolution. Less than 1 million vs over two million. Seems like a no brainer. Well, not quite. There's more to the story, and that's the "p" and the "i" that are appended to the numbers.
Back in the days of analog TV and CRT displays (just yesterday in fact) images were formed on the screen by having an electron gun sweep back and forth across the face of the picture tube. At least 50 images per second on the CRT were needed. However, the bandwidth to transmit a 50fps or 60fps image was just too much for the times. Things were therefore designed so that the gun would "paint" the image on the screen twice for each frame – first the odd numbered lines, then the even numbered lines. Each pass was called a "field" and two fields made a frame. This is called interlaced TV, and is what the "i" in 1080i stands for. Interlace is therefore an analogue compression scheme that allows the motion of 60fps to be transmitted in the same bandwidth as 30fps. All analogue TV broadcasting is interlaced.
Because of something called the "persistence of vision" (the human eye hangs onto what it sees for a small while) these two fields merge in our brains. Incidentally, it's this persistence of vision that allows us to see a 24 frame per second movie as continuous motion rather than a series of flickering still images.
But our computer screens, digital HD TV, and all LCD and plasma screens don't use interlacing. This is old tech. Instead they use what is known as progressive scan, which is what the "p" in 720p and 1080p stands for. We get the full frame all at once, rather than 50% (a field) at a time.
For static shots, or ones with little rapid motion of camera or subject, there's no great advantage to progressive over interlaced, and since 1080 provides twice the apparent resolution of 720 most people should find it preferable.
But (and it's a big "but") when there's fast camera or subject motion an interlaced image tends to blur. The subject moves between fields, and even when the camera is shooting at a high shutter speed, this leads to unpleasant blurring.
This is the reason why there are two broadcast standards for HD TV in North America; 720P and 1080i. ABC, ESPN and FOX television, which do a lot of sports coverage, have gone with 720P, based on their testing – which shows that even on a large screen the increased clarity of 720p's progressive scan trumps the increased resolution of 1080i. NBC and CBS have gone with 1080i. There is no broadcast 1080P because it would simply use too much over-the-air bandwidth, and is beyond what the MPEG2 and US broadcast standards allow.
Which brings us to a controversial point relevant to video consumer shooting. Most people assume that 1080i must be better than 720p simply because it has more than double the resolution. Not!
In fact, it is almost impossible to see the difference between the two in terms of resolution, even on a 50" 1080P display at normal viewing distance. What can be seen though is the difference between interlaced and progressive, especially when there's fast motion in the frame.
Note that in terms of bandwidth, 1080i and 720p require almost the same amount of data. With 1080i it is being allocated to spatial resolution, while with 720p is it allocated to temporal resolution.
Note as well that when a TV or monitor or software program deinterlaces an "I" format video, it reduces the vertical resolution by half.
Another thing to keep in mind is that unless you have at least a 30" display screen for video editing you're going to have to down-res the images size on-screen to be able to see it all. On the other hand, 720p fits nicely on even a 15" laptop screen, handy for 1:1 viewing when editing in the field.
The moral of our story is that while the equipment makers will try and sell you cameras that shoot 1920 X 1080, in reality you'll likely end up shooting 720P after doing your own testing. The exception to this is higher end cameras which shoot 1080P, the best of both worlds, and the current holy grail of video.
I hope this clarified some stuff about all the different TV resolutions out there.For interlace to work, the image must be vertically filtered to avoid what is called "interline twitter", an annoying flickering that occurs. This is done in camera by row pair summation on the interlace scan readout. This reduces vertical resolution to around 70% of the number of horizontal lines, and increases sensitivity to light / reduces noise at the same time. This means that a 1080i image from a camera should have "about the same" measured vertical resolution as a 720p camera.
On fast moving images, progressive is clearly superior. On slow or static shots, the advantages of 1080i over 720p are negligible, especially as many 1080i cameras only record 1440x1080 rather than 1920x1080. However, many 720p cameras only record 960x720.... And most HD 1080i broadcasts only transmit 1440x1080 anyway...
The other piece of the puzzle is that progressive images compress cleaner and less artifacty than interlaced images, meaning, given the same bandwidth, you'd probably get a cleaner image from 720p than 1080i.
In the end, I tell my students to render out at 1280 x 720. This makes it a nice HD format, works easy for both Maya or Flash. Not only will it look nice on their DVD show reels, but it can be shrunk down and compressed for video streaming on the web too, and even re-rendered at a comfortable 640x360 pixels for resolution, 5000 kbps for the data rate, 60 MB in file size, with an H264 codec... just for an example.
No comments:
Post a Comment