Jeg ønsker meg en skikkelig debatt her på forumet vedr 1080p vs 720p... (hvor stor skal en tv / lerret være, før det er noe point med 1080p)
legger ved info fra en link jeg fant på avsforum.com.. og håper noen av dere særdeles kompetente mennesker her, kan kommentere og komme med tanker rundt dette...
1080P: TIME FOR A REALITY CHECK!
by Peter H. Putman, CTS
Thinking about buying a new 1080p rear-projection TV, front projector, or LCD TV? You might want to put your credit card back in your wallet after you read this.
It’s obvious that the buzzword in consumer TV technology this year is “1080p”. Several manufacturers are showing and shipping 1080p DLP and LCoS rear-projection TVs. We’ve seen RPTVs and front projectors with 1920x1080 polysilicon LCD panels at CESA, NAB, and InfoComm. And the trickle of large LCD TVs and monitors with 1920x1080 resolution is turning into a flood.
To get your attention, marketers are referring to 1080p as “full spec” HD or “true” HD, a phrase also used by more than one HD veteran in the broadcast industry. We’re hearing about “1080p content” coming out of Hollywood, from broadcasters, from cable systems, and from direct broadcast satellite services.
The budding format war between Blu-ray and HD DVD for the next generation of high definition DVD players promises the same thing – 1080p content at high bit rates, finally realizing the full potential of HDTV.
Enough of this nonsense. It’s time to set the record straight, to clear up the air about what 1080p is and isn’t.
First off, there is no 1080p HDTV transmission format. There is a 1080p/24 production format in wide use for prime time TV shows and some feature films. But these programs must be converted to 1080i/30 (that’s interlaced, not progressive scan) before airing on any terrestrial, satellite, or cable TV network.
What’s that, you say? Those 1080p/24 could be broadcast as a digital signal? That’s true, except that none of the consumer HDTV sets out there would support the non-standard horizontal scan rate required. And you sure wouldn’t want to watch 24Hz video for any length of time; the flicker would drive you crazy after a few seconds.
No, you’d need to have your TV refresh images at either a 2x (48Hz) or 3x (72Hz) frame rate, neither of which is supported by most HDTVs. If the HDTV has a computer (PC) input, that might work. But if you are receiving the signals off-air or using a DVI HDCP or HDMI connection, you’ll be outta luck.
What about live HDTV? That is captured, edited, and broadcast as 1080i/30. No exceptions. At present, there are no off-the-shelf broadcast cameras that can handle 1080p/60, a true progressive format with fast picture refresh rates. It’s just too much digital data to handle and requires way too much bandwidth or severe MPEG compression. (Consider that uncompressed 1920x1080i requires about 1.3 gigabits per second to move around. 1080p/60 would double that data rate.)
How about Blu-ray and HD-DVD? If either format is used to store and play back live HD content, it will have to be 1920x1080i (interlaced again) to be compatible with the bulk of consumer TVs. And any progressive-scan content will also have to be interlaced for viewing on the majority of HDTV sets.
Here’s why. To cut manufacturing costs, most HDTV sets run their horizontal scan at a constant 33.8 kHz, which is what’s needed for 1080i (or 540p). 1080p scans pictures twice as fast at 67.6 kHz. But most of today’s HDTVs don’t even support external 720p signal sources, which requires a 44.9 kHz higher scan rate.
In the consumer TV business today, it’s all about cutting prices and moving as many sets as possible through big distribution channels. So, I ask you: Why would HDTV manufacturers want to add to the price of their sets by supporting 1080p/60, a format that no HDTV network uses?
Here’s something else to think about. The leading manufacturer of LCD TVs does not support the playback of 1080p content on its own 1920x1080 products, whether the signal is in the YPbPr component or RGB format. Only the industrial monitor version of this same LCD HDTV can accept a 1920x1080p RGB signal.
Now, don’t blame HDTV manufacturers for this oversight. They are only supporting the 1080 format in actual use, 1920x1080i, a legacy digital format that has its roots in the older Japanese MUSE analog HDTV format of the 1980s. That’s one big reason that 1080i has remained as a production and transmission format.
It gets worse. All kinds of compromises are made in the acquisition, production, and transmission of 1080i content, from cameras with less than full resolution in their sensors and reduced sampling of luminance and chrominance to excessive MPEG compression of the signal as it travels from antenna, dish, or cable to your TV.
But that’s not all. To show a 1080i signal, many consumer HDTVs do the conversion from interlaced to progressive scan using an economical, “quickie” approach that throws away half the vertical resolution in the 1080i image. The resulting 540p image is fine for CRT HDTV sets, which can’t show all that much detail to begin with. And 540p is not too difficult to scale up to 720p.
But a 540p signal played back on a 1080p display doesn’t cut the mustard. You will quickly see the loss in resolution, not to mention motion and interline picture artifacts. Add to that other garbage such as mosquito noise and macroblocking, and you’ve got a pretty sorry-looking signal on your new big screen 1080p TV.
Oops! Almost forgot, that same 1080p TV may not have full horizontal pixel resolution if it uses 1080p DLP technology. The digital micromirror devices used in these TVs have 960x1080 native resolution, using a technique known as “wobbulation” to refresh two sets of 960 horizontal pixels at high speed, providing the 1920x1080 image. It’s a “cost thing” again. (Let’s hope these sets don’t employ the 540p conversion trick as well!
To summarize: There are no fast refresh (30Hz or 60Hz) 1080p production or transmission formats in use, nor are there any looming in the near future – even on the new HD-DVD and Blu-ray formats. The bandwidth is barely there for 1080i channels, and it’s probably just as well, because most TVs wouldn’t support 1080p/60 anyway – they’d just convert those signals to 1080i or 540p before you saw them.
The 1280x720 progressive-scan HDTV format, which can be captured at full resolution using existing broadcast cameras and survives MPEG-2 compression better than 1080i, doesn’t make it to most HDTV screens without first being altered to 1080i or 540p in a set-top box or in the HDTV set itself. So what chance would a 1080p signal have?
Still think you’ve just gotta have that new 1080p RPTV? Wait until you see what standard definition analog TV and digital cable look like on it…
Copyright ©2005 Peter Putman / Roam Consulting Inc.
Viser resultater 1 til 20 av 95
Tråd: 1080P vs 720p
10-22-2005, 21:52 #1
1080P vs 720p
10-22-2005, 23:07 #2
A Primer on DVD in a High-Def World; or How Much is that Scaler in the Window?
Dette er bare en videre utdypning av det PStraums skrev, også sakset fra AVSForum.
Direkte link til tråden
PLEASE NOTE: The following is reposted by request to get it at the top of a thread. It was originally written as a reply to Texas_Longhorns who asked for a crash course on "HD DVD" and was really trying to get his head around the very confusing marketing hype currently out there regarding up-scaling DVD players, "high definition" DVD discs, digital connections, "new" HD DVD technologies and the like, and just what the heck people really meant when they bandied about the "high definition" catch phrase. He was planning on buying a DLP with 720 native resolution and was wondering just what would or would not work with that TV, and whether he had to discard his existing DVD library.
Your DLP (when you get it) has a fixed number of pixels that can be lit up to varying degrees of brightness to produce the color image on the screen. This is the "native" 720 resolution of your DLP. [Actually for technical reasons, the "native" resolution is more likely to be 768, but you can ignore that difference for now.]
[EDITED TO CORRECT (2/25/05): Oops! See post #146 below -- thanks to C1courtney. Although many fixed pixel technologies use actual "native" pixel counts that are influenced by what goes on in the computer world -- such as 768 or 788 line count matrices -- DLP displays do not. They use a true 720 line count matrix.]
No matter what you decide to watch, no matter what the resolution, the DLP needs to decide which pixels to light up. The circuit that does this is called a "scaler". The scaler works both ways. It can take a 480 signal or a 1080 signal and produce from it the 720 signal that it requires when lighting up the pixels. It does this using math -- more or less cleverly depending upon the quality of the electronics -- and tries to do it so well that the eye is fooled into thinking no conversion has happened. The scaler often operates even if the display is fed a 720 signal because of that 720 vs. 768 difference I mentioned [EDIT: for some display technologies].
The scaler's job is complicated by two additional factors. First, your DLP is a wide screen TV and some of the content you will be watching is recorded in the shape of a conventional TV. The various stretch and zoom modes of your DLP come into play here.
Second, a conventional TV signal is only sent half a picture at a time. One half contains all the odd numbered lines of the image and the other half all the even numbered lines. This "interlacing" of the picture is due to historical limitations on the cost of the electronics and the available bandwidth for conventional TV broadcasts. Older TVs would paint the image on the screen one half on top of the other in two passes, doing it fast enough that for the most part the eye gets fooled into thinking the whole image is there at once. But conventional TV is not all that great and the eye frequently sees the gaps between the image lines and other artifacts resulting from this 480-interlaced or 480i signal unless the set is adjusted to blur the image to conceal them.
Some years back, manufacturers decided they could sell TVs for more money if they put in circuits to try to make an interlaced signal look better, i.e., if they could do a better job of hiding the damage to the image done by interlacing than just blurring the image. The first of these were "line-doublers" that did just that -- they filled in the gaps between the lines using the data sent in each half image. That was quickly followed by "de-interlacing" circuits that stored the first half image until the second half image arrived and then constructed a FULL image from those two halves and painted the whole thing at once on the screen. The result was a "progressive scan" image which was painted progressively and smoothly from top to bottom without having to go back and do a second interlaced pass, and the resulting signal was called a 480-progressive or 480p signal. Such TVs used digital processing circuits which were just becoming cheap enough for home TV and produced a noticeably crisper image because there was no need to blur the scan lines.
"De-interlace" processing turned out to be surprisingly hard to do well. The two image halves were recorded by the video camera slightly separated in time, and thus they weren't really parts of the same picture but rather a double exposure of a slightly motion-blurred picture. And again for cost and bandwidth reasons (as well as compatibility with old black and white TV), another corner was cut in that the white to black gray scale which makes up the fine resolution detail of the picture was recorded at a higher resolution than the color data, which made it hard to figure out which color to use when reconstructing each part of the image from the two halves that were recorded slightly separated in time.
And the problem is that the math in a scaler depends upon the quality of the de-interlacing. They are both supposed to assume, and be optimized for, "natural moving images" and if the de-interlacing is faulty then the resulting artifacts in the image will mess up the result from the scaler as well.
A typical conventional TV today will also have a "native" resolution even though it doesn't have a fixed set of pixels such as your DLP. For technical reasons that will often be a 540-progressive or 540p signal. Your DLP likely has a native resolution of 768p which the marketing people will describe as being optimized for 720p because that is one of the "standard" resolutions for high definition television. [EDIT: As noted above, this last statement is not correct for DLPs although it is quite often the case for other fixed pixel display technologies.]
Now here comes the content!
Standard definition television, whether it comes from your roof antenna, your cable service box or a satellite, is a 480i signal. This is true even if you have "digital" cable, or subscribe to satellite TV services which are inherently "digital". In fact such digital services often "compress" the signal to fit more channels into the available bandwidth which damages the signal. But since normal off air broadcasts and analog cable are subject to all sorts of other damage -- noisy signals, interference, ghosting, etc. -- the compressed digital signal still looks better which is why people pay for it. But the ORIGINAL signal is still 480i. It contains images which are the shape of a conventional TV -- a "4:3 aspect ratio".
DVDs today are ALSO recorded at 480i. ALL DVDs. Even the extra-expensive ones sold under various marketing names such as "Superbit". DVDs that say they are recorded in high definition mean that better quality equipment was used in digitizing the original film content. But the actual data that gets put on the DVD is 480i nonetheless. In fact it is "compressed" 480i like the digital cable or satellite signals. Too much compression (because the marketing guys want to save space for putting "extras" on the DVD) damages the image. The "Superbit" DVDs and their ilk use less compression -- putting a movie on 2 DVDs instead of 1 for instance -- and thus yield a less damaged image. Additional care is taken in the digitizing process as well. But the data on the DVD is still only 480i. DVDs are also designed for 4:3 shaped images. Various tricks are used to put wide screen (16:9) or "wider than wide screen" cinematic movie content on the DVDs. The most important of these is "anamorphic" enhancement which deliberately distorts the wide image so that it fits in the squarer shape without wasting pixels on the top and bottom as unused black bars. DVD players automatically sense and remove this built-in distortion and produce the original wide screen image with better fidelity because there was no such wasted resolution.
So here comes the 480i signal from your source -- complete with color data coming in at only half that resolution or less. Your DLP now wants to scale it up to 720 [EDIT: or 768p for other common fixed pixel display technologies]. First the signal goes through the de-interlacer and comes out as 480p. Then the scaler extrapolates the extra resolution it needs -- inventing the intermediate pixels by kind of averaging the real pixels around them. And then the pixels light up. If you paused that image it would not look all that great -- certainly not as good as a photograph. But here comes the next 480i frame and the next and your eye does the final bit of processing by smoothing it all together so that you see an attractive moving image. The brain is remarkably good at making a silk purse from a sow's ear here, which is a good thing since otherwise TVs would never have been cheap enough to take off in the marketplace.
The next bit of fun and games comes when DVD player manufacturers figured out they could sell players for more money if they put a de-interlacing circuit in the players. They could then send a 480p signal to the TV. They could do this because their cronies (another division of the same company) producing the TVs set them up to take a 480p signal as input, thus by-passing the internal de-interlacing circuit. The logic behind all this was that the DVDs that were being recorded also include special information as to how to do a better job of de-interlacing which the player could see as part of it's processing to produce an analog TV image from the digital data recorded on the DVD. Thus, and this is the important bit, THE DVD PLAYER COULD DO A BETTER JOB OF DE-INTERLACING!
So now you had "progressive" DVD players that would take 480i DVDs -- the only kind that exist -- and produce 480p analog TV signals. The TVs still had their own de-interlacers to handle regular TV signals (which they needed to de-interlace to make their "line doublers" work).
So people would now pay for TWO de-interlacing circuits, plus the extra profit built into bleeding edge technology. And bleeding-edge it was. The idea was fine but the execution was often dreadful. The net result was that the de-interlacing in the DVD players was often WORSE than what the TVs could do on their own. For a variety of technical reasons, de-interlacing the wide range of DVD content out there is a tough job, and on top of that the engineering was often shoddy. So folks would spend the money for a progressive scan DVD player and then turn off the progressive option and use it like a 480i DVD player because that produced a better picture. The picture was "better" because some data was effectively being filtered out -- discarded -- and thus the TV's de-interlacer produced less noticeable glitches. In fact, many DVD players were having trouble just decoding the digital data on the DVD properly. If you'd like to see the sorts of problems that can occur, check out the remarkably detailed information in the DVD benchmarks section of the Secrets of Home Theater web site.
Nevertheless the progressive scan DVD players had one big advantage -- they sold like hotcakes. Most buyers assumed they MUST be better and thus any problems they were seeing must be due to faulty DVD content and not the players.
They sold so well that everybody else wanted to get in on the act. So now you had cable and satellite boxes and even tape players putting out 480p progressive signals, which was kind of silly unless the de-interlacer in your TV was particularly brain-dead -- in which case your TV probably had a whole bunch of other problems as well. The "better" picture that most people saw with such boxes was almost always just a result of them switching from "channel 3/4" or "composite" video cabling to "S-video" or "component" cabling, both of which include a higher bandwidth signal -- most notably an improved color signal.
Then came "high definition" TV.
From the standpoint of broadcast TV this was a big thing -- the real deal. This was content RECORDED in higher resolution, digital from inception, and in a wide screen aspect ratio. The industry settled on 720p and 1080i as the new broadcast standard resolutions -- not quite up to the eye-candy of the 1080p they were using for studio masters, but still looking much much better than conventional TV and at a cost the market could be made to bear.
TVs were manufactured that could handle these new signals -- originally expecting the signals to be converted to analog signals over component cables and then including true digital inputs (HDMI or DVI) and internal HDTV tuners. Of course since there was limited HDTV content out there, such sets were often used primarily to watch old, boring standard TV. So to help sell the sets, the marketing people thought up the bright idea of saying the sets would convert standard TV to high definition. This was basically a new way of marketing the old line-doubling scalers (nothing really new here) plus some real advantage arising since the display element and electronics were engineered for higher bandwidth signals in case they were ever fed a real HDTV signal -- that is the TVs were built more to studio monitor standards.
As the "HDTV" buzzword grew, DVD content producers wanted to cash in. Of course their DVDs were still only 480i, but never let the facts get in the way of a good marketing campaign. They discovered that 1080p professional digitizing equipment was being used to digitize the film content -- which was then down-scaled to 480i to be put on the DVD. And that was all they needed to know to call their new crop of DVDs "high definition" DVDs. This while they were compressing the heck out of them (damaging the image) to save space to put CD-ROM games on the same DVD. [I'm being a bit cynical here. Major studios have come a long way towards improving the transfer quality on their DVDs and are only occasionally tempted to let the movie be damaged because they know folks are more likely to buy "special" versions that include extras.]
Still with me? OK, we've got two chapters to go.
The next big deal is the "up-scaling" or "up-converting" (a misused marketing term) DVD players. The idea is to get people to buy new DVD players for their new HDTV-ready TVs by doing the same trick they did with the progressive players. I.e., let's put the scaler in the player!
Now remember the TV still needs its own scaler for standard def TV. And fixed pixel displays need a scaler for HDTV as well because HDTV comes in different broadcast resolutions which need to be converted to the "native" resolution of the display.
But heck, if you are going to buy an HDTV-ready TV and "high definition" DVD discs, then you certainly don't want to screw up the vibe by playing them on an old "progressive" player. You want a high-tech, high definition, "up-scaling" player! Just in case you've lost track here, the content on the DVDs is *STILL* only 480i in all this.
The drooling from the hardware guys was so great that it took them a while to hear the screams from the content guys. The folks who make their money selling DVD discs don't want high-res content coming out of the players because folks will just make copies of movies and not buy their discs! The HDTV broadcast networks face the same dilemma but they are already resigned to a business model that makes money by selling commercial time and subscriptions. The DVD guys need to schlep discs.
So the boys in building "A" got together with the boys in building "B" and came up with a solution. We'll allow up-scaling DVD players but only if the high res output is limited to digital connections that we can control with a copy protection scheme. The bosses in building "C" got big grins.
Well it turns out there was a digital cabling standard already in place called DVI. It was used to connect computers to monitors and since HDTV-ready TVs now were built to the high bandwidth and sync-rates needed by computers, many already had DVI inputs so that folks could use them as computer monitors as well.
All that was needed was to clamp a copy protection boot on that DVI input. This rejoices in the name of "HDCP".
An HDCP-compliant source device will refuse to make a digital connection to a display or intervening device which is not also HDCP-compliant. Analog connections will work regardless -- but only at conventional, lower resolutions.
So voila you now had TVs with digital inputs and DVD players with fancy new, up-scaling, high-definition digital outputs. Of course there were some older TVs out there with DVI that was NOT HDCP compliant, but the industry had an answer to that. Buy a new TV. Or use your fancy new up-scaling DVD player just as if it were a previous generation progressive player by connecting it via analog cables at 480p resolution. Since it said "up-scaling" on the box the image must be better, right?
DVI had other problems as well due to it's computer-based heritage. It didn't carry audio for example. So new HDMI cabling was invented to deal with that and to remove some other confusions inherent in DVI. HDMI is, more or less, DVI plus digital audio plus HDCP and with connection standards and protocols more or less attuned to the home theater market.
But all that techy, geeky stuff aside, the big news was that these players could put out glorious 720p or 1080i signals from a DVD disc via those HDMI or DVI connections! "Glorious" here being a marketing term of art. The important thing to remember, the thing I have to keep stressing because I see that buying frenzy gleam coming into your eye again, is that THE CONTENT ON THE DVDs IS ONLY 480i AND NO SCHEME CAN INVENT DATA THAT ISN'T THERE IN THE FIRST PLACE!
The 480i data decoded from the DVD first gets de-interlaced to 480p. Then it gets scaled up either to 720p or to 1080p. If the desired output signal is 720p then you are done. If the desired output signal is 1080i then the signal gets RE-interlaced to 1080i. The TV receives a digital 720p or 1080i signal from the player and SCALES IT AGAIN to match the native resolution of the display.
Urrh why are we doing this?
As the character says in "Shakespeare in Love", strangely enough it all works out.
In fact some, by no means all but some, of these new generation "up-scaling" DVD players produce a significantly better image than the previous generation of "progressive scan" players. Why this is so is due to several factors.
First the scaler in the player may be better than the scaler in the TV. The closer the player can get the data to the native resolution of the TV the less work the TV's scaler has to do.
Second, engineering continues to advance. Other factors than scaling are likely to be better in a good "up-scaling" player than in the previous generation players.
But the most important reasons why folks get good results from some of these new players is that the data stays entirely in the digital domain.
A player connected by analog cabling, such as S-video or component cabling, has to convert the digital data present on the DVD into analog TV signals. It usually does this as the very last thing it does -- in the video output stage -- because it is so useful to keep the signal in digital form for any other processing it needs to do first. The TV set receiving that analog signal ALSO wants to do processing of various forms -- which are done more cheaply, and for the most part better, with a digital signal. So the FIRST thing the TV does is convert the analog signal BACK to digital form.
Now these dual conversions introduce their own problems, but on top of that the conversions usually involve filtering of one form or another so that the signals work well across the widest range of source content -- some of which can be pretty crappy.
But an up-scaling player sends a digital signal to the TV which just leaves it in digital form. Thus no conversion noise and no filtering.
Given all that, it would seem natural that the best arrangement would be to use a digital connection for a *480* signal, and just leave it to the TV to do whatever scaling is needed -- once. Curiously, that is not often the best way to hook things up. HDTV-ready TV's are optimized for 1080i broadcast signals because that's how they are often judged in stores. That, plus any advantage that comes from having a better scaler in the player suggests that having the player upscale the DVD data and then feed that to the TV will give a better result even though a second scaling pass may be needed. There are additional advantages if you watch movies filmed in older 4:3 shape in that the player can put pillar boxes around the movie content without loss of movie resolution because the player is sending a higher resolution signal.
The bottom line is that despite the best efforts of the marketing guys to pull a fast one here, many of the better up-scaling players DO INDEED produce a significantly better image on many HDTV-ready TVs. Some of that is due to the digital connection, but some is also due to the combination of de-interlacing and scaling technologies working well to produce a signal the TV happens to be optimized to display. Combine that with other improvements naturally occurring with each product cycle and you get a better player.
But just as with the progressive players, there are some up-scaling players out there which are nothing but hype. Engineered by the school of shoddy, they are just not worth the money. And there are undoubtedly folks who will buy up-scaling players and find they end up preferring the signal they get hooking the thing up via S-video at 480 resolution, simply because their TV does a better job doing what they paid to have the DVD player do.
For the record, I use a Pioneer Elite 59avi DVD player connected HDMI to DVI and sending a 1080i signal to a Fujitsu P50 (30 series) plasma. And I just love it. It's an up-scaling combo that rocks. [For the techy geekies, yes I meant 1080i and not 720p. Go figur...]
And that's the world of today.
But if you are willing to hang in there for another year, or perhaps two years, the world as we know it will change.
That's because there are two competing technologies lining up to fight the battle to become the new -- TA DAH! -- high resolution DVD format. HD-DVD and Blue Ray differ from current DVD technology in that the data on the disc is actually encoded in high resolution. Such discs won't be compatible with current DVD players. You'll need to buy a new DVD player that truly deserves the moniker "high definition" -- applause from the boys in building "C".
Such a player will output a 720p or 1080i signal -- possibly even a 1080p signal -- but that signal will reflect much more what's actually on the disc and much less the art of the scaler engineer.
Those new players will undoubtedly play conventional DVD discs as well, although they may cut corners on doing that job (despite the high price of the players) since they will really be all about optimizing things so that you'll go and buy new discs.
Which you won't be able to do until there ARE new discs. It's no surprise that the major powers in the battle over which of these two formats will win are the guys who own the movies. Until a sufficient number of new discs hit the stores any such new players will be expensive curiosities.
And of course if you buy a player for the format that loses the war, then you are screwed.
But again, this is for the future. Up-scaling players for conventional DVD discs are the hot item right now. They are at price points from a few hundred bucks to a few thousand. The $1k price point is a real sweet spot right now with big news, new release players coming out closer to $2K, but you can get a fine player for a few hundred if you do your homework. You can also get some real crap, so DO please do your homework.
Hope this helps, and hope you enjoy your DLP when you get it. Just remember that 6 months from now today's electronics will be obsolete, so buy what works for you and enjoy it without angsting too much about what's around the corner.
Last edited by Bob Pariseau : 02-25-05 at 12:40 PM.
Jeg er en total nybegynner på dette området, og skal for første gang investere i et seriøst hjemmeteater/stereoanlegg når vi endelig finner et nytt hus, forhåpentligvis i år.
Har trålet deres forum og avsforumet ++ etter info for ihvertfall ikke være helt tilbakestående når jeg går inn i butikken. Og nei, selv om jeg er nybegynner skal jeg ikke innom Elkjøp og kjøpe noe Eltax shit. :-P
Problemet mitt er at jo mere jeg leser jo mere forvirret blir jeg.
Håper denne tråden kan lette forvirringen min noe.
10-23-2005, 10:56 #3
Jeg leste igjennom den første meldinga du sakset inn (den andre ble for lang for meg).
Det er litt vill vest i HD markedet. 1000-lappene sitter løst hos husfar (og husmor) når det endelig skal kjøpes inn flatskjerm som selvfølgelig skal ha "HD". Men utstyret kommer før innholdet. Det finnes HD materiale, men du må være bevisst for å finne det (Canal+ parabol, D-VHS filmer fra USA, en håndfull wmv-hd filmer som spilles av fra PC).
Filmer lages i et progressivt 24fps filmformat. Oppløsninga er vanskelig å måle, men ideelt sett kunne man ønsket seg veldig høy oppløsning.
Hvis du ser på 1080@24p som kanskje kunne være drømmeformatet så har dette lavere båndbredde enn 1080@60i som vel er standard format. Ved ideell interlace/deinterlace/telecine prosess kan man tenke seg å gå fra 24p -> 60i -> 24p uten tap. Hvis skjermen så kan vise et multippel av 24Hz (Jeg tror seriøse flatskjermer kan justere refreshraten, se forøvrig mitt spørsmål om dette på avsforum: http://www.avsforum.com/avs-vb/showthread.php?t=593534 )
Jeg har blitt fortalt at NTSC DVD med film-innhold inneholder 480@30p videostrøm som blir interlaced i dvd-spilleren. På samme måte var det en fyr som sa at "60Hz" HD-DVD/BR nok ville inneholde 30fps progressivt innhold som så blir prosessert på utgangen for å gi et interlaced bilde med rett framerate. I europa med PAL kan det være litt annerledes. Da høres det ut som om spillerne kan bygges til å outpute 24/48/72Hz progressivt.
10-23-2005, 11:55 #4
Et par ting bare, DLP kommer med 1920*1080 brikke. Så etterhvert kan man få en slik brikke. Og de-interlacingen/inverse telecine av 1080i blir nok bedre og billigere med årene. Det finnes en rekke produkter i 720p og det begynner å komme 1080 produkter i Norge også. Så får vi se da.. om 1080 produktene blir bedre etterhvert
10-23-2005, 12:13 #5
På PS3 skal man jo kunne spille i 1080p via HDMI. Vil maskinen også være i stand til å oppskalere DVD-filmer til samme oppløsning? (Tar det for gitt at den kan spille av DVD da Sony har uttalt at alle BD-spillere vil være bakoverkompitable med DVD)
10-23-2005, 17:02 #6
Det sies at wobulator-teknikken til DLP ikke er "ekte" HD. Men det viktige er jo resultatet, ikke veien. Hvis øyet oppfater det som 1080 x 1920 er det egentlig meg revnende likegyldig om det har vært innom interlaced, vist på en dlp projektor uten "ekte farger" (siden den tidsmultiplexer inn fargeinformasjon fra en enkelt brikke), en interlaced crt eller whateever...
10-23-2005, 21:26 #7
Jeg syns forfatteren av artikkelen skyter seg selv litt i foten. Dagens HDTV sendinger er som sagt i 1080i. For å vise et interlace signal på en lcd/plasma flatskjerm så må dette deinterlaces. Og det er jo akkurat her muligheten til å vise fullverdig 1080p vil ha stor betydning. Om skjermen ikke er i stand til å vise ekte 1080@60p så vil du miste halvparten av bildeinformasjonen i bilder med lite bevegelse.
12-07-2005, 23:17 #8
Hvorfor døde denne tråden egentlig?
Jeg har også tenkt mye på forskjellen (og nytten) mellom 720p, 1080i og 1080p.
Har lest side opp og side ned om dette og (gjerne rett på meg hvis jeg tar feil, men helst bruk mer enn en setning) slik det virker for meg så har det ikke så mye å si på kvaliteten om du velger 720p eller 1080i (ved raskere bevegeige bilder, ikke stillbilder) grunnet progressive scan og interlaced, 1080p derimot vil gi endel høyere bildekvalitet.
Synes denne var meget bra skrevet ang, 720p og 1080i/p: http://www.hometoys.com/htinews/oct0...phtg/1080p.htm
Men er det noen som kan komme med noe mer konkret om skjermstørrelse og avstand fra skjermen før det faktisk vil være noen særlig merkbar forskjell (for "vanlige" folk, ikke gærne AV guruer). Hvor langt må du sitte unna TVen for å se forskjell på bildekvaliteten på en f.eks 37" skjerm med 720p og en 37" 1080p? Hva med 42", 50", osv?
12-07-2005, 23:51 #9
tungt stoff detta
linken din strider jo imot det som står på toppen av siden..
den hevder j at 1080p kombinerer det beste fra to verdener mens den øverste artikkelen hevder at 1080p er mest en jippo..?
uansett interessant å se at 1080i har høyere antall pixler enn 720p standarden, det var ikke jeg klar over.
men helt teoretisk, er det nok for en tv å ha 1920x1080 i oppløsning for å vise 1080p eller stiller det krav til scalerings komponenter osb..?
12-07-2005, 23:55 #10
prøvde med 1080p fra marantz dv9600 til philips sin 37" 9830. den taklet ikke 1080p.... hjelper lite med 1920x1080 panel da..
12-08-2005, 00:10 #11
En annen ting å ta med i betraktningen er tilkobling av PC, PS3 osv. 1080p60 blir nok den foretrukte signaltypen her så jeg avventer med ny TV til jeg kan få en ~40" LCD med 1920x1080 piksler og som takler 1080p60 ihvertfall på HDMI inngangen(e). Få med kontrast på minst 5000:1 og pris under 10k også så hjelper det
12-08-2005, 01:04 #12Opprinnelig postet av jfinneru
12-08-2005, 05:49 #13
Jeg har ikke orket å lese alt dette, men jeg kan jo synse litt for det...
I en overgangsperiode (tre år?) tror jeg man vil bli topp fornøyd med en TV som "bare" takler 720p. Fordi det å oppskalere fra 576 til 720 (768 ) er ikke en så voldsom operasjon som å oppskalere til 1080 - jeg innbiller meg at bildet fra en DVD-spiller derfor vil bli bedre på en 720 skjerm enn en 1080 skjerm. Mange vil nok se mest på DVD også i tiden som kommer.
I tillegg vil man jo kunne spille av mye HD materiale, og i de tilfeller hvor man faktisk får tak i 1080 materiale, blir nok nedskaleringen til 720 veldig bra, fordi det jo er en greiere jobb å fjerne bildepunkter enn å legge til kunstige punkter basert på de som er der. Selvsagt ville det i de tilfellene man får 1080 materiale nok vært enda litt bedre på en 1080 skjerm, men men. Sitter man på tre - fire meters avstand (snakker nå om TV, ikke projektorer), vil man vel kanskje ikke merke noe til de bittesmå detaljene allikevel?
Jaja - litt synsing der altså.
12-08-2005, 08:09 #14
1080p er vel å bra når det gjelder video når vi får Blue-Ray eller HD-DVD, men for TV skal jo informasjonen overføres. Det er betydelig mere informsjon som skal overføres i 1080p enn 720p.
Leste et innlegg i Teknisk Ukeblad forfattet av en eldre teknikker i NRK. Han var meget kritisk til planene ifm utbygging av det digitale bakkenettet. Etter hans syn var hele formålet med dette å kunner overføre flere kanaler (=mere penger til leverandøren) og ikke kvalitet. Det vil nok si at det vil bli veeeeldig lenge til vi ser 1080p TV sendinger. Mye av det samme vil nok gjelde satelitt-overførte signaler. Skulle tro at penga rår der også .
12-08-2005, 10:04 #15
Penga og penga... I et perfekt marked vil vel tilbyderne tilby akkurat den varen kundene vil ha? Vi hadde ikke hatt tv-shop og chatte-tv hvis det ikke var for at noen faktisk bruker disse tingene.
På samme måte vil hd-tv sannsynligvis komme når markedet er villige til å betale prisen. For det koster en faktor på 4 i båndbredde.
12-08-2005, 10:10 #16Opprinnelig postet av John P.
Min 1680x1050 pcskjerm ser veldig bra ut med 720x576 video oppskalert.
Det er et åpent spørsmål hvor stor del av synsfeltet man må dekke før 1080 blir bedre enn 720. Jeg støtter meg til at med min pc-skjerm (nevnt over)
så greier jeg å se hele skjermen samtidig som jeg nyttiggjør meg pixel-størrelsen. Dermed har i alle fall jeg nok "båndbredde" til at store oppløsninger gjør seg. Om jeg har optisk skarphet nok til at hd gjør seg på 5 meters avstand på en 32" skal jeg la være usagt.
12-08-2005, 17:35 #17Opprinnelig postet av knutinh
Det eneste jeg egentelig baserer synsingen min på, er "oppskalering" av bilder i f.eks. Photoshop. Tar man et bittelite bilde og blåser det opp veldig mye, blir ikke resultatet så vakkert. Ganske 'blurry'.
12-08-2005, 19:27 #18Opprinnelig postet av John P.
Om du skalerer et bilde opp til 720p eller 1080p og begge skjermene fysisk er like store så skal bildet i utgangspunktet se likt ut.
Fordelen til 1080p er at hver pixel er mindre, og at du dermed kan kjøre en større skjerm før pixelstrukturen blir synlig.
12-08-2005, 21:40 #19Opprinnelig postet av Ingeniør'n
Det springende punktet hva kvalitet angår i det nye bakkenettet vil være hvor hardt man velger å komprimere signalene. Imo forsvinner hele poenget med HDTV om man raserer kvaliteten med å komprimere signalet ihjel.
12-08-2005, 22:18 #20Opprinnelig postet av no12