PDA

View Full Version : Upconvert DVD Players


urluckyday
04-10-2007, 04:22 PM
So, with that recent poll "Blu-Ray vs. HD-DVD," I had a question. I've been seeing this in recent Best Buy ads. They're called "Upconvert DVD Players." They supposedly make your current DVDs move up to like 1080i (or specified resolution). I was wondering if these things really work well, or if they are crap. I'm seriously considering getting one to watch things like Star Wars on an HDTV without any loss of quality. Anyone know?

Astrotoy7
04-11-2007, 12:00 PM
dont buy it !!! Upconvert is absolute rubbish!

DVDs are authored as .vob compilations. They are mpeg2 with PCM or AC3 etc Audio streams. Their max resolution is 720x576. A dual layered DVD can fit a high bitrate(higher picture qulity) version of a film at that rez for 2 and a bit hours. The more you compress and lower the bitrate - the more you can fit on.

1080p is 1920x1080. Far and beyond ****ty DVD rez. HD DVD and Blu ray are far better than the standard DVD format, but simply due to its massive capacity, Blu Ray is a technology that is ahead of the market trend.(not many of us have Ultra Hi Def monitors) However - thinking about numbers and what the future holds - here's some great figures(courtesy wiki and elsewhere) The way things sit at the moment, The advantages of Blu-Ray will particularly be advantageous fo rgaming devs(for the ps3 obviously) and the PC, eventually :p

Ultra Hi-Def Video (UHDV) currently has a broad definition, and encompasses anything over 1080, so anything from up to 4320p !!

*Eighteen minutes of uncompressed UHDV footage consumes 3.5 terabytes of data and one minute of footage consumes 194 gigabytes (2 hours of full length movie will use roughly 25 terabytes of storage).

*A 12 cm Holographic Versatile Disc at 3 micrometer separation of different colored tracks (with a capacity of 3.9 TB) would be able to store roughly 11 hours of MPEG-2 or 22 hours of H264 or VC1 compressed UHDV, compared to the 18 and a half minutes of uncompressed UHDV.

*An eight layer Blu-ray disc (with a capacity of 200 GB) would be able to store approximately 36 minutes of MPEG-2 compressed UHDV, or 72 minutes of H264 or VC-1] compressed UHDV, compared to the one minute of uncompressed UHDV.

*A 50 TB protein-coated disc (PCD) would be able to hold over 94 hours of H.264/AVC/VC-1 compressed UHDV, but generally that would be unnecessary, for a 50 TB PCD would be able to hold four hours of uncompressed UHDV. - (PCDs are an experimental technology pioneered by Harvard University).

thus ends todays lesson :p

mtfbwya

Negative Sun
04-11-2007, 04:02 PM
It sounds like selling hot air in the desert to me...Like Astro explained.

urluckyday
04-11-2007, 05:40 PM
Well, that was a nice lesson and all Astro, but it didn't really answer my question too well...like...I'm asking if they really work or not...basically looking for firsthand experiences...but you obviously did your research ;)

Negative Sun
04-11-2007, 06:00 PM
That did answer your question actually, there's no way you won't experience a loss of quality since the format is compressed too much to be fully displayed on a 1080i HD TV, therefore any piece of technology that claims it can turn your regular DVDs into HD DVDs is total rubbish ;)

urluckyday
04-11-2007, 08:13 PM
Well, isn't it just an HDMI interface? It's basically like the Nintendo Wii HDMI deal. Like it seems like all the upconvert player does is uncompress it, so that it looks better on an HDTV. (Oh and btw...this is just what I've read...I'm definitely no expert)

urluckyday
04-11-2007, 08:16 PM
Just found this...kinda goes both ways in the argument I guess...

http://www.tech-faq.com/hd-upconverter.shtml

Q
04-11-2007, 11:38 PM
This type of thing has been around for a few years now. The first up-converting DVD players that I can remember up-converted the signal to 720P, and that was at least 3-4 years ago. If I'm not mistaken, several if not most current-model HDTV's have this type of tech built-in, so I doubt that buying a special DVD player would be necessary if your HDTV has it already.

Negative Sun
04-12-2007, 06:35 AM
Well that makes sense, though IMO I wouldn't get an HD TV unless I could watch HD DVDs or other HD media on it, and if you ask me, I'm not really too bothered about the whole HD hype altogether, I still play my video games on my "****ty" 15" old school TV, and it's fine with me, as long as the game's good, I'm not too bothered about whether it's in 480p or 4800i

A nice big screen is cool for the living room, but that's about it.

I'll probably switch to HD when it's less expensive and more mainstream (by then a new generation of TVs will already exist no doubt)

urluckyday
04-12-2007, 08:51 PM
Negative Sun...you'd be a good Nintendo representative...lol...and you're right too.

Astrotoy7
04-13-2007, 05:57 AM
s...but you obviously did your research ;)

lolz...I build home theater PCs as a second job now, when I go to bed I count 1080 "i"s and "p"s to get myself to sleep :p

all upconvert does is have a GPU on board that adjusts output resolution and adjust interlace settings. It doesnt and CANNOT do anything to the datastream that has been BURNT INTO A DISC WITH A FREAKIN LASER !!! :p

As Qliveur mentioned, monitors and most HDTVs emply progressive scan now, which goes some way in reducing video artifact(as opposed to interlacing) but theres only so much you can do with a format stuck at 720x576.

Please legowar! Do old uncle astro a favour and put this out of your mind. You're much better off getting a nice monitor or HDTV :) I was about to say LCD but noticed "Q" is around :p

summary:
upconvert = bollocks.

mtfbwya

urluckyday
04-13-2007, 02:53 PM
I have an HDTV...I just don't wanna spend like $500-$1000 bucks on an HD player...and I also don't wanna have to pay 30 bucks to take advantage of it by buying one of the twenty available movies out there...that's the reason I ask.

Negative Sun
04-13-2007, 04:50 PM
lolz...I build home theater PCs as a second job now, when I go to bed I count 1080 "i"s and "p"s to get myself to sleep :p
What's better btw, "i"s or "p"s?

urluckyday
04-13-2007, 04:56 PM
My friend told me this:
"the "i" stands for interlaced which means the horizontal lines of pixels are drawn on the screen every other frame, the "p" stands for progressive which is where the entire picture is displayed every frame"

I really had no idea what the difference was, but there you go.

Astrotoy7
04-16-2007, 08:48 AM
My friend told me this:
"the "i" stands for interlaced which means the horizontal lines of pixels are drawn on the screen every other frame, the "p" stands for progressive which is where the entire picture is displayed every frame"

I really had no idea what the difference was, but there you go.

great description legowar :) Basically the interlacing(horizontal lines) are their to reduce broadcast bandwidth. Post processing can take interlace lines out but this doesnt make it full HD quality if the content isnt originally broadcast or encoded as 1080p.

astro

Q
04-17-2007, 01:24 AM
I was about to say LCD but noticed "Q" is around

Hey! I admitted defeat in our "CRT vs. LCD" debate.:xp: I'm now in search of a good deal on an AS-IPS monitor, if you must know.

Besides, what other choice do we have? If you want a new 1080p TV or monitor, you'll absolutely have to get an LCD, since CRT's are no longer produced, 1080p plasmas have yet to appear (and will be outrageously expensive when/if they do), and DLP's (as well as any projection set for that matter) simply suck.

milo
04-17-2007, 09:36 AM
My friend told me this:
"the "i" stands for interlaced which means the horizontal lines of pixels are drawn on the screen every other frame, the "p" stands for progressive which is where the entire picture is displayed every frame"

I really had no idea what the difference was, but there you go.
I think progressive scanning gives you more frames per second, but I'm not positive. Anyone care to confirm or correct that?

Ray Jones
04-17-2007, 10:21 AM
Interlaced mode gives you only half of the lines per picture/frame e.g. odd lines in picture 1, 3, 5 and so on, and the even in picture 2, 4, 6 and so on

What happens is basically like this:


interlace mode
Frame 1 Frame 2 Frame 3 Frame 4 Frame 5 Frame 6 WHATUC
Line 1 1 1 1
Line 2 2 2 2
Line 3 3 3 3
Line 4 4 4 4
Line 5 5 5 5
Line 6 6 6 6


non-interlace mode
Frame 1 Frame 2 Frame 3 Frame 4 Frame 5 Frame 6 WHATUC
Line 1 1 1 1 1 1 1
Line 2 2 2 2 2 2 2
Line 3 3 3 3 3 3 3
Line 4 4 4 4 4 4 4
Line 5 5 5 5 5 5 5
Line 6 6 6 6 6 6 6


Interlace mode produces a more unstable picture because it only displays half-pictures of the real picture at once. The aim is to reduce bandwidth because it needs only half the information (of the whole image) per displayed frame. Progressive mode produces "whole" pictures per frame and thus is almost flicker-free and more easy on the eyes in general. You can have a REAL non-interlaced mode, where every frame contains the full image, or a "FAKE" non-interlace mode, which in simple terms just takes two interlace frames and puts them together for one full image. A technology like that is used for 100Hz TVs, for instance.

Anyway, the number of *real* frames displayed per second is not dependent on interlace or non-interlace.. m'kay? :)