|
HD-DVD and Bluray will only work with HDMI or DVI!!!
|
|
diabolos
Suspended due to non-functional email address
|
15. June 2006 @ 03:34 |
Link to this message
|
Yes but it should be worded a little better.
All HD-DVDs and Blu-ray movies are controlled by the movie studios. If they decide to turn on the ICT (Image Constrant Token) then any HD capable connection that doesn't support HDCP (ie Component Video, HDMI v1.0, and some DVI-D connections) will be downresed to 540p (960×540).
But for now it seems that the movie studios are going to play it smart and keep the ICT turnned off so that early adaptors that bought tvs that don't have HDCP complient connections won't be disenfranchised.
Ced
|
Advertisement
|
|
|
Junior Member
1 product review
|
15. June 2006 @ 04:39 |
Link to this message
|
CED
I thought I was beginning to understand HDTV & HDDVD,
until you said remember 1080p and 1080i have the same
resolution.I thought 1080p was the best of the best?
Is it what your tv is capable of that makes the
difference?
My tv is the Sony kfe-50A10 and is capable of 480p,
1080i and 720p.
Also,someone stated regarding the new Toshiba HDDVD
player that the output resolution should be set at
1080i regardless of your display resolution and here's
why:
1.) For a 1080i display: if you set the player res. to 1080i
the following takes place:
1080p>1080i to your 1080i display
2.) For a 720p display: if you set the player res. to 720p
the following takes place:
1080p>540p>720P>to your 720p display
3.) For a 720p display:if you set the player res.to 1080i
the following takes place:
1080p>1080i to your tv and it converts the 1080i
to display 720p
Conclusion :by setting the res.(of the player)to 1080i
you lose less data than upconverting from 540p.
So here's my question,assuming that the above is correct,
which setting will give me the best PQ #1 or #3 ?
I would assume #1 because there's the least up or down
converting.
Thank You
DVDDIVA
Sony HDTV KF-E50A10
PS3 ( 60gb)
Toshiba DVD Player SD-4980SC ( upconverting)
|
diabolos
Suspended due to non-functional email address
|
15. June 2006 @ 05:25 |
Link to this message
|
A couple of things...
Quote: My tv is the Sony kfe-50A10 and is capable of 480p,
1080i and 720p.
Your tv is only capable of 720p (1280x720 Progressive scan). It can except 480i/p, 720p, or 1080i but it only can display 720p because its native resolution is 1280x720.
search link at bottom for more info on fixed pixel displays
A reviewer from Sound and Vision mag suggests keeping the toshiba/rca HD-DVD player set to 1080i output because the HD-DVD player doesn't scale 1080p to 720p very well. Meaning your tv will probably do a better job downconverting 1080i to 720p than the HD-DVD player would at downconverting 1080p to 720p.
As far as video processing, it depends on how good your video processors are. Most cheap video processors do that little trick because it uses less resources (is cheaper to produce and implament). Good video proccessing doesn't involve converting to 540, they go stright from 1080 to 720 (or whatever).
Quote: I thought 1080p was the best of the best?
It is, at 60fps, but at movie frame rates of 24fps 1080i (1920x1080 interlaced) can preform just as well because all of the original Progressive frames can be reconstucted from the interlaced frames using the video proccessing technique known as 3:2 pull-down.
For more on that see this page...
http://forums.afterdawn.com/thread_view.cfm/344032
Ced
|
Senior Member
|
15. June 2006 @ 06:49 |
Link to this message
|
I understand what they are trying to say, but I can tell you right now, it isn't correct. First off, limiting movies to "540p" which is not a real resolution of HD, downscaling is possible, but not all players and tv's will be able to do it correctly, which means that some people's equipment will be completely unable to play films with the 'ICT' flag. It's a wonderful idea on Hollywood's part, but in reality; it isn't going to work for most consumers. And seeing as they have announced similar trends with Blu-ray, like their intent to make component an impossibility(Right, that'll happen.) I doubt this will go much further.
Bottom line, if they want to use some kind of color correction, scrambling, extra signal, etc, etc. They'll need to make it compatible with everyone's television, or no one will puchase their discs, and if a studio suddenly sees retailers stop ordering their discs, they'll be forced to make a change. It's inevitability.
Will discs really launch with these odd, limited standards? Possibly. Will they last? Probably not. Bottom line is that when I, or any Joe blow go down to target to pick up a copy of Matrix 4(Oh please no, not another one), I(we) only intend to pick up a copy of Matrix 4. I(we) do not want to buy a new television or player that will specifically support a method of actually DOWNGRADING the image, so that the guy in charge can sleep at night, knowing I haven't copied his new HD-DVD. If it really comes down to that, I'd rather just pick up the regular DVD, because it's probably slightly cheaper anyways.
Besides, the pirates who copy these films, the people who share them, I think it's a pretty good guess that they aren't going to be connecting the cables from their HD-DVD player to their vcr or computer, they're more likely to insert the disc into their computer and bypass the protection. And the downscaling method mentioned would serve more to hurt the average consumer than the rampant pirate that we are trying to stop.
That's my two cents.
"Its not stupid, its advanced!" - The Almighty Tallest, Invader Zim
|
diabolos
Suspended due to non-functional email address
|
15. June 2006 @ 07:01 |
Link to this message
|
Thats exactly how I feel handsom,
I don't know what corporate guy decided that this was a good idea for there business model but I do know that they have no technical skill.
It just like using micro-vision protection for DVD, why would that stop someone if they can just decrypt and rip the info right from the disk?
Ced
|
sahiljit
Member
|
20. June 2006 @ 10:28 |
Link to this message
|
i have dvi on the back of my tv am i safe?
|
diabolos
Suspended due to non-functional email address
|
20. June 2006 @ 15:50 |
Link to this message
|
As long as your DVI port is comnplient with HDCP. Some older DVI ports arn't. If you have ever used a upconversion DVD player conncted via your DVI port then you should be safe.
Ced
|
Senior Member
|
21. June 2006 @ 08:00 |
Link to this message
|
I guess that the thing bothering me the most, is if it finally gets to the point that most consumers don't buy whatever technology, whatever new kind of disc it is, etc. industry execs aren't going to face it. They will decide that they picked a bad set of films for the first wave of the format, they'll decide it's because people aren't interested in the format, all these things. It's like Hollywood industry execs have no concern for discluding the average home user, when in fact, I suspect that using protection schemes like this, a vastly higher number of average users and families will be unable to use this format, and a very very minimal number of hackers and pirates will be stopped. Inevitably, pirates will break through, I give it about three months before the first major breaks in the encryption/protection/whatever are made. Discs will continue to use the protection, and the only people affected at that point will be Joe Blow who just spent hundreds on a player, and will never buy more than a couple discs.... Because he can't use them on his tv, etc.
And yet, executives don't see these anti-piracy 'protections' as any kind of intrusion or barrier. If and when a format fails for film production, it won't be because people couldn't just sit down to watch a movie. It'll be because the market is in a slump, because box office dollars have been low, it will be because Johnny Depp might not attract as much of a croud as he used to, in their eyes, it will be evrything BUT the issue that they are directly causing.
And that's what bugs me the most here, it hinders the furthering of public technology, because some doof exec thinks he found the perfect way to protect a few more of his dollars.
"Its not stupid, its advanced!" - The Almighty Tallest, Invader Zim
|
diabolos
Suspended due to non-functional email address
|
21. June 2006 @ 09:04 |
Link to this message
|
Well they are being a little considerate as far as the whole Image constrant Token (ICT) issue. No movie theater has made plans to use it and have made public statments implying that they won't for a long time (like 5 or more years) so no early adopter hads to suffer. Its totally a last resort and it is decided by the movie studios per title, not Sony or Toshiba.
And that works for most since HDMI 1.1 is the defacto standard HDTV connection. In 5 years tvs could only have HDMI and streo audio jacks and we should be well into HDMI 2.x too...lol
Ced
This message has been edited since posting. Last time this message was edited on 21. June 2006 @ 09:06
|
Senior Member
|
23. June 2006 @ 09:19 |
Link to this message
|
While most HDTVs will have it, that definitely does not make it the 'defacto standard'. Heck, my Sony HDTV is just over a year old, and doesn't have HDMI support. What's up with that?
Not to mention, the usual delay in standards, etc. When RCA cables (Red, white, yellow) started being put in televisions (Early 80's) it didn't become 'standard' in all televisions until the late 90's. People had to get adapters for coaxial connections which downgraded the quality.
On top of that; newer technologies use different signal formats that can't simply be converted and downgraded through straight signal connection. It would require electronic converters which would interpret and change the signal. Not to mention the fact that with all the different protections on movies these days, there would be numerous problems with decent image playback through a converter.
As it stands, many more HDTV's have Component HD and DVI than HDMI. HDMI is a nice format, with definite improvements. But less than half the HD users have any kind of HDMI plugs in their televisions.
In short, component video is the 'defacto standard' for HD, not HDMI. And while HDMI is a step above, it is by no means 'the standard'.
If a system is released with only HDMI for HDTVs, it will fail. I'm not saying that it might, but it *will*. There is no question. It is much too soon to assume that the mass market has HDMI. Even another five years from now, many people still won't have televisions with that feature, because people don't want to replace an expensive television too quick.
"Its not stupid, its advanced!" - The Almighty Tallest, Invader Zim
|
diabolos
Suspended due to non-functional email address
|
23. June 2006 @ 12:47 |
Link to this message
|
I feel you but In my store, everything that has HD atached to it has an HDMI port (except the cheapest HDTV tube we sell, the Insignia 30"). Some have two. The Hatachi 55" plasma has 3! The fact is most HD equipment has either Firewire, DVI-D, or HDMI connections. Digital tvs need digital sources! To my knowlege the CRT (the HDTVs) technology is the only technology type that has been launched without a digital connection of some type. They used HD-Component Video connections.
Now with that said I don't think any format would only have HDMI connections and Componet Video is the current de-facto-standard for HDTV. But the industry wants HDMI since it has the ability replace them all!
As a technical note, Composite Video and RF-Coax (analog) have the same picture quality since RF-Coax carries Composite Video and Mono audio. Yellow, Red, and White connections didn't gain exceptance for a while, IMO, because the idea of "Home Theater" didn't hit main stream untill DVD. Also video-game consoles played a huge roll in stereo sound (and surround sound) exceptance in the late 90's.
Ced
|
Senior Member
|
24. June 2006 @ 13:10 |
Link to this message
|
Ahhh, Now I see. You work in an electronics store. That explains your view a bit better for me. I worked in a video store for two years, and it skewed my view on movies a LOT. So, in a way, I think I better understand where you're coming from.
I don't think more than a couple new HDTVs are produced without HDMI anymore. That is correct. But look at how many HDTVs have been sold in the last five years or so. People paid a LOT of money for these things. Do you really think that many of them are ready to shell out for *another* new HD?
Realistically, I don't think a lot of people are ready to do that just so they can use Blu Ray. As for HD-DVD, I haven't seen annything indicating that it won't use component. Not to mention, it is a whole heck of a lot cheaper than Blu-ray players, even WITH the PS3/360 price/add-on issue. For once, I don't think this war is going to be won by the true highest resolution, best quality video. I think that people aren't going to care as much about 1080p as one would think, and I believe the majority of users will favor the more cost effective solution for new hi-res. From what I've been reading, the players AND the media are cheaper while being a very minimal difference. So much so that I would guess MOST people won't even be able to tell.
Oh, and please don't think I look down on you or your opinion because you work in an electronics store, just please understand that when you work around this stuff all day, it does skew opinions a bit. When you're surrounded by, and constantly getting new information on a topic, you forget that it may not be everyone's standard, and everyone else isn't up to that speed yet. That's all. When I worked in a video store, we all had our favorite actors/directors, etc. and it always seemed strange to us that someone hadn't seen that 'particular movie'. It's a different market, but I think you understand what I'm trying to relate.
"Its not stupid, its advanced!" - The Almighty Tallest, Invader Zim
|
diabolos
Suspended due to non-functional email address
|
24. June 2006 @ 18:09 |
Link to this message
|
Oh yea its cool. I do sometimes forget that the majority isn't ready for immediate change, but I quickly remember some people are not everytime I sell a RF-Modulator to someone that doesn't have Yellow, Red, and White inputs on there tv :)
The best HDMI interview I have read so far...
http://www.electronichouse.com/info/specials/hdmi_basics.html
This HDMI insider says some things that will make cable manufactures and there vendors unhappy, but it is all very truth!
I value everyones point-of-view, thats why I love this place,
Ced
This message has been edited since posting. Last time this message was edited on 24. June 2006 @ 20:12
|
Senior Member
|
8. July 2006 @ 10:20 |
Link to this message
|
So if you BUY HDMI 1.3 dvd player and your t.v has HDMI 1.0, Would i be getting 1.3 quilty or 1.0?
|
dblbogey7
Suspended due to non-functional email address
|
8. July 2006 @ 11:13 |
Link to this message
|
1.0
|
diabolos
Suspended due to non-functional email address
|
8. July 2006 @ 18:22 |
Link to this message
|
There is no picture quality difference between HDMI 1.0 and HDMI 1.3 so technically it will be HDMI 1.0 quality in either case (as long as there is no copy protection crazyness).
What I mean is HDTV is HDTV. Yes 1.3 is supports greater resolution and color depth but HDTV doesn't!
Ced
|
Senior Member
|
8. July 2006 @ 23:06 |
Link to this message
|
If they're is no picture quilty differnce why did they come up with 1.3?
|
dblbogey7
Suspended due to non-functional email address
|
9. July 2006 @ 04:10 |
Link to this message
|
HDMI 1.3 will allow advanced audio codecs - Dolby True HD and DTS-HD.
|
diabolos
Suspended due to non-functional email address
|
9. July 2006 @ 09:24 |
Link to this message
|
This message has been edited since posting. Last time this message was edited on 9. July 2006 @ 19:51
|
Advertisement
|
|
|
Senior Member
|
15. July 2006 @ 18:05 |
Link to this message
|
Honestly, it seems irrelevant at this point. The majority of people aren't buying 80" monitors and thousand dollar receivers. The average user does not truthfully have a huge interest in these. Again, let's remember that most users aren't even HD equipped yet. The number is growing but it's easily less than one in five homes. If not even lower. These new "Higher" def units don't mean anything to most users, even most of the 'savvy' people out there won't be using this equipment for at least a couple years. Again, it's a matter of practicality in cost.
But eventually, this *will* all be standard. But I don't think that will truthfully happen until the industry reaches another stopping point. We're reaching an interesting point in this technology, because we're closing in on (And some would say we've passed) the limit of usefulness on this higher end technology. After a certain point, no one, and I mean no one will be able to tell the difference. Most people who own HDTVs can't truthfully tell you the difference between 720p and 1080p. Unless you're very close, or have a VERY big tv, you just can't tell in most cases. And the oddest thing here is that in this way, a technology is nearing it's useful limitation. And after that, there's not much else to do with it. So where will the industry go next? Rumble pack enhanced DVD's?(I pray not. Not because it would necessarily be bad, but because it would be such a useless gimmick.) Really, they can add more and more speakers until you don't have the capacity for any more in your home; but can you really tell the difference in sound quality, when you go from 96kbps surround to 128? No. Why? Because, again, the original source wasn't that high to begin with. And unless you go through and painstakingly retouch it (Star Wars stlye), you'll never surpass the original quality anyways. And one other newsflash, studios STILL aren't filming in anything comparable to 1080p, so I don't think there is anything like an iminent reason to upgrade any time in the near future, for MOST users.
"Its not stupid, its advanced!" - The Almighty Tallest, Invader Zim
|
|