Intel HD4000 & TrueHD

Having trouble playing all your different media types? Ask here!
barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#41

Post by barnabas1969 » Fri Jan 04, 2013 6:09 pm

ronvp wrote:I am just flabbergasted that you guys are trying to prove that it is OK to use low speed cables for High Speed applications.
Nobody is trying to prove that. I'm just explaining how wrong you are on most of the other stuff you posted. I'm also saying, from my own experience and that of others, that short standard-speed HDMI cables usually work fine, even at 1080p60 or frame-packed 1080p24 (3D).

Just because a cable has not been tested to the high-speed standard, does not mean that it would fail the test. If you are setting up a TV, and you already have a standard-speed HDMI cable laying around, it doesn't hurt anything to try it before you buy a new cable. If the old cable works at all the resolutions and frame rates that you plan to use, then you just saved yourself some money. If not, no big deal... just buy a new high-speed cable.

If you don't already have some cables laying around, then you should always buy a high-speed HDMI cable. The price difference between a standard and high-speed cable is very negligible, until you get up to 30+ feet in length.

foxwood

Posts: 1761
Joined: Fri Sep 07, 2012 3:43 pm
Location:

HTPC Specs: Show details

#42

Post by foxwood » Fri Jan 04, 2013 8:23 pm

ronvp wrote: for anybody that is intending to watch 1080P video with Uncompressed Audio (as in a regular BluRay with DTS-HD), the correct advice is to use a High Speed Cable or risk poor picture/sound quality.
The correct answer is that if your current cables work, they work, whether they're officially in spec or not. If you're starting from scratch anyway, you might as well buy the higher spec cables, but there's no need to replace older cables UNLESS you encounter a problem.

The point is that if there's a problem, it will be obvious, it won't be just a slight degradation in picture quality or sound quality, as could be the case with an analog signal.

(Oops - I didn't see barnabas' reply, as it was on the next page.

TLDR - Ditto! )

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#43

Post by richard1980 » Fri Jan 04, 2013 11:57 pm

Except color space does not change the bit rate. 24-bit color depth is exactly that regardless of what color space is used. It's still exactly 24 bits per pixel. So to correct your math:

1920 x 1080 x 24 fps x 24 bpp = 1.11 Gbps

And with an 8-bit alpha channel, it becomes:

1920 x 1080 x 24 fps x 32 bpp = 1.48 Gbps

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#44

Post by barnabas1969 » Sat Jan 05, 2013 12:42 am

Also, it's very rare to find a Bluray with more than 24-bit color depth, and you never get broadcast, cable, or DVD in more than 24 bit color depth. I don't know about stuff you can download online, but since the source is rarely/never more than 24 bit color depth, I would imagine that the same is true with downloaded content online. This is why you should set your graphics adapter (and display) to YCbCr instead of RGB for HT purposes. You should also calibrate the adapter/display for 16-235 instead of 0-255, because the movies and TV shows you normally watch use that range too.

And, I agree with foxwood. If the cable doesn't work, it will be obvious. At a minimum, you'll get "sparkles" in the picture, and at worst you'll get nothing at all. If the cable works, it will be perfect at all frame rates and resolutions that your display is capable of reproducing.

The worst case scenario of using an old standard (or unknown) cable that you had laying around is if you intend to install it in-wall or in-ceiling. In that case, you should TEST TEST TEST the cable even if it says that it's high-speed rated before you put it in the wall/ceiling. The labor involved in installing and re-installing a cable in-wall/in-ceiling is worth the time it takes to test the cable with your equipment first. And I DO mean... with YOUR equipment. Test the cable with your source device(s), your display device(s), your switch(es) and your AVR(s) in ever combination that will be possible in your setup.

If you don't plan to embed it in a wall/ceiling, then it's really no big deal to replace the cable in most setups. If the labor involved is easy/cheap/free to replace the cable, then there's no harm in trying a standard cable you had laying around already.
Last edited by barnabas1969 on Sat Jan 05, 2013 12:55 am, edited 1 time in total.

ronvp

Posts: 14
Joined: Sat Dec 29, 2012 5:41 pm
Location:

HTPC Specs: Show details

#45

Post by ronvp » Sat Jan 05, 2013 12:54 am

richard1980 wrote:Except color space does not change the bit rate. 24-bit color depth is exactly that regardless of what color space is used. It's still exactly 24 bits per pixel. So to correct your math:

1920 x 1080 x 24 fps x 24 bpp = 1.11 Gbps

And with an 8-bit alpha channel, it becomes:

1920 x 1080 x 24 fps x 32 bpp = 1.48 Gbps
That I believe is not correct...It has a lot to do with it, because it indicates the amount of info needed to describe the color). In fact, if you do have a movie that gives you a problem on your low speed cable, you can actually tell FFDShow to adjust/convert to a other color space and reduce the bitrate going over your standard cable.. and maybe fix it.. Check the post I send earlier and this Wiki.. http://en.wikipedia.org/wiki/Image_compression

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#46

Post by barnabas1969 » Sat Jan 05, 2013 1:04 am

ronvp wrote:
richard1980 wrote:Except color space does not change the bit rate. 24-bit color depth is exactly that regardless of what color space is used. It's still exactly 24 bits per pixel. So to correct your math:

1920 x 1080 x 24 fps x 24 bpp = 1.11 Gbps

And with an 8-bit alpha channel, it becomes:

1920 x 1080 x 24 fps x 32 bpp = 1.48 Gbps
That I believe is not correct...It has a lot to do with it, because it indicates the amount of info needed to describe the color). In fact, if you do have a movie that gives you a problem on your low speed cable, you can actually tell FFDShow to adjust/convert to a other color space and reduce the bitrate going over your standard cable.. and maybe fix it.. Check the post I send earlier and this Wiki.. http://en.wikipedia.org/wiki/Image_compression
No, color "space" refers to the range of values between black and white. As I wrote in my (edited) post above, YCbCr is 16-235, where 16,16,16 (16 decimal is 00010000 in binary) is black and 235,235,235 (235 is 11101011 in binary) is white. With RGB color space, black is 0,0,0 and white is 255,255,255. In either case, it still requires 8 bits per primary color (the additive primary colors are red, green and blue as represented by the comma-separated groups of numbers in the previous sentence) to produce those colors. The difference, as pointed out by your Wikipedia link, is that a YCbCr color space is more easily compressed. So, a video recorded in YCbCr can be compressed to a smaller resulting file size than a video that was recorded in RGB.

For example, a single pixel of black in the YCbCr color space is represented as 000100000001000000010000 in binary (24 bits), and white is 111010111110101111101011. In the RGB color space, black is 000000000000000000000000 and white is 111111111111111111111111. Both color spaces still use 24 bits.

The smaller range (235-16+1 = 220 vs. 255-0+1 = 256) makes the YCbCr video easier to compress to a smaller size without losing detail. But, regardless of the color space and compression, the data going over the HDMI cable is NOT compressed. So, it consumes the full bit-depth (color depth) on its way between the source device and the display device.

The color "depth" (a.k.a. "bit depth") is the number of bits assigned to each of the primary colors. With 24-bit color depth (bit depth), there are 8 bits assigned to each of the primary colors. There are higher bit depths, but they are rarely used for Bluray movies, and never (AFAIK) used for DVD, broadcast, cable, or satellite TV.
Last edited by barnabas1969 on Sat Jan 05, 2013 1:15 am, edited 2 times in total.

ronvp

Posts: 14
Joined: Sat Dec 29, 2012 5:41 pm
Location:

HTPC Specs: Show details

#47

Post by ronvp » Sat Jan 05, 2013 1:11 am

I think there is confusion over color SPACE versus color MODEL.. the Color model (YUV 4.2.2 or Yuv 4.4.4) is not the same (I believe) as the Color Space.. and accoring to many websites and the compression/bit rate calculator link I send earlier COLOR MODEL makes a huge difference

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#48

Post by barnabas1969 » Sat Jan 05, 2013 1:22 am

I'd have to read more about color "model", but I can tell you that TV and movies will almost always (with rare exceptions on Bluray) have a CMYK color model.

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#49

Post by richard1980 » Sat Jan 05, 2013 1:52 am

I'd like to rephrase (and somewhat correct) the statement about RGB range calibration. You should calibrate your display appropriately for the type of content that will be viewed on the display. If you intend to view 0-255 content (such as the Windows desktop), then the display and adapter should be calibrated for 0-255. Otherwise, the desktop blacks and whites will be crushed. Then calibrate video playback to 16-255. (Yes, that's 255, not 235...this ensures you can view whiter-than-white. Of course, this may not be possible on all displays, as some displays simply can't display whiter-than-white, so you may not have any other option but to calibrate for 235.) If you aren't viewing PC content (or anything else that uses full range RGB), then I would recommend calibrating the display for 16-255 (or whatever white level doesn't clip...hopefully something higher than 235).

WRT color model vs color space, the difference does not matter. Color space and color model just define what the bits mean. But 24 bits of information is still 24 bits of information, regardless of what it means. So when content has 24-bit color depth, that means it takes 24 bits to describe the color of each pixel. How you describe the color is irrelevant. It still only takes 24 bits to describe the color of each pixel.

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#50

Post by richard1980 » Sat Jan 05, 2013 2:01 am

FWIW, here's a nice little write-up about color space vs color model: http://www.openphotographyforums.com/fo ... php?t=5654

ronvp

Posts: 14
Joined: Sat Dec 29, 2012 5:41 pm
Location:

HTPC Specs: Show details

#51

Post by ronvp » Sat Jan 05, 2013 5:20 pm

FYI. Intel posted a new version of the display drivers yesterday.. it may fix the sound issue in the original post..

Clearly much confusion exist on Color Model versus Color Space and they may sometimes be mixed up, but but they are not the same thing and depending on the color model, bit rate varies.

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#52

Post by richard1980 » Sat Jan 05, 2013 6:19 pm

The bit rate of the compressed video will vary, but the bit rate of the uncompressed (i.e., what is actually being sent over HDMI) video will never vary. 24 bits is 24 bits. It is never 23 bits, 22 bits, or any other number of bits. Neither color model nor color space have any impact on this.

Your argument makes about as much sense as saying a pound of lead is heavier than a pound of feathers. It's a pound either way.

User avatar
STC

Posts: 6808
Joined: Mon Jun 06, 2011 4:58 pm
Location:

HTPC Specs: Show details

#53

Post by STC » Sat Jan 05, 2013 6:26 pm

richard1980 wrote:...a pound of lead is heavier than a pound of feathers. It's a pound either way.
Now let's just break that down and think about that for a minute.....

What if the feathers and lead were CUPS.
AHHHH!!! Got you there eh? :mrgreen: ;) :lol:

[pointless post of the day v1.6]
By the Community, for the Community. 100% Commercial Free.

Want decent guide data back? Check out EPG123

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#54

Post by richard1980 » Sat Jan 05, 2013 7:01 pm

Well a cup is a measure of volume, whereas a pound is a measure of weight.

Hah! :lol:

Maybe I should use a better analogy. How's this: 1 MB of French vs 1 MB of English.
Last edited by richard1980 on Sat Jan 05, 2013 7:02 pm, edited 1 time in total.

bigsid05

Posts: 54
Joined: Wed Nov 02, 2011 3:11 pm
Location:

HTPC Specs: Show details

#55

Post by bigsid05 » Sat Jan 05, 2013 7:02 pm

ronvp wrote:FYI. Intel posted a new version of the display drivers yesterday.. it may fix the sound issue in the original post..

Clearly much confusion exist on Color Model versus Color Space and they may sometimes be mixed up, but but they are not the same thing and depending on the color model, bit rate varies.
Thanks - worth a shot.

ronvp

Posts: 14
Joined: Sat Dec 29, 2012 5:41 pm
Location:

HTPC Specs: Show details

#56

Post by ronvp » Sat Jan 05, 2013 7:12 pm

richard1980 wrote:Well a cup is a measure of volume, whereas a pound is a measure of weight.

Hah! :lol:

Maybe I should use a better analogy. How's this: 1 MB of French vs 1 MB of English.
I just fell out of my seat..laughing.. Why you do not do your homework and stop arguing is beyond me. .checked out several bit rate calculators, and the all show that when you change color model, the bit rate changes.... maybe they are all wrong and you are the expert (or maybe not!)..

User avatar
STC

Posts: 6808
Joined: Mon Jun 06, 2011 4:58 pm
Location:

HTPC Specs: Show details

#57

Post by STC » Sat Jan 05, 2013 7:45 pm

richard1980 wrote:Well a cup is a measure of volume, whereas a pound is a measure of weight.
I was just foolin' :thumbup:
By the Community, for the Community. 100% Commercial Free.

Want decent guide data back? Check out EPG123

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#58

Post by richard1980 » Sat Jan 05, 2013 9:18 pm

ronvp wrote:Why you do not do your homework and stop arguing is beyond me. .checked out several bit rate calculators, and the all show that when you change color model, the bit rate changes.... maybe they are all wrong and you are the expert (or maybe not!)..
Yes, I have done my homework. Perhaps you should do the same. I guarantee you anybody that says color space or color model changes the uncompressed bitrate is flat out wrong. It absolutely does not happen. Clearly the people that wrote the stuff you are reading do not know WTF they are talking about. Either that or you don't understand what they are saying. Both are plausible explanations.

If you want the truth, I'll try to make it simple. To express a color, you need to know two things:
  • The color model, which defines how to express colors. Essentially, this is the "language" that must be agreed upon between the transmitting device and the receiving device. Once the language is agreed upon, the receiving device will know what each bit of color information actually means. If the two devices don't agree on the same language, the receiving device will misinterpret the data coming from the transmitting device, and thus a command to make a pixel a certain color will result in the pixel being some other color instead. Two generic examples of this would be an adaptive color model (where colors are added together to get other colors) and a subtractive color model (where colors are subtracted from each other to get other colors).
  • The color space, which defines what colors from the color spectrum can actually be expressed. Additionally, each color space also has its own distribution of colors.
The only thing the color model and color space do is translate bits of data (1's and 0's) into actual colors. In other words, they both define what the bits of data actually mean. But neither of those things define how much data is required to express a particular color. That is where color depth comes in. The color depth is a measure of precision. For example, a 1-bit color depth means 2 colors can be expressed (color 1 and color 0), but not any other colors. Which two colors can be expressed depends on the color model and color space. Perhaps in color model A, 1 means red and 0 means black. But in color model b, 1 means green and 0 means purple. The particular color model and color space used do not change the fact that there is still only one bit of data used to express the colors.

To increase the number of colors that can be expressed, you would want to increase the number of bits used to express colors. While a 1-bit color depth is only able to produce 2 colors, a 2-bit color depth can express 4 colors (color 00, color 01, color 10, and color 11). Again, which 4 colors depends on the color model and color space, and the specific color model and color space used does not change the fact that 2-bits are still used to express colors. But the color model and color space define which specific colors are being expressed by the 2-bits.

You can continue increasing the color depth as high as you want. The more bits that are used, the more precise you can be in expressing a certain color. But in no case does changing the color model or color space translate into any change in how many bits are used to express each color. In all cases, 24-bit color depth (aka 24-bit precision) means that 24 bits are used to express colors.

So again, it is absolutely impossible for a 24-bit color depth to require any more or less than 24 bits to express a color, regardless of what color model or color space is being used, and anybody that says it does is completely wrong.

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#59

Post by barnabas1969 » Mon Jan 07, 2013 1:16 am

I agree with Richard. Actually, I've never found him to be wrong. The guy obviously takes his time to research stuff before he posts an answer. Also, it's important to note that the human eye can't discern the difference between more than about 10 million colors. A bit depth of 24 gives us more than 16 million colors. There's not much point to "deep color".

ronvp

Posts: 14
Joined: Sat Dec 29, 2012 5:41 pm
Location:

HTPC Specs: Show details

#60

Post by ronvp » Mon Jan 07, 2013 11:40 pm

Spin it anyway you like, but the issue was/is not about how many colors or about color depth, but rather will the color model affect the net bit rate and with that can it impact the HDMI cable selection for 1080p and clearly it does...

Post Reply