It's true that it will make the black, gray, and white rendering have a bigger range in some cases. However, it will also affect your color accuracy. My TV can handle the full range of 0-255 ("blacker than black", and "whiter than white"). The problem is that most CONTENT (the TV shows and movies... probably more than 99% of what is available to watch) is produced with the 16-235 range, because most TV's cannot handle values between 0-15 and 236-255. So, if you set your TV and/or display adapter to the 0-255 range, there is a mismatch between the source material and the way you are displaying it. It doesn't only affect the whites and blacks... it affects all of the colors in between.
You've gotta understand... the numbers (0-255) apply to each of the three colors (red, green, blue). This is what gives a computer monitor 16 million colors (actually 256^3, or 16,777,216, a.k.a "16 million colors"). You see, there are 256 levels of each color (0-255) on a computer monitor (and many high-end TV's). When all three colors are set to the same value, your eye perceives them as white. When all three colors are set to the maximum value (255, 255, 255 --decimal-- in RGB ... or FF, FF, FF in hexadecimal) you perceive them as the brightest white.
But, let's consider the fact that all broadcast TV shows (and more than 99% of movies) are shot in the limited 16-235 range. If you set your dynamic range to 0-255 (assuming that your TV can handle it), your TV may display a wider range of blacks and whites. That would be wonderful if we watched black-and-white TV. However, since most programming is shot in the limited range (and in COLOR), it distorts the color of the displayed image.
Let me explain it to you this way... we all know that yellow and blue makes green. If you add EQUAL amounts of yellow and blue to a piece of white paper, and shine a white light on it, you get green, right? Well, that's true for subtractive colors. Those are colors that are applied to surfaces that reflect light to our eyes, like paper. We also know that the primary colors are red, blue, and yellow... right?
That's the subtractive color model.
Well, a television doesn't work on the subtractive color model. It works on the additive color model. Ever wonder how a TV can display the color YELLOW, when it only has red, blue, and GREEN to work with? Well, in the additive color model, when you mix EQUAL amounts of the color green and red, you get.... (drum roll please)... yellow! The primary colors in the additive color model are Red, Green, and Blue (RGB).
Well, let's say that you're watching a TV show... and they filmed a perfectly yellow wall that is brightly lit. Now, remember... they almost always film stuff in the 16-235 range (10,648,000 colors, or 220^3... because 235-16+1=220). The original source material will contain pixels that are coded as 235,235,0 (R,G,B) to make pure, bright, yellow.
The problem is... your video adapter is set to 0-255. What happens when it receives content that contains 235,235,0?
Well, there are four possibilities, depending on your TV's (display device) and video adapter's behavior ...
A) The video adapter will simply pass the 16-235 range to the display device without converting it to 0-255. The display device will chop off everything above 235, and below 16... resulting in the same picture as the director intended.
B) The video adapter will convert the 16-235 range to 0-255. The display device will chop off everything above 235 and below 16... resulting in a loss of color accuracy and a loss of contrast ratio. Black (16,16,16) gets converted to 0,0,0... and that's fine... the TV will truncate it and display 16,16,16 anyway... still black. But what happens when the color is different... let's say 32,32,32? Let's assume that the video adapter converts that to 16,16,16. So... both are displayed as black... making you lose the shadow detail.
C) The video adapter will pass the 16-235 range to the display device without converting it to 0-255. The display device will display the full range of 0-255. This will result in color accuracy that is incorrect, and the contrast ratio will also suffer, because the original content expects black to be 16,16,16... but the display shows 16,16,16 as gray! (because it expects black to be 0,0,0) And the content expects 235,235,235 to be white... but the display expects 235,235,235 to be gray too! (because it expects white to be 255,255,255)
D) The video adapter will convert 16-235 from the content to 0-255, and the display will display it as 0-255 without chopping it. This will result in an expanded scale of black-to-white (gray scale), but a color that doesn't match the director's intent. It will be very close, but no cigar.
So, you have a 25% chance of getting it right by forcing the dynamic range to 0-255. Since more than 99% of content is filmed in 16-235, it's safer to either A) leave it set to "set by player" or B) set it to 16-235. Hopefully, if you choose option "A", your player (Media Center, PowerDVD, TMT, etc) will be able to detect when you are playing a Bluray (or Bluray rip) that is actually encoded in 0-255. But, that's a rare case. Most DVD's and Blurays are encoded in 16-235, and so is broadcast TV. You're much safer allowing the player to set the dynamic range, or just force it to 16-235... because that's how the vast majority of content is filmed.
I hope that makes sense.