Frame Rate

Post Reply
Mike88

Posts: 549
Joined: Wed Jun 20, 2012 7:50 am
Location:

HTPC Specs: Show details

Frame Rate

#1

Post by Mike88 » Mon Aug 13, 2012 7:39 am

How do I know if WMC7 is actually playing video at the correct frame rate?

I’ve experimented with MPC-HC and it indicates an OTA recording is not playing at the exact correct frame rate. From what I understand it will periodically drop or add a frame if necessary in order to maintain the proper rate. But that’s supposedly with MPC-HC using the CPU not the integrated Intel graphics.

I did use 4-1-1-info with WMC7 and it indicates the video has a frame rate of 59.9401, which I interpret is for the actual video. When I play a mpeg2 or VIDEO_TS recording the frame rates come up “not specified”.

How do I know what is really being output to my monitor or HDTV? Are frames being dropped or added in order to maintain the correct rate? If I add a video card how will I know if it changes anything, presuming something needs correcting?

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#2

Post by barnabas1969 » Mon Aug 13, 2012 2:52 pm

Your monitor or HDTV will display whatever refresh rate is set in your video adapter's settings. In other words, Media Center does not change the refresh rate of the display adapter to match the frame rate of the content you are playing.

You can play your movie files in Total Media Theatre 5, and it will adjust the refresh rate automatically for 23.976 or 59.94 content.

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#3

Post by richard1980 » Mon Aug 13, 2012 4:16 pm

Mike88 wrote:I’ve experimented with MPC-HC and it indicates an OTA recording is not playing at the exact correct frame rate. From what I understand it will periodically drop or add a frame if necessary in order to maintain the proper rate. But that’s supposedly with MPC-HC using the CPU not the integrated Intel graphics.
I believe you are talking about drop frame timecode, which doesn't actually drop any frames at all. It just skips a number in the counter. It drops frame counters 0 and 1 of the first second of each minute, except minutes that are divisible by 10. The frames are still there, they are just labeled as 2 and 3 instead of 0 and 1. This is because TV content is not aired at exactly 30.000 or 60.000 frames per second. Instead, it is slowed down and aired at either 30*1000/1001 (29.97) or 60*1000/1001 (59.9401) frames per second. The use of drop frame timecode allows the timecode to track with actual time.

Mike88

Posts: 549
Joined: Wed Jun 20, 2012 7:50 am
Location:

HTPC Specs: Show details

#4

Post by Mike88 » Tue Aug 14, 2012 2:44 am

In another forum there is frequent discussion that neither Intel nor AMD CPUs can play 23.976 fps properly. Some people use software or a video card to get it right otherwise they sometimes see a glitch when a frame is dropped, which could be anywhere from a few seconds to a few hours depending on how accurate they get the frame rate. Some people notice this & others do not. This supposedly also happens with 29.97 fps video but is less noticeable.

From me beginner's knowledge of the situation, it seems the CPU runs at an even number frame rate such as 24 of 30 fps. So the PC has to somehow adjust for this, typically using software or a video card. If these other adjustments are not done then a frame is dropped every so often so that the output will be the correct rate. So the frame rate is correct over a period of time, but it’s a matter of how that is achieved.

All the material I’ve read centers around viewing with MPC-HC which can provide a readout & graph showing what is happening. And MPC-HC on my Intel HTPC does show inaccurate frame rates with corrections taking place every 16.66 seconds. Personally I have not noticed these glitches in the actual movies or programs. I have seen some glitches every so often in older TV programs & dismissed these because of the older content.

But I’m still learning a lot about HTPC video & am curious & was trying to find a way to view similar data when playing a movie with WMC7. I know that entering 4-1-1-info gives some data including the frame rate. But how is WMC7 achieving that rate if the CPU cannot do it? IOW does WMC7 drop frames the same as MPC-HC?

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#5

Post by richard1980 » Tue Aug 14, 2012 4:05 am

16.66 seconds is 999.6 frames of 60.000 fps content. However, like I said before, content is not broadcast at 60.000 fps. It's broadcast at 59.9401 fps. 16.66 seconds is 998.6 frames of 59.9401 content. That's 1 frame of difference.

Like I said before, no frames are actually dropped. They are just counted differently. We were all taught to count 4 objects by saying "1, 2, 3, 4"...well that's not how you count 4 objects in drop frame timecode. Instead, you count those 4 objects by saying "1, 2, 5, 6"...it's still 4 objects, but the numbers are different.

Mike88

Posts: 549
Joined: Wed Jun 20, 2012 7:50 am
Location:

HTPC Specs: Show details

#6

Post by Mike88 » Tue Aug 14, 2012 9:39 am

The 16.66 seconds was measured with a stop watch so it may be slightly off.

I think I understand what you are saying about the timecode. But then I'm confused about all the talk about dropped frames and people seeing the glitches.

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#7

Post by richard1980 » Tue Aug 14, 2012 2:06 pm

The only time frames should ever be dropped or added is when you are changing from one framerate to another framerate. As long as you are converting in multiples of the first frame rate, you'll never see a problem. For example, converting from 30.000 to 60.000 (and their slowed down counterparts) doesn't result in any problems because 60 is a multiple of 30. To convert from 30 to 60, you just show each frame twice. The end result is each source frame is still displayed for the same amount of time...1/30 of a second. (On a side note, this is the same concept used by TV manufacturers that advertise "120 Hz", "240 Hz", "480 Hz", "600 Hz", etc. refresh rates.)

But then consider a case where you need to convert 24.000 to 30.000 (or 23.976 to 29.97). 30 is not an even multiple of 24. So to get 30 frames from 24 frames, 6 frames must be added. There are few different ways to do this, but no matter how you try to do it, there will be some sort of glitch. How noticeable that glitch is depends on the method used to get the 6 extra frames. This same thing applies to 23.976 to 59.94 conversion as well. To further complicate things, try converting from 23.976 to exactly 30.000 or 60.000. You'll end up with even more problems.

Mike88

Posts: 549
Joined: Wed Jun 20, 2012 7:50 am
Location:

HTPC Specs: Show details

#8

Post by Mike88 » Wed Aug 15, 2012 7:06 pm

I'm familiar with some of the problems of frame rate conversion having converted 30.000 fps .avi to 29.97 fps mpeg2. I know just enough to get confused & into trouble. Most of what I’ve seen is in this thread.
http://www.avsforum.com/t/1333324/lets- ... ently-well

If you could, please read about the first 20 or 30 postings. This does not sound like a conversion problem but I could have a complete misunderstanding of the problem.

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#9

Post by richard1980 » Wed Aug 15, 2012 11:51 pm

The issue discussed in that thread is the failure of GPUs to output exactly 23.976 fps (well, technically 24.000 * 1000/1001) when they are instructed to do so. The GPU is converting the content to a different output frame rate that is close to the desired value, but not exact. For example, a GPU might output the content at a frame rate that varies from the target by only +/- 0.001 fps. That doesn't sound like much, until you do all the math and figure out that the variance results in a 1-frame error every 16 minutes 40 seconds. Increase the variance to +/- .004 and that turns into a glitch every 4 minutes, 10 seconds. The more the GPU is off from the target output frame rate, the more often glitches will occur. And if you have a good enough eye to spot the glitches, I'm sure it can be very annoying.

Mike88

Posts: 549
Joined: Wed Jun 20, 2012 7:50 am
Location:

HTPC Specs: Show details

#10

Post by Mike88 » Thu Aug 16, 2012 4:42 am

From what I understand, all the above frame rate data comes from MPC-HC and using ctrl+j to display it. But I get a very corrupted picture when trying to play a DVRMS video because DXVA does not work with Intel CPUs. So I had to disable the DXVA filter in MPC-HC.

But then it appears the video is using the CPU, not the GPU. And my CPU usage does indeed increase.

Can you compare what MPC-HC using the CPU does versus what WMC7 does using the GPU?

If software or video cards can affect or change the frame rate, that means there can be a difference in the method used to play a video. Not that there has to be, just that there can be.

How do I know if WMC7 is doing the same thing as MPC-HC?
Or is WMC7 possibly doing a better/worse job?

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#11

Post by richard1980 » Thu Aug 16, 2012 11:36 am

Mike88 wrote:From what I understand, all the above frame rate data comes from MPC-HC and using ctrl+j to display it. But I get a very corrupted picture when trying to play a DVRMS video because DXVA does not work with Intel CPUs. So I had to disable the DXVA filter in MPC-HC.
To clarify, the DXVA problem is with MPC-HC. Intel GPUs support DXVA, but MPC-HC uses DXVA decoders that do not work correctly with all Intel GPUs.
Mike88 wrote:But then it appears the video is using the CPU, not the GPU. And my CPU usage does indeed increase.
This would be the expected behavior if you disable DXVA. Your CPU now has to do all the decoding and post-processing, whereas with DXVA enabled, a lot of those processes are off-loaded to the GPU.
Mike88 wrote:Can you compare what MPC-HC using the CPU does versus what WMC7 does using the GPU?
The difference between using DXVA and not using DXVA only affects the decoding and post processing. Final output is the responsibility of the GPU no matter what, and DXVA has nothing to do with whether or not the GPU can output the correct frame rate.
Mike88 wrote:If software or video cards can affect or change the frame rate, that means there can be a difference in the method used to play a video. Not that there has to be, just that there can be.

How do I know if WMC7 is doing the same thing as MPC-HC?
Or is WMC7 possibly doing a better/worse job?
There are a number of differences between using DXVA and not using DXVA, so yes, there is the possibility that one method works better than the other. However, as I stated above, DXVA has nothing to do with whether or not the GPU can output the correct frame rate.

Mike88

Posts: 549
Joined: Wed Jun 20, 2012 7:50 am
Location:

HTPC Specs: Show details

#12

Post by Mike88 » Fri Aug 17, 2012 5:49 am

richard1980,

Thank you so much for the explanation on how all this works.

From what I’ve read sometimes MPC-HC is used along with reclock or other software in order to achieve the best possible frame rates, and I’m thinking probably along with a video card.

At this point in time I don’t notice any glitches with frame rates, but I haven’t used the HTPC a lot. If I felt a need to tweak frame rates will a video card do the job without having to resort to extra software?

Post Reply