Is there a way to determine the intended frame rate of content playing in the html video element?
Does the video element even know the intended FPS or frame count, or does it simply "guess" (maybe 24fps) and play at the guessed speed?
Here are my unsuccessful attempts:
Look for a FPS or FrameCount property on the video element itself--Nope!
Look for cross-video-format header info about FPS or FrameCount--Nothing consistent!
Look for an event that is triggered upon frame changing--Nope!
My next attempt is more complicated: Sample the video by capturing frames to a canvas element and count frames by determining when the pixels change.
Does anyone have a simpler answer before I do the complicated attempt?
Knowing the frame-rate of the video wouldn't be as useful as you might think.
Browsers uses of some tricks to make a match between the frame-rate of the movie and the refresh-rate of the screen, so if you look at
currentTime property, you'll see that the actual frame duration ( ==
currentTime - previous
currentTime) is not a constant, it varies from frame to frame.
On this sample video : http://jsfiddle.net/3qs46n4z/3/ the pattern is :
4 frames at 21.3 + 1 frame at 32 + 5 frames at 21.3 + 1 frame at 32.
So if you want to always display the latest frame on a canvas while avoiding overdraw, the solution might be to :
- On each rAF, look at the current time of the video :
• Same ? -> do nothing.
• Changed ? -> update frame.
And whatever you want to do, comparing two currentTime === two numbers might be faster than comparing two imageDatas ;-)
Edit : looking into the specifications to find evidence of my saying, i found a nuance with this Note :
Which frame in a video stream corresponds to a particular playback position is defined by the video stream's format.
(Note of 4.8.6 at http://www.w3.org/TR/2011/WD-html5-20110113/video.html )
So strictly saying we can only say that (the current time is the same) implies (the frames are identical).
I can only bet that the reciprocal is true => different time means different frame.
In the example above, Chrome is trying to match the 24Hz of the movie on my 60Hz computer by trying to get 45 Hz ( = 60 / 2 + 60 / 4), the nearest from 48 = 2*24. For the 21 created frames i don't know if it interpolates or merely duplicates the frames. It surely changes depending on browser/device (Gpu especially). I bet any recent desktop or powerful smartphone does interpolate.
Anyway given the high cost of checking with the imageData, you'd much better draw twice than check.
Rq1 : I wonder to which extent using Xor mode + testing against 0 32 bits at a time could boost the compare time. (getImageData is slow.)
Rq2 : I'm sure there's a hacky way to use the playback rate to 'sync' the video and the display, and to know which frame is a genuine ( == not interpolated ) frame. ( so two pass here 1) sync 2) rewind and retrieve frames).
Rq3 : If your purpose is to get each and every video frame and only the video's frame, a browser is not the way to go. As explained above, the (desktop) browsers do interpolate to match as closely as possible the display frame rate. Those frames were not in the original stream. There are even some high-end '3D' (2D+time) interpolation devices where the initial frames are not even meant to be displayed (! ). On the other hand, if you are okay with the (interpolated) output stream, polling on rAF will provide every frames that you see (you can't miss a frame (except obviously your app is busy doing something else) .
Rq4 : interpolation ( == no duplicate frames ) is 99.99% likely on recent/decent GPU powered desktop.
Rq5 : Be sure to warm your functions (call them 100 times on start) and to create no garbage to avoid jit / gc pauses.
©2020 All rights reserved.