Monday 15 June 2015

linux - How can I determine the length of time since the last screen refresh on X11? -


I am trying to debug an incoming machine visitor debug by writing a text timestamp in a terminal window and then viewing it That's how long it takes the camera to 'detect' to change the screen. My monitor has a fresh rate of 60hz, so the screen is updated every ~ 17ms. Is there a way to decide that the timer that is refreshing within that 17ms window is currently a X11 application?

Edit: After wrestling with the problem for nearly a day, I think the real question I should have asked is how to create a visual indication that The cameras were fast enough to test the images. My working concept was that the cameras were buffering frames before broadcasting them, because the video stream was synchronized to other digital events (in this case

'xrefresh' is a tool that looks behind the output signal for robot controller Which can turn on a fresh event on the X server. It does this by painting the global window of a specified color and then removing it, so that all subsequent windows will have to be redesigned. In addition to this, I was still getting very inconsistent results regarding the frames captured against the monitor output, no matter what I tried to do, the video stream was looking behind me as expected to be a monitor state This may mean that the camera was slow to capture or was slow to update the monitor. Fortunately, I finally came to the idea of ​​using keyboard aldes to verify the sync environment of the camera frame. ('Exsette lead' and 'exsette'). It showed me immediately that in fact my computer monitor was slow to update, not behind the camera

No comments:

Post a Comment