End-to-end input latency measurement

Associate
Joined
19 Jul 2015
Posts
489
I've been wanting to do end-to-end latency measurements for a long time, as it's easy to do. All it requires is something to generate a measurable signal on an input, and another when the monitor responds to that input.

Sensing the input is trivial, since that's what mouse buttons inherently do. Here's a photo of an old mouse I modified by tacking a couple of wires across the left button, so that the click can be detected by an oscilloscope, like this:

mouse.jpg


Sensing the change in output can be achieved with some sort of light sensor, such as a photodiode. I have some of those lying around already bodged together into a form suitable for connecting to an oscilloscope. The particular one I used was a SLD-70BG2, which is sensitive to visible light, exactly like a monitor produces. I added a 100kΩ resistor in parallel to improve the response time. This is what it looks like:

photodiode.jpg


So what to measure? How about web browsers? People will choose browsers based on general performance, but I've never heard anyone consider input latency differences before. I wrote a simple script to flash the screen on left click for this test (full code below if anyone cares).

Code:
<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8">
        <title>Mouse Click Flasher</title>
    </head>
    <body>
        <script>
            (function () {
                const toggleBackground = event => {
                    if (event.button === 0) {
                        document.body.style.background = event.buttons && 1 ? 'white' : 'black';
                    }
                };

                document.body.style.background = 'black';
                document.addEventListener('mousedown', toggleBackground);
                document.addEventListener('mouseup', toggleBackground);
            }());
        </script>
    </body>
</html>

Here's a pic of what it looks like on the oscilloscope:

scope.png


The blue line is the mouse button, which is pulled up to 5V normally (actually 4.7V here), and shorted to ground when the button is pressed. The red line is from the photodiode. You can clearly see the monitor's response time. There are two latencies available here - one for mouse-down and one for mouse-up. I used only the mouse-down latency. I measured it as the time from mouse-down to when the monitor reached 50% (marked by vertical lines in the pic).

The monitor is a Samsung Neo G7, running at its default 165Hz, and with local dimming disabled for this test because it messes with the response time. It's in HDR mode, and the browsers are only outputing SDR, which is why it's possible for there to be overshoot on the transition from black to white.

Here's a summary of the results:

Code:
        Firefox     Chromium    Edge
Min     24.14       24.61       20.18
Max     45.15       42.82       49.98
Mean    32.78       33.06       34.28


Raw data below:

Code:
Firefox     Chromium    Edge
26.57       36.72       33.53
25.04       31.61       27.47
32.54       24.61       23.97
29.99       26.23       36.08
32.80       26.10       41.20
29.56       27.81       27.51
29.69       30.07       34.00
24.14       29.39       28.06
25.25       29.18       20.18
29.43       24.61       45.60
40.95       40.69       34.25
31.27       39.63       40.31
38.62       25.08       26.91
29.79       33.74       36.72
35.32       37.66       39.33
45.15       42.82       34.38
42.21       39.07       36.26
28.36       40.05       49.98
37.60       41.50       39.70
41.33       34.68       30.06

So... no difference. The range of values is quite wide, which is not surprising given that it depends on the JavaScript event loop, which isn't designed for low and consistent latency like a game engine is.

It's not possible to work out what makes up the total latency, apart from monitor reponse time which the photodiode measures directly. Since the measured latency covers the entire chain end-to-end, it includes all sorts of latencies that aren't normally measured, like the time the mouse takes to debounce the switch.
 
Associate
OP
Joined
19 Jul 2015
Posts
489
Apparently no one else (apart from Tetras) finds this interesting, but I do, so here are some more experiments.

First of all the same measurement as before, but in a game. For this purpose I used Unreal Tournament 3, since I know how to mod it, it's old enough to run far above the monitor's refresh rate, and it's possible to cap the frame rate quickly and easily using console commands. I scripted a very simple custom weapon that turns the whole screen solid black, changing to white when the fire button is pressed. For anyone wanting to recreate this test, here is the mod, plus source code: https://mrevil.asvachin.com/comp/latency/UTMutator_Latency.zip Extract the contents to %USERPROFILE%\Documents\My Games\Unreal Tournament 3. You can then select the UTMutator_Latency mutator when starting the game.

Here's what it looks like with an unlimited frame rate (>500):

centre-unlimited.png


As before, the blue trace is the mouse button (going from high to low when pressed, and back to high when released), and red is the photodiode (going from low to high when the weapon fires, and back to low when firing stops). I didn't bother taking multiple measurments and averaging them this time. The above image is typical though, at about 25ms.

Next up is one I haven't seen anyone else measure before - the time taken for the monitor to physically draw a frame from top to bottom. For this I used two photodiodes, one near the top of the screen and one near the bottom. Unlike the browser test, this time I used BPX 65 photodiodes with 510kΩ in parallel, because they are the only photodiodes I had a pair of in my parts drawer.

With the frame rate limited to 60 it looks like this:

top-bottom-60fps.png


The red trace is the top photodiode, the green one is the bottom photodiode. Overall latency is pretty high now, but that's not the interesting part. The interesting part is that the top of the screen is drawn about 5ms before the bottom. Also note that the monitor overshoots quite a bit, which is typical for monitors with Freesync and overdrive at lower frame rates.

Now the same at 160fps (just below the monitor's 165Hz refresh rate):

top-bottom-160fps.png


Total latency is much lower now, and the overshoot is virtually gone, as expected. The difference between the top and bottom is the same though. I thought it would be less.

And lastly the same again but with an uncapped frame rate, and VSync off. This requires two images, as I shall explain afterwards:

top-bottom-unlimited-normal.png


top-bottom-unlimited-torn.png


Depending on refresh rate and frame rate you will see one of the above two images happen more or less randomly. The first image looks the same as the previous measurements. However the second image shows the bottom of the screen changing before the top. What's happening is that the weapon is firing part-way through a frame, so the top of the screen is still showing the frame before firing, and the bottom during firing. The top of the screen then catches up when the next frame is drawn. This is tearing.

And that's one of the reasons why it's a good idea to cap frame rate to just below your monitor's refresh rate. You get more consistent latency that way.

This also shows why you need to measure latency in the middle of the screen if you want to compare numbers with anyone else's.

Lastly, here's a photo of the whole setup, replete with random cables all over the place:

setup.jpg


The monitor under test is the one on the left. The two photodioes are held against it by a sophisticated cardboard+tape contraption. It's positioned near the side of the monitor so the HUD doesn't interfere with the bottom photodiode. The monitor on the right shows the oscilloscope output. The oscilloscope itself is the small object with a green LED on it, on top of the PC, underneath the right monitor. The modded mouse is the one on the right on the "desk".
 
Back
Top Bottom