-
Notifications
You must be signed in to change notification settings - Fork 651
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add option to enforce a minimum latency to improve frame pacing #1139
base: master
Are you sure you want to change the base?
Add option to enforce a minimum latency to improve frame pacing #1139
Conversation
how was this not implemented yet ? |
I haven't checked the code yet but will share my own thoughts. Ideally we should stabilise entire frame path from frame capture to frame display. In such a way we also compensate encoding and decoding time jitter. As base timeline we can use frame capture timestamps (maybe need to send them from sunshine?) from which we can get delays between frames. We can even get remote VRR with this! |
I'm a bit skeptical of such an end-to-end solution, since it's really hard to synchronize timestamps between two separate systems. You can't simply trust raw timestamps sent from a separate system (since their system clocks aren't synchronized), and you can't rely on an "ack" or "frame displayed" message, since that would include any network latency. Definitely an intriguing idea, just not sure how it would look in practice. Perhaps someone more knowledgeable than me could make it work. |
The thing here is that you actually don't need an absolute timestamps to be synchronised between two systems. You can use differences between remote timestamps to know how to display frames on local system. You basically keep delay as unknown but just do frame pacing. The only thing you have to choose is some kind of base delay offset: if you base you local pacing timeline just on the first arrived frame it is possible that a lot of frames will arrive later than needed. So it's essential to add some artificial delay. For sure you can just set it to constant value of for example 100ms (after first frame arrival timestamp), it will give you the smoothest experience but also uncomfortable latency, so I guess the way to go here is to record timings of frames arrival and calculate what delay we need to get for example 99.9% of frames under it... Such a solution of dynamic pacing latency may be an overkill for LAN streaming tho) in that case jitters and latency are not so big so constant (configurable) value may be ok |
I noticed that Parsec was smoother than moonlight last time I tried (with comparable latency). Do not really know what they are doing but as in commercial product they definitely put effort into this thing and also have statistics from around the world to tune it. P.S. It may be some macOS related issue, maybe will do some testing sometime in future |
The thing we need here is that monotonic clock is monotonic and uniform which is not really always true (it seems so after checking rust docs for std::time::Instant, also check this answer on C++ comparison) |
Does this PR also help with audio crackling? I have the issue that due to jitter (I think) audio is unstable. More so with the android client, but also on Windows as client. Server is Sunshine on Windows (AMD-encoding). Parsec is running perfectly fine audio 100 %. I have tried all the settings and work-arounds I have found on the internet. |
Would love to see this get resurrected. Would be super to have for playing single player Action and RPG games over WiFi. I experience the exact same behavior described in the related "idea". My WiFi latency is the majority of the time <5ms but even just a 2 or 3ms variance can cause frame drops. When streaming outside my home where latency is closer to 30ms, frames dropped to jitter is nearly non-existent. |
I was priviledged to test the visual side of things (not sound yet) with a +50ms of buffer delay, which I think is a good tradeoff between queueing and still having a somewhat responsive gaming experience, and you know what? |
After a discussion in Discord today, I ran some builds with the OP's patch (plus a couple of minor fixes) and they're available for a short while. |
It's cool to see some interest in this PR. I've largely quit using game streaming, which is why I haven't touched this PR or rebased it recently. But if others find it useful, maybe I can touch it up. You're right that it doesn't buffer audio along with the video; that's definitely a miss. I'd also like to see a more quantitative evaluation of whether it improves pacing. The best I've done is monitoring the presentation frame time graphs from my steam deck. |
Problem
If a user has a poor network connection (high jitter), they may experience instability in frame delivery. This manifests as stutters, and overall is not a great experience. This problem is exacerbated when frames are arriving near the end of the client's vsync period (since there's less time to present the frame before we roll over to the next vsync period).
Solution
To account for jitter in the network connection (or in the host PC capture rate), we can allow the user to purposefully buffer some frames. This trades latency for stability. However, we need to be careful with how the buffering is done. If we simply render a frame once the next is available, that gives us one frame of buffer. However, that doesn't solve the frame pacing issue at all, since the second frame's arrival time dictates the render time. We can't rely on any individual frame's arrival time for timing our frame rendering.
Instead, we need to have a consistent schedule for releasing frames for rendering. We can measure the average time a frame spends in the input queue before it is released for rendering. Based on that measurement, we can gradually adjust our frame release schedule to target a desired minimum latency. Benefits of this approach are:
This "minimum latency" approach is exactly what is implemented in this PR. (I'm also open to different naming... Maybe "buffered latency"?)
Related Items
Idea on Moonlight Ideas board: https://ideas.moonlight-stream.org/posts/251/frame-buffering-option
Testing
I've tested this (and other methods for smoothing frame delivery) extensively, and found this gives me the best experience. My hardware is:
I typically use this option with 6 milliseconds of minimum latency, vsync enabled, and frame pacing disabled (not supported on Steam Deck anyways).
I'm looking for others to test with various hardware configurations to see whether it offers a net improvement.