presentation-time is a Wayland protocol which allows clients to get feedback on when their surfaces were shown on screen and on the predicted next presentation time. It's useful for clients such as video players and games to get timing feedback and schedule content to be drawn for exactly the time when it will be shown to the user. It should also be useful to e.g. GTK to time its animations precisely.
I wanted to try implementing this myself, as I have a bit of experience around this area (with sway/wlroots and from the client side, too), but I got a little lost in all the abstractions around Clutter, stage, actors and the master clock. Is there some documentation I missed for the general structure of how things work?
How would you like it to work
When a surface is committed after requesting a
wp_presentation_feedback, the object is stored. If the surface has another commit before its buffer is sampled by the compositor for rendering, the
wp_presentation_feedback->discarded event is emitted. Otherwise, when the buffer of the surface is sampled for rendering the
wp_presentation_feedback object is stored further as to not be prematurely discarded by the next commit. Then, when the frame containing the sampled contents of the surface is presented and the frame timing from DRM is received, the
wp_presentation_feedback->presented is emitted with the timing information.
Relevant links, screenshots, screencasts etc.
weston-presentation-shm is a very useful client for debugging the presentation-time implementation.