• Pekka Paalanen's avatar
    cogl: Pick glReadPixels format by target, not source · 981b0454
    Pekka Paalanen authored
    Presumably glReadPixels itself can be more performant with pixel format
    conversions than doing a fix-up conversion on the CPU afterwards. Hence,
    pick required_format based on the destination rather than the source, so
    that it has a better chance to avoid the fix-up conversion.
    
    With CoglOnscreen objects, CoglFramebuffer::internal_format (the source
    format) is also wrong. It is left to a default value and never set to
    reflect the reality. In other words, read-pixels had an arbitrary
    intermediate pixel format that was used in glReadPixels and then fix-up
    conversion made it work for the destination.
    
    The render buffers (GBM surface) are allocated as DRM_FORMAT_XRGB8888.
    If the destination buffer is allocated as the same format, the Cogl
    read-pixels first converts with glReadPixels XRGB -> ABGR because of the
    above default format, and then the fix-up conversion does ABGR -> XRGB.
    This case was observed with DisplayLink outputs, where the native
    renderer must use the CPU copy path to fill the "secondary GPU"
    framebuffers.
    
    This patch stops using internal_format and uses the desired destination
    format instead.
    
    _cogl_framebuffer_gl_read_pixels_into_bitmap() will still use
    internal_format to determine alpha premultiplication state and multiply
    or un-multiply as needed. Luckily all the formats involved in the
    DisplayLink use case are always _PRE and so is the default
    internal_format too, so things work in practise.
    
    Furthermore, the GL texture_swizzle extension can never apply to
    glReadPixels. Not even with FBOs, as found in this discussion:
    #72
    Therefore the target_format argument is hardcoded to something that can
    never match anything, which will prevent the swizzle from being assumed.
    
    !313
    981b0454
cogl-framebuffer-gl.c 52.7 KB