Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • gtk gtk
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 1,603
    • Issues 1,603
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 220
    • Merge requests 220
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Container Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • GNOMEGNOME
  • gtkgtk
  • Issues
  • #2428
Closed
Open
Issue created Feb 11, 2020 by David Hogan@dqh

OpenGL extension checking bug on macOS

In gdk/gdkglcontext.c, epoxy_has_gl_extension() is used to check for extensions such as GL_EXT_framebuffer_blit. On macOS, Epoxy fails to report support for this extension and therefore Gdk/Gtk operate without them. This results in a huge performance hit for our GtkGlArea Gtk+3 application (VICE) when running on macOS.

The functionality IS supported, even on old macs. However, starting from macOS OpenGL 3.2, I believe that as extensions are promoted to core OpenGL features, the macOS OpenGL implementation stops reporting those extensions. That is, if you obtain a gl context of a version that has promoted that extension to core.

On my machine, VICE x64sc Gtk+3 by default uses about 75% of a host cpu core. If I modify GDK to request a legacy OpenGL context, then epoxy_has_gl_extension() reports to Gdk that GL_EXT_framebuffer_blit, GL_ARB_texture_non_power_of_two and GL_ARB_texture_rectangle are all present, and VICE suddenly only requires about 45% of a single host cpu core.

Further - if I remove that modification and get OpenGL 3.2 again, but change Gdk to pretend that the above extensions are still detected, I get the same dramatic host CPU performance gain.

I would like to propose that Gdk check the OpenGL context version before checking for an extension. For example, GL_EXT_framebuffer_blit is promoted to core OpenGL 3.0, so the line:

priv->has_gl_framebuffer_blit = epoxy_has_gl_extension ("GL_EXT_framebuffer_blit");

could be changed to:

priv->has_gl_framebuffer_blit = priv->gl_version >= 30 || epoxy_has_gl_extension ("GL_EXT_framebuffer_blit");

to assume that so long as an OpenGL context of 3.0 or greater is obtained, then the feature can be assumed to be available.

This change alone would deliver the huge performance benefit above to our project, but possibly all use of epoxy_has_gl_extension() should be updated to check the OpenGL context version. It is possible that other OpenGL implementations out there also choose not to report core features as available extensions.

Are you still accepting patches for Gtk 3.x? Is there a possibility of a future Gtk 3.x release?

Assignee
Assign to
Time tracking