Backends: SDL_Renderer: Fix texture atlas format on big-endian hardware (#4927)

SDL_PIXELFORMAT_RGBA32 is intended for byte arrays where the channels
are specifically in the RGBA order. However, this is not what
GetTexDataAsRGBA32 does: rather, it constructs an array where each
pixel is an unsigned int in ABGR order. On little-endian platforms,
this does indeed result in an RGBA byte order, however, on big-endian
platforms, this results in an ABGR byte order, which does not match
the PIXELFORMAT enum.

What should be used is the SDL_PIXELFORMAT_ABGR8888 enum, which
specifies that the pixels are in native-endian ABGR, which they are.
This commit is contained in:
Clownacy
2022-01-30 22:42:21 +00:00
committed by ocornut
parent 43177324c0
commit c39192ba64
2 changed files with 2 additions and 1 deletions

View File

@ -210,7 +210,7 @@ bool ImGui_ImplSDLRenderer_CreateFontsTexture()
io.Fonts->GetTexDataAsRGBA32(&pixels, &width, &height); // Load as RGBA 32-bit (75% of the memory is wasted, but default font is so small) because it is more likely to be compatible with user's existing shaders. If your ImTextureId represent a higher-level concept than just a GL texture id, consider calling GetTexDataAsAlpha8() instead to save on GPU memory.
// Upload texture to graphics system
bd->FontTexture = SDL_CreateTexture(bd->SDLRenderer, SDL_PIXELFORMAT_RGBA32, SDL_TEXTUREACCESS_STATIC, width, height);
bd->FontTexture = SDL_CreateTexture(bd->SDLRenderer, SDL_PIXELFORMAT_ABGR8888, SDL_TEXTUREACCESS_STATIC, width, height);
if (bd->FontTexture == NULL)
{
SDL_Log("error creating texture");