A tool written in rust to bleed an image's border colors into their transparent neighbors. It's primarily intended for use in applications that use bilinear filtering.
Using is as simple as:
alpha_bleed <input.png> <output.png>
A very common method for scaling images up and down is bilinear filtering. Each pixel in the new image is mapped to the original image space. This often results in coordinates that land somewhere in-between the original pixels (i.e. floats as opposed to ints). The converted coordinates are then used to find the closest neighboring pixels (ints). Finally, by weighing the area between each neighbor an interpolated pixel is achieved for the new images.
local widthScale = oldWidth / newWidth
local heightScale = oldHeight / newHeight
for i = 0, newHeight - 1 do
for j = 0, newWidth - 1 do
local y = i * heightScale
local x = j * widthScale
local x1 = math.floor(x)
local x2 = math.min(oldWidth - 1, math.ceil(x))
local y1 = math.floor(y)
local y2 = math.min(oldHeight - 1, math.ceil(y))
end
end
This is a flawed approach. It makes sense when every pixel is fully opaque, but not when transparency is involved.
For example, say we have a red and green pixel and we want to interpolate between the two.
The result would be:
Now let's imagine that the green pixel is fully transparent. If we do the interpolation this time we get a value that visually doesn't make sense. The green is still having an impact on the final result!
This tool helps mitigate the issue by finding all the pixels in the image that are not fully transparent, but that do have a fully transparent neighbor. It then bleeds out the r, g, and b channels into those fully transparent neighbors.
This results in a process that looks like this:
Let's see what this does to our end result with a few example. First let's revisit the red with transparent green example.
We'd run alpha bleed on this which would result in the red bleeding into the green pixel and overwriting it:
Then when we bilinear interpolate the middle pixel we get:
Great! That's the result we'd expect!
However, alpha bleeding doesn't actually perfectly solve the issue. For example, let's imagine this is our image and we are going to bleed into the transparent pixel.
In this case we get the same yellowish color as before. However, if we bilinear filter a new pixel between either the red or gree we're using this yellowish color as part of the calculation which looks wrong:
It's not terrible, but it's not accurate either.
A better way of handling this would be for the graphics pipeline to support pre-multiplied alpha images. This means instead of storing pixels as:
Instead they get stored as:
This causes the bilinear filtering result to be properly weighed. However, changes must also be made to the graphics pipeline to support this.
-- typical blend equation
result.rgb = (foreground.rgb * foreground.a) + (background.rgb * (1 - foreground.a))
-- pre-multiply support
result.rgb = foreground.rgb + (background.rgb * (1 - foreground.a))
It also means that images must be stored at a higher bit depth. For example, 8bit depth images would now need to be stored as 16bit depth images so that precision isn't lost.
Alternatively you might suggest making a modification to the bilinear filtering algorithm itself to properly weigh the alpha channel. Howev 5C01 er, this has a performance cost since we'd need to do quite a few more multiplications and inevitably have to temporarily convert to a higher bit depth as part of the process.