From DisplayObject to Stage3D Texture
Flash makes it pretty easy to use any DisplayObject
as a Stage3D
texture. This is a great feature since you can use powerful, traditional classes like MovieClip
, Sprite
, TextField
, and Shape
to build a texture—often with vector graphics—and then use Stage3D
‘s GPU hardware acceleration to render them with maximum performance. But this path is fraught with subtle problems, any one of which could result in poor rendering quality that’s quite hard to debug. Today’s article takes you through the process step by step to make sure you end up with great results.
First off, here’s a high level view of the steps you must take to transform your DisplayObject
into a Stage3D
texture:
- Create vector graphics (e.g. explicit
Shape
objects, assets from Flash Pro) - Convert to
BitmapData
usingBitmapData.draw
- Create and upload to a
Scene3D
Texture
- Render triangles with a fragment shader that samples the texture
As you know, vector graphics are mathematically defined and, for all intents and purposes, of “perfect” quality at any rotation or scale. So you won’t have quality problems in step #1.
In step #2, BitmapData.draw
has one parameter affecting quality: smoothing
. It defaults to false
, so setting it to true
can really help in some cases. There is also a non-parameter that affects quality: Stage.quality
. So if your Stage
quality is low, you’ll get a lower quality rendering to the BitmapData
. An alternative to changing the entire Stage
‘s quality is to use BitmapData.drawWithQuality
. This has been around since Flash Player 11.3 and AIR 3.3, so it may require you to target a higher version of Flash Player than you’re already using.
Step #3 involves creating a texture which can also cause the quality to be lower. The format parameter to Context3D.createTexture
can be Context3DTextureFormat.BGRA
for full quality, but the other formats all involve lower quality. The BGRA_PACKED
and BGR_PACKED
formats only use 16 bits per pixel and the COMPRESSED
and COMPRESSED_ALPHA
use lossy texture formats (PVR, ETC, and DXT) via the ATF container format.
Lastly, step #4 is where your fragment shader samples the texture on a per-pixel basis. There are two possible problems in this step. First, the point you’re sampling on the texture could be imprecise if, for example, integer clamping was applied at some phase. Second, textures are sampled using a variety of texture flags. The most important flags in this case are the ones controlling the filtering mode: nearest
and linear
. The nearest
flag will result in a much blockier output since there is no smoothing applied, but you can guarantee that only colors in the texture will be used.
If you’re not sure at which step your problem is occurring, I’d recommend a guess-and-check approach until you’ve found the culprit. Here are the parameters:
- Turn on
smoothing
and increase theStage.quality
for yourBitmapData.draw
. If you can require Flash Player 11.3, switch toBitmapData.drawWithQuality
instead and setsmoothing
totrue
andquality
toBEST
. - Ensure your
Context3D.createTexture
is using theBGRA
format instead of any of the others. - Ensure your fragment shader texture sampling flags include
linear
and notnearest
You may need to apply more than one of these techniques in order to solve the quality issue, but you want to apply as few of them as possible in order to keep performance high. Technique #1 will only affect the time required to create the textures, which is hopefully an infrequent process. Technique #2 will require more VRAM and more upload time than the other formats. The former will become an issue if you’re using lots of textures but the latter is only a concern at texture creation time. Finally, technique #3 will affect the drawing of every single pixel on every single frame. Luckily, most modern GPUs are so fast that the difference between linear and nearest-neighbor sampling is essentially non-existent.
#1 by Benjamin Guihaire on September 23rd, 2013 ·
Few more notes on the subject
Prior to 11.4, COMPRESSED_ALPHA textures are not (fully?) supported by flash.
You can create a DXT texture from a bitmapData at runtime, but for mobile, PVR and ETC can’t be created from a bitmapData at runtime, so the best for size is to use BGRA_PACKED and BGR_PACKED for mobile.
There is another step that might be important : the creation of the mipmaps of your texture. You can resize in half your bitmapData multiple times using draw() with the correct matrix, but due to flash doing pre-multiplied alpha internally, it can result in poor results such as halos around transparent or semi transparent pixels, and pixels getting darker and darker for each lower resolution mipmap for semi transparent pixels. One solution to go around that problem is to write your own bitmap resizing filter using a pixelbender shader for example, or access the pixels directly with getPixels or getPixel32 and do the resize manually.
Creating the mipmaps can be CPU intensive, so you might want to do it in multiple frames (1 mipmap generated per frame for example, and more as the mipmap get smaller) , or maybe no mipmaps, in that case, the shader needs to be created correctly to not expect mipmaps.
If you don’t create the mipmaps, pay attention to the size of your generated texture to not be too small or too big, to be roughly the size it will be rendered on your 3d canvas (which can be the case if you have an almost fixed camera, or if the generated texture is part of the UI rendered on the 3d canvas with for example starling).
because of the pre-multiplied alpha, to render your texture on the 3d, you might also want to use the blending mode : dst = (1) * src + (1-alpha) * dst (instead of the default alpha blending: dst = (alpha) * src + (1-alpha) * dst
Benjamin Guihaire
http://www.guihaire.com/code