While Ren'Py is primarily used with two dimensional rectangular images that are common in visual novels, underneath the hood it has a model-based renderer intended to to take advantage of features found in modern GPUs. This allows for a number of visual effects that would not otherwise be possible.
As a warning, this is one of the most advanced features available in Ren'Py.
In many cases, it's not necessary to understand how model-based rendering
works behind the scenes - features like matrixcolor
and Live2D support
can be used without understanding how Model-Based rendering works, and more
such features will be added to the understanding. This documentation is
intended for very advanced creators, and for developers looking to add
to Ren'Py itself.
As of Ren'Py 7.4 (late 2020), Model-Based rendering needs to be enabled to be used. This is done by setting config.gl2 to True, using:
define config.gl2 = True
config.gl2
= False linkIf true, Ren'Py will default to using a model-based renderer.
As it's expected that model-based rendering will become the only renderer in the near future, the rest of this documentation is written as if model-based rendering is enabled all the time.
Model-Based Rendering is one of the most advanced features in Ren'Py, and this documentation may be hard to understand without first looking at the OpenGL, OpenGL ES, GLSL, and GLSL ES manual. What's more, since there are portions of the models that are passed directly to your GPU drivers, which may accept erroneous inputs, it's important to check on multiple kinds of hardware.
The fundamental thing that Ren'Py draws to the screen is a Model. A model consists of the following things:
As Ren'Py usually draws more than one thing to the screen, it creates a
tree of Render
objects. These Render objects may have Models or
other Renders as children. (A Render object can also be turned into a Model.
as described below.) A Render contains:
Matrix
that describes how the children are transformed in
three-dimensional space.Ren'Py draws the screen by performing a depth-first walk through the tree of Renders, until a Model is encountered. During this walk, Ren'Py updates a matrix transforming the location of the Model, a clipping polygon, and lists of shader parts, uniforms, and gl properties. When a Model is encountered as part of this walk, the appropriate shader program is activated on the GPU, all information is transferred, and a drawing operation occurs.
Ren'Py creates Models automatically as part of its normal operation. The main reason to understand where models are created is that models correspond to drawing operations, and hence are the units that shaders are applied to.
Solid()
u_renpy_solid_color
uniform.Dissolve()
, ImageDissolve()
, AlphaDissolve()
, Pixellate()
, AlphaMask()
, Flatten()
Transform()
and ATLA Transform creates a model if mesh
is true, or if blur
is being used. In this case, the children of the Transform are rendered
to textures, with the mesh of the first texture being used for the mesh
associated with the model.
Not every transform creates a Model. Some transforms will simply add
shaders and uniforms to a Render (such as transforms that use
blur
or alpha
). Other transforms simply affect
geometry.
Render
mesh
attribute is True.
is being used. In this case, the children of the Render are rendered
to textures, with the mesh of the first texture being used for
the mesh associated with the model.It's expected that Ren'Py will add more ways of creating models in the future.
Ren'Py generates a shader program by first assembling a list of shader part names. This list consists of "renpy.geometry", the list of shader parts taken from Renders, and the list of shader parts found in the Model being drawn.
The shader parts are then deduplicated. If a shader part begins with "-", it is removed from the list, as is the rest of that part without the leading "-". (So "-renpy.geometry" will cause itself and "renpy.geometry" to be removed.)
Ren'Py then takes the list of shader parts, and retrieves lists of variables, functions, vertex shade parts, and fragment shader parts. These are, in turn, used to generate the source code for shaders, with the parts of the vertex and fragement shaders being included in low-number to high-number priority order.
This means that any variable created by one of the shader will be accessible by every other fragment from any other shader in the list of shader parts. There is no scope like in Python functions to protect interference between shaders.
Ren'Py keeps a cache of all combinations of shader parts that have ever been used in game/cache/shaders.txt, and loads them at startup. If major changes in shader use occur, this file should be edited or deleted so it can be re-created with valid data.
New shader parts can be created by calling the renpy.register_shader function and supplying portions of GLSL shaders.
Generally, shader parts should be of the form "namespace.part", such as "mygame.recolor" or "mylibrary.warp". Names beginning with "renpy." or "live2d." are reserved for Ren'Py, as are names beginning with _.
renpy.
register_shader
(name, **kwargs) linkThis registers a shader part. This takes name, and then keyword arguments.
The variables used by the shader part. These should be listed one per line, a storage (uniform, attribute, or varying) followed by a type, name, and semicolon. For example:
variables='''
uniform sampler2D tex0;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
'''
Other keyword arguments should start with vertex_
or fragment_
,
and end with an integer priority. So "fragment_200" or "vertex_300". These
give text that's placed in the appropriate shader at the given priority,
with lower priority numbers inserted before higher priority numbers.
Ren'Py supports only the following variable types:
Matrix
)Uniform variables should begin with u_, attributes with a_, and varying variables with v_. Names starting with u_renpy_, a_renpy, and v_renpy are reserved, as are the standard variables given below.
As a general sketch for priority levels, priority 100 sets up geometry, priority 200 determines the initial fragment color (gl_FragColor), and higher-numbered priorities can apply effects to alter that color.
Here's an example of a custom shader part that applies a gradient across each model it is used to render:
init python:
renpy.register_shader("example.gradient", variables="""
uniform vec4 u_gradient_left;
uniform vec4 u_gradient_right;
uniform vec2 u_model_size;
varying float v_gradient_done;
attribute vec4 a_position;
""", vertex_300="""
v_gradient_done = a_position.x / u_model_size.x;
""", fragment_300="""
float gradient_done = v_gradient_done;
gl_FragColor *= mix(u_gradient_left, u_gradient_right, gradient_done);
""")
The custom shader can then be applied using a transform:
transform gradient:
shader "example.gradient"
u_gradient_left (1.0, 0.0, 0.0, 1.0)
u_gradient_right (0.0, 0.0, 1.0, 1.0)
show eileen happy at gradient
As stated before, the gradient_done
variable from the example.gradient shader
will be accessible by any and all other shaders applied from the same list. This
can be useful when having optional parts in a given shader system, but it can also
lead to name collisions when using two independent shaders.
There is a variable that can help in debugging custom shaders:
config.log_gl_shaders
= False linkIf true, source code for the GLSL shader programs will be written to log.txt on start.
Model-Based rendering adds the following properties to ATL and Transform()
:
mesh
linkType: | None or True or tuple |
---|---|
Default: | None |
If not None, this Transform will be rendered as a model. This means:
mesh_pad
linkType: | None or tuple |
---|---|
Default: | None |
If not None, this can either be a 2 or 4 component tuple. If mesh is true and this is given, this applies padding to the size of the textues applied to the the textures used by the mesh. A two component tuple applies padding to the right and bottom, while a four component tuple applies padding to the left, top, right, and bottom.
This can be used, in conjunction with the gl_pixel_perfect
property,
to render text into a mesh. In Ren'Py, text is rendered at the screen
resoltution, which might overflow the boundaries of the texture that
will be applied to the mesh. Adding a few pixels of padding makes the
texture bigger, which will display all pixels. For example:
transform adjust_text:
mesh True
mesh_pad (10, 0)
gl_pixel_perfect True
shader "shaders.adjust_text"
will ensure that the texture passed to the shader contains all of the pixels of the text.
shader
linkType: | None or str or list of str |
---|---|
Default: | None |
If not None, a shader part name or list of shader part names that will be applied to the this Render (if a Model is created) or the Models reached through this Render.
blend
linkType: | None or str |
---|---|
Default: | None |
if not None, this should be a string. This string is looked up in
config.gl_blend_func
to get the value for the gl_blend_func
property. It's used to use alternate blend modes.
The default blend modes this supports are "normal", "add", "multiply", "min", and "max".
In addition, uniforms that start with u_ but not with u_renpy are made available as Transform properties. GL properties are made available as transform properties starting with gl_. For example, the color_mask property is made available as gl_color_mask.
config.gl_blend_func
= { ... } linkA dictionary used to map a blend mode name to a blend function. The blend modes are supplied to the blend func property, given below.
The default blend modes are:
gl_blend_func["normal"] = (GL_FUNC_ADD, GL_ONE, GL_ONE_MINUS_SRC_ALPHA, GL_FUNC_ADD, GL_ONE, GL_ONE_MINUS_SRC_ALPHA)
gl_blend_func["add"] = (GL_FUNC_ADD, GL_ONE, GL_ONE, GL_FUNC_ADD, GL_ZERO, GL_ONE)
gl_blend_func["multiply"] = (GL_FUNC_ADD, GL_DST_COLOR, GL_ONE_MINUS_SRC_ALPHA, GL_FUNC_ADD, GL_ZERO, GL_ONE)
gl_blend_func["min"] = (GL_MIN, GL_ONE, GL_ONE, GL_MIN, GL_ONE, GL_ONE)
gl_blend_func["max"] = (GL_MAX, GL_ONE, GL_ONE, GL_MAX, GL_ONE, GL_ONE)
The following uniforms are made available to all Models.
vec2 u_model_size
float u_lod_bias
config.gl_lod_bias
and defaulting to -0.5, biases Ren'Py to always pick the next bigger
level and scale it down.mat4 u_transform
float u_time
vec4 u_random
vec4 u_viewport
sampler2D tex0
, sampler2D tex1
, sampler2D tex2
vec2 res0
, vec2 res1
, vec2 res2
The following attributes are available to all models:
vec4 a_position
If textures are available, so is the following attribute:
vec2 a_tex_coord
GL properties change the global state of OpenGL, or the Model-Based renderer.
These properties can be used with a Transform, or with the Render.add_property()
function.
gl_blend_func
If present, this is expected to be a six-component tuple, which is used to set the equation used to blend the pixel being drawn with the pixel it is being drawn to, and the parameters to that equation.
Specifically, this should be (rgb_equation, src_rgb, dst_rgb, alpha_equation, src_alpha, dst_alpha). These will be used to call:
glBlendEquationSeparate(rgb_equation, alpha_equation)
glBlendFuncSeparate(src_rgb, dst_rgb, src_alpha, dst_alpha)
Please see the OpenGL documentation for what these functions do. OpenGL constants can be imported from renpy.uguu:
init python:
from renpy.uguu import GL_ONE, GL_ONE_MINUS_SRC_ALPHA
The blend
transform property is generally an easy way to
use this.
gl_color_mask
gl_depth
If true, this will clear the depth buffer, and then enable depth rendering for this displayable and the children of this displayable.
Note that drawing any pixel, even transparent pixels, will update
the depth buffer. As a result, using this with images that have
transparency may lead to unexpected problems. (As an alternative,
consider the zorder
and behind
clauses of the show
statement.)
gl_pixel_perfect
The following properties only take effect when a texture is being created,
by a Transform with mesh
set, or by Model()
, where these
can be supplied the property method.
gl_drawable_resolution
gl_anisotropic
If supplied, this determines if the textures applied to a mesh are created with anisotropy. Anisotropy is a feature that causes multiple texels (texture pixels) to be sampled when a texture is zoomed by a different amount in X and Y.
This defaults to true. Ren'Py sets this to False for certain effects, like the Pixellate transition.
gl_mipmap
gl_texture_wrap
When supplied, this determines how the textures applied to a mesh are wrapped. This expects a 2-component tuple, where the first component is used to set GL_TEXTURE_WRAP_S and the second component is used to set GL_TEXTURE_WRAP_T, which conventionally are the X and Y axes of the created textyure.
The values should be OpenGL constants imported from renpy.uguu:
init python:
from renpy.uguu import GL_CLAMP_TO_EDGE, GL_MIRRORED_REPEAT, GL_REPEAT
The Model displayable acts as a factory to created models for use with the model-based renderer.
Model
(size=None, **properties) linkThis is a displayable that causes Ren'Py to create a 2D or 3D model for use with the model-based renderer, that will be drawn in a single operation with the shaders given here, or selected by an enclosing Transform or Displayable.
If no mesh method is called, a mesh that sets a_position and a_tex_coord to match the way Ren'Py loads textures is created if at least one texture is supplied. Otherwise, a mesh that only sets a_position is used.
All methods on this calls return the displayable the method is called on, making it possible to chain calls.
child
(displayable, fit=False) linkThis is the same as the texture method, except that the focus and main parameters are set to true.
grid_mesh
(width, height) linkCreates a mesh that consists of a width x height grid of evenly spaced points, connecting each point to the closest points vertically and horizontally, and dividing each rectangle in the grid so created into triangles.
property
(name, value) linkSets the value of a gl property.
shader
(shader) linkAdds a shader to this model.
texture
(displayable, focus=False, main=False, fit=False) linkAdd a texture to this model, by rendering the given displayable.
The first texture added will be tex0
, the second tex1
, a
and so on.
uniform
(name, value) linkSets the value of a uniform that is passed to the shaders.
The Model displayable can be used in conjunction with an ATL transform and a built-in shader to create the Dissolve transform:
transform dt(delay=1.0, new_widget=None, old_widget=None):
delay delay
Model().texture(old_widget).child(new_widget)
shader [ 'renpy.dissolve' ]
u_renpy_dissolve 0.0
linear delay u_renpy_dissolve 1.0
Using the Model displayable as the child of a displayable is incompatible
with mesh
, as the two both create models inside Ren'Py.
Variables:
uniform mat4 u_transform;
attribute vec4 a_position;
Vertex shader:
gl_Position = u_transform * a_position;
Variables:
uniform sampler2D tex0;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
uniform float u_renpy_blur_log2;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
gl_FragColor = vec4(0.);
float renpy_blur_norm = 0.;
for (float i = -5.; i < 1.; i += 1.) {
float renpy_blur_weight = exp(-0.5 * pow(u_renpy_blur_log2 - i, 2.));
renpy_blur_norm += renpy_blur_weight;
}
gl_FragColor += renpy_blur_norm * texture2D(tex0, v_tex_coord.xy, 0.);
for (float i = 1.; i < 14.; i += 1.) {
if (i >= u_renpy_blur_log2 + 5.) {
break;
}
float renpy_blur_weight = exp(-0.5 * pow(u_renpy_blur_log2 - i, 2.));
gl_FragColor += renpy_blur_weight * texture2D(tex0, v_tex_coord.xy, i);
renpy_blur_norm += renpy_blur_weight;
}
if (renpy_blur_norm > 0.0) {
gl_FragColor /= renpy_blur_norm;
} else {
gl_FragColor = texture2D(tex0, v_tex_coord.xy, 0.0);
}
Variables:
uniform float u_lod_bias;
uniform sampler2D tex0;
uniform sampler2D tex1;
uniform float u_renpy_dissolve;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
vec4 color0 = texture2D(tex0, v_tex_coord.st, u_lod_bias);
vec4 color1 = texture2D(tex1, v_tex_coord.st, u_lod_bias);
gl_FragColor = mix(color0, color1, u_renpy_dissolve);
Variables:
uniform float u_lod_bias;
uniform sampler2D tex0;
uniform sampler2D tex1;
uniform sampler2D tex2;
uniform float u_renpy_dissolve_offset;
uniform float u_renpy_dissolve_multiplier;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
vec4 color0 = texture2D(tex0, v_tex_coord.st, u_lod_bias);
vec4 color1 = texture2D(tex1, v_tex_coord.st, u_lod_bias);
vec4 color2 = texture2D(tex2, v_tex_coord.st, u_lod_bias);
float a = clamp((color0.a + u_renpy_dissolve_offset) * u_renpy_dissolve_multiplier, 0.0, 1.0);
gl_FragColor = mix(color1, color2, a);
Variables:
uniform vec4 u_renpy_solid_color;
Fragment shader:
gl_FragColor = u_renpy_solid_color;
Variables:
uniform float u_lod_bias;
uniform sampler2D tex0;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
gl_FragColor = texture2D(tex0, v_tex_coord.xy, u_lod_bias);
Variables:
uniform mat4 u_renpy_matrixcolor;
Fragment shader:
gl_FragColor = u_renpy_matrixcolor * gl_FragColor;
Variables:
uniform float u_renpy_alpha;
uniform float u_renpy_over;
Fragment shader:
gl_FragColor = gl_FragColor * vec4(u_renpy_alpha, u_renpy_alpha, u_renpy_alpha, u_renpy_alpha * u_renpy_over);