Danny's Blog
See all posts
GitHub

Why Is Matrix Math Different in Every Graphics Program?

Written by: Danny Spencer

What gives?

I remember being very frustrated when trying to learn about matrix math in 3D graphics software.

I wondered why matrix multiplication seemed "backwards" in certain programs.

You read an OpenGL or Blender tutorial that says "It's easy! Think of matrix multiplications as applying transformations from right-to-left!"

Then you see a DirectX or Maya tutorial that says something contradictory. "It's easy! Think of matrix multiplications as applying transformations from left-to-right!"

Here's why

There are different conventions for representing transformation matrices. The authors of various 3D packages decide to use one convention over another. It really just boils down to that.

I've come up with two core ideas that define why matrix math appears different and confusing:

  1. Element order: Row-major vs Column-major order
  2. Transform style: Pre-multiply row-vector (y=xAy=xA) vs post-multiply column-vector (y=Axy=Ax)

Element order: Row-major vs Column-major order

From the Wikipedia article: https://en.wikipedia.org/wiki/Row-_and_column-major_order

Row-major and column-major order answers the question: "How do we describe a matrix as a flat list of numbers?"

Given the 4x4 matrix:

A=[100501010001150001]A = \begin{bmatrix} 1&0&0&5\\ 0&1&0&10\\ 0&0&1&15\\ 0&0&0&1 \end{bmatrix}

We can write it as row-major:

float A[16] = {1,0,0,5, 0,1,0,10, 0,0,1,15, 0,0,0,1};

Or as column-major:

float A[16] = {1,0,0,0, 0,1,0,0, 0,0,1,0, 5,10,15,1};

When we talk about row-major and column-major order, it can be in the context of the memory layout of the particular matrix data structure or how it's indexed. Alternatively, it can be how it's initialized from code. I find the latter more useful in most contexts.

Note that element ordering virtually never describes a visual representation of a matrix. If you're pretty-printing a matrix or you see one in a user interface and it's arranged in rows and columns, then they're rows and columns!

With row-major of column-major order, the math doesn't change. It only affects the way it's represented as a flat list.

Transform style: Pre-multiply row-vector (y=xAy=xA) vs post-multiply column-vector (y=Axy=Ax)

We use the notation y=xAy=xA and y=Axy=Ax. yy describes an output vector, xx describes an input vector, and AA describes a transformation matrix. Please do not confuse xx and yy with the x and y axes.

When we say y=xAy=xA, the operation looks like this when working with a 4x4 matrix and R3\R^3 vectors:

T([xyz])=[xyz1][5000010000015030001]T(\begin{bmatrix}x&y&z\end{bmatrix}) = \begin{bmatrix}x&y&z&1\end{bmatrix}\begin{bmatrix}5&0&0&0\\0&10&0&0\\0&0&15&0\\30&0&0&1\end{bmatrix}

When we say y=Axy=Ax, the operation looks like this:

T([xyz])=[5003001000001500001][xyz1]T\left( \begin{bmatrix}x\\y\\z\end{bmatrix} \right) = \begin{bmatrix} 5 & 0 & 0 & 30\\ 0 & 10 & 0 & 0\\ 0 & 0 & 15 & 0\\ 0 & 0 & 0 & 1 \end{bmatrix} \begin{bmatrix}x\\y\\z\\1\end{bmatrix}

Both of these operations yield equivalent vectors. The only difference is the first is a row vector, and the second is a column vector.

Pros of y=Axy=Ax:

Pros of y=xAy=xA:

How do you test how a program does matrix math?

We need to figure out both element order, and transform style.

Mini-quiz: We perform a translation of (10,20,30). What's the element order of the following matrix?

float A[16] = {1,0,0,10, 0,1,0,20, 0,0,1,30, 0,0,0,1};

a) Row-major, b) Column-major, or c) Not enough information

Answer below:

...

...

If you guessed c, then you're correct. There is not enough information. This matrix could be either row-major or column-major depending on the transform style:

Row-major and post-multiply column-vector (y=Axy=Ax):

[xyz1]=[1001001020001300001][xyz1]\begin{bmatrix}x'\\y'\\z'\\1\end{bmatrix} = \begin{bmatrix}1&0&0&10\\0&1&0&20\\0&0&1&30\\0&0&0&1\end{bmatrix}\begin{bmatrix}x\\y\\z\\1\end{bmatrix}

Column-major and pre-multiply row-vector (y=xAy=xA):

[xyz1]=[xyz1][1000010000101020301]\begin{bmatrix}x'&y'&z'&1\end{bmatrix} = \begin{bmatrix}x&y&z&1\end{bmatrix}\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\\10&20&30&1\end{bmatrix}

It's often difficult to find documentation that formally describes this for most 3D graphics software (if it exists at all). So one might resort to finding it out on thier own.

I've used a few tricks:

y=xAy=xA and y=Axy=Ax don't affect basic matrix operations

Those two styles only prescribe how transform matrices are created. Given known matrices, it doesn't actually affect the basic matrix operations and the order of their arguments.

For example, for matrix multiplication, the order only swaps because the matrices themselves are not the same in the different programs.

No matter which style a program uses, the following operations should remain identical across them.

Appendix: a table of matrix math conventions for popular programs

The results of this table are all my own independent research. I've tried and tested the code in the table. Feel free to fact-check!

ApplicationMatrix initialization order*Transform styleLanguageExample (translate by x=10,y=20,z=30)
BlenderRow-majory=AxPython
cube.matrix_local = Matrix( ((1,0,0,10),(0,1,0,20),(0,0,1,30),(0,0,0,1)) )
MayaRow-majory=xAMEL, Python
cmds.setAttr('pCube1.offsetParentMatrix', [1,0,0,0, 0,1,0,0, 0,0,1,0, 10,20,30,1], type='matrix')
HoudiniRow-majory=xAVEX
@P = @P * {1,0,0,0, 0,1,0,0, 0,0,1,0, 10,20,30,1};
Cinema 4DColumn-major*y=AxPython
out = c4d.Matrix(off=c4d.Vector(10,20,30), v1=c4d.Vector(1,0,0), v2=c4d.Vector(0,1,0), v3=c4d.Vector(0,0,1)) * pos
Unreal EngineRow-majory=xAC++
SetActorTransform(FTransform(FMatrix({ 1,0,0,0 }, { 0,1,0,0 }, { 0,0,1,0 }, { 10,20,30,1 })));
UnityColumn-majory=AxC#
CSSColumn-majory=AxCSS
transform:matrix3d(1,0,0,0, 0,1,0,0, 0,0,1,0, 10,20,30,1)
GLSLColumn-majory=Ax (by convention)GLSL
gl_Position = worldViewProj * mat4(1,0,0,0, 0,1,0,0, 0,0,1,0, 10,20,30,1) * pos;
HLSLRow-majory=xA (by convention)HLSL
output.position = mul(mul(float4(input.position,1), float4x4(1,0,0,0, 0,1,0,0, 0,0,1,0, 10,20,30,1)), worldViewProj);
glmColumn-majory=AxC++
glm::vec4 out = glm::mat4(1,0,0,0, 0,1,0,0, 0,0,1,0, 10,20,30,1) * in;
DirectXMathRow-majory=xAC++
XMVECTOR out = XMVector3Transform(in, XMMATRIX(1,0,0,0, 0,1,0,0, 0,0,1,0, 10,20,30,1));
EigenRow-majory=AxC++
Eigen::Vector4f out = Eigen::Matrix4f({ {1,0,0,10},{0,1,0,20},{0,0,1,30},{0,0,0,1} }) * in;

* Cinema 4D: The translation vector is the first column, and we effectively multiply with the column vector [1,x,y,z]T\begin{bmatrix}1,x,y,z\end{bmatrix}^T. See: https://developers.maxon.net/docs/Cinema4DPythonSDK/html/manuals/data_algorithms/classic_api/matrix.html#matrix-fundamental

* Note: Initialization order describes how a matrix is created from scratch. This describes the element order for numbers passed to a class constructor, function, array, or inline syntax. It's usually the same as storage order, but not always (Eigen is an example of this).

The work is not shown, but "Transform style" is based on other indications such as built-in transform functions and the order that transforms are applied (e.g. "translate", "rotate", "scale"). GLSL and HLSL are "by convention", because they don't include transform functions and they allow for both y=Axy=Ax and y=xAy=xA.