Matrix Manipulation in JavaScript for WebGL

WebGL expects that the 4x4 matrices passed to the GPU will be sent in column-major format. The numbers in the one-deminsional array are ordered by column, first column then second column, etc. In other words, array position 0 is row 0 column 0 of the matrix; array element 1 is row 1 column 0; ...; top row second column is array element 4; etc.

You can declare the identity matrix like this:

var Indent_Matrix = [ 1,0,0,0, 0,1,0,0, 0,0,1,0, 0,0,0,1 ]; 
Then you can write a function to return a matrix that rotates around the Z axis like this:
function my_rotate_z (angle)
{
    var rads = angle * (Math.PI / 180);  // convert to radians

    var result = Indent_Matrix;          // build the 16 numbers
    result[0] = Math.cos(rads);
    result[1] = Math.sin(rads);
    result[4] = 0-Math.sin(rads);
    result[5] = Math.cos(rads);

    return result;
}
Then inside our render function you can do something like this:
    myMatrix = my_rotate_Z (angle);

 

Note 1: Angel's mult() expects two of his matrices, so you cannot use his multiply function with your arrays.

Note 2: You still need to flatten your matrix that gets passed to the GPU just to be sure you only send the 16 numbers.

Note 3: The console.log function is really useful.