Macaulay2 can handle noncommutative rings, and for such rings there is a difference between left modules and right modules. In Macaulay2, all the modules are left modules, but matrices act on the left, too. The usual convention would be to have the matrices act on the right, so the homomorphism rule (
f(av)=af(v)) becomes a consequence of associativity of matrixvectorscalar multiplication (
(av)f=a(vf)). Macaulay2 makes things come out okay in the end  a left
Rmodule can be regarded naturally as a right
R'module, where
R' is the opposite ring of
R, obtained from the ring
R by reversing the multiplication. Thus matrices over
R' can act on
Rmodules from the left. Matrices over
R in Macaulay2 are
really matrices over
R'.
We illustrate this state of affairs with an example over a (noncommutative) Weyl algebra. First observe the noncommutativity.
i1 : R = QQ[x,dx,WeylAlgebra=>{x=>dx}]
o1 = R
o1 : PolynomialRing, 1 differential variable(s)

i2 : x*dx
o2 = x*dx
o2 : R

i3 : dx*x
o3 = x*dx + 1
o3 : R

Now verify the module is a left module by checking associativity.
i4 : M = R^2
2
o4 = R
o4 : Rmodule, free

i5 : v = M_0
o5 =  1 
 0 
2
o5 : R

i6 : dx*v
o6 =  dx 
 0 
2
o6 : R

i7 : x*(dx*v)
o7 =  xdx 
 0 
2
o7 : R

i8 : (x*dx)*v
o8 =  xdx 
 0 
2
o8 : R

i9 : x*(dx*v) == (x*dx)*v
o9 = true

Now make a matrix and check that left multiplication by it is a homomorphism from
M to
M.
i10 : f = dx * id_M
o10 =  dx 0 
 0 dx 
2 2
o10 : Matrix R < R

i11 : f*(x*v)
o11 =  xdx 
 0 
2
o11 : R

i12 : x*(f*v)
o12 =  xdx 
 0 
2
o12 : R

i13 : f*(x*v) == x*(f*v)
o13 = true

Now we make another matrix and check that matrix multiplication treats the entries of the matrices as residing in the opposite ring,
R'.
i14 : g = x * id_M
o14 =  x 0 
 0 x 
2 2
o14 : Matrix R < R

i15 : f*g
o15 =  xdx 0 
 0 xdx 
2 2
o15 : Matrix R < R

i16 : f*g == (x*dx) * id_M
o16 = true

i17 : (dx * id_M)*(x * id_M) == (x*dx) * id_M
o17 = true

Here we check that multiplication of a scalar times a matrix is compatible with multiplication of a scalar times a vector.
i18 : x * ( (dx * id_M) * v )
o18 =  xdx 
 0 
2
o18 : R

i19 : (x * (dx * id_M) ) * v
o19 =  xdx 
 0 
2
o19 : R

i20 : (x * (dx * id_M) ) * v == x * ( (dx * id_M) * v )
o20 = true

One desirable associativity rule does
not hold, the one for
RingElement * Matrix * Matrix, as we see in this example.
i21 : x * ( id_M * ( dx * id_M ) )
o21 =  xdx 0 
 0 xdx 
2 2
o21 : Matrix R < R

i22 : (x * id_M) * ( dx * id_M )
o22 =  xdx+1 0 
 0 xdx+1 
2 2
o22 : Matrix R < R

i23 : x * ( id_M * ( dx * id_M ) ) == (x * id_M) * ( dx * id_M )
o23 = false

The reason for this discrepancy is that, as explained above, matrix multiplication is done over
R', not over
R.
Currently, tensor product of a module
M by a ring
R works on either side and does the same thing. In other words, you can write
R**M or
M**R. That may change in the future.