Element-wise addition/subtraction (same dimensions)
Matrix multiplication (cols of A = rows of B)
Determinant, inverse, trace (square matrices only)
Transpose and rank (any matrix)
det(2×2) = ad − bc
(AB)ᵀ = BᵀAᵀ
A × A⁻¹ = I
Note
This calculator supports matrices up to 5×5. Results are rounded to 4 decimal places. Verify manually for critical calculations.
A matrix is a rectangular array of numbers, symbols, or expressions arranged in rows and columns. The individual items in a matrix are called elements or entries. Matrices are fundamental tools in linear algebra and are used extensively in mathematics, physics, engineering, computer graphics, statistics, and many other fields. A matrix with m rows and n columns is called an m×n matrix, and the dimensions determine which operations can be performed on it.
The concept of matrices was developed in the mid-19th century by mathematicians like Arthur Cayley and James Joseph Sylvester. Today, matrices are indispensable in solving systems of linear equations, representing linear transformations, analyzing networks, processing images, and implementing machine learning algorithms. Understanding matrix operations opens doors to advanced mathematical concepts and real-world applications.
Addition and Subtraction: Two matrices can be added or subtracted only if they have the same dimensions. The operation is performed element by element—the element in row i, column j of the result is the sum (or difference) of the corresponding elements in the input matrices. These operations are commutative for addition (A + B = B + A) but not for subtraction.
Multiplication: Matrix multiplication is more complex. To multiply matrix A (size m×n) by matrix B (size p×q), the number of columns in A must equal the number of rows in B (n = p). The resulting matrix will have dimensions m×q. Each element is calculated by taking the dot product of the corresponding row from A and column from B. Unlike addition, matrix multiplication is not commutative—AB generally does not equal BA.
Transpose: The transpose of a matrix A, denoted Aᵀ, is formed by interchanging its rows and columns. If A is an m×n matrix, then Aᵀ is an n×m matrix. The element at position (i,j) in A becomes the element at position (j,i) in Aᵀ. Transposition is useful in many applications, including solving systems of equations and computing matrix products.
Determinant: The determinant is a scalar value computed from a square matrix that provides important information about the matrix. For a 2×2 matrix [[a,b],[c,d]], the determinant is ad−bc. For larger matrices, the determinant is calculated recursively using cofactor expansion. A non-zero determinant indicates the matrix is invertible, while a zero determinant means the matrix is singular.
Inverse: The inverse of a square matrix A, denoted A⁻¹, is the matrix that when multiplied by A gives the identity matrix (A × A⁻¹ = I). Not all matrices have inverses—only those with non-zero determinants (non-singular matrices) are invertible. The inverse is crucial for solving systems of linear equations: if Ax = b, then x = A⁻¹b.
Trace and Rank: The trace of a square matrix is the sum of its diagonal elements. It's a simple but useful measure with properties like tr(A+B) = tr(A) + tr(B). The rank of a matrix is the maximum number of linearly independent rows (or columns). It determines the dimension of the vector space spanned by its rows or columns and is essential in understanding systems of equations.
Matrices have countless practical applications across various fields. In computer graphics, matrices represent transformations like rotation, scaling, and translation of objects in 2D and 3D space. Every video game and animation software relies heavily on matrix operations. In physics and engineering, matrices describe systems of equations, stress tensors, and quantum mechanical states.
In data science and machine learning, matrices are the backbone of neural networks and statistical analysis. Images are represented as matrices of pixel values, enabling operations like filtering, compression, and feature extraction. Google's PageRank algorithm, which revolutionized web search, uses matrix operations on massive sparse matrices representing web link structures. Understanding matrices is essential for anyone working in these rapidly growing fields.