# Current Affairs JEE Main & Advanced

#### Types of Matrices

(1) Row matrix : A matrix is said to be a row matrix or row vector if it has only one row and any number of columns.     Example :  [5  0  3] is a row matrix of order $1\times 3$ and [2] is a row matrix of order $1\times 1$.   (2) Column matrix : A matrix is said to be a column matrix or column vector if it has only one column and any number of rows.   Example : \left[ \begin{align} & \,\,\,2 \\ & \,\,\,3 \\ & -6 \\ \end{align} \right] is a column matrix of order $3\times 1$ and [2] is a column matrix of order $1\times 1$. Observe that [2] is both a row matrix as well as a column matrix.   (3) Singleton matrix : If in a matrix there is only one element then it is called singleton matrix.   Thus, $A={{[{{a}_{ij}}]}_{m\times n}}$is a singleton matrix, if $m=n=1$   Example : $[2],\text{ }[3],\text{ }[a],\text{ }[3]$ are singleton matrices.   (4) Null or zero matrix : If in a matrix all the elements are zero then it is called a zero matrix and it is generally denoted by $O$. Thus $A={{[{{a}_{ij}}]}_{m\times n}}$is a zero matrix if ${{a}_{ij}}=0$for all $i$ and $j$.   Example : $[0],\left[ \begin{matrix} 0 & 0 \\ 0 & 0 \\ \end{matrix} \right],\left[ \begin{matrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ \end{matrix} \right],[0\,\,0]$ are all zero matrices, but of different orders.   (5) Square matrix : If number of rows and number of columns in a matrix are equal, then it is called a square matrix.   Thus $A={{[{{a}_{ij}}]}_{m\times n}}$is a square matrix if $m=n$.   Example : $\left[ \begin{matrix} {{a}_{11}} & {{a}_{12}} & {{a}_{13}} \\ {{a}_{21}} & {{a}_{22}} & {{a}_{23}} \\ {{a}_{31}} & {{a}_{32}} & {{a}_{33}} \\ \end{matrix} \right]$is a square matrix of order $3\times 3$.   (i) If $m\ne n$then matrix is called a rectangular matrix.   (ii) The elements of a square matrix A for which $i=j,i.e.\,\,{{a}_{11}},$ ${{a}_{22}},{{a}_{33}},....{{a}_{nn}}$are called diagonal elements and the line joining these elements is called the principal diagonal or leading diagonal of matrix A.   (6) Diagonal matrix : If all elements except the principal diagonal in a square matrix are zero, it is called a diagonal matrix. Thus a square matrix $A=[{{a}_{ij}}]$ is a diagonal matrix if $\Delta$when $\Delta =0$.   Example : $\left[ \begin{matrix} 2 & 0 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & 4 \\ \end{matrix} \right]$is a diagonal matrix of order $3\times 3$, which can be denoted by diag [2, 3, 4].   (7) Identity matrix : A square matrix in which elements in the main diagonal are all '1' and rest are all zero is called an identity matrix or unit matrix. Thus, the square matrix $A=[{{a}_{ij}}]$is an identity matrix, if {{a}_{ij}}=\left\{ \begin{align} & 1,\,\,\text{if}\,\,\,i=j \\ & 0,\,\,\text{if}\,\,i\ne j \\ \end{align} \right.   We denote the identity matrix of order $n$ by ${{I}_{n}}$.   Example : [1], more...

#### Trace of a Matrix

The sum of diagonal elements of a square matrix. A is called the trace of matrix A, which is denoted by tr A.   $tr\,\,A=\sum\limits_{i=1}^{n}{{{a}_{ii}}={{a}_{11}}+{{a}_{22}}+...{{a}_{nn}}}$   Properties of trace of a matrix   Let ${{C}_{11}},\,{{C}_{12}},\,{{C}_{13}}$and $B={{[{{b}_{ij}}]}_{n\times n}}$and $\lambda$be a scalar   (i) $tr(\lambda A)=\lambda \,tr(A)$                     (ii) $tr(A-B)=tr(A)-\,tr\,(B)$   (iii) $tr(AB)=tr(BA)$                               (iv) $tr\,(A)\,=tr\,(A')$ or $t{{r}_{{}}}({{A}^{T}})$   (v) $tr\,({{I}_{n}})=n$   (vi) $tr\,(0)\,=0$   (vii) $tr\,(AB)\ne tr\,A\,.\,tr\,B$

#### Addition and Subtraction of Matrices

If $A={{[{{a}_{ij}}]}_{m\times n}}$and $B={{[{{b}_{ij}}]}_{m\times n}}$are two matrices of the same order then their sum $A+B$ is a matrix whose each element is the sum of corresponding elements i.e., $A+B={{[{{a}_{ij}}+{{b}_{ij}}]}_{m\times n}}$.   Similarly, their subtraction $A-B$ is defined as   $A-B={{[{{a}_{ij}}-{{b}_{ij}}]}_{m\times n}}$   Matrix addition and subtraction can be possible only when matrices are of the same order.   Properties of matrix addition : If A, B and C are matrices of same order, then   (i) $A+B=B+A$                    (Commutative law)                    (ii) $(A+B)+C=A+(B+C)$    (Associative law)   (iii) $A+O=O+A=A,$where O is zero matrix which is additive identity of the matrix.   (iv) $A+(-A)=0=(-A)+A$, where $(-A)$ is obtained by changing the sign of every element of A, which is additive inverse of the matrix.   (v) \left. \begin{align} & A+B=A+C \\ & B+A=C+A \\ \end{align} \right\}\Rightarrow B=C          (Cancellation law)

#### Scalar Multiplication of Matrices

Let $A={{[{{a}_{ij}}]}_{m\times n}}$be a matrix and k be a number, then the matrix which is obtained by multiplying every element of A by k is called scalar multiplication of A by k and it is denoted by kA.   Thus, if $A={{[{{a}_{ij}}]}_{m\times n}}$, then $kA=Ak={{[k{{a}_{ij}}]}_{m\times n}}$.   Properties of scalar multiplication   If A, B are matrices of the same order and $\lambda ,\,\mu$ are any two scalars then   (i) $\lambda (A+B)=\lambda A+\lambda B$                        (ii) $(\lambda +\mu )A=\lambda A+\mu A$   (iii) $\lambda (\mu A)=(\lambda \mu A)=\mu (\lambda A)$                 (iv) $(-\lambda A)=-(\lambda A)=\lambda \,(-A)$
• All the laws of ordinary algebra hold for the addition or subtraction of matrices and their multiplication by scalars.

#### Multiplication of Matrices

Two matrices A and B are conformable for the product AB if the number of columns in A (pre-multiplier) is same as the number of rows in B (post multiplier). Thus, if $A={{[{{a}_{ij}}]}_{m\times n}}$ and $B={{[{{b}_{ij}}]}_{n\times p}}$ are two matrices of order $m\times n$ and $n\times p$respectively, then their product AB is of order $m\times p$and is defined as ${{(AB)}_{ij}}=\sum\limits_{r=1}^{n}{{{a}_{ir}}{{b}_{rj}}}$=[{{a}_{i1}}{{a}_{i2}}...{{a}_{in}}]\left[ \begin{align} & \underset{\vdots }{\mathop{\overset{{{b}_{1j}}}{\mathop{{{b}_{2j}}}}\,}}\, \\ & {{b}_{nj}} \\ \end{align} \right]= (${{i}^{th}}$ row of A)(${{j}^{th}}$ column of B)                                                                                                            .....(i)   where $i=1,\text{ }2,\text{ }...,m$ and $j=1,\text{ }2,\text{ }...p$   Now we define the product of a row matrix and a column matrix.   Let $A=\left[ {{a}_{1}}{{a}_{2}}....{{a}_{n}} \right]$be a row matrix and $B=\left[ \begin{matrix} {{b}_{1}} \\ \underset{\vdots }{\mathop{{{b}_{2}}}}\, \\ {{b}_{n}} \\ \end{matrix} \right]$ be a column matrix.   Then $AB=\left[ {{a}_{1}}{{b}_{1}}+{{a}_{2}}{{b}_{2}}+....+{{a}_{n}}{{b}_{n}} \right]$                             ?..(ii)   Thus, from (i), ${{(AB)}_{ij}}=$Sum of the product of elements of ${{i}^{th}}$ row of A with the corresponding elements of ${{j}^{th}}$ column of B.   Properties of matrix multiplication   If A, B and C are three matrices such that their product is defined, then   (i) $AB\ne BA$,           (Generally not commutative)   (ii) $(AB)C=A(BC)$,          (Associative Law)   (iii) $IA=A=AI$, where I is identity matrix for matrix multiplication.   (iv) $A(B+C)=AB+AC$, (Distributive law)   (v)  If $AB=AC\not{\Rightarrow }B=C$,(Cancellation law is not applicable)   (vi) If $AB=0,$ it does not mean that $A=0$ or $B=0,$ again product of two non zero matrix may be a zero matrix.

#### Positive Integral Powers of a Matrix

The positive integral powers of a matrix A are defined only when A is a square matrix.   Also then ${{A}^{2}}=A.A$, ${{A}^{3}}=A.A.A={{A}^{2}}A$.    Also for any positive integers $m$ and $n,$   (i) ${{A}^{m}}{{A}^{n}}={{A}^{m+n}}$    (ii) ${{({{A}^{m}})}^{n}}={{A}^{mn}}={{({{A}^{n}})}^{m}}$   (iii) ${{I}^{n}}=I,{{I}^{m}}=I$                                                   (iv) ${{A}^{0}}={{I}_{n}}$, where A is a square matrix of order $n$.

#### Transpose of a Matrix

The matrix obtained from a given matrix A by changing its rows into columns or columns into rows is called transpose of matrix A and is denoted by ${{A}^{T}}$or ${A}'$.   From the definition it is obvious that if order of A is $m\times n,$ then order of ${{A}^{T}}$is $n\times m$.   Example:   Transpose of matrix ${{\left[ \begin{matrix} {{a}_{1}} & {{a}_{2}} & {{a}_{3}} \\ {{b}_{1}} & {{b}_{2}} & {{b}_{3}} \\ \end{matrix} \right]}_{2\times 3}}$ is $\text{ }{{\left[ \begin{matrix} {{a}_{1}} & {{b}_{1}} \\ {{a}_{2}} & {{b}_{2}} \\ {{a}_{3}} & {{b}_{3}} \\ \end{matrix} \right]}_{3\times 2}}$   Properties of transpose : Let A and B be two matrices then,   (i)  ${{({{A}^{T}})}^{T}}=A$   (ii)  ${{(A+B)}^{T}}={{A}^{T}}+{{B}^{T}},A$and B being of the same order   (iii)  ${{(kA)}^{T}}=k{{A}^{T}},k$ be any scalar (real or complex)   (iv) ${{(AB)}^{T}}={{B}^{T}}{{A}^{T}},A$ and B being conformable for the product AB   (v) ${{({{A}_{1}}{{A}_{2}}{{A}_{3}}.....{{A}_{n-1}}{{A}_{n}})}^{T}}={{A}_{n}}^{T}{{A}_{n-1}}^{T}.......{{A}_{3}}^{T}{{A}_{2}}^{T}{{A}_{1}}^{T}$   (vi) ${{I}^{T}}=I$

#### Special Types of Matrices

(1) Symmetric matrix : A square matrix $A=[{{a}_{ij}}]$is called symmetric matrix if ${{a}_{ij}}={{a}_{ji}}$for all i, j or ${{A}^{T}}=A$.   Example : $\left[ \begin{matrix} a & h & g \\ h & b & f \\ g & f & c \\ \end{matrix} \right]$   (2) Skew-symmetric matrix : A square matrix $A=[{{a}_{ij}}]$is called skew- symmetric matrix if ${{a}_{ij}}=-{{a}_{ji}}$for all i, j or ${{A}^{T}}=-A$.   Example : $\left[ \begin{matrix} 0 & h & g \\ -h & 0 & f \\ -g & -f & 0 \\ \end{matrix} \right]$   All principal diagonal elements of a skew- symmetric matrix are always zero because for any diagonal element.   ${{a}_{ij}}=-{{a}_{ij}}\Rightarrow {{a}_{ij}}=0$   Properties of symmetric and skew-symmetric matrices   (i) If A is a square matrix, then $A+{{A}^{T}},A{{A}^{T}},{{A}^{T}}A$ are symmetric matrices, while $A-{{A}^{T}}$is skew- symmetric matrix.   (ii) If A is a symmetric matrix, then$-A,KA,{{A}^{T}},{{A}^{n}},{{A}^{-1}},{{B}^{T}}AB$ are also symmetric matrices, where $n\in N$, $K\in R$ and B is a square matrix of order that of A.   (iii) If A is a skew-symmetric matrix, then   (a) ${{A}^{2n}}$is a symmetric matrix for $n\in N$.   (b) ${{A}^{2n+1}}$is a skew-symmetric matrix for $n\in N$.   (c) kA is also skew-symmetric matrix, where $k\in R$.   (d)  ${{B}^{T}}AB$ is also skew- symmetric matrix where B is a square matrix of order that of A.   (iv) If A, B are two symmetric matrices, then   (a)  $A\pm B,\,\,AB+BA$ are also symmetric matrices,   (b)  $AB-BA$is a skew- symmetric matrix,   (c)   AB is a symmetric matrix, when $AB=BA$.   (v) If A, B  are two skew-symmetric matrices, then   (a) $A\pm B,\,\,AB-BA$ are skew-symmetric matrices,   (b) $AB+BA$is a symmetric matrix.   (vi) If A a skew-symmetric matrix and C is a column matrix, then ${{C}^{T}}$AC is a zero matrix.   (vii) Every square matrix A can unequally be expressed as sum of a symmetric and skew-symmetric matrix   i.e., $A=\left[ \frac{1}{2}(A+{{A}^{T}}) \right]+\left[ \frac{1}{2}(A-{{A}^{T}}) \right]$.   (3) Singular and Non-singular matrix : Any square matrix A is said to be non-singular if $|A|\ne 0,$and a square matrix A is said to be singular if $|A|\,=0$. Here $|A|$(or det(A) or simply det  $|A|$ means corresponding determinant of square matrix A.   Example : $A=\left[ \begin{matrix} 2 & 3 \\ 4 & 5 \\ \end{matrix} \right]$ then$|A|\,=\left| \,\begin{matrix} 2 & 3 \\ 4 & 5 \\\end{matrix}\, \right|=10-12=-2\Rightarrow A$ is a non-singular matrix.   (4) Hermitian and Skew-hermitian matrix : A square matrix $A=[{{a}_{ij}}]$ is said to be hermitian matrix if   ${{a}_{ij}}={{\bar{a}}_{ji}}\,;\,\,\forall i,j\,\,i.e.,\,A={{A}^{\theta }}$.   Example : $\left[ \begin{matrix} a & b+ic \\ b-ic & d \\ \end{matrix} \right]\,,\,\,\left[ \begin{matrix} 3 & 3-4i & 5+2i \\ 3+4i & 5 & -2+i \\ 5-2i & -2-i & 2 \\ \end{matrix} \right]$   are Hermitian matrices. If A is a Hermitian matrix then ${{a}_{ii}}={{\bar{a}}_{ii}}\,\,\Rightarrow$${{a}_{ii}}$ is real $\forall i,$ thus every diagonal element of a Hermitian matrix must be real.   A square matrix, $A=\,\,|{{a}_{jj}}|$ is said to be a Skew-Hermitian if ${{a}_{ij}}=-{{\bar{a}}_{ji}}.\,\forall i,\,j\,i.e.\,{{A}^{\theta }}=-A$. If A is a more...

#### Adjoint of a Square Matrix

Let $A=[{{a}_{ij}}]$be a square matrix of order $n$ and let ${{C}_{ij}}$be cofactor of ${{a}_{ij}}$in  A. Then the transpose of the matrix of cofactors of elements of A is called the adjoint of A and is denoted by adj A   Thus, $adj$$A={{[{{C}_{ij}}]}^{T}}\Rightarrow {{(adj\,A)}_{ij}}={{C}_{ji}}=$cofactor of ${{a}_{ji}}$in A.     If $A=\left[ \begin{matrix} {{a}_{11}} & {{a}_{12}} & {{a}_{13}} \\ {{a}_{21}} & {{a}_{22}} & {{a}_{23}} \\ {{a}_{31}} & {{a}_{32}} & {{a}_{33}} \\\end{matrix} \right],$ then  $adj\,A={{\left[ \begin{matrix} {{C}_{11}} & {{C}_{12}} & {{C}_{13}} \\ {{C}_{21}} & {{C}_{22}} & {{C}_{23}} \\ {{C}_{31}} & {{C}_{32}} & {{C}_{33}} \\\end{matrix} \right]}^{T}}=\left[ \begin{matrix} {{C}_{11}} & {{C}_{21}} & {{C}_{31}} \\ {{C}_{12}} & {{C}_{22}} & {{C}_{32}} \\ {{C}_{13}} & {{C}_{23}} & {{C}_{33}} \\\end{matrix} \right];$ where ${{C}_{ij}}$denotes the cofactor of ${{a}_{ij}}$in A.   Example : $A=\left[ \begin{matrix} p & q \\r & s \\\end{matrix} \right],\,{{C}_{11}}=s,\,{{C}_{12}}=-r,\,{{C}_{21}}=-q,\,{{C}_{22}}=p$ $\therefore adj\,A={{\left[ \begin{matrix} s & -r \\ -q & p \\\end{matrix} \right]}^{T}}=\left[ \begin{matrix} s & -q \\ -r & p \\\end{matrix} \right]$   Properties of adjoint matrix : If A, B are square matrices of order $n$ and ${{I}_{n}}$is corresponding unit matrix, then   (i) $A(adj\,A)=|A|{{I}_{n}}=(adj\,A)A$   (Thus A (adj A) is always a scalar matrix)   (ii) $|adj\,A|=|A{{|}^{n-1}}$                                 (iii) $adj\,(adj\,A)=|A{{|}^{n-2}}A$   (iv) $|adj\,(adj\,A)|\,=\,|A{{|}^{{{(n-1)}^{2}}}}$                  (v) $adj\,({{A}^{T}})={{(adj\,A)}^{T}}$   (vi) $adj\,(AB)=(adj\,B)(adj\,A)$           (vii) $adj({{A}^{m}})={{(adj\,A)}^{m}},m\in N$   (viii) $adj(kA)={{k}^{n-1}}(adj\,A),k\in R$   (ix) $adj\,({{I}_{n}})={{I}_{n}}$                               (x) $adj\,(O)=O$   (xi) A is symmetric $\Rightarrow$ adj A is also symmetric.   (xii) A is diagonal $\Rightarrow$ adj A is also diagonal.   (xiii) A is triangular $\Rightarrow$ adj A is also triangular.   (xiv) A is singular $\Rightarrow$ $|adj\,\,A|=0$

#### Inverse of a Matrix

A non-singular square matrix of order $n$ is invertible if there exists a square matrix B of the same order such that $AB={{I}_{n}}=BA$.   In such a case, we say that the inverse of A is B and we write ${{A}^{-1}}=B$. The inverse of A is given by ${{A}^{-1}}=\frac{1}{|A|}.adj\,A$.   The necessary and sufficient condition for the existence of the inverse of a square matrix A is that $|A|\ne 0$.   Properties of inverse matrix:   If A and B are invertible matrices of the same order, then    (i) ${{({{A}^{-1}})}^{-1}}=A$   (ii) ${{({{A}^{T}})}^{-1}}={{({{A}^{-1}})}^{T}}$   (iii) ${{(AB)}^{-1}}={{B}^{-1}}{{A}^{-1}}$                  (iv) ${{({{A}^{k}})}^{-1}}={{({{A}^{-1}})}^{k}},k\in N$   [In particular ${{({{A}^{2}})}^{-1}}={{({{A}^{-1}})}^{2}}]$   (v) $adj({{A}^{-1}})={{(adj\,A)}^{-1}}$   (vi) $|{{A}^{-1}}|\,=\frac{1}{|A|}=\,|A{{|}^{-1}}$   (vii) A = diag $({{a}_{1}}{{a}_{2}}...{{a}_{n}})$$\Rightarrow {{A}^{-1}}=diag\,(a_{1}^{-1}a_{2}^{-1}...a_{n}^{-1})$   (viii)  A is symmetric $\Rightarrow$ ${{A}^{-1}}$ is also symmetric.   (ix) A is diagonal, $|A|\ne 0\,\,\Rightarrow {{A}^{-1}}$is also diagonal.   (x) A is a scalar matrix $\Rightarrow$ ${{A}^{-1}}$is also a scalar matrix.   (xi) A is triangular, $|A|\ne 0$$\rightleftharpoons$${{A}^{-1}}$is also triangular.       (xii) Every invertible matrix possesses a unique inverse.   (xiii)  Cancellation law with respect to multiplication   If A is a non-singular matrix i.e., if $|A|\ne 0$, then ${{A}^{-1}}$exists and $AB=AC\Rightarrow {{A}^{-1}}(AB)={{A}^{-1}}(AC)$   $\Rightarrow$ $({{A}^{-1}}A)B=({{A}^{-1}}A)C$   $\Rightarrow$ $IB=IC\Rightarrow B=C$   $\therefore$ $AB=AC\Rightarrow B=C\Leftrightarrow |A|\,\ne 0$.

#### Trending Current Affairs

You need to login to perform this action.
You will be redirected in 3 sec