학교수업/수치해석

[Numerical Analysis] 2. Matrix

hwijin97 2022. 5. 19. 17:01

Matrix

- A rectangular array of numbers having a fixed number of rows and columns

- $m \times n$ matrix consists of $m$ rows and $n$ columns

 

  • Square matrix : #columns = #rows
  • Row matrix : #rows = 1
  • Column matrix : #columns = 1
  • Vectors : row matrix or column matrix
  • Scalar : 1 x 1 matrix
  • Diagonal matrix : $a_{ij} = 0 \text{if} i \neq j$
  • Identity matrix : $a_{ij} = 0 \text{if} i \neq j$ , $a_{ij} = 0 \text{if} i = j$
  • Kronecker delta : $\delta_{ij} = 0 \text{if} i \neq j , 1 \text{if} i = j$
  • Symmetric matrix : $a_{ij} = a_{ji}$

Matrix Computation

 

$A+B$, $A-B$, $kA$ : 각 원소별 계산

 

multiplication : #column of 1st matrix must equal to #row of 2nd matrix : conformable

$C = AB$

$A : l \times m, B : m times n  \to C : l \times n$

$c_{ij} = a_{i1}b_{1j} + a_{i2}b_{2j} + \cdots + a_{im} b_{mj}$

 

$AB \neq BA$

$AI = IA = A$

transpose of $A = A^T$, $a_{ij}^T = a_{ji}$

 

Orthogonal Matrix

$AA^T = I$ : A is an orthogonal matrix

 

Matrix Properties

$A+B = B+A$

$A+(B+C) = (A+B)+C$

$(AB)C = A(BC)$

$A(B+C) = AB + AC$

$(A+B)^T = A^T + B^T

$(kA)^T = kA^T

$(AB)^T = B^TA^T$

 

 

Determinants

- Defined in square matrices

- notation : $|A|$ or det(A)

- 1x1 matrix : $|A| = a_{11}$

- 2x2 matrix : $|A| = a_{11}a_{22} - a_{12}a_{21}$

 

-3x3 matrix

Minor of element $a_{ij} : |A_{ij}|$ : Obtained by deleting i th row and j th column

Cofactor of element $a_{ij} : c_{ij} = (-1)^{i+j} |A_{ij}|$

$|A_{11}|$ : 1 row, 1 column 을 제거한 행렬의 det

$|A| = \sum_{j=1}^n a_{ij} (-1)^{i+j} |A_{ij}| = \sum_{i=1}^n a_{ij} (-1)^{i+j} |A_{ij}|$ : i th row expansion, j th column expansion

 

Determinants Properties

$|A| = |A^T|

$|AB| = |A||B|$

$|B| = k|A|$, if we obtain $B$ by multiplying a constant $k$ to any one row( or column )

Interchanging any tw orows ( or columns ) of $A$ changes the sign of $|A|$

if two rwos (or columns) are identical, $|A| = 0$

Adding a multiple of one row (or column) to another row( or column ) does not change the determminant

 

 

Inverse Matrix

- Matrix arithmetic does not define division

- Inverse matrix do a similar process

$AA^{-1} = I$ -> $AX = B$ -> $A^{-1}AX = A^{-1}B$

$a_{ij}^{-1} = \frac{(-1)^{i+j}|A_{ij}|}{|A|} $

$A^{-1} : a_{ij}^{-1} $

- A^{-1} does not exist if $|A| = 0$

 

Gauss-Jordan Elimination

 

- To calculate inverse matrix

1. Construct n x 2n matrix by concatenating identity to the right of the n x n matrix

$A|I$

- Elementary row operation is one of the following operations that can be performed on a matrix

1. Excahnge two rows

2. Multiply a row by a nonzero scalr

3. Add a multiple of one row to another row

 

- Algorithm

1. Construct the augumented matrix [A I]

2. Repeat the followings for the column $j$ = from 1 to n

3. Find row $i$ with $i \geq j$ such that $a_{ij}$ has the largest absolute value. if no such row exists for which $a_{ij} \neq 0$, no inverse matrix exists

4. if $i \neq j$, exchange row $i$ and row $j$

5. Multiply row $j$ by $ \frac{1}{a_{ij}}$ => set $a_{jj}$ to 1

6. For each row $k$ where $1 \leq k \leq n$ and $k \neq j$, add $-a_{kj}$ times row $j$ to $row $k$ => set all element in column $j$ to 0 except the row $j$

 

 

Linear System

 

$AX = B $ -> $A^{-1}AX = A^{-1}B$ , $X = A^{-1}B$

- when no solution exists : $|A| = 0$

 

Homogeneous System

$AX = 0$

- if $|A| \neq 0$ , zero solution(or trivial solution) $AX = 0$

- if $|A| = 0$, non-zero solution (X \neq 0$ exists

 

Solving Linear System using Gauss-Jordan Ellimination ( $AX = B$, [A B])

 

 

Eigenvalues and Eigenvectors

- For an n x n matrix A, there exists non-zero column matrix(vector) X and scalar $\lambda$ such that $AX = \lambda X$

- $\lambda $ is called eigenvalue of $A$ and $X$ is called eigenvector that corresponds to that eigenvalue

$AX = \lambda X \to (A-\lambda I) X = 0$ : To have non-trivial solution, $X \neq 0$ in homogeneous system, the characteristic equation $|A - \lambda I| = 0$ should be satisfied

 

 

Diagonalization

- For n x n matrix A, we say P diagonalizes A if $P^{-1}AP$ is a diagonal matrix

- if n x n matrix A has eigenvalues $\lambda_1, \cdots, \lambda_n$ and there exists corresponding eigenvector $X_1, X_2, \cdots, X_n$ that form a linearly independent set, then $A$ can be diagonalized

P = [ $X_1$  $\cdots$  $X_n$ ], $X_1, \cdots, X_n$ are linaerly independent, P is invertible

$P^{-1}AP = \Lambda$