Chapter 1. Vector Space
Recommended reading : 【Linear Algebra】 Linear Algebra Index
2. Vector Space
1. Matrix Operations
⑴ Definition of Matrix
① Define m × n matrix A, i-th row vector, j-th column vector as follows
② Zero matrix or null matrix : A matrix where all elements are 0
③ Square matrix
⑵ Matrix Addition
⑶ Matrix Multiplication
① The element at i-th row, j-th column of matrix X is denoted as X[i][j], and if A ∈ ℝl×m, B ∈ ℝm×n, C ∈ ℝl×n, and C = A × B, the following holds
② Property 1. Generally, AB ≠ BA
③ Property 2. AB = O does not imply A = O or B = O
④ Property 3. The following relationship holds between matrix multiplication and addition
○ A(B+C) = AB + AC, (A+B)C = AC + BC
○ A(BC) = (AB)C
○ c(A + B) = cA + cB
⑤ Theorem 1. Cayley–Hamilton Theorem
○ Using this, An can be calculated as An = (A2 - (a + d)A + (ad - bc)E) Q(A) + kA + sE for n ≥ 2
⑥ Programming Code
○ Using numpy
import numpy as np
# Definition of two matrices
A = np.array([[1, 2], [3, 4]])
B = np.array([[5, 6], [7, 8]])
# Output matrices
print("Matrix A:")
print(A)
print("\nMatrix B:")
print(B)
# Matrix multiplication
result = np.dot(A, B)
## Alternatively, result = A@B
print("\nNumpy matrix multiplication result:")
print(result)
○ Using sympy
from sympy import Matrix, init_printing
# Initialization of output style of LaTeX
init_printing()
# Definition of two matrices
A = Matrix([[1, 2], [3, 4]])
B = Matrix([[5, 6], [7, 8]])
# Output matrices
print("Matrix A:")
display(A)
print("Matrix B:")
display(B)
# Matrix multiplication
result = A * B
print("Sympy matrix multiplication result:")
display(result)
⑷ Transposed Matrix
① Definition
② Symmetric Matrix : If A is a square matrix and equals its transpose
③ When X ∈ ℝN×d (N data points of d dimensions) is given,
○ XtX ∈ ℝd×d is the covariance matrix when μ = 0
○ XXt ∈ ℝN×N is the dot matrix (similarity matrix)
④ Property 1. (At)t = A
⑤ Property 2. (A + B)t = At + Bt
⑥ Property 3. (rA)t = rAt
⑦ Property 4. Transpose of AB is (AB)t = BtAt
⑧ Property 5. If A and B are symmetric and AB = BA, then AB is symmetric
⑸ Trace : Denoted as trace or tr
① Property 1. tr(E) = n, tr(O) = 0
② Property 2. tr(A + B) = tr(A) + tr(B)
③ Property 3. tr(cA) = ctr(A), c ∈ ℝ
④ Property 4. tr(AT) = tr(A)
⑤ Property 5. If A is an m×n matrix and B is an n×r matrix, then tr(AB) = tr(A) × tr(B)
⑹ Gaussian Elimination
① Linear System of Equations
② Augmented Matrix : Refers to m × (n + ℓ) matrix (A B)
③ Gauss-Jordan Elimination : Operates row-wise
○ Operation 1. Swap the positions of two row vectors : Changes the order of equations
○ Operation 2. Multiply a specific row vector by c : Multiplies both sides of an equation by c
○ Operation 3. Multiply one row vector by c and add it to another row vector : Multiplies one equation by c and adds it to another equation
○ Final Step : Achieve reduced row echelon form (RREF)
④ Advantage : Can be implemented programmatically
2. Vector Space
⑴ Conditions for Vector Space
① Identity Element for Addition : There exists 0 in V such that x + 0 = x for any x in V
② Additive Inverse : If x ∈ V, there exists -x in V such that x + (-x) = 0
③ Commutativity of Addition : If x, y ∈ V, then x + y = y + x
④ Associativity of Addition : If x, y, z ∈ V, then (x + y) + z = x + (y + z)
⑤ Distributive Law : c(x + y) = cx + cy
⑥ Distributive Law : (c1 + c2)x = c1x + c2x
⑦ Multiplicative Identity : 1x = x
⑧ Associativity of Scalar Multiplication : c1(c2x) = (c1c2)x
⑵ Linear Transformation
① Linear Transformation : A mapping T : U → V is linear if it satisfies the following for x, y ∈ U and c ∈ F
② Linear Combination
③ Linearly Dependent
④ Linearly Independent
⑤ Spanning Set : If any vector in vector space V can be expressed as a linear combination of elements in S, then S is a spanning set
⑥ Basis : A linearly independent subset of the spanning set
⑶ Inverse of T, T-1 : If T-1 exists such that T-1 : V → U, then T-1 is also a linear transformation
⑷ Inner Product Space
① Conditions for Inner Product
② Standard Inner Product : Defined as follows for V = ℂn
③ Properties of Inner Product
④ Inner Product Space : A vector space with a defined inner product
○ Real Inner Product Space : When the scalar field is the set of real numbers
○ Complex Inner Product Space : When the scalar field is the set of complex numbers
⑤ Norm
⑥ Theorem 1. Cauchy-Schwarz Inequality
⑦ Theorem 2. Triangle Inequality
⑷ Metric Space
① Distance Function
② Relation between Norm and Distance
○ If norm is defined, distance d can be defined
○ Having a defined distance does not necessarily imply a corresponding norm exists
③ Kernel
○ A non-linear map φ(x) : X → H can be used to define distance d differently
○ Definition of Kernel : k(x i, y i) ≡ φ(x i)Tφ(x j)
○ Characteristics of Kernel
○ Examples of Kernels
○ φ(x, y) = (x, y, x2 + y2)T
○ k(v, u) = uTv (In this case, φ(·) is the identity function)
○ Gaussian Kernel : k(v, u) = exp(- v - u 2 / 2σ2)
○ Polynomial Kernel : k(v, u) = (uTv + 1)d
○ Sigmoid Kernel : k(v, u) = tanh(αuTv + β)
○ RBF Kernel : k(v, u) = exp(-γ v - u 2)
⑸ Subspace : A spanning set spans a subspace
Input: 2020.04.07 21:36
Modified: 2024.10.10 08:12