Korean, Edit

The 34th National University Student Mathematics Contest – Area 1

Recommended post: 【National University Student Mathematics Contest】 Collection of National University Student Mathematics Contest Solutions


The 34th National University (Industrial) Mathematics Contest

November 14, 2015 (10:00 - 13:00)



1.

Compute the following limit.

스크린샷 2026-01-18 오후 12 38 41

Solution.

By L’Hospital’s theorem,


스크린샷 2026-01-18 오후 12 39 15



2.

For a positive integer $d$, show that any real-coefficient polynomial $\phi(x,y)=\sum_{i=0}^{2d} a_i x^i y^{2d-i}$ satisfies the following identity.


스크린샷 2026-01-18 오후 12 39 42


Solution.


스크린샷 2026-01-23 오전 12 32 14


Expanding the left-hand side yields an expression independent of $x$ and $y$, so the right-hand side, which is obtained by applying a symmetric transformation in $x$ and $y$, is ultimately the same as the left-hand side.

Because nature does not favor one of $x$ and $y$ over the other.


Solution Using Group Theory.

This proof uses the $SO(2)$ action and the Lie algebra generator, and essentially mirrors the former proof.


스크린샷 2026-01-23 오전 1 20 08


Since \(\phi\) is a homogeneous polynomial of degree \(2d\) and the Laplacian \(\Delta\) lowers the degree by \(2\), it follows that \(\Delta^{d}\phi\) is a constant. Hence \(L(\text{constant})=0\), and therefore \(L(\Delta^{d}\phi)=0\).


스크린샷 2026-01-23 오전 1 08 52


The symmetry between $x$ and $y$ stems from the rotational symmetry of $L$ and $\Delta$; that is, applying a rotation does not change these operators (they commute with the $SO(2)$ action). The rotational invariance of $L$ is easy to see by switching to polar coordinates, $x = r\cos\theta$ and $y = r\sin\theta$.



3.

For a positive integer $n$ ($n \ge 3$), a matrix $A=(a_{ij})$ of size $n^2 \times n^2$ is given as follows.


스크린샷 2026-01-18 오후 12 40 34


Find all eigenvalues of $A$ and the dimension of the eigenspace corresponding to each eigenvalue.

Solution.

The given matrix $A$ can be expressed as a Kronecker product $A = J \otimes (A)_{n\times n}$, where $J=\mathbf{1}\times \mathbf{1}^T$ (see Kronecker product). For example, when $n=3$, it is as follows.


스크린샷 2026-01-18 오후 12 41 10


The eigenvalues of $J$ are $0$ or $n$.


스크린샷 2026-01-18 오후 12 41 37


Also, the eigenvalues of $(A)_{n\times n}$ are $1$ or $-1$. In the following calculation, note that when computing an $(n-1)\times(n-1)$ determinant, the only nonzero terms come from the diagonal entries and the anti-diagonal entries.


스크린샷 2026-01-18 오후 12 42 00


Therefore, since the eigenvalues of $A$ are the products of the eigenvalues of $J$ and $(A)_{n\times n}$, the set of eigenvalues is ${-n, 0, n}$.

Meanwhile, by the following, for an eigenvector $x$ of $J$ and an eigenvector $y$ of $(A){n\times n}$, $x \otimes y$ is an eigenvector of $A = J \otimes (A){n\times n}$.


스크린샷 2026-01-18 오후 12 42 21


Therefore, referring to the characteristic equation above, we can determine the dimension occupied by the eigenvectors corresponding to each eigenvalue. Note that the sum of these dimensions must be $n^2$.

Case 1. $n = 2k+1$

1-1. When the eigenvalue $\mu$ of $(A)_{n\times n}$ is $1$: the dimension is $k+1$ (in the $n$-dimensional space)

1-1-1. When the eigenvalue of $J$ is $0$, so the eigenvalue of $A$ is $0 \times \mu = \mathbf{0}$: the dimension is $(n-1)\times(k+1)$ (in the $n^2$-dimensional space)

1-1-1. When the eigenvalue of $J$ is $n$, so the eigenvalue of $A$ is $n \times \mu = \mathbf{n}$: the dimension is $1\times(k+1)$ (in the $n^2$-dimensional space)

1-2. When the eigenvalue $\mu$ of $(A)_{n\times n}$ is $-1$: the dimension is $k$ (in the $n$-dimensional space)

1-2-1. When the eigenvalue of $J$ is $0$, so the eigenvalue of $A$ is $0 \times \mu = \mathbf{0}$: the dimension is $(n-1)\times k$ (in the $n^2$-dimensional space)

1-2-2. When the eigenvalue of $J$ is $n$, so the eigenvalue of $A$ is $n \times \mu = \mathbf{-n}$: the dimension is $1\times k$ (in the $n^2$-dimensional space)

Case 2. $n = 2k$

2-1. When the eigenvalue $\mu$ of $(A)_{n\times n}$ is $1$: the dimension is $k+1$ (in the $n$-dimensional space)

2-1-1. When the eigenvalue of $J$ is $0$, so the eigenvalue of $A$ is $0 \times \mu = \mathbf{0}$: the dimension is $(n-1)\times(k+1)$ (in the $n^2$-dimensional space)

2-1-2. When the eigenvalue of $J$ is $n$, so the eigenvalue of $A$ is $n \times \mu = \mathbf{n}$: the dimension is $1\times(k+1)$ (in the $n^2$-dimensional space)

2-2. When the eigenvalue $\mu$ of $(A)_{n\times n}$ is $-1$: the dimension is $k-1$ (in the $n$-dimensional space)

2-2-1. When the eigenvalue of $J$ is $0$, so the eigenvalue of $A$ is $0 \times \mu = \mathbf{0}$: the dimension is $(n-1)\times(k-1)$ (in the $n^2$-dimensional space)

2-2-2. When the eigenvalue of $J$ is $n$, so the eigenvalue of $A$ is $n \times \mu = \mathbf{-n}$: the dimension is $1\times(k-1)$ (in the $n^2$-dimensional space)

Generalizing this, we can express it as follows.


스크린샷 2026-01-18 오후 12 43 01



4.

Let $V$ be the real vector space consisting of real matrices of size $100 \times 100$. For a matrix $A \in V$, let $d_A$ be the dimension of the subspace {$B \in V \mid AB = BA$} of $V$. If $A \in V$ satisfies the identity $A^4 - 5A^2 + 4I = O$ (where $I$ is the identity matrix), find the minimum value of $d_A$.

Solution.

The given identity has the form $p(A)=O$ for the polynomial $p(t)=t^4-5t^2+4$.

A matrix is diagonalizable if and only if its minimal polynomial splits into distinct linear factors.

Here, the minimal polynomial is the nonzero polynomial of smallest degree among those satisfying $q(A)=O$, and it is the greatest common divisor of all polynomials $q$ such that $q(A)=O$.

Since a polynomial satisfied by $A$ factors into distinct linear factors with no repeated roots, $A$ is diagonalizable.

Meanwhile, since $p(t)=(t+2)(t+1)(t-1)(t-2)=0$, the possible eigenvalues of $A$ are $-2, -1, 1, 2$.

A diagonalizable matrix $A$ and a matrix $B$ that commutes with it (i.e., belongs to its commutant) can be represented as block diagonal matrices, where each block corresponds to an eigenvalue.

Because if $Av=\lambda v$, then $A(Bv)=B(Av)=\lambda(Bv),$ so $B$ must preserve each eigenspace of $A$.

Recall that “commuting” means satisfying $AB=BA$, and if all eigenvalues of $A$ are distinct, then a matrix $B$ commuting with a diagonal matrix $D$ must itself be diagonal.

Therefore, the form of $B$ is as follows.


스크린샷 2026-01-18 오후 12 43 58


If the dimension of each eigenvalue block $B_i$ is $n_i$, then with $k=4$ we have $n_1+\cdots+n_k=100$.

Since an arbitrary $n_i\times n_i$ matrix is allowed within each eigenvalue block, we obtain $d_A = n_1^2 + \cdots + n_k^2$, and we can apply the Cauchy–Schwarz inequality.


스크린샷 2026-01-18 오후 12 44 22


(Problem related to commuting: 2017, 36th Contest, Area 1, Problem 5)



5.

A differentiable function $f:\mathbb{R}\to\mathbb{R}$ satisfies the following conditions.


스크린샷 2026-01-18 오후 12 44 54


Prove the following inequality.


스크린샷 2026-01-18 오후 12 45 13


Solution.

First, by the Cauchy–Schwarz inequality, let us verify that the following holds.


스크린샷 2026-01-18 오후 12 45 34


Whether the inequality holds, which we checked under the assumption $f(x)\ge 0$, is the same as the general conclusion of the given statement.

(The above claim is not strictly necessary, but it is introduced because it leads to the conclusion that it is also acceptable to assume $f(x)$ is an upward-convex bell shape.)

Also, whether the given inequality holds does not change even if we appropriately shift the interval of integration.

For example, even if we swap $[1,2]$ and $[2,3]$ and redefine them as $[2,3]$ and $[1,2]$, the values of both the left-hand side and the right-hand side do not change.

Therefore, assuming $f(x)f’(x)\ge 0$ for $-\infty \le x \le c$ and $f(x)f’(x)\le 0$ for $c \le x \le \infty$, the validity of the expression we examined is the same as the general conclusion of the given inequality.

Because by performing the substitution action used in the above example appropriately, we can create a situation where $f(x)f’(x)\ge 0$ on $-\infty \le x \le c$ and $f(x)f’(x)\le 0$ on $c \le x \le \infty$.

Since $f(-\infty)=f(\infty)=0$, we can obtain the following conclusion.


스크린샷 2026-01-18 오후 12 46 19



6.

Let $x(t), y(t), z(t)$ be pairwise coprime real-coefficient polynomials of degree at least $1$. For some positive integer $d$, they satisfy the following identity:

\[x(t)^d + y(t)^d = z(t)^d\]

Show that $x(t)^{d-1}$ divides $y(t)z’(t) - z(t)y’(t)$, and that $d \le 2$.

Solution.

We can obtain the following equation.


스크린샷 2026-01-18 오후 12 46 50


We can divide both sides by $x(t)^{d-1}$, and since $x(t)$ and $y(t)$ are coprime, it is easy to see that $x(t)\mid\bigl(y(t)z’(t)-y’(t)z(t)\bigr)$.

Without loss of generality, assume that the degree $n$ of $x(t)$ is greater than or equal to the degree of $y(t)$. Then the degree of $z(t)$ is also $n$.

Therefore, we can show that $d \le 2$ as follows.


스크린샷 2026-01-18 오후 12 47 10


This is related to the function-field version of Fermat’s Last Theorem, the Mason–Stothers theorem.



7.

Let $A, B$ be real symmetric $2\times 2$ matrices with eigenvalues $\lambda_1, \lambda_2$ ($\lambda_1 \ge \lambda_2 \ge 0$) and $\mu_1, \mu_2$ ($\mu_1 \ge \mu_2 \ge 0$), respectively. Prove the following inequality:

\[\mathrm{tr}(AB) \le \lambda_1\mu_1 + \lambda_2\mu_2\]

Official Solution.

The official solution is written in an easy-to-understand way, so most of the content is copied here.

Since $A, B$ are symmetric matrices, $A, B$ are orthogonally diagonalizable. ( spectral theorem)

Diagonalizing $A$ and $B$, there exist orthogonal matrices $P$ and $Q$ such that $A = P,\mathrm{Diag}(\lambda_1,\lambda_2),P^T$ and $B = Q,\mathrm{Diag}(\mu_1,\mu_2),Q^T$, hence we obtain the following.


스크린샷 2026-01-18 오후 12 47 55


Since $P^TQ$ is also orthogonal, we write $P^TQ = ((a,c)^T,(b,d)^T)$, and since each row vector and column vector must be a unit vector,

\[a^2 + b^2 = c^2 + d^2 = 1 = a^2 + c^2 = b^2 + d^2\]

holds. In particular, $a^2 = d^2$ and $b^2 = c^2$. Now, using $\lambda_1\mu_2 + \lambda_2\mu_1 \le \lambda_1\mu_1 + \lambda_2\mu_2$ and $a^2 + b^2 = 1$, we obtain


스크린샷 2026-01-18 오후 12 48 34


and thus we get it.



8.

For a positive integer $n$, a real matrix $M$ of size $n \times n$ is called an orthogonal matrix if $M^T M = I$. If there exists an invertible real matrix $P$ such that $PMP^{-1}$ is orthogonal, then $M$ is called a “matrix similar to an orthogonal matrix.” (Here $M^T$ is the transpose of $M$, and $I$ is the identity matrix.)

(i) For a real symmetric matrix $S$ whose eigenvalues are all positive, show that any matrix $A$ satisfying $A^TSA = S$ is “similar to an orthogonal matrix.”

(ii) For matrices $A, B$, if both $A$ and $\begin{pmatrix}A & 0\ 0 & B\end{pmatrix}$ are “similar to an orthogonal matrix,” show that $B$ is also “similar to an orthogonal matrix.”

Solution.

(i)

Consider an orthogonal matrix $M$ such that $M^T M = I$.

Let $P$ be the matrix obtained by multiplying the 1st, $\cdots$, $n$-th columns of $M$ by $\sqrt{\lambda_1}, \cdots, \sqrt{\lambda_n}$, respectively. Then for an arbitrary diagonal matrix $D$ whose diagonal entries are $\lambda_1, \cdots, \lambda_n$, we have: $P^T P = D$.

A real symmetric matrix $S$ whose eigenvalues are all positive can be orthogonally diagonalized, so $S = P_S D_S P_S^T$.

As shown above, we can find $P$ satisfying $P^T P = D_S$, so

$S = P_S D_S P_S^T = (P P_S^T)^T (P P_S^T) = Q^T Q$

$A^TSA = A^T Q^T Q A = S = Q^T Q$

$\Leftrightarrow I = (Q^{-1})^T A^T Q^T Q A Q^{-1} = (Q A Q^{-1})^T (Q A Q^{-1})$

$\Leftrightarrow$ Since $Q$ exists, $A$ is “similar to an orthogonal matrix.”

(ii)

The official solution was referenced.

First, let $M(A,B)=\begin{pmatrix}A & 0\ 0 & B\end{pmatrix}$. For an arbitrary invertible matrix $S$,


스크린샷 2026-01-18 오후 12 49 59


satisfies, so without loss of generality we may assume $A$ is orthogonal.

Since $M(A,B)$ is “similar to an orthogonal matrix,” from the fact that $X M(A,B) X^{-1}$ is orthogonal, we obtain the following.

\[M(A,B)^T X^T X M(A,B) = X^T X\]

In the end, we see that $M(A,B)$ preserves some positive definite matrix $G = X^T X$.

Since a symmetric matrix $G$ is necessarily positive definite symmetric, if a matrix is “similar to an orthogonal matrix,” then there exists a positive definite symmetric matrix that it preserves. (The converse of (i).)


스크린샷 2026-01-18 오후 12 50 31


Comparing the above equation blockwise, from $M(A,B)^T G M(A,B)=G$ we obtain three relations.

○ Upper-left: $A^T Z_{11} A = Z_{11}$

○ Upper-right: $A^T Z_{12} B = Z_{12}$

○ Lower-right: $B^T Z_{22} B = Z_{22}$

Since we obtain $B^T Z_{22} B = Z_{22}$, by (i) we know that $B$ is “similar to an orthogonal matrix.”

Appendix

If $MM^T = P M P^T$ holds, then $M$ is called the square root of $P M P^T$.

This problem was inspired by that.



Input: 2023.03.11 21:30

Revision: 2025.06.10 22:40

results matching ""

    No results matching ""