Home > Uncategorized > SEEMOUS 2017 Problems

## SEEMOUS 2017 Problems

Problem 1. Let ${A \in \mathcal M_2(\Bbb{R})}$. Suppose

$\displaystyle A =\begin{pmatrix} a& b\\ c & d \end{pmatrix}$

satisfies

$\displaystyle a^2+b^2+c^2+d^2<1/5.$

Show that ${I+A}$ is invertible.

Problem 2. Let ${A,B \in \mathcal M_n(\Bbb{R})}$.

• a) Prove that there exists ${a>0}$ such that for every ${\varepsilon \in (-a,a),\varepsilon\neq 0}$ the matrix equation

$\displaystyle AX+\varepsilon X = B,\ X \in \mathcal M_n(\Bbb{R})$

has a unique solution ${X(\varepsilon) \in \mathcal M_n(\Bbb{R})}$.

• b) Prove that if ${B^2 = I_n}$ and ${A}$ is diagonalizable then

$\displaystyle \lim_{\varepsilon \rightarrow 0} \text{tr}(BX(\varepsilon)) = n-\text{rank}(A).$

Problem 3. Let ${f: \Bbb{R} \rightarrow \Bbb{R}}$ be a continuous function. Prove that

$\displaystyle \int_0^4 f(x(x-3)^2)dx = 2 \int_1^3 f(x(x-3)^2)dx$

Problem 4. a) Let ${n \geq 0}$ be an integer. Compute ${\displaystyle \int_0^1 (1-t)^n e^t dt}$.

b) Let ${k \geq 0}$ be a fixed integer and let ${(x_n)_{n \geq k}}$ be the sequence defined by

$\displaystyle x_n = \sum_{i=k}^n {i \choose k}\left(e-1-\frac{1}{1!}-\frac{1}{2!}-...-\frac{1}{i!}\right).$

Prove that the sequence converges and find its limit.

Hints: 1. This must be a joke 🙂 We know that if for a norm matrix we have ${\|A\|<1}$ then ${I-A}$ is invertible. The sum of squares is a norm, so ${\|A\|<1/5}$ implies that ${I+A}$ is invertible.

2. The equation is ${(A+\varepsilon I)X = B}$. We know that there are at most ${n}$ values ${\varepsilon_i}$ for ${\varepsilon}$ when ${A+\varepsilon I}$ is not invertible (related to the eigenvalues of ${A}$). We have two cases. If one of the eigenvalues is zero, then pick ${a}$ smaller than the distance to the closest eigenvalue to ${0}$. If ${0}$ is not an eigenvalue, then choose ${a}$ again smaller than the closest eigenvalue to ${0}$. For ${b}$ note that ${BX(\varepsilon) = B(A+\varepsilon I)^{-1}B}$ and since ${\text{tr}(XY)=\text{tr}(YX)}$ and ${B^2=I}$ we get that ${\text{tr}(BX(\varepsilon))= \text{tr}((A+\varepsilon I)^{-1})}$. Since ${A}$ is diagonalizable we conclude that

$\displaystyle \text{tr}(BX(\varepsilon)) = \sum_{i=1}^n \frac{1}{\lambda_i+\varepsilon}$

Thus, multiplying by ${\varepsilon}$ and letting ${\varepsilon}$ go to zero we obtain ${1}$ for zero eigenvalues and zero for non-zero ones. Thus, the limit counts the number of zero eigenvalues, which is precisely ${n-\text{rank}(A)}$ (since ${A}$ is diagonalizable).

3. Classic SEEMOUS stuff. Prove that this works when ${f(x) = x^n}$, then extend to polynomials, and then, by density, to continuous functions. Proving this for monomial might get involved, but initial computations show that we have the same terms when performing the integrations by parts on the two sides…

4. coming…