[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.10

(1). The numerical radius defines a norm on $\scrL(\scrH)$.

(2). $w(UAU^*)=w(A)$ for all $U\in \U(n)$.

(3). $w(A)\leq \sen{A}\leq 2w(A)$ for all $A$.

(4). $w(A)=\sen{A}$ if (but not only if) $A$ is normal.

Solution.

(1). We only need to show that $$\beex \bea w(A)=0&\ra \sef{x,Ax}=0,\quad \forall\ x:\sen{x}=1\\ &\ra \sef{y,Ax}=\frac{1}{4} \sum_{k=0}^3 i^k\sef{x+i^ky,A(x+i^ky)}=0,\quad\forall\ x,y:\sen{x}=\sen{y}=1\\ &\ra Ax=0,\quad \forall\ x:\sen{x}=1\\ &\ra A=0. \eea \eeex$$

(2). $$\beex \bea w(UAU^*)&=\sup_{\sen{x}=1}\sev{\sef{x,UAU^*}}\\ &=\sup_{\sen{x}=1}\sev{(U^*x)^*A(U^*x)}\\ &=\sup_{\sen{y}=1}\sev{y^*Ay}\quad\sex{y=U^*x}\\ &=w(A). \eea \eeex$$

(3). $$\beex \bea w(A)&=\sup_{\sen{x}=1}\sev{\sef{x,Ax}}\\ &\leq \sup_{\sen{x}=1} \sex{\sen{x}\cdot \sen{Ax}}\\ &=\sup_{\sen{x}=1}\sen{Ax}\\ &=\sen{A};\\ \sen{A}&=\sup_{\sen{x}=\sen{y}=1}\sev{\sef{y,Ax}}\\ &=\sup_{\sen{x}=\sen{y}=1} \sev{\frac{1}{4}\sum_{k=0}^3 i^k\sef{y+i^kx,A(y+i^kx)}}\\ &\leq \sup_{\sen{x}=\sen{y}=1} \frac{1}{4}\sum_{k=0}^3 \sev{\sef{y+i^kx,A(y+i^kx)}}\\ &\leq \sup_{\sen{x}=\sen{y}=1} \frac{1}{4}\sum_{k=0}^3 \sen{y+i^kx}^2\cdot w(A)\\ &=\sup_{\sen{x}=\sen{y}=1} \frac{1}{4}\cdot 4\sex{\sen{x}^2+\sen{y}^2} \cdot w(A)\\ &=2w(A). \eea \eeex$$

(4). If $A$ is normal, then by the spectral theorem, there exists a unitary $U$ such that $$\bex A=U\diag(\lm_1,\cdots,\lm_n)U^*, \eex$$ and hence $$\beex \bea \sen{Ax}^2&=\sef{Ax,Ax}\\ &=x^*A^*Ax\\ &=Ux^*\diag(|\lm_1|^2,\cdots,|\lm_n|^2)U^*x\\ &=\sum_{i=1}^n |\lm_i|^2|y_i|^2\quad\sex{y=U^*x}\\ &\leq \max_i\sen{\lm_i}^2\sen{y}^2\\ &\leq w(A)\sen{x}^2. \eea \eeex$$

时间: 2024-10-10 03:21:45

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.10的相关文章

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.10

Every $k\times k$ positive matrix $A=(a_{ij})$ can be realised as a Gram matrix, i.e., vectors $x_j$, $1\leq j\leq k$, can be found so that $a_{ij}=\sef{x_i,x_j}$ for all $i,j$. Solution. By Exercise I.2.2, $A=B^*B$ for some $B$. Let $$\bex B=(x_1,\c

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.1

Let $x,y,z$ be linearly independent vectors in $\scrH$. Find a necessary and sufficient condition that a vector $w$ mush satisfy in order that the bilinear functional $$\bex F(u,v)=\sef{x,u}\sef{y,v}+\sef{z,u}\sef{w,v} \eex$$ is elementary. Solution.

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.7

For every matrix $A$, the matrix $$\bex \sex{\ba{cc} I&A\\ 0&I \ea} \eex$$ is invertible and its inverse is $$\bex \sex{\ba{cc} I&-A\\ 0&I \ea}. \eex$$ Use this to show that if $A,B$ are any two $n\times n$ matrices, then $$\bex \sex{\ba{c

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.5

Show that the inner product $$\bex \sef{x_1\vee \cdots \vee x_k,y_1\vee \cdots\vee y_k} \eex$$ is equal to the permanent of the $k\times k$ matrix $\sex{\sef{x_i,y_j}}$. Solution. $$\beex \bea &\quad \sef{x_1\vee \cdots \vee x_k,y_1\vee \cdots \vee y

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.1

Show that the inner product $$\bex \sef{x_1\wedge \cdots \wedge x_k,y_1\wedge \cdots\wedge y_k} \eex$$ is equal to the determinant of the $k\times k$ matrix $\sex{\sef{x_i,y_j}}$. Solution. $$\beex \bea &\quad \sef{x_1\wedge\cdots \wedge x_k,y_1\wedg

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.1.2

Let $X$ be nay basis of $\scrH$ and let $Y$ be the basis biorthogonal to it. Using matrix multiplication, $X$ gives a linear transformation from $\bbC^n$ to $\scrH$. The inverse of this is given by $Y^*$. In the special case when $X$ is orthonormal (

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.6

If $\sen{A}<1$, then $I-A$ is invertible, and $$\bex (I-A)^{-1}=I+A+A^2+\cdots, \eex$$ aa convergent power series. This is called the Neumann series. Solution.  Since $\sen{A}<1$, $$\bex \sum_{n=0}^\infty \sen{A}^n=\frac{1}{1-\sen{A}}<\infty. \ee

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.8

For any matrix $A$ the series $$\bex \exp A=I+A+\frac{A^2}{2!}+\cdots+\frac{A^n}{n!}+\cdots \eex$$ converges. This is called the exponential of $A$. The matrix $A$ is always invertible and $$\bex (\exp A)^{-1}=\exp(-A). \eex$$ Conversely, every inver

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.7

The set of all invertible matrices is a dense open subset of the set of all $n\times n$ matrices. The set of all unitary matrices is a compact subset of all $n\times n$ matrices. These two sets are also groups under multiplication. They are called th