Inner Product On The Vector Space Of Polynomials

(a) Show that a linear funtional $l\in P_{n,2d}^{*}$ if and only if there exists a measure $\mu$ supported on the united sphere $\mathbb{S}^{n-1}$ such that

(1)
\begin{align} l(f) = \int_{\mathbb{S}^{n-1}} f d\mu . \end{align}

(b) For a form $f\in\mathbb{R}[x]_{n,d}$ define $\partial f$ to be the corresponding differential operator. Show that $\langle f,g\rangle = \frac{1}{d!}\partial f(g)$ is a valid inner product on $\mathbb{R}[x]_{n,d}$.

(c) Using the inner product from part (b) we can identify $\mathbb{R}[x]_{n,d}$ with $\mathbb{R}[x]^{*}_{n,d}$ by identifying a linear functional $l$ with the form $f_l$ such that $l(f)=\langle f,g\rangle$ for all $g\in\mathbb{R}[x]_{n,d}$. Show that for a point evaluation $l_v, v=(v_1,\ldots,v_n)\in\mathbb{R}^n$ the corresponding form is $(v_1 x_1 + \dots + v_n x_n)^d = \langle v,x\rangle^d$.


(a) Let $\mu$ be a measure with this properties. Then $l(f) = \int_{\mathbb{S}^{n-1}} f d\mu$ is linear in f since the integral is linear in f and, by monotonicity of the integral, $l(f) = \int_{\mathbb{S}^{n-1}} f d\mu \geq 0$ for all $f\in P_{n,d}$.

Let $l\in P_{n,2d}^{*}$. Then $l$ is a linear combination of point evaluations, $l=\lambda_1 l_{v_1} + \dots \lambda_m l_{v_m}$ for $v_1,\ldots,v_m\in\mathbb{S}^{n-1}$. Then

(2)
\begin{align} l(f) = \lambda_1 f(v_1) + \dots \lambda_m f(v_m) = \ \lambda_i \int_{\mathbb{S}^{n-1}} f d\delta_{v_i}, \end{align}

where $\delta_{v_i} = \delta_{v_i}(\mathbb{S}^{n-1})$ is the direac measure w.r.t. $v_i$.
Define $\delta = \sum \lambda_i\delta_{v_i}$ as the sum measure of the (rescaled) dirac point measures. Now $\delta$ is the requried measure.

(b) We have to show positive-definiteness, symmetry and linearity (in one argument, the second follows by symmetry).

Let f be $f= \sum_{\alpha} c_{\alpha} x^{\alpha}$.

Positive-definiteness: $\langle f,f\rangle = \frac{1}{d!} \sum_{\alpha} \frac{\partial^{|\alpha|}}{x^{\alpha}}(f) = \frac{1}{d!} \sum_{\alpha} c_{\alpha}^2 \geq 0$ with equality iff $c_{\alpha}=0$ for all $\alpha$.

Symmetry: Let $g=\sum_{\beta} c_{\beta} x^{\beta}$ and let $\alpha$ be an exponent vector of f. Then $\frac{\partial^{|\alpha|}}{x^{\alpha}}(g) = \sum_{\beta} c_{\beta} \frac{\partial^{|\alpha|}}{x^{\alpha}}(x^{\beta})$ and

(3)
\begin{align} \frac{\partial^{|\alpha|}}{x^{\alpha}}(x^{\beta}) = \begin{cases} 0 & \text{if } \alpha_i>\beta_i \text{ for some } i \\ c_{\beta} & \text{if } \alpha=\beta \end{cases} \end{align}

(Note that $\alpha_i<\beta_i$ for all i is not possible since then $|\alpha|<|\beta|$ which is a contradiction.)
Therefore, $\langle f,g\rangle = \frac{1}{d!} \sum_{\alpha=\beta} c_{\alpha}\cdot c_{\beta}$. In the same way, we get $\langle g,f\rangle = \frac{1}{d!} \sum_{\beta=\alpha} c_{\beta}\cdot c_{\alpha}$, ie symmetry.

Linearity in the second argument is obvious since differentiation is linear.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License