Marcinkiewicz-Zygmund’s SLLN

By | December 21, 2016

First, we present well-known Khintchine-Kolmogorov Theorem.

Theorem (Khintchine-Kolmogorov). Let X_{1},X_{2},\dots be independent random variables with finite expectation. If \sum_{n=1}^{\infty}\mathrm{Var}\left(X_{n}\right)<\infty,
then \sum_{n=1}^{\infty}\left[X_{n}-\mathbb{E}\left[X_{n}\right]\right] converges a.e.

Now we present the Marcinkiewicz-Zygmund theorem which is our object of this article.

Theorem (Marcinkiewicz-Zygmund Theorem). Suppose that X_{1},X_{2},\dots are i.i.d with \mathbb{E}\left|X_{1}\right|^{p}<\infty for some 0<p<2. Then

    \[ \sum_{n=1}^{\infty}\left(\frac{X_{n}}{n^{\frac{1}{p}}}-\mathbb{E}\left[Y_{n}\right]\right)\quad\text{converges a.e.}, \]

where

    \[ Y_{n}=\frac{\mathbb{E}X_{n}1_{\left\{ \left|X_{n}\right|\le n^{\frac{1}{p}}\right\} }}{n^{\frac{1}{p}}}. \]

Proof. Set

    \[ A_{j}=\left\{ \left(j-1\right)^{\frac{1}{p}}<\left|X_{n}\right|\le j^{\frac{1}{p}}\right\} . \]

Then

    \begin{align*} \sum_{n=1}^{\infty}\mathbb{E}\left|Y_{n}\right|^{2} & =\sum_{n=1}^{\infty}\sum_{j=1}^{n}n^{-\frac{2}{p}}\mathbb{E}\left|X_{n}I_{A_{j}}\right|^{2}\\ & =\sum_{n=1}^{\infty}\sum_{j=1}^{n}n^{-\frac{2}{p}}\int_{A_{j}}\left|X_{n}\right|^{2}d\mathbb{P}\\ & =\sum_{j=1}^{\infty}\sum_{n=j}^{\infty}\int_{A_{j}}n^{-\frac{2}{p}}\left|X_{1}\right|^{2}d\mathbb{P}\\ & \le\sum_{j=1}^{\infty}\left(\frac{1}{j}+\frac{p}{2-p}\right)\int_{A_{j}}\left|X_{1}\right|^{2}d\mathbb{P}. \end{align*}

The last inequality comes from the fact

    \begin{align*} \sum_{n=j}^{\infty}n^{-\frac{2}{p}} & \le\int_{j}^{\infty}x^{-\frac{2}{p}}dx+\frac{1}{j}\\ & =\frac{p}{2-p}j^{-\frac{2}{p}}+\frac{1}{j}\\ & \le\frac{p}{2-p}+\frac{1}{j}. \end{align*}

Now

    \[ \sum_{n=1}^{\infty}\mathbb{E}\left|Y_{n}\right|^{2}\le\frac{2}{2-p}\mathbb{E}\left|X_{1}\right|^{p}<\infty. \]

Hence

    \[ \sum_{j=1}^{\infty}\left(Y_{n}-\mathbb{E}\left[Y_{n}\right]\right) \]

converges a.e. Note that

    \[ \sum_{n=1}^{\infty}\mathbb{P}\left(\frac{X_{n}}{n^{\frac{1}{p}}}\neq Y_{n}\right)=\sum_{n=1}^{\infty}\mathbb{P}\left(\left|X_{1}\right|>n^{\frac{1}{p}}\right)\le\mathbb{E}\left|X_{1}\right|^{p}<\infty. \]

Hence by Borel-Cantelli’s lemma, \frac{X_{n}}{n^{\frac{1}{p}}}=Y_{n} a.e. for all but finitely many n. Hence

    \[ \sum_{j=1}^{\infty}\left(\frac{X_{n}}{n^{\frac{1}{p}}}-\mathbb{E}\left[Y_{n}\right]\right) \]

converges a.e. This completes the proof.

Now we give some application of this theorem.

Problem. Given i.i.d sequence of random variables X_{n}, assume \mathbb{E}X_{n}=0 and

    \[ \mathbb{E}\left|X_{n}\right|\log\left(1+\left|X_{n}\right|\right)<\infty. \]

Show \sum_{n}\left(\frac{X_{n}}{n}\right) converges almost surely.

Proof. Note that X_{n}\in L^{1}. Indeed,

    \begin{align*} \mathbb{E}\left|X_{n}\right| & =\int_{\left\{ \left|X_{n}\right|\le1\right\} }\left|X_{n}\right|d\mathbb{P}+\int_{\left\{ \left|X_{n}\right|>1\right\} }\left|X_{n}\right|d\mathbb{P}\\ & \le\mathbb{P}\left\{ \left|X_{n}\right|\le1\right\} +\int_{\left\{ \left|X_{n}\right|>1\right\} }\left|X_{n}\right|\log\left(1+\left|X_{n}\right|\right)d\mathbb{P}\\ & \le1+\mathbb{E}\left|X_{n}\right|\log\left(1+\left|X_{n}\right|\right)<\infty. \end{align*}

Hence it suffices to show

(1)   \begin{equation*} \sum_{n=1}^{\infty}\frac{\mathbb{E}\left|X_{1}\right|1_{\left\{ \left|X_{1}\right|\ge n\right\} }}{n}<\infty. \end{equation*}

Indeed, suppose the above is true. Then note that

    \begin{align*} \sum_{n=1}^{\infty}\left|\frac{\mathbb{E}X_{1}1_{\left|X_{1}\right|\le n}}{n}\right| & =\sum_{n=1}^{\infty}\left|\frac{\mathbb{E}X_{1}\left(1-1_{\left|X_{1}\right|>n}\right)}{n}\right|\\ & =\sum_{n=1}^{\infty}\left|\frac{\mathbb{E}X_{1}1_{\left\{ \left|X_{1}\right|\ge n\right\} }}{n}\right|\\ & \le\sum_{n=1}^{\infty}\frac{\mathbb{E}\left|X_{1}\right|1_{\left\{ \left|X_{1}\right|\ge n\right\} }}{n}<\infty \end{align*}

since \mathbb{E}X_{1}=0.

Now Marcinkiewicz-Zygmund theorem gives

    \[ \sum_{n=1}^{\infty}\left(\frac{X_{n}-\mathbb{E}X_{n}1_{\left\{ \left|X_{n}\right|\le n\right\} }}{n}\right) \]

converges almost surely.

Now we left to show that (1) holds. Let E_{k}=\left\{ k\le\left|X_{1}\right|<k+1\right\}.
Then observe that

    \begin{align*} \sum_{n=1}^{\infty}\frac{\mathbb{E}\left|X_{1}\right|1_{\left\{ \left|X_{1}\right|\ge n\right\} }}{n} & =\sum_{n=1}^{\infty}\sum_{k=n}^{\infty}\frac{\mathbb{E}\left|X_{1}\right|1_{E_{k}}}{n}\\ & =\sum_{k=1}^{\infty}\sum_{n=1}^{k}\frac{\mathbb{E}\left|X_{1}\right|1_{E_{k}}}{n}\\ & \le C\sum_{k=1}^{\infty}\log k\cdot\mathbb{E}\left|X_{1}\right|1_{E_{k}}\\ & \le C\mathbb{E}\left|X_{1}\right|\log\left(1+\left|X_{1}\right|\right)<\infty. \end{align*}

So we are done.

Leave a Reply

Your email address will not be published. Required fields are marked *