Let
X
1
,
X
2
,
…
{X_1},{X_2}, \ldots
be mutually independent random variables such that
E
(
X
n
)
=
0
E({X_n}) = 0
and
E
(
X
n
2
)
=
σ
n
2
=
1
E(X_n^2) = \sigma _n^2 = 1
for all
n
=
1
,
2
,
…
n = 1,2, \ldots
. For each
n
=
1
,
2
,
…
n = 1,2, \ldots
let
S
n
=
∑
j
=
1
n
X
j
{S_n} = \sum \nolimits _{j = 1}^n {{X_j}}
; then, by the Kolmogorov criterion for mutually independent random variables,
S
n
/
n
1
/
2
+
α
→
0
{S_n}/{n^{1/2 + \alpha }} \to 0
almost surely as
n
→
∞
n \to \infty
for any positive constant
α
\alpha
. A deeper understanding of this theorem will be facilitated if we know the order of magnitude of
E
{
N
∞
(
α
,
ε
)
}
E\{ {N_\infty }(\alpha ,\varepsilon )\}
as
ε
→
0
+
\varepsilon \to {0^ + }
, where
N
∞
(
α
,
ε
)
{N_\infty }(\alpha ,\varepsilon )
is the integer-valued random variable defined by
N
∞
(
α
,
ε
)
=
∑
n
=
1
∞
χ
(
ε
n
1
/
2
+
α
,
∞
)
(
|
S
n
|
)
{N_\infty }(\alpha ,\varepsilon ) = \sum \nolimits _{n = 1}^\infty {{\chi _{(\varepsilon {n^{1/2 + \alpha }},\infty )}}(|{S_n}|)}
. The present note does the work for a wide class of random variables by using Esseen’s theorem and Katz-Petrov’s theorem.