Theorem. Assume that the functions
x
(
⋅
)
,
h
(
⋅
)
,
G
(
⋅
)
x( \cdot ),h( \cdot ),G( \cdot )
satisfy: (i)
0
⩽
x
(
t
)
,
t
∈
[
0
,
∞
)
;
x
(
t
)
→
0
0 \leqslant x(t),t \in [0,\infty );x(t) \to 0
as
t
→
∞
;
x
t \to \infty ;x
bounded, measurable; (ii)
0
⩽
h
(
s
)
;
h
(
s
)
0 \leqslant h(s);h(s)
Lipschitz continuous for
s
∈
I
s \in I
, where I is a closed interval containing the range of
x
;
h
(
0
)
=
0
,
h
′
(
0
+
)
=
1
,
h
(
0
+
)
>
0
x;h(0) = 0,h’(0 + ) = 1,h(0 + ) > 0
; (iii) G a probability distribution on
(
0
,
∞
)
(0,\infty )
having nontrivial absolutely continuous component and finite second moment. Let
H
x
(
t
)
=
∫
0
t
h
[
x
(
t
−
y
)
]
d
G
(
y
)
Hx(t) = \smallint _0^th[x(t - y)]dG(y)
. If
0
⩽
(
x
−
H
x
)
(
t
)
=
o
(
t
−
2
)
0 \leqslant (x - Hx)(t) = o({t^{ - 2}})
, with strict inequality on the left on a set of positive measure, then
x
(
t
)
∼
γ
/
t
,
t
→
∞
x(t) \sim \gamma /t,t \to \infty
, where
γ
\gamma
is a constant depending only on h and G. The condition
o
(
t
−
2
)
o({t^{ - 2}})
is close to best possible, and cannot, e.g., be replaced by
O
(
t
−
2
)
O({t^{ - 2}})
.