Single-step discretization methods are considered for equations of the form
u
t
+
A
u
=
f
(
t
,
u
)
{u_t} + Au = f(t,u)
, where A is a linear positive definite operator in a Hilbert space H. It is shown that if the method is consistent with the differential equation then the convergence is essentially of first order in the stepsize, even if the initial data v are only in H, but also that, in contrast to the situation in the linear homogeneous case, higher-order convergence is not possible in general without further assumptions on v.