Abstract
We consider the multi-valued problem of nding all solutions of the equation f(x) = 0 in the space of functions f : [0, 1] → R such that the derivative f(r) with r ∈ {0, 1, 2, . . .} exists and is Hölder continuous with exponent ϱ ∈ (0, 1]. Available algorithms use information about values of f and/or its derivatives at adaptively selected n points, and the error between the true solution Z(f) and approximate solution Zn(f) is measured by the Hausdor distance dH(Z(f),Zn(f)). We show that, despite the fact that the worst case error of any algorithm is innite, it is possible to construct nonadaptive approximations Z∗n such that the error dH (Z(f),Z∗n(f)) converges to zero as n → +∞. However, the convergence can be arbitrarily slow. Specically, for arbitrary sequence of approximations {Zn}n≥1 that use n adaptively chosen function values and/or its derivatives, and for arbitrary positive sequence {τn}n≥1 converging to zero there are functions f in our space such that supn≥1 τ−1n dH(Z(f),Zn(f))= +∞. We conjecture that the same lower bound holds if we allow information about values of n arbitrary and adaptively selected linear functionals at f.
Publisher
Ivan Franko National University of Lviv