The Kolmogorov distances between a symmetric hypergeometric law with standard deviation
σ
\sigma
and its usual normal approximations are computed and shown to be less than
1
/
(
8
π
σ
)
1/(\sqrt {8\pi }\,\sigma )
, with the order
1
/
σ
1/\sigma
and the constant
1
/
8
π
1/\sqrt {8\pi }
being optimal. The results of Hipp and Mattner (2007) for symmetric binomial laws are obtained as special cases.
Connections to Berry-Esseen type results in more general situations concerning sums of simple random samples or Bernoulli convolutions are explained.
Auxiliary results of independent interest include rather sharp normal distribution function inequalities, a simple identifiability result for hypergeometric laws, and some remarks related to Lévy’s concentration-variance inequality.