Abstract
The Johnson-Lindenstrauss transform is a dimensionality reduction technique with a wide range of applications to theoretical computer science. It is specified by a distribution over projection matrices from R
n
→ R
k
where
k n
and states that
k
=
O
(
ε
−2
log 1/
δ
) dimensions suffice to approximate the norm of any fixed vector in R
n
to within a factor of 1 ±
ε
with probability at least 1 −
δ
. In this article, we show that this bound on
k
is optimal up to a constant factor, improving upon a previous
Ω
((
ε
−2
log
1/
δ
)/log(1/
ε
)) dimension bound of Alon. Our techniques are based on lower bounding the information cost of a novel one-way communication game and yield the first space lower bounds in a data stream model that depend on the error probability
δ
.
For many streaming problems, the most naïve way of achieving error probability
δ
is to first achieve constant probability, then take the median of
O
(log 1/
δ
) independent repetitions. Our techniques show that for a wide range of problems, this is in fact optimal! As an example, we show that estimating the ℓ
p
-distance for any
p
∈ [0,2] requires
Ω
(
ε
−2
log
n
log 1/
δ
) space, even for vectors in {0,1}
n
. This is optimal in all parameters and closes a long line of work on this problem. We also show the number of distinct elements requires
Ω
(
ε
−2
log 1/
δ
+ log
n
) space, which is optimal if
ε
−2
=
Ω
(log
n
). We also improve previous lower bounds for entropy in the strict turnstile and general turnstile models by a multiplicative factor of
Ω
(log 1/
δ
). Finally, we give an application to one-way communication complexity under product distributions, showing that, unlike the case of constant
δ
, the VC-dimension does not characterize the complexity when
δ
=
o
(1).
Publisher
Association for Computing Machinery (ACM)
Subject
Mathematics (miscellaneous)
Cited by
28 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Streaming Euclidean k-median and k-means with o(log n) Space;2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS);2023-11-06
2. Quantum communication complexity of linear regression;ACM Transactions on Computation Theory;2023-09-22
3. Better Cardinality Estimators for HyperLogLog, PCSA, and Beyond;Proceedings of the 42nd ACM SIGMOD-SIGACT-SIGAI Symposium on Principles of Database Systems;2023-06-18
4. RidgeSketch: A Fast Sketching Based Solver for Large Scale Ridge Regression;SIAM Journal on Matrix Analysis and Applications;2022-08-22
5. Universal Streaming of Subset Norms;THEOR COMPUT;2022