Affiliation:
1. University of California, Berkeley
2. Lawrence Berkeley National Laboratory
3. University of Tokyo
Abstract
Many modern millimeter and submillimeter (“mm-wave”) telescopes for
astronomy are deploying more detectors by increasing the detector
pixel density and, with the rise of lithographed detector
architectures and high-throughput readout techniques, it is becoming
increasingly practical to overfill the focal plane. However, when the
pixel pitch ppix is small compared to the product of
the wavelength λ and the focal ratio F, or ppix≲1.2Fλ, the Bose term of the photon noise
correlates between neighboring detector pixels due to the Hanbury
Brown and Twiss (HBT) effect. When this HBT effect is non-negligible,
the array-averaged sensitivity scales with the detector count Ndet less favorably than the uncorrelated
limit of Ndet−1/2. In this paper, we present a general
prescription to calculate this HBT correlation based on a quantum
optics formalism and extend it to polarization-sensitive detectors. We
then estimate the impact of HBT correlations on the sensitivity of a
model mm-wave telescope and discuss the implications for a focal plane
design.
Funder
Office of Science
Japan Society for the Promotion of
Science
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献