Abstract
We use Janson's dependency criterion to prove that the distribution of $d$-descents of permutations of length $n$ converge to a normal distribution as $n$ goes to infinity. We show that this remains true even if $d$ is allowed to grow with $n$.
Publisher
The Electronic Journal of Combinatorics
Subject
Computational Theory and Mathematics,Geometry and Topology,Theoretical Computer Science,Applied Mathematics,Discrete Mathematics and Combinatorics
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献