1. An architecture for parallel topic models
2. J. Dean, G. S. Corrado, and R. Monga. Large scale distributed deep networks. Proc. Int. Conf. Neural Inf. Process. Syst. Red Hook, NY, USA: Curran Associates, 2013, pp. 1223–1231.
3. Douban Paracel. [Online]. Available: http://paracel.io/
4. Scaling Distributed Machine Learning with the Parameter Server
5. M. Li, Z. Li, and A. Smola. Parameter server for distributed machine learning. Proc. Big Learn. NIPS Workshop, 2013, pp. 1–10.