Affiliation:
1. Samsung Electronics Co., Ltd.
Abstract
As mobile phones become increasingly multifunctional, the number and size of applications installed in phones are rapidly increasing. Consequently, mobile phones require more hardware resources such as NOR/NAND flash memory and DRAM, and their production cost is accordingly becoming higher. One candidate solution to reduce production cost is demand paging using MMU. However, demand paging causes unpredictably long page fault latency, and as such mobile phone manufacturers are reluctant to deploy this scheme. In this paper, we present a method that reduces the long latency of page faults by performing page fault handling in a parallelized manner, considering the characteristics of NAND-Type flash memory. We also discuss how to modify the existing page cache replacement policies so that they can exploit the benefits of the parallelized page fault handler. Experimental results show that the parallelized page fault handler improves the worst case latency of page faults significantly, by up to roughly 20%, and that the modified page cache replacement policies improve both the average and worst instruction fetch time.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design,Software
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献