Author:
Oh Sang-Ho,Lee Su Jin,Noh Juhwan,Mo Jeonghoon
Abstract
AbstractThe extensive utilization of electronic health records (EHRs) and the growth of enormous open biomedical datasets has readied the area for applications of computational and machine learning techniques to reveal fundamental patterns. This study’s goal is to develop a medical treatment recommendation system using Korean EHRs along with the Markov decision process (MDP). The sharing of EHRs by the National Health Insurance Sharing Service (NHISS) of Korea has made it possible to analyze Koreans’ medical data which include treatments, prescriptions, and medical check-up. After considering the merits and effectiveness of such data, we analyzed patients’ medical information and recommended optimal pharmaceutical prescriptions for diabetes, which is known to be the most burdensome disease for Koreans. We also proposed an MDP-based treatment recommendation system for diabetic patients to help doctors when prescribing diabetes medications. To build the model, we used the 11-year Korean NHISS database. To overcome the challenge of designing an MDP model, we carefully designed the states, actions, reward functions, and transition probability matrices, which were chosen to balance the tradeoffs between reality and the curse of dimensionality issues.
Funder
National Research Foundation of Korea
Publisher
Springer Science and Business Media LLC
Reference29 articles.
1. Beck, J. R. & Pauker, S. G. The Markov process in medical prognosis. Med. Decis. Making. 3(4), 419–458. https://doi.org/10.1177/0272989X8300300403 (1983).
2. Xiang, Y. & Poh, K. Time-critical dynamic decision modeling in medicine. Comput. Biol. Med. 32(2), 85–97 (2002).
3. Leong, T.Y. Dynamic decision modeling in medicine: a critique of existing formalisms. In Proc Annu Symp Comput Appl Med Care 478–484 (1993).
4. Stahl, J. E. Modelling methods for pharmacoeconomics and health technology assessment: An overview and guide. Pharmacoeconomics 26(2), 131–148. https://doi.org/10.2165/00019053-200826020-00004 (2008).
5. Schaefer, A., Bailey, M., Shechter, S. & Roberts,, M. Modeling medical treatment using Markov decision processes. In Operations Research and Health Care 593–612 (2005).
Cited by
11 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献