Affiliation:
1. University of Bordeaux 4, Pessac, France
2. University of Lille 1, Villeneuve-d’Ascq, France
Abstract
Randomness and regularities in finance are usually treated in probabilistic terms. In this paper, we develop a different approach in using a non-probabilistic framework based on the algorithmic information theory initially developed by Kolmogorov (1965). We develop a generic method to estimate the Kolmogorov complexity of numeric series. This approach is based on an iterative “regularity erasing procedure” (REP) implemented to use lossless compression algorithms on financial data. The REP is found to be necessary to detect hidden structures, as one should “wash out” well-established financial patterns (i.e. stylized facts) to prevent algorithmic tools from concentrating on these non-profitable patterns. The main contribution of this article is methodological: we show that some structural regularities, invisible with classical statistical tests, can be detected by this algorithmic method. Our final illustration on the daily Dow-Jones Index reveals a weak compression rate, once well- known regularities are removed from the raw data. This result could be associated to a high efficiency level of the New York Stock Exchange, although more effective algorithmic tools could improve this compression rate on detecting new structures in the future.
Subject
Computational Mathematics,Computer Science Applications,Computer Vision and Pattern Recognition,Finance
Reference27 articles.
1. The komornik-loreti constant is transcendental;Allouche;Amer Math Monthly,2000
2. Data compression techniques for stock market prediction;Azhar;Data Compression Conference, 1994. DCC ’94,1994
3. Generalized autoregressive conditional heteroscedasticity;Bollerslev;Journal of Econometrics,1986
4. Multifractality in asset returns: Theory and evidence;Calvet;Review of Economics andStatistics,2002
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献