Affiliation:
1. Politecnico di Torino, Torino, Italy
Abstract
In automated Visual GUI Testing (VGT) for Android devices, the available tools often suffer from low robustness to mobile fragmentation, leading to incorrect results when running the same tests on different devices.
To soften these issues, we evaluate two feature matching-based approaches for widget detection in VGT scripts, which use, respectively, the complete full-screen snapshot of the application (
Fullscreen
) and the cropped images of its widgets (
Cropped
) as
visual locators
to match on emulated devices.
Our analysis includes validating the portability of different feature-based visual locators over various apps and devices and evaluating their robustness in terms of cross-device portability and correctly executed interactions. We assessed our results through a comparison with two state-of-the-art tools, EyeAutomate and Sikuli.
Despite a limited increase in the computational burden, our Fullscreen approach outperformed state-of-the-art tools in terms of correctly identified locators across a wide range of devices and led to a 30% increase in passing tests.
Our work shows that VGT tools’ dependability can be improved by bridging the testing and computer vision communities. This connection enables the design of algorithms targeted to domain-specific needs and thus inherently more usable and robust.
Publisher
Association for Computing Machinery (ACM)
Reference64 articles.
1. EyeAutomate Documentation;AB Synteda;https://eyeautomate.com/wp-content/themes/EyeAutomateTheme/resources/EyeAutomateCertifiedTesterCourse.pdf,2018
2. On the long-term use of visual GUI testing in industrial practice: A case study;Alégroth Emil;Empir. Softw. Eng.,2017
3. Maintenance of automated test suites in industry: An empirical study on visual GUI testing;Alégroth Emil;Info. Softw. Technol.,2016
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献