Author:
Anandika Arrya,Laksono Pringgo Dwi,Suhaimi Muhammad Syaiful Amri bin,Muguro Joseph,Rusydi Muhammad Ilhamdi
Abstract
Rapid technological developments, one of which is technology to build communication relationships between humans and machines using Biosignals. One of them is Electrooculography (EOG). EOG is a type of biosignals obtained from eye movement. Research related to EOG has also developed a lot, especially for virtual keyboard control. Research on virtual keyboard control based on eye gaze motion using electrooculography technology has been widely developed. Previous research mostly drew conclusions based on time consumption in typing paragraphs. However, it has not been seen based on the number of eye gaze motions made by the user. In this research, an adaptive virtual keyboard system is built, controlled using EOG signals. The adaptive virtual keyboard is designed with 7x7 dimensions and has 49 buttons, including main buttons, letters, numbers, symbols, and unused buttons. The layout of the adaptive virtual keyboard has six zones. Each zone has a different number of steps. Characters located in the same zone have the same number of steps. The adaptive feature is to rearrange the position of the character's button based on the previously used characters. In the experiments, 30 respondents controlled static and adaptive virtual keyboards with 7 paragraphs typed. Adaptive mode rearranges the position of buttons based on k-selection activities from respondents. the k numbers are 10, 30, 50, 70 and 100. Two virtual keyboard modes are evaluated based on the number of steps required to type the paragraphs. Test results show that the performance of the adaptive virtual keyboard can shorten the number of user steps compared to static mode. There are tests of the optimal system that can be reduced up to 283 number of steps and from respondents, that can reduced up to 258 number of steps or about 40% of steps. This research underscores the promise of EOG-driven adaptive virtual keyboards, signaling a notable stride in augmenting user interaction efficiency in typing experiences, heralding a promising direction for future human-machine interface advancements.