Affiliation:
1. Graduate School of Information Science and Technology The University of Tokyo Tokyo Japan
2. Department of Computer Science and Information Engineering National Chung Cheng University Minhsiung Taiwan
3. Department of Computer Science National Yang Ming Chiao Tung University Hsinchu Taiwan
4. Department of Computer Science and Information Engineering Department of Information Management National Taiwan University Taipei Taiwan
Abstract
AbstractInterface icons are prevalent in various digital applications. Due to limited time and budgets, many designers rely on informal evaluation, which often results in poor usability icons. In this paper, we propose a unique human‐in‐the‐loop framework that allows our target users, that is novice and professional user interface (UI) designers, to improve the usability of interface icons efficiently. We formulate several usability criteria into a perceptual usability function and enable users to iteratively revise an icon set with an interactive design tool, EvIcon. We take a large‐scale pre‐trained joint image‐text embedding (CLIP) and fine‐tune it to embed icon visuals with icon tags in the same embedding space (IconCLIP). During the revision process, our design tool provides two types of instant perceptual usability feedback. First, we provide perceptual usability feedback modelled by deep learning models trained on IconCLIP embeddings and crowdsourced perceptual ratings. Second, we use the embedding space of IconCLIP to assist users in improving icons' visual distinguishability among icons within the user‐prepared icon set. To provide the perceptual prediction, we compiled IconCEPT10K, the first large‐scale dataset of perceptual usability ratings over 10,000 interface icons, by conducting a crowdsourcing study. We demonstrated that our framework could benefit UI designers' interface icon revision process with a wide range of professional experience. Moreover, the interface icons designed using our framework achieved better semantic distance and familiarity, verified by an additional online user study.
Funder
National Science and Technology Council
Japan Society for the Promotion of Science
National Taiwan University
Subject
Computer Graphics and Computer-Aided Design