Affiliation:
1. University Islam Sultan Sharif Ali, Brunei
2. Taylor's University, Malaysia
Abstract
The increasing use of AI in modern smart cities calls for explainable artificial intelligence (XAI) systems that can improve the efficiency and effectiveness of city operations while being transparent, interpretable, and trustworthy. Developing a unified framework for XAI that can handle the heterogeneity of data and systems in smart cities is the first challenge, considering the need to incorporate human factors and preferences in AI systems. The second challenge is developing new XAI methods that can handle the complexity and scale of smart city data. Addressing ethical and legal aspects is also critical, including ensuring that AI systems are fair and unbiased, protecting citizens' privacy and security, and establishing legal frameworks. Evaluating the effectiveness and usability of XAI systems is also crucial in improving city operations and stakeholder trust apart from XAI research for smart cities: improved visualization, human feedback, integration.
Reference37 articles.
1. Ahmad, K., Maabreh, M., Ghaly, M., Khan, K., Qadir, J., & Al-Fuqaha, A. (2020). Developing future human-centered smart cities: Critical analysis of smart city security, interpretability, and ethical challenges. arXiv preprint arXiv:2012.09110.
2. Biometric Authentication-Based Intrusion Detection Using Artificial Intelligence Internet of Things in Smart City
3. Intelligible and Explainable Machine Learning
4. Das, A., & Rad, P. (2020). Opportunities and challenges in explainable artificial intelligence (xai): A survey. arXiv preprint arXiv:2006.11371.
5. Embarak, O. (2022). An adaptive paradigm for smart education systems in smart cities using the internet of behaviour (IoB) and explainable artificial intelligence (XAI). Paper presented at the 2022 8th International Conference on Information Technology Trends (ITT).