Abstract
<div class="section abstract"><div class="htmlview paragraph">The operation management of electric Taxi fleets requires cooperative optimization of Charging and Dispatching. The challenge is to make real-time decisions about which is the optimal charging station or passenger for each vehicle in the fleet. With the rapid advancement of Vehicle Internet of Things (VIOT) technologies, the aforementioned challenge can be readily addressed by leveraging big data analytics and machine learning algorithms, thereby contributing to smarter transportation systems. This study focuses on optimizing real-time decision-making for charging and dispatching in large-scale electric taxi fleets to improve their long-term benefits. To achieve this goal, a spatiotemporal decision framework using Bi-level optimization is proposed. Initially, a deep reinforcement learning-based model is built to estimate the value of charging and order dispatching under uncertainty. The model considers the long-term costs and benefits of different tasks and guides whether electric taxis should prioritize charging or order dispatching for the fleet's long-term benefits. Subsequently, a combinatorial optimization approach is employed to determine the specific targets for charging or order dispatching. Case studies are conducted within real-world operation data from electric taxis in Hangzhou City, China. The results validate the efficacy of the proposed method, as compared to a baseline approach. Across various fleet sizes and charging power conditions, the method significantly reduces non-service time during the charging process by optimizing charging time and location. The proposed method is found to be suitable for large-scale fleets and high-charging power scenarios.</div></div>