Abstract
Voltage-to-time and current-to-time converters have been used in many recent works as a voltage-to-digital converter for artificial intelligence applications. In general, most of the previous designs use the current-starved technique or a capacitor-based delay unit, which is non-linear, expensive, and requires a large area. In this paper, we propose a highly linear current-to-digital converter. An optimization method is also proposed to generate the optimal converter design containing the smallest number of PMOS and sensitive circuits such as a differential amplifier. This enabled our design to be more stable and robust toward negative bias temperature instability (NBTI) and process variation. The proposed converter circuit implements the point-wise conversion from current-to-time, and it can be used directly for a variety of applications, such as analog-to-digital converters (ADC), used in built-in computational random access (C-RAM) memory. The conversion gain of the proposed circuit is 3.86 ms/A, which is 52 times greater than the conversion gains of state-of-the-art designs. Further, various time-to-digital converter (TDC) circuits are reviewed for the proposed current-to-time converter, and we recommend one circuit for a complete ADC design.
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献