Affiliation:
1. University of Waterloo Faculty of Engineering Waterloo Canada
2. DarwinAI Waterloo Canada
Abstract
AbstractThere can be numerous electronic components on a given PCB, making the task of visual inspection to detect defects very time‐consuming and prone to error, especially at scale. There has thus been significant interest in automatic PCB component detection, particularly leveraging deep learning. While deep neural networks are able to perform such detection with greater accuracy, these networks typically require high computational resources, limiting their feasibility in real‐world use cases, which often involve high‐volume and high‐throughput detection with constrained edge computing resource availability. To bridge this gap between performance and resource requirements, PCBDet, an attention condenser network design that provides state‐of‐the‐art inference throughput while achieving superior PCB component detection performance compared to other state‐of‐the‐art efficient architecture designs, is introduced. Experimental results show that PCBDet can achieve up to 2× inference speed‐up on an ARM Cortex A72 processor when compared to an EfficientNet‐based design while achieving ∼2–4% higher mAP on the FICS‐PCB benchmark dataset.
Publisher
Institution of Engineering and Technology (IET)
Reference9 articles.
1. Lu H. Mehta D. Paradis O. Asadizanjani N. Tehranipoor M. Woodard D.L.:FICS‐PCB: a multi‐modal image dataset for automated printed circuit board visual inspection(2020). eprint.iacr.org/2020/366.pdf (accessed Aug. 12 2022)
2. Kuo C.‐W. Ashmore J. Huggins D. Kira Z.:Data‐efficient graph embedding learning for PCB component detection. In:2019 IEEE Winter Conference on Applications of Computer Vision pp.551–560.IEEE Piscataway NJ(2019)
3. Balanced-YOLOv3: Addressing the Imbalance Problem of Object Detection in PCB Assembly Scene
4. A deep context learning based PCB defect detection model with anomalous trend alarming system
5. Wong A. Shafiee M.J. Abbasi S. Nair S. Famouri M.:Faster attention is what you need: a fast self‐attention neural network backbone architecture for the edge via double‐ condensing attention condensers. arXiv:2208.06980 (2022)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献