Wavelet Transforms Significantly Sparsify and Compress Tactile Interactions
Author:
Slepyan Ariel1, Zakariaie Michael2, Tran Trac1, Thakor Nitish12
Affiliation:
1. Electrical and Computer Engineering Department, The Johns Hopkins University, Baltimore, MD 21218, USA 2. Biomedical Engineering Department, The Johns Hopkins University, Baltimore, MD 21218, USA
Abstract
As higher spatiotemporal resolution tactile sensing systems are being developed for prosthetics, wearables, and other biomedical applications, they demand faster sampling rates and generate larger data streams. Sparsifying transformations can alleviate these requirements by enabling compressive sampling and efficient data storage through compression. However, research on the best sparsifying transforms for tactile interactions is lagging. In this work we construct a library of orthogonal and biorthogonal wavelet transforms as sparsifying transforms for tactile interactions and compare their tradeoffs in compression and sparsity. We tested the sparsifying transforms on a publicly available high-density tactile object grasping dataset (548 sensor tactile glove, grasping 26 objects). In addition, we investigated which dimension wavelet transform—1D, 2D, or 3D—would best compress these tactile interactions. Our results show that wavelet transforms are highly efficient at compressing tactile data and can lead to very sparse and compact tactile representations. Additionally, our results show that 1D transforms achieve the sparsest representations, followed by 3D, and lastly 2D. Overall, the best wavelet for coarse approximation is Symlets 4 evaluated temporally which can sparsify to 0.5% sparsity and compress 10-bit tactile data to an average of 0.04 bits per pixel. Future studies can leverage the results of this paper to assist in the compressive sampling of large tactile arrays and free up computational resources for real-time processing on computationally constrained mobile platforms like neuroprosthetics.
Reference19 articles.
1. Sagisaka, T., Ohmura, Y., Kuniyoshi, Y., Nagakubo, A., and Ozaki, K. (2011, January 26–28). High-Density Conformable Tactile Sensing Glove. Proceedings of the 2011 11th IEEE-RAS International Conference on Humanoid Robots, Bled, Slovenia. 2. Learning the Signatures of the Human Grasp Using a Scalable Tactile Glove;Sundaram;Nature,2019 3. Ward-Cherrier, B., Pestell, N., and Lepora, N.F. (August, January 31). NeuroTac: A Neuromorphic Optical Tactile Sensor Applied to Texture Recognition. Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France. 4. Funabashi, S., Morikuni, S., Geier, A., Schmitz, A., Ogasa, S., Torno, T.P., Somlor, S., and Sugano, S. (2018, January 1–5). Object Recognition Through Active Sensing Using a Multi-Fingered Robot Hand with 3D Tactile Sensors. Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain. 5. Compressed Learning for Tactile Object Recognition;Hollis;IEEE Robot. Autom. Lett.,2018
|
|