Search Results - deepak+kadetotad

2 Results Sort By:
Hierarchical Coarse-Grain Sparsity for Deep Neural Networks
­Background Recurrent neural networks (RNNs) that enable accurate automatic speech recognition (ASR) are large in size and have long short-term memory (LSTM) capabilities. Due to the large size of these networks, most speech recognition tasks are performed in the cloud servers, which requires constant internet connection, introduces privacy concerns,...
Published: 2/23/2023   |   Inventor(s): Jae-Sun Seo, Deepak Kadetotad, Chaitali Chakrabarti, Visar Berisha
Keywords(s):  
Category(s): Physical Science, Computing & Information Technology, Wireless & Networking
Coarse-Grain Memory Sparsification for Small-Footprint Deep Neural Networks
Recent breakthroughs in deep neural networks (DNNs) have led to improvements in state-of-the-art speech applications. Conventional DNNs have hundreds or thousands of neurons in each layer, which require a large amount of memory to store the connections between neurons. Implementing these networks in hardware requires a large memory and high computation...
Published: 2/23/2023   |   Inventor(s): Jae-Sun Seo, Chaitali Chakrabarti, Sairam Arunachalam, Deepak Kadetotad
Keywords(s):  
Category(s): Computing & Information Technology, Wireless & Networking