Schematic Operation of On-line Update Matrix Decomposer Architecture for Machine Learning Hardware

Researchers at GW have developed an effective, improved, and cost-effective training acceleration solution that is novel, for large scale neural networks in the field of Artificial Intelligence. The novel solution can also be implemented in a variety of hardware applications that is associated with training acceleration in machine learning as can be appreciated. The novel solution can include a computer architecture that can effectively calculate a weight matrix update for a neuromorphic network. The novel solution also includes aspects such as reduced time, area, energy, memory needed to operate a neuromorphic hardware system.

The disclosed invention can be implemented as either an apparatus, a device, a system, or a method as can be appreciated. The disclosed invention can include various aspects as follows: (i) a receiving module configured to receive various input vectors associated with backpropagation-based learning in a layer in a deep neural network; (ii) a computing module configured to errors associated with each layers of the deep neural network associated with weight matrices of the input vectors. In an embodiment, the disclosed invention utilizes low-rank approximations of stochastic gradient descent for reduced area needed to operate a neuromorphic hardware system.

Fig. 1 – Aspects of the disclosed invention

Applications:

  • Applications in the field of Artificial Intelligence including associated hardware applications and emerging hardware devices and solutions

Advantages:

  • Effective and improved training acceleration for large scale neural networks in the field of Artificial Intelligence
  • Reduced time, area, energy, memory needed to operate a neuromorphic hardware system
Patent Information:
Title App Type Country Serial No. Patent No. File Date Issued Date Expire Date
Quasi-Systolic Processor and Streaming Batch Eigenupdate Neuromorphic Machine US Utility *United States of America 16/806,121 11,651, 231 3/2/2020