In today’s AI revolution, Neuromorphic computing has become a vital part of artificial intelligence. However, existing technologies like memristor, though widely used to compute and store data, face complex non-idealities such as inconsistency in performance from one switching cycle to the next, variation in behavior between individual devices, and tuning failure where the resistance state remains stuck at a certain level. These non-idealities cause deviations in the outputs of the underlying vector-matrix multiplications due to which the memristive neural networks may not achieve desired accuracies.
Researchers at GW have developed a novel algorithm that mitigates the impact of these device non-idealities at the level of individual neural network (NN) layers and thereby aligning the outputs of memristive hardware layers with software counterparts. The effectiveness of the proposed algorithm was investigated using a 20,000-device hardware prototyping platform on a continual learning problem where a network must learn new tasks without forgetting previously learned information. Results demonstrate that the average multi-task classification accuracy improves from 61 % to 72 % using the proposed approach.
Figure 2. Layer Ensemble Averaging.
Applications:
Advantages: