The results suggest it’s successful — training a ResNet model with noise achieved accuracy of 93.7% on the popular CIFAR-19 data set and top-1 accuracy on ImageNet of 71.6% after mapping the trained weights (i.e., parameters that transform input data) to phase-change memory components.
IBM’s latest work in the domain follows the introduction of the company’s phase-change memory chip for AI training.
In parallel, the team experimented with training machine learning models using analog phase-change memory components.
Moreover, after mapping the weights of a particular model onto 723,444 phase-change memory devices in a prototype chip, the accuracy stayed above 92.6% over the course of a single day.
In a study published this week in the journal Nature Communications, researchers at IBM’s lab in Zurich, Switzerland claim to have developed a technique that achieves both energy efficiency and high accuracy on machine learning workloads using phase-change memory.