IBM Engineers Develop Energy-Efficient AI Chip with Phase-Change Memory

NNicholas August 23, 2023 10:37 PM

IBM has made significant strides in the development of an energy-efficient, analog AI processor. The team has successfully built a chip using phase-change memory, which has the potential to considerably reduce the energy footprint of large language models.

AI models' growing power consumption

Artificial Intelligence models, particularly sizable ones such as Chat GPT, rely heavily on billions of computational nodes and extensive connections. This setup entails a considerable amount of energy use, primarily due to the constant trips the system makes back and forth to memory. Given the pace of AI advancements and the ever-growing model sizes, this power consumption problem is anticipated to intensify in the future.

Innovative designs to curb energy use

In a bid to mitigate the energy consumption issue, IBM and Intel have taken innovative approaches to chip design. They've developed chips where individual neurons have all the memory they need to operate. This method combines memory and processing, potentially eliminating the need for frequent trips to memory. An alternative approach is to perform operations inside the memory itself, further consolidating the process and reducing energy usage.

Phase-change memory chips for energy efficiency

IBM has taken a step further in the development of an energy-efficient AI chip. The company recently revealed that it has built a phase-change memory chip that can perform speech recognition tasks with a reasonable degree of accuracy. Remarkably, it accomplishes this feat while maintaining a significantly lower energy footprint than its conventional counterparts. This breakthrough chip is much closer to a functional AI processor than previous prototypes.

Phase-change memory, a technology that's been under development for some time, offers a unique solution to neural network design. It functions by heating a small patch of material and then managing how quickly it cools. The resulting states—either an organized crystal or a disordered high-resistance state—can store a bit that remains until enough voltage is applied to change its state. This behavior aligns well with neural networks, where each node receives an input and decides how much signal to send to further nodes based on its state.

An interesting facet of phase-change memory is its suitability to represent the strength of connections between nodes in a neural network. Typically, this strength is viewed as an individual bit of memory operating in an analog mode. Phase-change memory's ability to set the resistance of a bit to values anywhere between its on and off states allows for this analog behavior. This smooth gradient of potential values serves to illustrate the strength of node connections, replicating a neural network node's behavior simply by passing current through a bit of phase-change memory.

IBM's significant step toward functional AI processors

IBM's latest creation is not just a promising prototype—it's a chip that's much closer to a functional AI processor. It contains all the necessary hardware to connect individual nodes, a significant step toward handling large language models. This development signifies a potential solution to the growing energy consumption problem in the field of AI, marking a significant milestone in the journey toward more energy-efficient AI technology.

More articles

Also read

Here are some interesting articles on other sites from our network.