The Nervana Neural Network Processor (NNP) is expected to help companies develop new classes of AI applications. It is expected to be on sale before the end of the year. Credit: Intel There are a number of efforts involving artificial intelligence (AI) and neural network-oriented processors from vendors such as IBM, Qualcomm and Google. Now, you can add Intel to that list. The company has formally introduced the Nervana Neural Network processor (NNP) for AI projects and tasks. This isn’t a new Intel design. The chips come out of Intel’s $400 million acquisition of a deep learning startup called Nervana Systems last year. After the acquisition, Nervana CEO Naveen Rao was put in charge of Intel’s AI products group. “The Intel Nervana NNP is a purpose-built architecture for deep learning,” Rao said in a blog post formally announcing the chip. “The goal of this new architecture is to provide the needed flexibility to support all deep learning primitives while making core hardware components as efficient as possible.” He added: “We designed the Intel Nervana NNP to free us from the limitations imposed by existing hardware, which wasn’t explicitly designed for AI.” That’s an interesting statement, since Rao could be referring to the x86 architecture — or GPUs, since Nvidia’s CEO has never been shy about sniping at x86. Rao didn’t go into design specifics, only that the NNP does not have a standard cache hierarchy of an x86, and on-chip memory is managed by software directly. He also said the chip was designed with high speed on- and off-chip interconnects, enabling “massive bi-directional data transfer.” Using self-learning chips to develop AI applications Intel CEO Brian Krzanich had his own blog post on the subject. “Using Intel Nervana technology, companies will be able to develop entirely new classes of AI applications that maximize the amount of data processed and enable customers to find greater insights — transforming their businesses,” he wrote. Krzanich also revealed that Facebook was involved in the design of the processor, although he did not elaborate beyond saying Facebook worked with Intel “in close collaboration, sharing its technical insights.” Now, why would Facebook care? Because one of the potential uses for the NNP as described by Krzanich is in social media to “deliver a more personalized experience to their customers and offer more targeted reach to their advertisers.” Neuromorphic chips are inspired by the human brain and designed to be self-learning, so as they perform a task they get better at it and look at new ways of executing it. A recent documentary on the Japanese news channel NHK World illustrated that, where an AI application that practiced millions of games of shogi, a Japanese chess-like board game, came up with its own strategies that it had not been programmed into it, flabbergasting the developer and human player it roundly trounced. It thought for itself. All of that reminds me of Ian Malcolm’s comment in Jurassic Park: “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” Related content news High-bandwidth memory nearly sold out until 2026 While it might be tempting to blame Nvidia for the shortage of HBM, it’s not alone in driving high-performance computing and demand for the memory HPC requires. By Andy Patrizio May 13, 2024 3 mins CPUs and Processors High-Performance Computing Data Center news CHIPS Act to fund $285 million for semiconductor digital twins Plans call for building an institute to develop digital twins for semiconductor manufacturing and share resources among chip developers. By Andy Patrizio May 10, 2024 3 mins CPUs and Processors Data Center news HPE launches storage system for HPC and AI clusters The HPE Cray Storage Systems C500 is tuned to avoid I/O bottlenecks and offers a lower entry price than Cray systems designed for top supercomputers. By Andy Patrizio May 07, 2024 3 mins Supercomputers Enterprise Storage Data Center news Lenovo ships all-AMD AI systems New systems are designed to support generative AI and on-prem Azure. By Andy Patrizio Apr 30, 2024 3 mins CPUs and Processors Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe