Dissatisfied with chips on the market, Google makes a video-transcoding chip to send better quality YouTube videos. Credit: YouTube TV You know Google has more money than it could ever spend when it invests in a custom chip to do one task. And now they’ve done it for the third time. The search giant has developed a new chip and deployed it in its data centers to compress video content. The chips, called Video (Trans)Coding Units, or VCUs, do that faster and more efficiently than traditional CPUs. In a blog post discussing the project, Jeff Calow, a lead software engineer at Google said the VCU gives the highest YouTube video quality possible on your device while consuming less bandwidth than before. “An important thing to understand is that video is created and uploaded in a single format, but will ultimately be consumed on different devices—from your phone to your TV—at different resolutions,” he wrote. Some viewers will be streaming to a 4k TV at home and others watching on their phone. The infrastructure team’s job is to get those videos ready to watch by sending the smallest amount of data to your chosen device with the highest possible quality video. “But it’s costly and slow, and doing that processing using regular computer “brains” (called CPUs) is pretty inefficient, especially as you add more and more videos,” he wrote. Google claims the VCU is 20 to 33 times more compute-efficient than Google’s previous optimized system, which ran on traditional x86 servers. The project has been in the works since 2015. YouTube saw consumers wanted higher quality video, but had to shift to more efficient video codecs in order to do so. The VP9 codec fits the bill, but it requires five times more compute resources than the older H.264 codec, Callow said. The VCU supports both of them, and the next generation VCU will support AV1, an advanced codec with even higher resolution than VC9. “A dedicated, hard-wired processor is always going to be the fastest. Transcoding is one of those operations that doesn’t change much, so a programmable device isn’t needed,” said Jon Peddie, president of Jon Peddie Research, who follows the graphics market. But he doubts developers will be able to use it like they do with Google’s Tensor Processing Unit (TPU) for AI. “I don’t know for sure, but I’d say its made by YouTube for YouTube and only YouTube. The support and documentation issues would not be worth it to them and it would only arm a competitor,” he said. So for now, enjoy those pretty 4k YouTube videos. And for those keeping track, the VCU is the third custom data center chip Google has designed. Before that came the Tensor Processing Unit (TPU) for AI workloads and the Titan chip for security. Related content news High-bandwidth memory nearly sold out until 2026 While it might be tempting to blame Nvidia for the shortage of HBM, it’s not alone in driving high-performance computing and demand for the memory HPC requires. By Andy Patrizio May 13, 2024 3 mins CPUs and Processors High-Performance Computing Data Center news CHIPS Act to fund $285 million for semiconductor digital twins Plans call for building an institute to develop digital twins for semiconductor manufacturing and share resources among chip developers. By Andy Patrizio May 10, 2024 3 mins CPUs and Processors Data Center news HPE launches storage system for HPC and AI clusters The HPE Cray Storage Systems C500 is tuned to avoid I/O bottlenecks and offers a lower entry price than Cray systems designed for top supercomputers. By Andy Patrizio May 07, 2024 3 mins Supercomputers Enterprise Storage Data Center news Lenovo ships all-AMD AI systems New systems are designed to support generative AI and on-prem Azure. By Andy Patrizio Apr 30, 2024 3 mins CPUs and Processors Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe