PCIe continues to double its data rate, targeting 800G Ethernet and cloud computing. Credit: Brad Chacos/IDG The latest PCI Express (PCIe) specification again doubles the data rate over the previous spec. PCI Express 7.0 calls for a data rate of 128 gigatransfers per second (GT/s) and up to 512 GB/s bi-directionally via x16 data lane slot (not every PCI Express slot in a PC or server uses 16 transfer lanes), according to PCI-SIG, the industry group that maintains and develops the specification. The slower, previous spec, PCI Express 6.0 has yet to come to market, and doubling the rate with each version has become the norm. Having double the bandwidth means more data can be fed to demanding tasks like AI, machine learning, and high-performance computing applications. The PCI-SIG said PCIe 7.0 is targeted at 800G Ethernet, AI/ML, cloud, and quantum computing, and data-intensive uses like hyperscale data centers. PCIe 7.0 also features Pulse Amplitude Modulation with 4 levels (PAM4) signaling. The PCI SIG fadded PAM4 to the PCIe 6.0 spec as a replacement for non-return-to-zero (NRZ) amplitude. NRZ encoding only had two amplitude levels per pulse during a clock cycle, PAM4 has four, so PCIe 6 and 7 will double the amount of data encoded with each cycle. PCIe was created to replace AGP video card slots, and now every GPU card is PCIe-based. But more recent upgrades and revisions have been aimed at getting maximum performance out of solid-state drives (SSDs). With the doubling of throughput every two to three years, PCIe remains the interface of choice for SSDs for keeping CPU cores fed with data. PCIe 6 products are expected to start shipping this year, while those supporting PCIe 7 aren’t expected until 2025. The PCI-SIG is comprised of over 900 member companies, and has done regular updates with predictable throughput increases each time. That’s a better track record than other groups such as those behind the SATA-drive or USB interfaces, which have languished, fallen behind, and become confusing with their numerous point revisions. Related content news High-bandwidth memory nearly sold out until 2026 While it might be tempting to blame Nvidia for the shortage of HBM, it’s not alone in driving high-performance computing and demand for the memory HPC requires. By Andy Patrizio May 13, 2024 3 mins CPUs and Processors High-Performance Computing Data Center news CHIPS Act to fund $285 million for semiconductor digital twins Plans call for building an institute to develop digital twins for semiconductor manufacturing and share resources among chip developers. By Andy Patrizio May 10, 2024 3 mins CPUs and Processors Data Center news HPE launches storage system for HPC and AI clusters The HPE Cray Storage Systems C500 is tuned to avoid I/O bottlenecks and offers a lower entry price than Cray systems designed for top supercomputers. By Andy Patrizio May 07, 2024 3 mins Supercomputers Enterprise Storage Data Center news Lenovo ships all-AMD AI systems New systems are designed to support generative AI and on-prem Azure. By Andy Patrizio Apr 30, 2024 3 mins CPUs and Processors Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe