In February, deep within a warehouse at CERN, home of the Large Hadron Collider (LHC) – the world’s largest scientific experiment – two network engineers eagerly pressed a button. Instantly, text flashed on a screen before them, confirming success. “There was high-fiving involved,” recalls Joachim Opdenakker of SURF, a Dutch IT association serving educational and research institutions. “It was super-cool to see.”
Opdenakker and his colleague Edwin Verheul had established a new data link between the LHC in Switzerland and data storage sites in the Netherlands, achieving speeds of 800 gigabits per second (Gbps) – over 11,000 times the average UK home broadband speed. This link is designed to enhance scientists’ access to LHC experiment results.
A subsequent test in March, using specialized equipment from Nokia, confirmed the desired speeds were attainable. “This transponder that Nokia uses, it’s like a celebrity,” says Verheul, noting the high demand for the equipment. “We had limited time to do tests. If you have to postpone a week, then the transponder is gone.”
While this bandwidth approaches one terabit per second, it is not the fastest; some subsea cables, using multiple fiber strands, achieve speeds several hundred times greater.
In laboratories worldwide, networking experts are developing fiber optic systems capable of transmitting data at astonishing speeds, reaching many petabits per second (Pbps), which is 300 million times faster than the average UK home broadband connection. The potential applications for such immense bandwidth are still being explored, but engineers are focused on pushing these limits even further.
The duplex cable from CERN to data centers in the Netherlands spans nearly 1,650 kilometers (1,025 miles), traveling from Geneva to Paris, Brussels, and Amsterdam. Achieving 800 Gbps over this distance involves overcoming the challenge of maintaining the power levels of light pulses, which requires amplification at various points along the route, as explained by Joachim Opdenakker.
Every collision of subatomic particles at the LHC generates about one petabyte of data per second, enough to fill 220,000 DVDs. While this data is condensed for storage and analysis, significant bandwidth is still necessary. With an upgrade expected by 2029, the LHC anticipates producing even more data.
“The upgrade increases the number of collisions by at least a factor of five,” says James Watt, senior vice president and general manager of optical networks at Nokia.
The current speed of 800 Gbps may soon seem slow. In November, researchers in Japan set a new world record for data transmission, achieving an astounding 22.9 Pbps. This bandwidth could provide every person on Earth, and then some, with a Netflix stream, according to Chigo Okonkwo of Eindhoven University of Technology, who was involved in the research.
This experiment involved sending a vast stream of pseudorandom data over 13 kilometers of coiled fiber optic cable in a lab. Dr. Okonkwo explains that the integrity of the data is analyzed after transfer to ensure it was transmitted quickly without accumulating excessive errors. The system used multiple cores—19 in total—inside a single fiber cable, a new type of cable unlike the standard ones connecting many homes to the internet.
But replacing older fiber optic cables is costly and challenging. Extending their lifespan is beneficial, according to Wladek Forysiak of Aston University in the UK. Recently, he and his team achieved speeds of approximately 402 terabits per second (Tbps) over a 50km optical fiber with a single core. This speed is about 5.7 million times faster than the average UK home broadband connection.
“I think it’s a world best; we don’t know of any results that are better than that,” says Prof. Forysiak. Their technique involves using more wavelengths of light than usual for data transmission. This method relies on alternative forms of electronic equipment to send and receive signals over fiber optic cables, potentially making it easier to implement than replacing thousands of kilometers of cable.
Martin Creaner, director general of the World Broadband Association, suggests that activities in the so-called metaverse might one day require such extreme bandwidth. His organization predicts that home broadband connections could reach up to 50 Gbps by 2030. However, reliability may be even more crucial than speed for certain applications. “For remote robotic surgery across 3,000 miles, you absolutely do not want any scenario where the network goes down,” Creaner emphasizes.
Dr. Okonkwo points out that training AI will increasingly necessitate moving large datasets quickly, arguing that faster data transfer will be advantageous. Ian Phillips, who collaborates with Prof. Forysiak, adds that once bandwidth becomes available, humanity finds ways to utilize it: “Humanity finds a way of consuming it.
Although data speeds of multiple petabits per second far exceed current web user needs, Lane Burdette, a research analyst at TeleGeography, notes the rapid growth in bandwidth demand. This demand is increasing at about 30% per year on transatlantic fiber optic cables. She highlights that content provision—such as social media, cloud services, and video streaming—now consumes much more bandwidth. “It used to be around 15% of international bandwidth in the early 2010s. Now it’s up to 75%. It’s absolutely massive,” she says.
In the UK, significant improvements in internet speeds are still needed, as many people lack access to sufficiently fast broadband at home. According to Andrew Kernahan, head of public affairs at the Internet Service Providers Association, most home users can now access gigabit-per-second speeds. However, only about a third of broadband customers are opting for such technology. He points out that there isn’t currently a “killer app” that necessitates such high speeds, though this might change as more TV is consumed via the internet.
“There’s definitely a challenge to get the message out there and make people more aware of what they can do with the infrastructure,” Kernahan says.