Scale, scale, scale…
Put simply, the main theme of this year’s Datacloud Global Congress has been absolutely enormous numbers.
AI data centres will represent 75% of all data centre demand by 2030. Data centres will draw as much power as Japan by the same year. 600kW per rack is already on the horizon.
The atmosphere in Cannes has been that of an industry progressing at breakneck pace, with everybody trying to get a sense of what to expect in even six months’ time.
As CBRE’s Andrew Jay aptly summed up: “This is no longer about planning for the future. The future is now—and it’s coming in 100-megawatt increments.”
…but beware the rise of ‘Braggawatts’
We’ve all seen it on our LinkedIn feeds. An enormous new data centre project is triumphantly announced as being in the works. But reading the announcement more closely, it becomes clear a few things are missing – no power agreement in place, no dedicated secured connectivity, and often not even any planning permission from local authorities.
Chris Sharp, Digital Realty’s CTO, called this phenomenon ‘Braggawatts’ on stage in Cannes this year, and it has been a common theme in discussions – a rapidly growing industry like data centres is attracting lots of noise about new capacity, regardless of whether it will actually come to fruition.
The lesson? Take certain announcements with a pinch of salt.
Location is still important
AI reducing the importance of latency – and by extension, where data centre are located - has been a well-covered topic in the past couple of years but beware the conclusion that you can build a data centre wherever you want, according to speakers on a panel focusing on AI 12 months on.
Inferencing workload (which does not need to be near the user) is indeed a huge chunk of the processing power required for AI compute – up to 500MW, or half of the entire London data centre market. But companies and customers still want to be near major market, and build strategies are being planned accordingly.
“We're betting on the traditional markets,” said Doug Adams, president of NTT Global Data Centres. “In the next 4 years trillion dollars of data centres will be built, and most of that will be serviced from the major markets."
Virtus’s Neil Cresswell agreed, while highlighting the growth potential of markets adjacent to FLAP-D. “It’s not microsecond latency like in high-frequency trading, but clients want to be within 100–150km of a major metro. That’s driving growth in cities like Bristol, Birmingham, and Cambridge.”
The average person doesn’t know what a data centre is
This is scarcely believable to data centre professions who spend eight hours plus a day immersed in the digital infrastructure world. But as the marketing maxim goes, you are not the customer – and a session on the publish perception of data centres laid bare some interesting statistics.
CyrusOne’s Emma Fryer delved into some findings from her company’s recent report on data centre perception around Europe to show the work needed to be done. In the survey of 13,000 people, only 7% of respondents had a negative perception of the data centre industry, which is a far more positive perception than the general tone of media coverage of data centre impact.
But less than half of these respondents could accurately define what a data centre was, and this was as part of a multiple-choice question (the UK got a special mention as having the lowest results here).
As Fryer put it, “people think the Internet runs on magic” – and we still have a long way to go as an industry to persuade people this is not the case.
The stranded asset risk is real
With such huge investment going into data centre provision, a large amount of it is being spent on technology, servers, and electronics, and this is the category most at risk from becoming obsolete, and from there making the entire facility in which they are housed a stranded asset.
The issue of rapid component obsolescence was raised by Oracle’s Dan Madrigal: “We’re designing for 130kW racks today, but 600kW per rack is already on the horizon. By the time a campus is built, the technology may have shifted. Standardisation is nearly impossible when you’re moving at 90 miles an hour.”
Virtus’s Cresswell agreed, with a focus on the cooling side: “We said everything would be liquid cooled by 2027—well, that’s already happening.”
“The network is the bottleneck”
Or it is according to Vishnu Acharya, head of network infrastructure, Platform Engineering at Uber. So much of the conversation around data centre deployment has focused on power availability and chokepoints – but it’s not the only limiting factor.
Fibre and inter-facility connectivity is coming up more and more as a problem to tackle – in fact, it is the main focus of the upcoming Datacloud USA & Metro Connect Fall event in September.
This is turning fibre – particularly dark fibre, which is open to multiple tenants and clients – into an attractive asset class, given that lead times and connectivity charges are both rising in many key data centre markets.

Stakeholder relationships make a huge difference
Getting a data centre built, powered and online is a giant task, and it can only happen when data centre developers and operators are exactly on the same wavelength with others who have an interest in the project’s success – power suppliers, local governments and policymakers, neighbouring communities, fibre providers, and so on.
This theme emerged in various contexts across the two days in Cannes. In southern Europe, for example, the trend of developers sealing power purchase agreements directly has led to some big data centre projects in recent months, particularly in Greece.
And on the legislation side, the Spanish region of Aragón has become a fast-growing region at the partial expense of the capital Madrid, primarily due to favourable regulatory conditions. As Schneider Electric’s Thierry Chamayou put it: “You need support at every level — from land acquisition to power access — and local governments need to streamline processes.”
Southern Europe is building on its position
There is so much data centre capacity growth required that demand is flowing to every established market, and there are plenty in Southern Europe that are feeling the benefit.
Many of the region’s hubs have benefited from their proximity to major cable landing points. And now the AI wave is coming, they are building on this heritage to add megawattage at an unprecedented scale “Five years ago, one megawatt in southern Europe was considered large,” said Ruben Garcia, Data Centre Sales Director at Rittal.
“Now we’re selling 30–50MW every few weeks. Madrid, Milan, Athens — these places are becoming real centres of gravity.”
Power lead times are long – so plan accordingly
The Datacloud Energy and ESG event in Brussels back in March 2025 extensively covered how data centres need to work closely with power suppliers and national grids to ensure the servers in their new facilities can stay on. This discussion continued in Cannes, with a particular focus on lead times – up to 10 years for new generation turbines, and even longer for transmission.
However, the onus is on the data centre world to adapt to this, rather than complain about it. This means prioritising power when it comes to site selection, looking to alternative fuels and sources for power, improving battery capability, and more – as well as working with grids to promote the benefits of data centres as stable power consumers and load balancers.
Data sovereignty comes with big costs
Many countries and jurisdictions now have some form of data sovereignty requirements – where data either about or from their citizens must be stored within the country’s borders.
But political will is butting up against economic reality. A panel dedicated to data sovereignty and the rise of private cloud highlighted various limiting factors for keeping data behind borders.
There are the practical aspects – data flats around the world, and this is even more so now data from all over the place is being sucked into AI models. And there are also economic aspects – in-country solutions cost a lot of money, especially so if real redundancy needs to be provided (which is almost always the case).
As Equinix’s CTO Justin Dustzadeh put it: “From a data sovereignty perspective, I always tell my clients: know your use case, know your considerations and know your trade-offs.”
Sustainability: mixed messages
Sustainability in data centre operations has been a massive focus area for so long that it’s almost taken as a given at industry events. But changes in the market – political, economic, power availability, and more – mean that the answer of how important sustainability is depends on who you ask.
On one hand, it is still a huge priority, both for regulatory and economic reasons. A panel on how regulation can support a sustainable and competitive European data centre industry included a range of views on this, with one speaker stating that sustainability is driven by demand and market dynamics, with regulation only becoming a factor in terms of market failure.
On the other, there is a belief that AI demand makes aiming for net zero extremely tough if enough compute power is going to be provided. This, combined with onerous reporting regulations (particularly in the EU), means that there has been a gradual backslide in the importance of the topic, with options such as gas-fired generators and turbines being spoke about as viable options (particularly for reliable backup) in terms that were not the case two or three years ago.
In conclusion, this is an area where the direction of travel is not immediately obvious – things could easily go either way in a year’s time.
The MW to GW shift is BIG
And to close, a useful illustration from QTS Data Centers’ Tag Greason about just how big an increase AI is going to bring. One million seconds (or a MW of seconds) is equivalent to around 11 days – and one billion seconds, or a gigawatt’s worth of seconds, would be 31 years and 7 months. Something to bear in mind when thinking about the MW to GW shift…