{"id":5534,"date":"2026-04-15T04:08:22","date_gmt":"2026-04-15T04:08:22","guid":{"rendered":"https:\/\/www.europesays.com\/ai\/5534\/"},"modified":"2026-04-15T04:08:22","modified_gmt":"2026-04-15T04:08:22","slug":"how-neocloud-nscale-is-navigating-the-ai-infrastructure-boom","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ai\/5534\/","title":{"rendered":"How neocloud Nscale is navigating the AI infrastructure boom"},"content":{"rendered":"<p>AI is no longer a side bet for most enterprises\u2014it\u2019s quickly becoming core to how products are built, decisions are made, and work gets done. That shift is colliding with a hard physical reality: the compute behind modern AI runs hot, dense, power-hungry\u2014and remains scarce. It\u2019s pushing data center operators to deliver far more capacity, far faster, often with projects measured in the hundreds of megawatts and timelines that feel closer to months than years.<\/p>\n<p>Related insights<\/p>\n<p>The result is a new infrastructure arms race, with \u2018neoclouds\u2019 emerging alongside hyperscalers to deliver AI-first capacity. And that arms race is attracting massive amounts of capital. Data center investment has surged into the hundreds of billions of dollars annually, with PwC estimating that meeting capacity requirements will require about US$2 trillion in capex by 2030. The financing playbook is evolving in kind. New deal structures are emerging, blurring the lines between customer contracts, infrastructure finance, and hardware supply\u2014and raising new questions about risk sharing and resilience.<\/p>\n<p>Nidhi CHAPPELL has had a front-row seat to how these dynamics are reshaping strategy. She currently serves in the C-suite at Europe-based neocloud Nscale as Global President of AI Infrastructure, and previously held a senior AI infrastructure leadership role at Microsoft Azure. In this edited version of her interview with strategy+business, CHAPPELL outlines what C-suite leaders often underestimate about scaling AI capacity\u2014including the operating-model and talent implications of running AI-dense infrastructure, and the growing importance of transparency in energy and water use. She also describes the shifts she sees shaping the next generation of facilities\u2014more modular, more instrumented, and more tightly orchestrated\u2014and what that means for executive decisions on capacity strategy, partner ecosystems, and the long-term trade-offs among speed, efficiency, and control.<\/p>\n<p>S+B: What strikes you most about this particular moment for the data center industry?<br \/>NIDHI CHAPPELL: The pace is unlike anything I\u2019ve seen before. In the early days of cloud, the demands were big, but they were still within the bounds of what existing data center designs could deliver. Now, we\u2019re being asked to build sites that can deliver hundreds of megawatts, support racks over 100 kW, and do it all in regions that prioritize renewable power. And do it in 12 months, not three years.<\/p>\n<p>We\u2019ve also seen the shift in density from 6 kW racks to more than 130 kW per rack in a very short space of time. That\u2019s a profound change. It impacts everything, from the power distribution system to the cooling topology to the physical structure of the data center itself. It\u2019s no longer possible to design around air cooling, as liquid and immersion cooling are now the baseline.<\/p>\n<p>From a strategy perspective, this has accelerated our move toward modular design. Traditional builds can take too long and often can be too rigid. By using prefabricated modules and digital twins, we can design around each graphics processing unit (GPU) generation\u2019s specific thermals and power draw before the kit even arrives. That allows us to deploy at speed, without sacrificing performance or sustainability.<\/p>\n<p>Finally, it\u2019s different in terms of who\u2019s at the table. AI infrastructure is now a board-level topic for banks, governments, universities, and industrial firms. That wasn\u2019t the case during the early days of cloud migration and digitization. The stakeholder mix is broader, and expectations are higher.<\/p>\n<p>AI infrastructure is now a C-suite topic for businesses across sectors. The stakeholder mix is broader, and expectations are higher.<\/p>\n<p>S+B: How does Nscale\u2019s business model compare to those of traditional hyperscalers?<br \/>CHAPPELL: We\u2019re structured differently than a traditional hyperscaler. We\u2019re vertically integrated, which means we design, build, own, and operate our infrastructure, from the data centers themselves to the orchestration software stack running on top. This allows us to optimize for AI performance, not just generic cloud workloads. We build specifically for high-density, GPU-based systems from the ground up. The traditional hyperscaler model is largely based on general-purpose compute and long-term capital cycles built around multi-tenant availability zones.<\/p>\n<p>Performance efficiency, rapid deployment, and adaptability to new compute requirements are especially critical in AI, where hardware generations refresh every 12 to 18 months. Our approach keeps pace with that cadence and allows quick and efficient upgrades of our infrastructure.<\/p>\n<p>S+B: When cloud adoption accelerated, neoclouds emerged and then eventually many consolidated. Are there signs that this time will be different?<br \/>CHAPPELL: Nscale exists because the market has shifted. The previous generation of companies haven\u2019t adapted to meet the needs of customers. Customers now expect predictable access to large, contiguous blocks of AI-ready capacity, delivered on clear timelines and run with consistent performance.<\/p>\n<p>The density, cooling requirements, and energy needs of AI workloads are rewriting the rulebook. It requires a different design approach, different construction methods, and different operational disciplines. Essentially, we\u2019re building an entirely new set of demands that result in a more defensible business model.<\/p>\n<p>Also, neoclouds differ significantly in how deep they go into the stack. For instance, some rent GPUs in co-location facilities and offer an API layer on top. Nscale takes full ownership from ground to cloud: we own the data centers, the software, and the hardware, which includes the power, cooling, networking, orchestration, and sustainability profile. That allows us to integrate things like closed-loop liquid cooling and digital twins into the facility architecture. Since we\u2019re architected in this way, we\u2019re providing a sovereign solution so that our customers can run AI workloads under their own legal, operational, and security frameworks.<\/p>\n<p>S+B: Speaking of sovereignty, a recent survey we ran with industry execs showed data sovereignty was among their top concerns, second only to cost. Is this a trend you see among your customers?<br \/>CHAPPELL: Sovereignty is certainly becoming increasingly important across sectors, particularly in heavily regulated industries such as healthcare, finance, and government. We\u2019re building across the globe to provide sovereign AI solutions to countries that want the benefits and security of having compute located within their borders or within their regulatory ecosystem.<\/p>\n<p>S+B: There\u2019s talk of \u2018circular financing\u2019 in the AI data center boom. How do you view this trend, and what safeguards do you see for sustainable growth?<br \/>CHAPPELL: As demand grows for compute, it\u2019s natural to see new financing models develop around it. The capital intensity of AI infrastructure has brought in new kinds of investors. We\u2019re now seeing sovereign wealth funds, infrastructure private equity, and corporates from the chip ecosystem come to the table. It reflects just how strategic compute capacity has become.<\/p>\n<p>But it also means expectations around scale, timelines, and returns have tightened. To operate at this level, you need partners who understand the hardware refresh cycle and the realities of deployment in emerging regions. That\u2019s why vertical integration matters. It provides control over timelines and performance.<\/p>\n<p>S+B: As the number of data centers expands, are there factors that the industry is worried about in keeping up with demand?<br \/>CHAPPELL: The big one is talent. We talk a lot about energy supply and cooling, but we don\u2019t talk enough about the people needed to build and operate these facilities. As you move into remote regions where renewable energy is available, specialist labor is harder to find.<\/p>\n<p>That\u2019s why we partner with local technical colleges and run apprenticeships to develop those skills where we have operations. But across the board, there\u2019s a need for more structured pathways into this sector. The complexity of AI infrastructure demands a workforce that understands everything from mechanical engineering to software orchestration, so knowledge sharing across disciplines is more crucial than ever before. The engineer running the liquid cooling system needs to understand the workload it\u2019s supporting. The technician doing GPU swaps should understand how model performance is affected by thermal stability. We need more cross-disciplinary training.<\/p>\n<p>What\u2019s also changing is how we operate. As the infrastructure becomes AI-native, operations have to become AI-native too. That means building systems in which people are augmented by automation so teams can focus on high-value and complex tasks with greater precision.<\/p>\n<p>S+B: Regarding concerns over energy and water usage, what\u2019s the message to those who are kept up at night by these concerns?<br \/>CHAPPELL: Transparency is critical and we need to get better as an industry at publishing real metrics, such as megawatts used, percentage from renewable sources, and efficiency benchmarks like power usage effectiveness (PUE). At Nscale, we\u2019re operating at around a 1.1 PUE, which is some of the most efficient operations I\u2019ve seen in my career. We\u2019re also designing systems that capture and reuse waste heat. For example, in Glomfjord, Norway, waste heat goes directly into local aquaculture.<\/p>\n<p>The key is to design infrastructure to be efficient by default. And to use natural cooling where possible. At Glomfjord, we&#8217;ve been able to eliminate diesel generator emissions and instead draw on the reliability of Norway\u2019s renewable grid and robust systems to maintain uptime. The technology to build efficiently exists; it now needs to be prioritized.<\/p>\n<p>S+B: What are some other recent innovations in DC technology or operations that are most exciting that people may not be aware of?<br \/>CHAPPELL: Digital twin technology, which I mentioned earlier, is hugely valuable in enabling us to simulate the entire site\u2014power, cooling, compute\u2014before it\u2019s built. That means we can test different hardware configurations, spot thermal bottlenecks, and validate airflow or coolant routing months before deployment. It saves time and cuts out the guesswork.<\/p>\n<p>S+B: If you were to design the \u2018data center of the future,\u2019 what would it look like and why?<br \/>CHAPPELL: It would be fully modular, highly prefabricated, and designed for continuous operation through AI-native refresh cycles. That means swappable blocks of compute, power, and cooling that can be replaced independently.<\/p>\n<p>It would operate like a next-generation intelligence factory: producing tokens, inference results, and model training runs continuously. It would have integrated crane systems, looped liquid cooling with isolation valves, and AI-led orchestration systems that optimize energy use and performance in real time. And most importantly, it would sit close to abundant renewable energy.<\/p>\n<p>Data centers won\u2019t exist in isolation\u2014they\u2019ll be part of a telco AI fabric. A distributed layer of edge AI nodes will sit inside telco networks to deliver ultra-low latency intelligence where it\u2019s needed.<\/p>\n<p>Author profile:<\/p>\n<p>David De Lallo is a contributing editor for PwC and s+b.<\/p>\n<p> Share to:\u00a0&#13;<\/p>\n<p>Topics: <a href=\"https:\/\/www.strategy-business.com\/tag\/energy\" rel=\"nofollow noopener\" target=\"_blank\">energy<\/a>, <a href=\"https:\/\/www.strategy-business.com\/tag\/global+expansion\" rel=\"nofollow noopener\" target=\"_blank\">global expansion<\/a>, <a href=\"https:\/\/www.strategy-business.com\/tag\/infrastructure\" rel=\"nofollow noopener\" target=\"_blank\">infrastructure<\/a>, <a href=\"https:\/\/www.strategy-business.com\/tag\/tech+sector\" rel=\"nofollow noopener\" target=\"_blank\">tech sector<\/a>, <a href=\"https:\/\/www.strategy-business.com\/tag\/telecommunications\" rel=\"nofollow noopener\" target=\"_blank\">telecommunications<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"AI is no longer a side bet for most enterprises\u2014it\u2019s quickly becoming core to how products are built,&hellip;\n","protected":false},"author":2,"featured_media":5535,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[24,25,1066,4873,205,3972,4872],"class_list":{"0":"post-5534","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-ai","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-energy","11":"tag-global-expansion","12":"tag-infrastructure","13":"tag-tech-sector","14":"tag-telecom"},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/5534","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/comments?post=5534"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/5534\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media\/5535"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media?parent=5534"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/categories?post=5534"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/tags?post=5534"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}