{"id":204056,"date":"2025-06-22T02:39:17","date_gmt":"2025-06-22T02:39:17","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/204056\/"},"modified":"2025-06-22T02:39:17","modified_gmt":"2025-06-22T02:39:17","slug":"how-a-i-is-changing-the-way-the-world-builds-computers","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/204056\/","title":{"rendered":"How A.I. Is Changing the Way the World Builds Computers"},"content":{"rendered":"\n<p class=\"g-text  svelte-wbgwfj\">This is the most fundamental change to computing since the early days of the World Wide Web. Just as companies completely rebuilt their computer systems to accommodate the new commercial internet in the 1990s, they are now rebuilding from the bottom up \u2014 from tiny components to the way that computers are housed and powered \u2014 to accommodate artificial intelligence. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Big tech companies have constructed computer data centers all over the world for two decades. The centers have been packed with computers to handle the online traffic flooding into the companies\u2019 internet services, including search engines, email applications and e-commerce sites. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">But those facilities were lightweights compared with what\u2019s coming. Back in 2006, Google opened its <a href=\"https:\/\/www.nytimes.com\/2006\/06\/14\/technology\/14search.html\" target=\"_blank\" rel=\"noopener\">first data center<\/a> in The Dalles, Ore., spending an estimated $600 million to complete the facility. In January, OpenAI and several partners <a href=\"https:\/\/www.nytimes.com\/2025\/02\/08\/technology\/sam-altman-elon-musk-trump.html\" target=\"_blank\" rel=\"noopener\">announced a plan<\/a> to spend roughly $100 billion on new data centers, beginning with a campus in Texas. They plan to eventually pump an additional $400 billion into this and other facilities across the United States. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">The change in computing is reshaping not just technology but also finance, energy and communities. Private equity firms are plowing money into data center companies. <a href=\"https:\/\/www.nytimes.com\/2024\/12\/25\/technology\/ai-data-centers-electricians.html\" target=\"_blank\" rel=\"noopener\">Electricians are flocking to areas<\/a> where the facilities are being erected. And in some places, locals are <a href=\"https:\/\/www.nytimes.com\/2024\/10\/29\/technology\/data-center-peculiar-missouri.html\" target=\"_blank\" rel=\"noopener\">pushing back against the projects<\/a>, worried that they will bring more harm than good. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">For now, tech companies are asking for more computing power and more electricity than the world can provide. OpenAI hopes to raise hundreds of billions of dollars to <a href=\"https:\/\/www.nytimes.com\/2024\/09\/25\/business\/openai-plan-electricity.html\" target=\"_blank\" rel=\"noopener\">construct computer chip factories<\/a> in the Middle East. Google and Amazon recently struck deals to build and deploy <a href=\"https:\/\/www.nytimes.com\/2024\/10\/16\/business\/energy-environment\/amazon-google-microsoft-nuclear-energy.html\" target=\"_blank\" rel=\"noopener\">a new generation of nuclear reactors<\/a>. And they want to do it fast. <\/p>\n<p class=\"g-caption svelte-cu2gla\">Google\u2019s A.I. chips on a circuit board. The company needs thousands of these chips to build its chatbots and other A.I. technologies.<\/p>\n<p class=\"g-credit svelte-cu2gla\">Christie Hemm Klok for The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">The bigger-is-better mantra was challenged in December when a tiny Chinese company, DeepSeek, said it had built one of the world\u2019s <a href=\"https:\/\/www.nytimes.com\/2025\/01\/23\/technology\/deepseek-china-ai-chips.html\" target=\"_blank\" rel=\"noopener\">most powerful A.I. systems<\/a> using <a href=\"https:\/\/www.nytimes.com\/2025\/01\/27\/technology\/what-is-deepseek-china-ai.html\" target=\"_blank\" rel=\"noopener\">far fewer computer chips<\/a> than many experts thought possible. That raised questions about Silicon Valley\u2019s frantic spending. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">U.S. tech giants were unfazed. The wildly ambitious goal of many of these companies is to create artificial general intelligence, or A.G.I. \u2014 a machine that can do anything the human brain can do \u2014 and they still believe that having more computing power is essential to get there. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Amazon, Meta, Microsoft, and Google\u2019s parent company, Alphabet, <a href=\"https:\/\/www.nytimes.com\/2025\/02\/08\/technology\/deepseek-data-centers-ai.html\" target=\"_blank\" rel=\"noopener\">recently indicated<\/a> that their capital spending \u2014 which is primarily used to build data centers \u2014 could top a combined $320 billion this year. That\u2019s more than twice what they spent two years ago. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">The New York Times visited five new data center campuses in California, Utah, Texas and Oklahoma and spoke with more than 50 executives, engineers, entrepreneurs and electricians to tell the story of the tech industry\u2019s insatiable hunger for this new kind of computing. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cWhat was probably going to happen over the next decade has been compressed into a period of just two years,\u201d Sundar Pichai, Google\u2019s chief executive, said in an interview with The Times. \u201cA.I. is the accelerant.\u201d <\/p>\n<p>New computer chips for new A.I. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">The giant leap forward in computing for A.I. was driven by a tiny ingredient: the specialized computer chips called graphics processing units, or GPUs. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Companies like the Silicon Valley chipmaker Nvidia originally designed these chips to render graphics for video games. But GPUs had a knack for running the math that powers what are known as neural networks, which can <a href=\"https:\/\/www.nytimes.com\/2017\/11\/28\/technology\/artificial-intelligence-research-toronto.html\" target=\"_blank\" rel=\"noopener\">learn skills<\/a> by analyzing large amounts of data. Neural networks are the basis of chatbots and other leading A.I. technologies. <\/p>\n<p>How A.I. Models Are Trained <\/p>\n<p class=\"g-leadin svelte-1so50ue\">By analyzing massive datasets, algorithms can learn to distinguish between images, in what&#8217;s called machine learning. The example below demonstrates the training process of an A.I. model to identify an image of a flower based on existing flower images.<\/p>\n<p class=\"g-source svelte-cu2gla\">Sources: IBM and Cloudflare<\/p>\n<p class=\"g-credit svelte-cu2gla\">The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">In the past, computing largely relied on chips called central processing units, or CPUs. These could do many things, including the simple math that powers neural networks. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">But GPUs can do this math faster \u2014 a lot faster. At any given moment, a traditional chip can do a single calculation. In that same moment, a GPU can do thousands. Computer scientists call this parallel processing. And it means neural networks can analyze more data. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cThese are very different from chips used to just serve up a web page,\u201d said Vipul Ved Prakash, the chief executive of Together AI, a tech consultancy. \u201cThey run millions of calculations as a way for machines to \u2018think\u2019 about a problem.\u201d <\/p>\n<p class=\"g-text  svelte-wbgwfj\">So tech companies started using increasingly large numbers of GPUs to build increasingly powerful A.I. technologies. <\/p>\n<p>Difference between CPU and GPU-powered computers  <\/p>\n<p class=\"g-source svelte-cu2gla\">Sources: Nvidia, IBM and Cloudflare<\/p>\n<p class=\"g-credit svelte-cu2gla\">The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">Along the way, Nvidia rebuilt its GPUs specifically for A.I., packing more transistors into each chip to run even more calculations with each passing second. In 2013, <a href=\"https:\/\/www.nytimes.com\/2017\/09\/16\/technology\/chips-off-the-old-block-computers-are-taking-design-cues-from-human-brains.html\" target=\"_blank\" rel=\"noopener\">Google began building its own A.I. chips<\/a>. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">These Google and Nvidia chips were not designed to run computer operating systems and could not handle the various functions for operating a Windows laptop or an iPhone. But working together, they accelerated the creation of A.I. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cThe old model lasted for about 50 years,\u201d said Norm Jouppi, a Google engineer who oversees the company\u2019s effort to build new silicon chips for A.I. \u201cNow, we have a completely different way of doing things.\u201d <\/p>\n<p>The closer the chips, the better. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">It is not just the chips that are different. To get the most out of GPUs, tech companies must speed the flow of digital data among the chips. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cEvery GPU needs to talk to every other GPU as fast as possible,\u201d said Dave Driggers, the chief technology officer at Cirrascale Cloud Services, which operates a data center in Austin, Texas, for the Allen Institute for Artificial Intelligence, a prominent A.I. research lab. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">The closer the chips are to one another, the faster they can work. So companies are packing as many chips into a single data center as they can. They have also developed new hardware and cabling to rapidly stream data from chip to chip. <\/p>\n<p class=\"g-caption svelte-cu2gla\">Meta\u2019s Eagle Mountain data center sits in a valley beneath Utah\u2019s Lake Mountains, south of Salt Lake City. Meta broke ground on this building after the A.I. boom erupted.<\/p>\n<p class=\"g-credit svelte-cu2gla\">Christie Hemm Klok for The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">That is changing how data centers \u2014 which are essentially big buildings filled with racks of computers stacked on top of one another \u2014 work. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">In 2021, before the A.I. boom, Meta opened two data centers an hour south of Salt Lake City and was building three more there. These facilities \u2014 each the size of the Empire State Building, laid on its side across the desert \u2014 would help power the company\u2019s social media apps, such as Facebook and Instagram. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">But after <a href=\"https:\/\/www.nytimes.com\/2022\/12\/10\/technology\/ai-chat-bot-chatgpt.html\" target=\"_blank\" rel=\"noopener\">OpenAI released ChatGPT<\/a> in 2022, Meta re-evaluated its A.I. plans. It had to cram thousands of GPUs into a new data center so they could churn through weeks and even months of calculations needed to build a single neural network and advance the company\u2019s A.I. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cEverything must function as one giant, data-center-sized supercomputer,\u201d said Rachel Peterson, Meta\u2019s vice president of data centers. \u201cThat is a whole different equation.\u201d <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Within months, Meta broke ground on a sixth and seventh Utah data center beside the other five. In these 700,000-square-foot facilities, technicians filled each rack with hardware used to train A.I., sliding in boxy machines packed with GPUs that can cost tens of thousands of dollars. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">In 2023, Meta <a href=\"https:\/\/www.nytimes.com\/2023\/02\/01\/technology\/meta-restructuring-charge.html\" target=\"_blank\" rel=\"noopener\">incurred a $4.2 billion restructuring charge<\/a>, partly to redesign many of its future data center projects for A.I. Its activity was emblematic of a change happening across the tech industry. <\/p>\n<p>A.I. machines need more electricity. Much more. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">New data centers packed with GPUs meant new electricity demands \u2014 so much so that the appetite for power would go through the roof. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">In December 2023, Cirrascale leased a 139,000-square-foot traditional data center in Austin that drew on 5 megawatts of electricity, enough to power about 3,600 average American homes. Inside, computers were arranged in about 80 rows. Then the company ripped out the old computers to convert the facility for A.I. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">The 5 megawatts that used to power a building full of CPUs is now enough to run just eight to 10 rows of computers packed with GPUs. Cirrascale can expand to about 50 megawatts of electricity from the grid, but even that would not fill the data center with GPUs. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">And that is still on the small side. OpenAI aims to build about five data centers that <a href=\"https:\/\/www.nytimes.com\/2025\/02\/08\/technology\/sam-altman-elon-musk-trump.html\" target=\"_blank\" rel=\"noopener\">top the electrical use of about three million households<\/a>. <\/p>\n<p class=\"g-caption svelte-cu2gla\">Cirrascale\u2019s data center in Austin, Texas, draws on 5 megawatts of electricity, which can power eight to 10 rows of computers packed with GPUs.<\/p>\n<p class=\"g-credit svelte-cu2gla\">Christie Hemm Klok for The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">It\u2019s not just that these data centers have more gear packed into a tighter space. The computer chips that A.I. revolves around need far more electricity than traditional chips. A typical CPU needs about 250 to 500 watts to run, while GPUs use up to 1,000 watts. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Building a data center is ultimately a negotiation with the local utility. How much power can it provide? At what cost? If it must expand the electrical grid with millions of dollars in new equipment, who pays for the upgrades? <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Data centers consumed about 4.4 percent of total electricity in the United States in 2023, or more than twice as much power as the facilities used to mine cryptocurrencies. That could triple by 2028, according to a December report published by the Department of Energy. <\/p>\n<p>Power consumption by A.I. data centers <\/p>\n<p class=\"g-leadin svelte-1so50ue\">The Energy Department estimates that A.I. servers in data centers could consume as much as 326 terawatt-hours by 2028, nearly eight times what they used in 2023.<\/p>\n<p class=\"g-source svelte-cu2gla\">Source: Lawrence Berkeley National Laboratory, Energy Department<\/p>\n<p class=\"g-credit svelte-cu2gla\">The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cTime is the currency in the industry right now,\u201d said Arman Shehabi, a researcher at the Lawrence Berkeley National Laboratory who led the report. There is a rush to keep building, he said, and \u201cI don\u2019t see this slowing down in the next few years.\u201d <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Data center operators are now having trouble finding electrical power in the United States. In areas like Northern Virginia \u2014 the world\u2019s biggest hub of data centers because of its proximity to underwater cables that shuttle data to and from Europe \u2014 these companies have all but exhausted the available electricity. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Some A.I. giants are turning to nuclear power. Microsoft <a href=\"https:\/\/www.nytimes.com\/2024\/10\/30\/business\/energy-environment\/three-mile-island-nuclear-energy.html\" target=\"_blank\" rel=\"noopener\">is restarting<\/a> the Three Mile Island nuclear plant in Pennsylvania. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Others are taking different routes. Elon Musk and xAI, his A.I. start-up, recently bypassed clean energy in favor of a quicker solution: <a href=\"https:\/\/www.npr.org\/2024\/09\/11\/nx-s1-5088134\/elon-musk-ai-xai-supercomputer-memphis-pollution\" target=\"_blank\" rel=\"noopener\">installing their own gas turbines<\/a> at a new data center in Memphis. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cMy conversations have gone from \u2018Where can we get some state-of-the-art chips?\u2019 to \u2018Where can we get some electrical power?\u2019\u201d said David Katz, a partner with Radical Ventures, a venture capital firm that invests in A.I. <\/p>\n<p>A.I. gets so hot, only water can cool it down. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">These unusually dense A.I. systems have led to another change: a different way of cooling computers. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">A.I. systems can get very hot. As air circulates from the front of a rack and crosses the chips crunching calculations, it heats up. At Cirrascale\u2019s Austin data center, the temperature around one rack started at 71.2 degrees Fahrenheit on the front and ended up at 96.9 degrees on the back side. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">If a rack isn\u2019t properly cooled down, the machines \u2014 and potentially the whole data center \u2014 are at risk of catching fire. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Just outside Pryor, a farm-and-cattle town in the northeast corner of Oklahoma, Google is solving this problem on a massive scale. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Thirteen Google data centers rise up from the grassy flatlands. This campus holds tens of thousands of racks of machines and uses hundreds of megawatts of electricity streaming from metal-and-wire power stations installed between the concrete buildings. To keep the machines from overheating, Google pumps cold water through all 13 buildings. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">In the past, Google\u2019s water pipes ran through empty aisles beside the racks of computers. As the cold water moved through the pipes, it absorbed the heat from the surrounding air. But when the racks are packed with A.I. chips, the water isn\u2019t close enough to absorb the extra heat. <\/p>\n<p class=\"g-source svelte-cu2gla\">Source: SimScale<\/p>\n<p class=\"g-credit svelte-cu2gla\">The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">Google now runs its water pipes right up next to the chips. Only then can the water absorb the heat and keep the chips working. <\/p>\n<p class=\"g-source svelte-cu2gla\">Source: SimScale<\/p>\n<p class=\"g-credit svelte-cu2gla\">The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">Pumping water through a data center filled with electrical equipment can be risky since water can leak from the pipes onto the computer hardware. So Google treats its water with chemicals that make it less likely to conduct electricity \u2014 and less likely to damage the chips. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Once the water absorbs the heat from all those chips, tech companies must also find ways of cooling the water back down. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">In many cases, they do this using giant towers sitting on the roof of the data center. Some of the water evaporates from these towers, which cools the rest of it, much as people are cooled when they sweat and the sweat evaporates from their skin. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cThat is what we call free cooling \u2014 the evaporation that happens naturally on a cool, dry morning,\u201d said Joe Kava, Google\u2019s vice president of data centers. <\/p>\n<p class=\"g-caption svelte-cu2gla\">Inside a Google data center, which is packed with computers that use Google\u2019s A.I. chips.<\/p>\n<p class=\"g-credit svelte-cu2gla\">Christie Hemm Klok for The New York Times<\/p>\n<p class=\"g-text  svelte-wbgwfj\">Google and other companies that use this technique must keep replenishing the water that pumps through the data center, which can strain local water supplies. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Google data centers consumed 6.1 billion gallons of water in 2023, up 17 percent from the previous year. In California, a state that faces drought, more than 250 data centers consume billions of gallons of water annually, raising alarm bells <a href=\"https:\/\/www.sacbee.com\/opinion\/op-ed\/article297294554.html\" target=\"_blank\" rel=\"noopener\">among local officials<\/a>. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">Some companies, including Cirrascale, use massive chillers \u2014 essentially air-conditioners \u2014 to cool their water instead. That reduces pressure on the local water supply, because they reuse virtually all of the water. But the process requires more electrical power. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">There is little end in sight. Last year, Google broke ground on 11 data centers in South Carolina, Indiana, Missouri and elsewhere. Meta said its newest facility, in Richland Parish, La., would be big enough to cover most of Central Park, Midtown Manhattan, Greenwich Village and the Lower East Side. <\/p>\n<p class=\"g-text  svelte-wbgwfj\">\u201cThis will be a defining year for AI,\u201d Mark Zuckerberg, Meta\u2019s chief executive, said in January in a <a href=\"https:\/\/www.facebook.com\/zuck\/posts\/pfbid0219ude255AKkmk4JAueXZeZ9zpjNYio2tBkd7bNmCaRbJ6iJaVVjypUgDg78CNdq5l\" target=\"_blank\" rel=\"noopener\">Facebook post<\/a> that concluded, \u201cLet\u2019s go build!\u201d <\/p>\n","protected":false},"excerpt":{"rendered":"This is the most fundamental change to computing since the early days of the World Wide Web. Just&hellip;\n","protected":false},"author":2,"featured_media":204057,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3164],"tags":[1942,72034,3284,32054,53,16,15],"class_list":{"0":"post-204056","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-computing","8":"tag-artificial-intelligence","9":"tag-computer-chips","10":"tag-computing","11":"tag-data-centers","12":"tag-technology","13":"tag-uk","14":"tag-united-kingdom"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114724704282828099","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/204056","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=204056"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/204056\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/204057"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=204056"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=204056"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=204056"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}