In the race to catch up with artificial intelligence rivals like OpenAI and Meta, Elon Musk’s xAI has made some controversial moves and received criticism for harmful content. Most recently, the company has seen backlash for its rollout of two chatbots geared toward more explicit content, according to the New York Times.
What’s happening?
Many view AI as the technology of the future, and the race for AI supremacy has become increasingly cutthroat. However, allowing AI chatbots to engage in explicit conversations with more sexualized content has been an area from which many companies have shied away, fearing the potential costs to their reputation as well as clampdowns from regulators.
Enter Ani and Valentine, two AI chatbots from xAI that were designed to go where other companies would not, engaging in a level of intimacy that other AI creators have purposely sought to avoid.
Many critics have viewed the move as a cynical ploy by Musk to elevate the profile of his lagging AI company. While many in the AI field have come to recognize the value of creating some level of emotional connection between human users and their chatbots in order to increase engagement, they have largely steered away from sexual intimacy. But not xAI.
“It’s all tied to the fundamental race to intimacy that we’re seeing in the AI industry,” Camille Carlton, the policy director at the Center for Humane Technology, told the New York Times. “These companies know that emotional attachment means more engagement and more market share.”
Musk has pushed back against critics, claiming that he believes the more explicit chatbots will actually lead to more intimacy among humans. Musk expressed the opinion that these chatbots could help address population decline, which has long been a focus for the billionaire.
“I predict — counter-intuitively — that it will increase the birth rate!” Musk wrote on his social media platform, X, in August, per the Times. “Mark my words.”
Why is it important?
Much of the concern around explicit AI has centered on the ability of minors to access sexualized and harmful content. In order to interact with Ani or Valentine, xAI users need only enter a birth date, purportedly showing that they are over 18 years old. In the days of tech-savvy youth raised on the internet, this hardly serves as an effective barrier.
Musk’s xAI has not been the only AI company to face controversy over the explicit content of its chatbots and the ability of children to access them.
In August, Reuters published a report based on its review of a Meta internal policy document that stated that it was within the rules of the company’s AI to “engage a child in conversations that are romantic or sensual.”
Meta quickly backtracked on the policy. Later that month, a bipartisan group of 44 attorneys general wrote a letter to over a dozen tech companies expressing concerns about minors having access to explicit content via AI.
“Your innovations are changing the world and ushering in an era of technological acceleration that promises prosperity undreamt of by our forebears,” the AGs wrote. “We need you to succeed. But we need you to succeed without sacrificing the well-being of our kids in the process.”
While companies like Meta have pulled back on explicit content and attempted to put safety guardrails in place, Musk and xAI seem to be heading in the opposite direction, perhaps seeing an opportunity to differentiate xAI in a crowded and competitive AI marketplace.
Beyond questions of exposing minors and others to explicit and potentially harmful content, AI has faced controversy on a number of other fronts as well.
The energy-hungry data centers necessary for powering complex AI models have cropped up across the United States and the world, and they’re consuming vast amounts of electricity. As the increased demand for electricity has outpaced the growth in supply, it has caused electricity prices to skyrocket for everyday consumers.
With electricity generation often comes planet-overheating pollution, and data centers have become responsible for a significant percentage of the world’s heat-trapping emissions while also drawing on clean energy that might otherwise directly benefit households. Not only that, but these data centers also use huge amounts of water for cooling purposes. Those living nearby have reported a hit to local water supplies as well as symptoms such as fatigue and anxiety related to loud data center operations.
Water changes and noise pollution may also be impacting area wildlife, and focusing on AI uses that serve predominantly to drive longer and longer periods of engagement — rather than leveraging the resource-hungry tech to deliver evidence-based social benefits, such as cancer diagnosis and food security — may prove to do more harm than good.
Many also fear that ever-advancing AI systems will continually replace human workers, a phenomenon that researchers have noted is already occurring. Some experts have warned that AI could ultimately lead to millions of job losses and widespread unemployment.
What’s being done about it?
Given the unprecedented pace at which AI has developed, lawmakers and government regulators have struggled to keep up with the rapid technological advancements. However, in order to hold companies accountable and to ensure that effective guardrails are established to protect users’ safety, particularly among youth, the industry must be held to certain standards.
Similarly, while AI has the potential to deliver technological breakthroughs that could help address planet-overheating pollution and other environmental concerns, it appears currently to be on pace to exacerbate those problems.
To establish meaningful regulations protective of users and the environment while keeping energy costs affordable for everyday Americans will require a thoughtful, purpose-driven collaboration among the AI industry, government, individuals, and other stakeholders.
Join our free newsletter for weekly updates on the latest innovations improving our lives and shaping our future, and don’t miss this cool list of easy ways to help yourself while helping the planet.