When this year’s class of college graduates first set foot on campus, AI as we know it today did not exist. In February, just three-plus years after ChatGPT made its very public debut, the U.S. Department of Labor issued its first-ever AI Literacy Framework. As the secretary of education outlines, the framework is meant to help educators accelerate production of “the next great American talent pipeline.”
As faculty deeply involved in shaping our own institution’s response to AI, we applaud the framework’s release as an important step toward clarity, and one that will amplify efforts already underway in California via the California State University’s AI Workforce Acceleration Board. The DOL’s framework, issued in the hopes of “providing a foundation to guide nationwide AI literacy efforts across workforce and education systems,” begins to answer the burning question of what, exactly, AI literacy should look like from a workforce-preparation perspective.
This question is top of mind not only for government officials intent on building a workforce for what they (and Big AI) would sell as a new “golden age,” but also for universities dependent on good postgraduation job-placement statistics. And, according to the newly released results from a first-of-its-kind CSU systemwide AI survey, students themselves identified defining AI literacy as a central concern.
Yet, unless employers clearly articulate what it means in practice, AI literacy efforts will fall short as workforce preparation. Absent defined, role-specific AI competencies, educators are left without the guidance necessary to prepare students for an AI-driven economy.
We helped develop and lead the CSU systemwide survey, which collected nearly 100,000 individual responses from students, faculty and staff in fall 2025, including participants from all 22 campuses. The largest AI survey ever undertaken in higher education, it provides an annual snapshot of how campus communities engage with AI. Among the findings, nearly 70 percent of the 80,626 CSU students who responded to the survey in fall 2025 believed AI will be an essential part of most professions. Yet, only 37 percent said the curriculum offers adequate AI exposure.
Regarding career preparation in particular, more than four in five (81.7 percent) had concerns that AI will affect job security. A marketing student stated, “Students who learn AI now will have a massive competitive advantage. I’m teaching myself because I can’t wait for the curriculum to catch up.”
Such enthusiasm does not necessarily reflect a love of AI. Rather, it indexed a practical approach to the need for employment and fears incited by the hype about AI. As a computer science major reported, “I don’t want to use it, I HAVE TO! Because if I don’t, then I’ll be left behind.”
A notable number of students took the opposite track. For example, one said, “I changed majors due to the use of AI. I originally wanted to pursue graphic design. The use of AI made the job market slimmer and slimmer, so I switched to social work.” For students like this, but also for the hesitant engagers and even among those more fully embracing AI, the desire to use AI responsibly ran high. Across the board, students wondered, as this physical therapy student did, about ways to use AI “without losing our ability to think for ourselves or lose our humanity.” Students’ nuanced views included the observation that “The promise of saving time only hides larger structural problems with labor and resources.”
But in practical terms, coupled with a fear of missing out or not having a competitive edge in the job market, students expressed confusion over what skills they really need. As one student reported, “I am having a hard time finding concrete information regarding what employers mean by ‘AI skills.’ What skills am I supposed to be developing?”
An engineering student asserted, “I want to learn AI tools that are actually used in my industry, not just generic chatbots. Show me what engineers actually do with AI on the job.”
Faculty members have been able to forge productive, pedagogically relevant partnerships with industry representatives. Yet, in their survey responses, many students called for more AI-relevant internship opportunities, presentations from guest speakers with real-world workplace experience and more direct instruction on the responsible use of AI.
To make relevant curricular adjustments, faculty need to know not what employers say they want but (in the words of one professor) “specific ways AI is being used in industry so I can inform students.” A firmer, better-scoped definition of AI literacy, as well as occupation-specific skill lists, is sorely needed.
Until employers know and can clearly articulate what they need, for instance, via occupation-aligned AI competency maps, our students will remain stuck in a sea of indeterminacy. This ambiguity benefits AI vendors: The anxieties it feeds fuels subscriptions. But the lack of concrete specifications regarding where and how AI fits (or doesn’t) in various career-related skill sets is unhelpful at best for our rising graduates.
And here’s where we hope the DOL framework will help. The preamble includes the following directive: “Employers should identify specific tasks where AI can augment employee capabilities and determine what level of AI literacy different roles require.” Doing this will move us away from performatively invoking AI preparedness and toward a concrete strategy for achieving a firmly defined, workplace-relevant AI literacy for students who seek it.
Elisa J. Sobo is a professor of anthropology and an AI Faculty Fellow at San Diego State University.
David M. Goldberg is an associate professor of management information systems and an AI Faculty Fellow at San Diego State University.