Gen Z is increasingly using generative AI tools to obtain academic guidance. In doing so, they’re bypassing university counselors and advisors.
A new study from Arkansas State University shows that college-aged Gen Z students are changing how academic advice is sought, found, and followed. Many of today’s college students are using generative AI tools for quick answers, as opposed to meeting with professors, counselors, or career advisors. As a result, academic support is in the midst of change.
According to the study, nine out of ten students identified ChatGPT as their preferred AI tool. In fact, over 20% of students said they’d skipped an advisor meeting because AI had already answered it. Indeed, nearly one in three felt confident making a significant academic decision based just on AI’s advice, instead of checking first with a real person, let alone a qualified, credentialed adult with subject matter expertise.
What began as a shortcut for homework is turning into a broader approach to academia for many students. Today, more than half of students use AI instead of Google to get answers about school, classes, and subject matter. Twenty-two percent use AI daily to check degree requirements, interpret school policies, and understand financial aid.
Of course, speed doesn’t necessarily guarantee accuracy. Forty-one percent of students surveyed admitted they’d followed advice from AI that ended up being wrong. Moreover, two-thirds said no adult caught the mistake. These types of oversight carry consequences: 74% said they wasted time, 7% missed a deadline, and 6% enrolled in the wrong class.
Students primarily rely on AI tools like ChatGPT for academic advising because it’s easier. Eighty-eight percent said they prefer AI because it’s virtually always available and accessible, as opposed to school advisors who keep business hours and may not answer student questions directly or clearly. Sixty-nine percent of students reported simply preferring having faster answers from AI. Sixty-five percent said it helped them to avoid having to spend their time waiting for an academic advising appointment to receive answers.
Most students, however, reported understanding that AI has limits when compared to human capabilities. Sixty-two percent believed AI should assist human advisors instead of replacing them outright. Seventy-three percent said AI lacks the empathy and emotional understanding that can be critical to high-quality academic advising.
The balance between AI assistance and human help is still only beginning to shift on campuses across the globe. Fifty-nine percent of students said they prefer human accuracy more than AI’s speed. But 35% valued both equally, while 6% prioritized the efficiency offered by AI.
Beyond speed and accuracy, there’s a third variable that’s critical to excellent advising: trust. The findings around student confidence in their school and its resources are, if nothing else, reassuring. The study reported that 19% of students said they trust AI more than their university’s website. This frustration can be caused by universities’ outdated portals, confusing website layouts, and slow replies. The answers provided by AI tools can, by comparison, seem faster, clearer, and more helpful.
But the quality of answers provided by AI is inconsistent. Forty-two percent of students said AI has given them incorrect information about school policies. Thirty-three percent said they’ve received plagiarism or academic integrity advice from AI that has been flat-out wrong. Thirty-one percent mentioned errors in guidance around degree planning. A quarter had problems with credit transfer details.
Academic advising has traditionally centered on mentorship. Advisors have face-to-face conversations, build trust with students, and offer long-term support across several years. Today, a rapidly emerging digital model of advising offers instant answers without trust, rapport, or relationship. Sixty percent of students said they’ve used AI to brainstorm future possibilities, including academic or career ideas. Sixty-five percent said AI has written emails to professors or departments for them.
The Arkansas State research shows that many students trust AI, but tend to underestimate its flaws. As a result, some students follow incorrect advice. Few get corrected. Consequences include delayed class scheduling, missed deadlines, and preventable mistakes.
The extent to which AI use has become widespread and normalized can explain, to some degree, student confidence in AI for academic advising. Many students see AI as a standard part of the academic process instead of a shortcut tinged with risk.
Despite this risk, most students don’t believe AI can do everything when it comes to college counseling. Seventy percent said AI can’t replace human advisors because some academic problems are too emotional or situationally dependent for machines to solve. Sixty-nine percent said advisors better understand context and individual circumstance than AI. Sixty-five percent said personal support still matters.
Taken together, these findings point to a nuanced behavioral reality: despite widespread AI use for academic advising, Gen Z isn’t eschewing human advice entirely. Instead, they’re choosing when and why they need it. For fast facts, they’ll probably ask AI. For life planning or emotional support, they’ll likely still turn to people. For now.
Of course, today’s students aren’t necessarily utilizing AI because they’re averse to human interaction. Many students are understandably looking for options that provide simple convenience and accessibility. AI certainly fills that gap.
Perhaps most alarming is the problem of trust that emerges from the study’s findings. Fully 20% of students think AI is more accurate than their school’s own website. Universities may begin to lose control of their messaging if AI becomes the main source of advice.
Beyond identifying issues, the study findings implicitly outline a path forward. Sixty-two percent of students said AI should support traditional academic advising at universities. Indeed, a hybrid model is promising. For instance, AI might handle repetitive questions while advisors focus on nuanced conversations, empathy-driven mentorship, and tailored problem-solving.
Many industries already use automation for tasks and save human input for decisions that benefit from empathy or judgment. Some universities have started experimenting with AI-assisted help desks that triage complex issues to human staff.
There are risks, however. Today’s students are learning to skip human feedback in school. By extension, they may also do the same in their careers by relying too heavily on AI for decisions that benefit from context and experience.
How students make choices now and into the future will depend on how colleges embrace these changes or try to fight them. Yet, one thing is certain: when Gen Z says, “I’ll just ask ChatGPT,” they’re not necessarily being lazy. They’re being honest. And their honesty and evolving behavior are shifting what academic advice looks like.
Notes: This post was drafted with the assistance of AI tools and reviewed, edited, and published by humans.
Read next: Research Across Retailers Confirms Holiday Tactics Often Fail, Highlighting Evidence-Based Engagement Strategies
A new study from Arkansas State University shows that college-aged Gen Z students are changing how academic advice is sought, found, and followed. Many of today’s college students are using generative AI tools for quick answers, as opposed to meeting with professors, counselors, or career advisors. As a result, academic support is in the midst of change.
According to the study, nine out of ten students identified ChatGPT as their preferred AI tool. In fact, over 20% of students said they’d skipped an advisor meeting because AI had already answered it. Indeed, nearly one in three felt confident making a significant academic decision based just on AI’s advice, instead of checking first with a real person, let alone a qualified, credentialed adult with subject matter expertise.
What began as a shortcut for homework is turning into a broader approach to academia for many students. Today, more than half of students use AI instead of Google to get answers about school, classes, and subject matter. Twenty-two percent use AI daily to check degree requirements, interpret school policies, and understand financial aid.
Of course, speed doesn’t necessarily guarantee accuracy. Forty-one percent of students surveyed admitted they’d followed advice from AI that ended up being wrong. Moreover, two-thirds said no adult caught the mistake. These types of oversight carry consequences: 74% said they wasted time, 7% missed a deadline, and 6% enrolled in the wrong class.
Students primarily rely on AI tools like ChatGPT for academic advising because it’s easier. Eighty-eight percent said they prefer AI because it’s virtually always available and accessible, as opposed to school advisors who keep business hours and may not answer student questions directly or clearly. Sixty-nine percent of students reported simply preferring having faster answers from AI. Sixty-five percent said it helped them to avoid having to spend their time waiting for an academic advising appointment to receive answers.
Most students, however, reported understanding that AI has limits when compared to human capabilities. Sixty-two percent believed AI should assist human advisors instead of replacing them outright. Seventy-three percent said AI lacks the empathy and emotional understanding that can be critical to high-quality academic advising.
The balance between AI assistance and human help is still only beginning to shift on campuses across the globe. Fifty-nine percent of students said they prefer human accuracy more than AI’s speed. But 35% valued both equally, while 6% prioritized the efficiency offered by AI.
Beyond speed and accuracy, there’s a third variable that’s critical to excellent advising: trust. The findings around student confidence in their school and its resources are, if nothing else, reassuring. The study reported that 19% of students said they trust AI more than their university’s website. This frustration can be caused by universities’ outdated portals, confusing website layouts, and slow replies. The answers provided by AI tools can, by comparison, seem faster, clearer, and more helpful.
But the quality of answers provided by AI is inconsistent. Forty-two percent of students said AI has given them incorrect information about school policies. Thirty-three percent said they’ve received plagiarism or academic integrity advice from AI that has been flat-out wrong. Thirty-one percent mentioned errors in guidance around degree planning. A quarter had problems with credit transfer details.
Academic advising has traditionally centered on mentorship. Advisors have face-to-face conversations, build trust with students, and offer long-term support across several years. Today, a rapidly emerging digital model of advising offers instant answers without trust, rapport, or relationship. Sixty percent of students said they’ve used AI to brainstorm future possibilities, including academic or career ideas. Sixty-five percent said AI has written emails to professors or departments for them.
The Arkansas State research shows that many students trust AI, but tend to underestimate its flaws. As a result, some students follow incorrect advice. Few get corrected. Consequences include delayed class scheduling, missed deadlines, and preventable mistakes.
The extent to which AI use has become widespread and normalized can explain, to some degree, student confidence in AI for academic advising. Many students see AI as a standard part of the academic process instead of a shortcut tinged with risk.
Despite this risk, most students don’t believe AI can do everything when it comes to college counseling. Seventy percent said AI can’t replace human advisors because some academic problems are too emotional or situationally dependent for machines to solve. Sixty-nine percent said advisors better understand context and individual circumstance than AI. Sixty-five percent said personal support still matters.
Taken together, these findings point to a nuanced behavioral reality: despite widespread AI use for academic advising, Gen Z isn’t eschewing human advice entirely. Instead, they’re choosing when and why they need it. For fast facts, they’ll probably ask AI. For life planning or emotional support, they’ll likely still turn to people. For now.
Of course, today’s students aren’t necessarily utilizing AI because they’re averse to human interaction. Many students are understandably looking for options that provide simple convenience and accessibility. AI certainly fills that gap.
Perhaps most alarming is the problem of trust that emerges from the study’s findings. Fully 20% of students think AI is more accurate than their school’s own website. Universities may begin to lose control of their messaging if AI becomes the main source of advice.
Beyond identifying issues, the study findings implicitly outline a path forward. Sixty-two percent of students said AI should support traditional academic advising at universities. Indeed, a hybrid model is promising. For instance, AI might handle repetitive questions while advisors focus on nuanced conversations, empathy-driven mentorship, and tailored problem-solving.
Many industries already use automation for tasks and save human input for decisions that benefit from empathy or judgment. Some universities have started experimenting with AI-assisted help desks that triage complex issues to human staff.
There are risks, however. Today’s students are learning to skip human feedback in school. By extension, they may also do the same in their careers by relying too heavily on AI for decisions that benefit from context and experience.
How students make choices now and into the future will depend on how colleges embrace these changes or try to fight them. Yet, one thing is certain: when Gen Z says, “I’ll just ask ChatGPT,” they’re not necessarily being lazy. They’re being honest. And their honesty and evolving behavior are shifting what academic advice looks like.
Notes: This post was drafted with the assistance of AI tools and reviewed, edited, and published by humans.
Read next: Research Across Retailers Confirms Holiday Tactics Often Fail, Highlighting Evidence-Based Engagement Strategies



