A cyberpsychology expert warns that as AI advances, people become overconfident and less critical of the information it provides.
NORFOLK, Va. — Artificial intelligence (AI) tools are everywhere, helping us write, research, create and even communicate. However, as AI becomes part of everyday life, experts say our digital culture has not caught up.
Dr. Scott Debb, professor of psychology and director of the Cyberpsychology Research Lab at Norfolk State University, says the gap between what AI can do and what people understand about it is growing.
“If you don’t have the digital literacy,” Debb said, “you start to believe everything you see or hear, and this false sense of confidence happens.”
This misplaced trust, he says, stems from what psychologists call cognitive offloading, the tendency to let technology do the thinking for us.
“Every person with a phone or Wi-Fi connection has access to virtually all the information in the world,” Debb explained. “But there’s a difference between having information and using it.”
He says that when users turn to AI tools like ChatGPT or other generative platforms, they often ignore the process of evaluating information altogether.
“With AI, you type in a question and it tells you what you want to hear,” Debb said. “But when we accept it literally, we no longer think, we are just passive consumers.”
This passivity, he warns, can have repercussions beyond misinformation. It can shape how we understand truth, expertise, and even human connection.
“There is no conscience, there is no conscience,” he said. “Everything he does is just a reflection of what he’s been fed, what other people have already said online.”
Debb compares the current AI boom to the early days of social media, another tool created to connect people that later revealed major flaws.
“What was originally intended was to stay connected,” he said. “But without safeguards, the impact on humans only changes; it gets worse before it gets better.”
Today, he says, we are at a crossroads with AI. Technology advances faster than the policies meant to regulate it. And without proper safeguards, AI could take on roles it was never designed for.
“Generative AI should not play therapist,” Debb said. “We have an opportunity right now to put in place oversight, so that people don’t end up trusting a program over a human being.”
He believes that education in digital literacy, understanding how algorithms work, where information comes from and when to question it, should become as essential as traditional reading and writing skills.
“Just because we have digital technology, none of our circuits have changed,” Debb said. “It’s up to us to be critical consumers, not just consumers.”
Debb says AI itself is not the enemy. As with any tool before it, the danger comes in how it is used and whether people are equipped to use it wisely.
“Any obstacle can become an opportunity,” he said. “If we focus on awareness and accountability now, AI can still be a tool of transformation, not addiction. »