
Get stories and expert advice on all things related to college and parenting.

A recent MIT study suggests that everyone in academia, and students and their families, too, should be concerned about students’ use of generative Artificial Intelligence such as ChatGPT. In the study, researchers divided participants into three groups: one used ChatGPT extensively to create and revise essays, one used Google’s search engine to assist in writing essays, and one used no technology to write the essays.
Preliminary conclusions suggest that people who extensively used AI for writing essays had the lowest brain engagement of the three groups and “underperformed in neural, linguistic, and behavioral levels.” Translation: AI may be robbing students of the benefits of learning.
We also know that high school students are using AI to complete work (various surveys and studies report 33%–86%). It’s not clear, though, how many use it to start the learning process, enhance what they have learned or test themselves, or use it to circumvent learning altogether.
What this all means for college students is that, yes, you need to know what AI is and how to use it responsibly, but you also need to be careful in the process.
When we talk about AI in education today, we’re primarily referring to Large Language Models (LLMs) like OpenAI’s ChatGPT, Anthropic’s Claude, and Google’s Gemini. These are sophisticated systems trained on vast amounts of text that can understand context, engage in conversations, and generate human-like responses. This is fundamentally different from older technologies like spell checkers or basic predictive text, which use simpler pattern recognition and don’t truly “understand” language. Importantly, LLMs generate responses based on patterns in their training data — they’re not primarily search engines pulling current information or fact-checkers verifying accuracy!
Use AI as a collaborator, not a replacement. The most effective approach treats AI as a thinking partner that can help you work through ideas, generate initial drafts, or automate tedious tasks like formatting or research organization. This frees you to focus on the creative, analytical, and critical thinking aspects that require human insight.
Avoid the “copy-and-paste trap.” Simply taking AI output verbatim demonstrates poor digital literacy and misses the tool’s real value. Instead, use AI to brainstorm, refine your ideas, or handle routine tasks while you maintain ownership of the intellectual work.
Your professors may allow some AI use, but it’s highly unlikely they’ll allow you to use it for large portions or entire assignments. If you ask an AI program to solve mathematical problems or write paragraphs or essays, you are most likely not using it ethically.
Misusing AI carries both institutional and personal consequences. Beyond potential academic penalties like failing grades or disciplinary action, over-reliance on AI robs you of the deep learning that comes from wrestling with problems yourself. The struggle of working through challenges builds critical thinking skills, resilience, and genuine understanding that no AI can provide.
Develop your own thinking first. Before turning to AI for help with assignments or problems, invest time in understanding the subject matter and forming your own initial perspective on the topic. AI works best when you can evaluate, refine, and build upon its suggestions rather than depending on it to do your thinking for you.
Al can do amazing things, but it should be used intentionally, mindfully, and ethically. If it is used to circumvent the learning process, it keeps you from the opportunity to learn how to think creatively and critically — two of the most important skills to develop during college.