How ChatGPT Changed My Data Science Classroom Overnight
I TA for CS3000, Intro to Data Science at Khoury College. Over 200 students. The course covers pandas, EDA, basic ML, the usual pipeline. I've been doing this since fall semester and I genuinely love it. Helping someone debug their first groupby operation and seeing it click is one of the best feelings in grad school.
Then ChatGPT dropped in December, and by the time students came back in January, everything was different.
The First Week Back
The signs were immediate. Office hours went from "how do I merge these DataFrames" to "I merged these DataFrames using ChatGPT but the output looks wrong, what happened?" The questions got more specific. But also, in some cases, more shallow. Students had answers they didn't understand. They could show me working code but couldn't explain why the left join produced NaN values in certain columns.
Some students were using ChatGPT as a tutor. Others were using it as a shortcut. The gap between those two approaches became obvious within days.
The Tutor vs. the Shortcut
The students who used ChatGPT well were asking it to explain concepts, then coming to office hours with follow-up questions that went deeper than the tool could go. "ChatGPT said StandardScaler assumes a Gaussian distribution. When does that assumption break down?" That's a great question. That's a student who's learning faster because they have a tireless explainer available at 2am.
The students who used it as a shortcut were copying solutions wholesale. And you could tell, not because the code was too good, but because it was too generic. ChatGPT writes correct but characterless code. It doesn't have the little idiosyncrasies that come from a student who learned pandas last week. The variable names are too clean. The comments are too polished. It's like reading an essay with perfect grammar but no voice.
I'm not judging either group, honestly. When I was an undergrad, I would have used it too. The temptation is completely rational. But the outcomes are measurably different.
The Professor Panic
Some faculty reacted by trying to ban it. I get the impulse, but I think that's a losing strategy. You can't enforce it. You can't detect it reliably. And more importantly, these students are going to use AI tools in every job they take after graduation. Teaching them to pretend it doesn't exist seems counterproductive.
Other professors leaned in. One restructured their assignments to require explanation alongside code. Another started giving open-ChatGPT exams where the questions were designed to be hard even with AI assistance. Those felt smarter to me.
The Irony of Teaching Data Science Right Now
Here's the thing that keeps me up at night. I'm teaching people the fundamentals of data science while the tools that automate those fundamentals are improving every month. The content of CS3000 is still valuable. Understanding how data flows through a pipeline, what transformations do, why certain visualizations reveal patterns that others hide. That's judgment, not syntax.
But the delivery has to change. If I'm teaching students to write code that ChatGPT writes better and faster, I'm wasting their time. If I'm teaching them to think critically about data, to question assumptions, to understand why a model works and not just that it works, then the course still matters. Maybe more than ever.
What I'm Actually Telling Students
When students ask me whether they should use ChatGPT, I tell them what I believe: use it, but use it the way you'd use a study partner who's read the textbook but hasn't done the homework. It can explain concepts. It can generate starting points. It can help you debug. But it hasn't sat with your specific dataset for three hours trying to figure out why the distribution is bimodal. That part is still yours.
I'm only a TA. I don't set policy or design curricula. But I'm watching this play out across 200 students in real time, and the signal is clear. The students who treat ChatGPT as a thinking tool are accelerating. The ones who treat it as an answer machine are falling behind in ways they won't notice until the midterm.
Education was already overdue for a rethink. ChatGPT just made the timeline urgent.
Related Posts
What an MS in CS Taught Me About the Gap Between Research and Production
With my MS at Northeastern nearly done, here's what I actually learned about the space between reading papers and shipping models.
What Teaching 200 Students Taught Me About Explaining Complex Ideas
A semester as a TA for Intro to Data Science changed how I think about communication, patience, and what it really means to understand something.
The Academic Integrity Crisis Nobody Knows How to Solve
As a TA grading 200+ students, I've seen the full spectrum of how ChatGPT is reshaping academic honesty. The problem isn't cheating. It's that we're testing the wrong things.