this post was submitted on 07 Oct 2024
53 points (100.0% liked)

Humanities & Cultures

2532 readers
40 users here now

Human society and cultural news, studies, and other things of that nature. From linguistics to philosophy to religion to anthropology, if it's an academic discipline you can most likely put it here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Thevenin@beehaw.org 6 points 1 month ago* (last edited 1 month ago) (1 children)

While I don’t say this as a criticism of the author, it is worth pointing out that she’s also failed to adapt to the new technologies. She talks about how teachers will need to adapt to the new tools but ultimately places the blame on the students rather than reconsidering who her audience is.

How would you propose adapting to this? Do you believe it's the teacher's responsibility to enact this change rather than (for example) a principal or board of directors?

The average teacher does not have the luxury of choosing their audience. Ideally you'd only teach students who want to learn, but in reality teachers are given a class of students and ordered to teach them. If enough students fail their exams, or if the teacher gives up on the ones who don't care, the teacher is assumed to be at fault and gets fired.

You can theoretically change your exams so that chatbot-dependent students will fail, or lower your bar because chatbots are "good enough" for everyday life. But thanks to standardized testing, most teachers do not have the power to change their success metrics in either direction.

This article is about PhD students coasting through their technical writing courses using chatbots. This is an environment/application where the product (writing a paper) is secondary to the process (critical analysis), so being able to use a chatbot is missing the point. Even if it were, cancelling your technical writing class to replace it with an AI-wrangling class is not a curriculum modification but an abdication. Doing that can get your program canceled, and could even get a tenured professor fired.

The author was really stuck between a rock and a hard place. Re-evaluating the systemic circumstances that incentivize cheating is crucially important -- on that we absolutely agree -- but it's a responsibility that should be directed at those with actual power over that system.

[Edit: taking the tone down a notch.]

[–] Gaywallet@beehaw.org 2 points 1 month ago

How would you propose adapting to this? Do you believe it’s the teacher’s responsibility to enact this change rather than (for example) a principal or board of directors?

To be clear, I'm not blaming anyone here. I think it's a tough problem and frankly, I'm not a professional educator. I don't think it's the teacher's responsibility and I don't blame them for a second for deciding that nah, this isn't worth my time.

This article is about PhD students coasting through their technical writing courses using chatbots. This is an environment/application where the product (writing a paper) is secondary to the process (critical analysis), so being able to use a chatbot is missing the point.

Completely agreed here. I would have just failed the students for cheating if it were me. But to be clear, I was talking in more the abstract, since the article is written more about the conundrum and the pattern than it is about a solution. The author decided to quit, not to tackle the problem, and I was interested in hearing them follow that thread a bit further as they're the real expert here.