I don't want random techbros coming in, hence why I'm posting on Den. I hope this is ok.

I'm teaching an online composition class this summer. I got two essays from students that cited sources that don't exist. I called them out on it. Here's what happened.

One copped to using Bard, but then sent a second essay that still clearly reeks of gen AI or other horseshit.

The other copped to using a GenAI search engine unwittingly, and has tried to claim they've read things that, by all accounts, they haven't.

Normally, I would have just failed these students for writing hundreds of words on material that doesn't exist. But I really wanted them to go beyond a basic cop and explain their reasons for using this. This is in part since I have administrative duties around GenAI this year in our program. So I wanted to get data for my fellow instructors (i.e. here's what the student did, here's how we can design better assignments that both teach more carefully and also are harder to use GenAI on, etc. etc.) Instead, I've just hit a brick wall from them. They're insisting that it was only a research error, even though by all accounts, these essays shouldn't exist since the majority is written on things that just literally aren't out there.

Again, they wrote about things that don't exist as if they do. That's GenAI in a nutshell. It's some of the most blatant shit. And these students are still trying to justify their work.

What bugs me most, however, isn't the students. It's the fact that technology like this was thrown out into the ether without any fucking guard rails. These students don't realize the problems with it, so they're fucking themselves. And while maybe they would have found some other way to do this kind of lazy work pre-ChatGPT, the accessibility of these LLM models means that more students will do stupid shit like this and fail, instead of trying to learn.

I'm very doomer about this stuff, not because of some AI takeover, but the total enshittification of everything. The citations-needed episode on it was very good on the other serious labor implications as well. However, there's also a ton of potential added labor or shittiness in the affected fields. After all, my instructors will have to work more for the same amount of pay OR just not bother policing it. Either outcome is terrible. While I'm going to do my damndest to try and help my colleagues build assignments that remain rigorous and have guiderails to avoid genAI production, the fact is, eventually it's coming for all of us. And even if it doesn't take our jobs, it's going to make us all more miserable. Because there's not the structures in place for FALGSC or anything. So we're going to lay people off, pay them less, remove some of the most human pursuits, and for what? A bot that's slightly more convenient and less accurate than wikipedia?

I'd love for someone to un-doomer me about this stuff, but it's just very depressing. I needed to vent among friends. Thanks for listening folks.

I'm still a bloomer at heart, but god damn is it hard to keep up in the face of material conditions.

  • ChestRockwell [comrade/them, any]
    hexagon
    ·
    edit-2
    10 months ago

    Yeah the false negative issue is real. I actually didn't accuse them of GPT use initially, just was like "why are you talking about shit that doesn't exist"

    If you want to throw the lit one a pedagogical bone, tell them to look into including images of text into their papers. The reason these students immediately stood out is the essays required them to incorporate images from the essays they were talking about in lieu of block quotes, and they were the only two without images of the texts.

    If nothing else, it will make the students actually find specific pages/evidence, rather than having the bullshit bot generate it. It's a stopgap, but I've found it surprisingly useful and I'm going to do more of it in the fall as a preventative measure. Also, students can do a lot of cool shit (marking up the passage, including that, etc.) to show it's really their work.

    • TerminalEncounter [she/her]
      ·
      10 months ago

      Like, why even take the class and spend the money and time on it if you're just gonna turn in AI crap. Better not be pre-med style people that think they're above reading and comprehension of complicated text

      • FunkyStuff [he/him]
        ·
        10 months ago

        If my college is anything to go by, 80% of everyone taking courses in the humanities are STEM students in their last 2 years filling out required humanities credits. Even worse, these classes tend to be distance learning and very very unlikely to weed out these kinds of students.

      • D61 [any]
        ·
        10 months ago

        Like, I tried to go to college for some IT thing and had to spend two of my four years in mandatory non-IT related classes. Chemistry? Yup, not relevant to the major. Survey of Calculus? Yup, had to take that twice because I cannot remember math equations to save my life. Not one, single, solitary class openly used anything in from the Calculus. At least the Intro to Film Studies and the Feminism/Popculture classes I took to meet the humanities requirements were actually engaging classes with real world application in my daily life.