I don't want random techbros coming in, hence why I'm posting on Den. I hope this is ok.

I'm teaching an online composition class this summer. I got two essays from students that cited sources that don't exist. I called them out on it. Here's what happened.

One copped to using Bard, but then sent a second essay that still clearly reeks of gen AI or other horseshit.

The other copped to using a GenAI search engine unwittingly, and has tried to claim they've read things that, by all accounts, they haven't.

Normally, I would have just failed these students for writing hundreds of words on material that doesn't exist. But I really wanted them to go beyond a basic cop and explain their reasons for using this. This is in part since I have administrative duties around GenAI this year in our program. So I wanted to get data for my fellow instructors (i.e. here's what the student did, here's how we can design better assignments that both teach more carefully and also are harder to use GenAI on, etc. etc.) Instead, I've just hit a brick wall from them. They're insisting that it was only a research error, even though by all accounts, these essays shouldn't exist since the majority is written on things that just literally aren't out there.

Again, they wrote about things that don't exist as if they do. That's GenAI in a nutshell. It's some of the most blatant shit. And these students are still trying to justify their work.

What bugs me most, however, isn't the students. It's the fact that technology like this was thrown out into the ether without any fucking guard rails. These students don't realize the problems with it, so they're fucking themselves. And while maybe they would have found some other way to do this kind of lazy work pre-ChatGPT, the accessibility of these LLM models means that more students will do stupid shit like this and fail, instead of trying to learn.

I'm very doomer about this stuff, not because of some AI takeover, but the total enshittification of everything. The citations-needed episode on it was very good on the other serious labor implications as well. However, there's also a ton of potential added labor or shittiness in the affected fields. After all, my instructors will have to work more for the same amount of pay OR just not bother policing it. Either outcome is terrible. While I'm going to do my damndest to try and help my colleagues build assignments that remain rigorous and have guiderails to avoid genAI production, the fact is, eventually it's coming for all of us. And even if it doesn't take our jobs, it's going to make us all more miserable. Because there's not the structures in place for FALGSC or anything. So we're going to lay people off, pay them less, remove some of the most human pursuits, and for what? A bot that's slightly more convenient and less accurate than wikipedia?

I'd love for someone to un-doomer me about this stuff, but it's just very depressing. I needed to vent among friends. Thanks for listening folks.

I'm still a bloomer at heart, but god damn is it hard to keep up in the face of material conditions.

  • JoeByeThen [he/him, they/them]
    ·
    10 months ago

    I've got a couple professors in my personal life and the one focused on lit is basically shifting much of their writing assignments to be done in class. Seems to be the only way around it they've come up with that isn't going to create false negatives that might wreck people's academic careers.

    • ChestRockwell [comrade/them, any]
      hexagon
      ·
      edit-2
      10 months ago

      Yeah the false negative issue is real. I actually didn't accuse them of GPT use initially, just was like "why are you talking about shit that doesn't exist"

      If you want to throw the lit one a pedagogical bone, tell them to look into including images of text into their papers. The reason these students immediately stood out is the essays required them to incorporate images from the essays they were talking about in lieu of block quotes, and they were the only two without images of the texts.

      If nothing else, it will make the students actually find specific pages/evidence, rather than having the bullshit bot generate it. It's a stopgap, but I've found it surprisingly useful and I'm going to do more of it in the fall as a preventative measure. Also, students can do a lot of cool shit (marking up the passage, including that, etc.) to show it's really their work.

      • TerminalEncounter [she/her]
        ·
        10 months ago

        Like, why even take the class and spend the money and time on it if you're just gonna turn in AI crap. Better not be pre-med style people that think they're above reading and comprehension of complicated text

        • FunkyStuff [he/him]
          ·
          10 months ago

          If my college is anything to go by, 80% of everyone taking courses in the humanities are STEM students in their last 2 years filling out required humanities credits. Even worse, these classes tend to be distance learning and very very unlikely to weed out these kinds of students.

        • D61 [any]
          ·
          10 months ago

          Like, I tried to go to college for some IT thing and had to spend two of my four years in mandatory non-IT related classes. Chemistry? Yup, not relevant to the major. Survey of Calculus? Yup, had to take that twice because I cannot remember math equations to save my life. Not one, single, solitary class openly used anything in from the Calculus. At least the Intro to Film Studies and the Feminism/Popculture classes I took to meet the humanities requirements were actually engaging classes with real world application in my daily life.

  • Acute_Engles [he/him, any]
    ·
    10 months ago

    I feel like if I was given access to these LLMs in school I'd have spent just as much time editing the essay the AI shat out as I would have on a real essay. Then bragged about how i didn't have to write it

    Do they not edit the output?

    • red_stapler [he/him]
      ·
      edit-2
      10 months ago

      This would be me. It’s probably a neurodivergence thing, but I can’t turn a blank word document into what I’m thinking about to save my life, but I’ll spend an hour refining an LLM prompt and then editing out the cringe hallucinations. I guess I need at least a foundation to get started? 🤷

      • ChestRockwell [comrade/them, any]
        hexagon
        ·
        edit-2
        10 months ago

        Ironically, neurodivergence is probably one of the few "legitimate" uses of this stuff that I'm ok with in the classroom. The situation you described is one I'm actually sympathetic to, and if a student actually had this as their process, I would be far more open to it. Because what you describe there is an actual writing process. Yes, it's not my process, but producing a bunch of slop then hunting through it for the useful material, editing out the hallucinations, etc. -- that's a real writing process! It's similar to a "shitty first draft" on some levels.

        I have some philosophical concerns about this still -- namely, in getting that initial output, you might start contorting yourself to "fit" the machine. Like, prompt design doesn't actually direct the machine -- the machine is directing you in man y real ways. I'm very anxious about this in terms of agency, etc -- I.e. an english language learner just washing their own voice and thinking away in the slurry of slop produced by GPT.

        But that's more a philosophical than pedagogical concern. The fact is, if neurodivergent students wanted to propose that kind of process, I'd be open to it as long as they were open with me about it. Some people write the whole essay in one night, some people produce it over many iterative steps. There's no one right way to write, but you have to do that kind of thinking/engagement that you're describing.

        P.S. my non-GPT way to get that "foundation" on the page is just copy all the relevant quotes/evidence/whatever down into a document, then start doing some commentary/explanation. Then once you have even a token bit of commentary, you can activate the dialectical process of writer/reader to produce more out of it.

        • red_stapler [he/him]
          ·
          edit-2
          10 months ago

          in getting that initial output, you might start contorting yourself to "fit" the machine. Like, prompt design doesn't actually direct the machine

          Perhaps, in my case I know roughly what output I want, so I work with various outputs until I get to something that I can take over from.

          This is all academic since I’m middle aged now and I just want a quick description of my D&D character. chomsky-yes-honey

    • ChestRockwell [comrade/them, any]
      hexagon
      ·
      10 months ago

      I don't know. They clearly didn't do any checking of it, and if anything, it might have also been run through machine translation.

      This is also the citations-needed philosophy - to actually make GPT output good is probably a net zero or negative in comparison to the actual work to just... write.

      • YearOfTheCommieDesktop [they/them]
        ·
        10 months ago

        this is my philosophy on using it to write code as well. every damn time I see someone do it, it comes out subtly or blatantly broken, and you have to develop just as much expertise to fix the subtle issues as you would need to do it yourself to begin with.

        What worries me is the people that use it to get a basis to work from, fix the obvious problems, and leave in place the unspoken assumptions and subtle bugs and erroneous approaches that look reasonable at a glance. I've seen my coworkers do it already. At best, it mirrors the misconceptions you give it in your prompt, at worst it comes up with its own wrong ways of doing the thing.

  • TerminalEncounter [she/her]
    ·
    edit-2
    10 months ago

    I'm trying to understand the mindset of doing this. I imagine they were hotshit or at least pretty smart in high-school and coasted a lot, never really had to apply themselves or struggle through school. And during covid online school I'm sure they slacked off and built some bad habits.

    And then they encounter this problem at school, it's harder than they're used to, they can't just slack off and bullshit an essay and get a B or A off that. They could fail. And instead of asking for guidance from the prof they're indirectly paying, or asking for more time, they go straight to academic dishonesty and present someone/something else's work as their own (and in your case - they continue to lie about it lol). I just don't understand why go to cheating so quickly??? You're in school, you're paying, you have access to your instructor and presumably a writing center, you have your fellow students... like why turn in obvious chatgpt essays. Is it just anxiety or fear? Or do they think they're entitled to get away with it? I don't understand.

    • ChestRockwell [comrade/them, any]
      hexagon
      ·
      edit-2
      10 months ago

      I REMINDED THEM ALL EVERY WEEK I HAD OFFICE HOURS WEEKLY BY ZOOM APPOINTMENT AT ANY TIME AND ONLY TWO STUDENTS EVER USED IT.

      And yes, I wonder a lot about anxiety. However, I'd like to think I'm fairly approachable and try to make it clear office hours is there for them. It's really baffling.

      • TrudeauCastroson [he/him]
        ·
        edit-2
        10 months ago

        If they're like me in university they probably left it last minute and started writing it too late to use any office hours to try to improve their writing.

        This sort of thing might be able to be mitigated by having mini pre-deadlines where you have to submit what you wrote but it doesn't count for many marks. Or if you have seminars/tutorials as part of the class you could have them have to discuss about their sources they scoped out but didn't write about yet.

        It is a little bit hand-holdy for university, but if these are covid final highschool year kids then they all have habits even worse than mine.

        It's kind-of like doing an enforced writing-workshop since you at least get a rough draft a lot quicker and they have something to bring to any workshop.

        • ChestRockwell [comrade/them, any]
          hexagon
          ·
          edit-2
          10 months ago

          I had those draft checkpoints. After my feedback, instead of coming to office hours, they responded by spooling up the AI material.

          It's really depressing, because I even said in my feedback "you should come to office hours." Like, come talk to me about these half-baked ideas. Instead, they went to the AI to churn out crap.

          I should note, this is a summer online class. So I suspect that contributed a lot as well - they didn't "feel" like they were in class, so the material they submitted was often late and not up to scruff.

          • TrudeauCastroson [he/him]
            ·
            edit-2
            10 months ago

            Was their original material you provided feedback on written by themselves?

            If so, that's pretty strange to me. Maybe it indicates they skated by in highschool effortlessly and have no idea how to write better, and instead of trying to write better when given critique they gave up and used AI?

            I feel for you, I have no idea how to make these people want to learn and put effort in either. My major didn't require writing essays and such, but I did have to take some gen-ed classes but I found them pretty interesting.

            I did my essays within the span of 2 days though, and if I cared more about grades and it was available maybe I would've used AI, idk. I had good English marks in highschool, but my university essays were usually C- to B- because I didn't do writing workshops, and I sort of regret not putting in more effort but I was struggling with my main STEM degree focus at the time.

            Online definitely makes it harder to engage people.

            • ChestRockwell [comrade/them, any]
              hexagon
              ·
              10 months ago

              Was their original material you provided feedback on written by themselves?

              Perhaps a mix. The drafts have too little to tell. So my working theory is they just ended up behind the 8 ball and made some questionable choices.

    • YearOfTheCommieDesktop [they/them]
      ·
      edit-2
      10 months ago

      why go to cheating so quickly??? You're in school, you're paying, you have access to your instructor and presumably a writing center, you have your fellow students... like why turn in obvious chatgpt essays. Is it just anxiety or fear?

      They aren't there to learn. I know a lot of college and HS age kids and not very many of them really get into a learning mindset, they are there out of obligation, or to get something else they want, or because their parents insisted.

      Also, depending on what year they are in, they probably built these habits in HS tbh, especially if they had teachers who didn't read their work in detail or never had to diverge from "analyze this well known novel", which I imagine the AI is more convincing at considering the wealth of source material. I've met some of these kids, its just the next step from "haha I never read any assigned books just the cliffsnotes if anything and I still pass" mindset.

      Personally, I started to get into that "learning mindset" with certain classes in college and an extracurricular or two in HS, but in college I was already committed to a major that I already knew a lot about and increasingly didnt even grab my interest, so I dropped out rather than force myself to go through the motions for several more years of part time school that I mostly hated. I might go back for something like philosophy or sociology eventually, but the desire to actually learn not just pass has to be there.

      • BigHaas [he/him]
        ·
        10 months ago

        Exactly, the goal of college is to obtain a job permission slip as quickly, cheaply, and easily as possible.

        • YearOfTheCommieDesktop [they/them]
          ·
          10 months ago

          which is a shame, to be fair, but it isnt the kids' fault that it's being shaped to be more and more that way, I have a hard time really blaming them. But for all our sake's I hope it can be turned around. a real education can be a profoundly liberating thing.

    • D61 [any]
      ·
      edit-2
      10 months ago

      deleted by creator

    • ChestRockwell [comrade/them, any]
      hexagon
      ·
      10 months ago

      I think all the credentialing is antithetical to my pedagogy. Fundamentally I try to teach as a sophist, so the students are often annoyed since the whole idea is there's no "correct" answer so much as better or worse strategies. By focusing on the process by which we arrive at truth (i.e. dialectic and different forms of evidence/truth claim), I hope for my students to come out able to recognize bullshit and misleading claims.

      It's a little bit LIB perhaps, but my class always turns in the end towards the failures of liberalism since students have to argue for solutions to systemic labor problems. More often than not they are brought to face the failures of liberal democratic institutions in protecting workers, etc. I also try to use these moments to emphasize the way that different value systems create different conclusions (i.e. liberalism will value democratic "process" over results. However if you frame your argument around results, then liberal American democracy usually isn't the answer).

      So I do wonder how much motivation students have to cheat or just get thru with minimal effort is motivated by the material conditions you talk about. After all, rhetoric and argument and writing just aren't comfortable in general. So while having no time limit and less credentialization would go a long way, there would likely be students who don't want to get into that mode of being your own reader/critic and anticipating what others would say that's sort of the key to rhetorical/humanistic thought.

      The nice thing is if there weren't credentials at the end like there are now, there'd be no downside to just quitting the class if you don't want to think about these things.