• came_apart_at_Kmart [he/him, comrade/them]
    ·
    17 hours ago

    recently i had to listen to someone talk about how great AI is the other day, that i personally would have assumed knows better. what they like about it is how they can generate graphics without knowing graphic design and written copy without having to write. they see it as a way for people without skills to generate work product that, prior to "AI" would have taken training. notably, this person does not have these skills in the abundance they would like, so this allows them to contribute.

    i understand the logic of that, but it seems like a strategy that bets long on platforms not enshittifying and short-changes any organizational strategy to address skill debt. also, when it comes to written copy, i think we've all seen examples of the sort of copy AI generates that is uninformative noise. few things are more frustrating to me today than trying to find information on something, finding an article that fully convinces the SEO of its worthiness, then reading like 4-5 paragraphs of good syntax and grammar thinking i am about to learn something and realizing it is just going in circles and yielding zero knowledge or analysis. i don't even know how to describe that frustration to people sometimes, because it feels like a lot of people don't do this sort of research in their roles anymore... even when they are supposed to. they, instead, skim things and now let search engine AI deliver quick answers to them.... quick answers which can be stovepiped by other AI written copy with good SEO in an ouroboros of fabricated bullshit.

    having these systems "teach" younger humans seems like probably one of the stupidest ideas i've heard in months.