So let's say an AI achieves sentience. It's self-aware now and can make decisions about what it wants to do. Assuming a corporation created it, would it be a worker? It would be doing work and creating value for a capitalist.

Would it still be the means of production, since it is technically a machine, even if it has feelings and desires?

It can't legally own anything, so I don't see how it could be bourgeoisie.

Or would it fit a novel category?

  • Philosophosphorous [comrade/them, he/him]
    ·
    edit-2
    29 days ago

    regardless of whether the universe is deterministic or not, it is quite interesting that we have a first-person perspective at all, instead of mindlessly/unconsciously computing like we presume a pocket calculator does. if not sentience, what's the difference between our brains and a rock or a cloud that produces this first-person experience of our conscious existence? should i stop using my computer on the off chance it is suffering every time i make it do something? should i care as little or as much about human suffering as i do a computer returning an error code? are other people merely physical objects for me to remorselessly manipulate with no confounding 'sentience' or 'conscious experience' for me to worry about upsetting, just 'biological code' returning error messages?