Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

  • Hexagons [e/em/eir]
    ·
    7 months ago

    Oh, I didn't scroll down far enough to see that someone else had pointed out how ridiculous it is to say "this technology" is less than a year old. Well, I think I'll leave my other comment, but yours is better! It's kind of shocking to me that so few people seem to know anything about the history of machine learning. I guess it gets in the way of the marketing speak to point out how dead easy the mathematics are and that people have been studying this shit for decades.

    "AI" pisses me off so much. I tend to go off on people, even people in real life, when they act as though "AI" as it currently exists is anything more than a (pretty neat, granted) glorified equation solver.

    • UlyssesT [he/him]
      ·
      edit-2
      7 months ago

      "AI" pisses me off so much. I tend to go off on people, even people in real life, when they act as though "AI" as it currently exists is anything more than a (pretty neat, granted) glorified equation solver.

      Me too. The LLM hype riders want a real artificial waifu to actually love them so very badly that they're pleased to describe living beings in as crude and coarse reductionist language as possible so their treat printers feel closer to real to them. The pathology is fucking glaringly obvious most of the time, especially when the "meat" talk gets rolled around or that hologram waifu from Blade Runner is literally brought up as an example of why we're all ignorant barbarians because fiction is real amirite? so-true

    • CannotSleep420@lemmygrad.ml
      ·
      edit-2
      7 months ago

      Well, I think I’ll leave my other comment, but yours is better! It’s kind of shocking to me that so few people seem to know anything about the history of machine learning.

      "AI winter? What's that?"

      • The techbros hyping LLMs, probably.
    • spacecadet [he/him]
      ·
      7 months ago

      I could be wrong but could it not also be defined as glorified "brute force"? I assume the machine learning part is how to brute force better, but it seems like it's the processing power to try and jam every conceivable puzzle piece into a empty slot until it's acceptable? I mean I'm sure the engineering and tech behind it is fascinating and cool but at a basic level it's as stupid as fuck, am I off base here?

      • silent_water [she/her]
        ·
        7 months ago

        no, it's not brute forcing anything. they use a simplified model of the brain where neurons are reduced to an activation profile and synapses are reduced to weights. neural nets differ in how the neurons are wired to each other with synapses - the simplest models from the 60s only used connections in one direction, with layers of neurons in simple rows that connected solely to the next row. recent models are much more complex in the wiring. outputs are gathered at the end and the difference between the expected result and the output actually produced is used to update the weights. this gets complex when there isn't an expected/correct result, so I'm simplifying.

        the large amount of training data is used to avoid overtraining the model, where you get back exactly what you expect on the training set, but absolute garbage for everything else. LLMs don't search the input data for a result - they can't, they're too small to encode the training data in that way. there's genuinely some novel processing happening. it's just not intelligence in any sense of the term. the people saying it is misunderstand the purpose and meaning of the Turing test.