https://www.twitter.com/heavenrend/status/1793346515261432027

  • Frank [he/him, he/him]
    ·
    6 months ago

    Oh look it has no ability to recognize or manipulate symbols and no referrents for what those symbols would even represent.

    • laziestflagellant [they/them]
      ·
      edit-2
      6 months ago

      It's funny. If you asked any of these AI hyping effective altruists if a calculator 'understands' what numbers mean or if a video game's graphic engine 'understands' what a tree is, they'd obviously say no, but since this chunky excessive calculator's outputs are words instead of numbers or textured polygons suddenly its sapient.

      • invalidusernamelol [he/him]
        ·
        6 months ago

        It's the British governments fault for killing Alan Turing before he could scream from the rooftops that the Turing test isn't a measure of intelligence

        • Philosoraptor [he/him, comrade/them]
          ·
          6 months ago

          The fact that it's treated that way is just evidence that none of the AI bros have actually read "Computing Machinery and Intelligence." It's like the first fucking line of the paper.

          • BeamBrain [he/him]
            ·
            6 months ago

            AI bros are the absolute worst.

            When I was in college, for one of my classes, one of our assignments was to use a circuit design language to produce a working, Turing-complete computer from basic components like 1-bit registers and half-adders. It really takes the mystique out of computation and you really see just how basic and mechanical computers are at their core: you feed signals into input lines, then the input gets routed, stored, and/or output depending on a handful of deterministic rules. At its core, every computer is doing the same thing that basic virtual computer did, just with more storage, bit width, predefined operations, and fancier ways to render its output. Once you understand that, the idea of a computer "becoming self-aware and altering its programming" is just ludicrous.

      • HexBroke
        ·
        edit-2
        5 months ago

        deleted by creator

    • yoink [she/her]
      ·
      6 months ago

      no but you see its exactly like how a brain works (i have no idea how a brain works)

    • facow [he/him, any]
      ·
      edit-2
      6 months ago

      We love Chinese rooms don't we folks?

      Just one more filing cabinet of instructions and we'll be done building god. I'm sure of it

  • Dirt_Owl [comrade/them, they/them]
    ·
    edit-2
    6 months ago

    "It still has bugs"

    Then why have you implemented it before it's safe to do so? Shit like this would get most things recalled or sued back in the day for endangering people with false information.

    Capitalism is totally off it's rocket

  • Water Bowl Slime@lemmygrad.ml
    ·
    6 months ago

    Isn't that what food photographers do to make pizza cheese look stretchier? The bot should recommend nailing the pizza slice down to the table next lol

  • btfod [he/him, comrade/them]
    ·
    6 months ago

    Silly putty is non toxic and provides even better tack, get the red kind and stretch it over the dough in place of sauce

  • AlicePraxis
    ·
    edit-2
    4 months ago

    deleted by creator

    • GalaxyBrain [they/them]
      ·
      6 months ago

      I know a LOT about making pizza. I don't understand how cheese isn't sticking in the first place. Do they not cook the pizza?

      • TheLastHero [none/use name]
        ·
        6 months ago

        I can only assume Americans are using so much oil on pizza that all structural integrity is compromised and its more like greasy tomato soup served on flatbread

        • GalaxyBrain [they/them]
          ·
          6 months ago

          Too much sauce could be the culprit as well, if the cheese is floating on a lake of sauce while melting you've got problems and another is LET YOUR PIZZA SET FOR LIKE 5 MINUTES. If they're eating without waiting the cheese doesn't have a chance to resttle and will be hard to get a solid bite on so you end out dragging the mass. Also likely is they're using wayyyy Too much cheese and thst mozzarella is turning into a big disc of borderline boccini.

            • GalaxyBrain [they/them]
              ·
              6 months ago

              Resist the temptation. Just like, go into s different room for a bit. This applies to most hot foods as well. Things keep processing after they're taken off a heat source and waiting until it's all the way finished does make a big difference.

      • Barx [none/use name]
        ·
        6 months ago

        Maybe they're using a cheese that doesn't melt as easily and a low temp oven? It doesn't make sense to me, either.

      • TheBroodian [none/use name]
        ·
        6 months ago

        I don't know about elsewhere on the planet, but in the USA pre-shredded cheese sold at the grocery store is usually powdered with something to prevent the shredded cheese from re-amalgamating. Consequently, this shredded cheese always takes longer and higher temperatures to melt and reincorporate unless it's rinsed off first. Most Americans aren't aware of this, and so often shredded cheese topping on shit just comes out badly

        • invalidusernamelol [he/him]
          ·
          6 months ago

          Sauce should really be a topping. Your base should be oil and maybe some tomato paste and garlic.

          The sauce heavy big 3 in America really don't know how to make pizza.

  • aaaaaaadjsf [he/him, comrade/them]
    ·
    edit-2
    6 months ago

    Ok I understand how the AI got this one wrong. You can make a "glue" by mixing flour and water together. You can also thicken a sauce by adding flour. So the AI just jumbled it all up into this. In its dataset, it's got "flour + water = non toxic glue". However, adding flour to a sauce, which contains water, also thickens the sauce. So in the AI's world, this makes perfect sense. Adding the"non toxic glue" to the sauce will make it thicker.

    This just shows how unintelligent so called "Artificial Intelligence" actually is. It reasons like a toddler. It can't actually think for itself, all it can do is try link things that it thinks are relevant to the query from it's dataset.

    • laziestflagellant [they/them]
      ·
      6 months ago

      You're actually giving it too much credit here. It seems to have lifted the text from a reddit joke comment that got shared/archived/reposted a lot (enough?) and therefore was the one of the 'most frequent' text strings that are returned on the subject

    • OutrageousHairdo [he/him]
      hexagon
      ·
      6 months ago

      This is pure speculation. You can't see into its mind. Commercially implemented AIs have recommended recipes that involve poison in the past, including one for mustard gas, so to give it the benefit of the doubt and assume it was even tangentially correct is giving it more slack than it has earned.

    • ziggurter [he/him, comrade/them]
      ·
      6 months ago

      So best case it regularly employs one of the most basic and widely known logical fallacies of affirming the consequent (flour + water -> non-toxic glue "therefore" non-toxic glue -> flour + water). Sorry, but if your attempt to make a computer use inductive reasoning tosses the deductive reasoning that computers have always been good at due to simplicity out the window, then I think you've tailed not only at "artificial intelligence", but at life.

    • Thordros [he/him, comrade/them]
      ·
      6 months ago

      As a large language model, I cannot advocate for or against holding one's beer as they dive in to the ol Reddit switcharoo link chain.

  • CyborgMarx [any, any]
    ·
    6 months ago

    This shit is gonna get peopled killed, this is on the level of those old 4chan toxic fume "pranks"

  • D61 [any]
    ·
    6 months ago

    100% correct statement.