• AtmosphericRiversCuomo [none/use name]
    ·
    1 month ago

    These models absolutely encode knowledge in their weights. One would really be showing their lack of understanding about how these systems work to suggest otherwise.

    • KobaCumTribute [she/her]
      ·
      1 month ago

      Except they don't, definitionally. Some facts get tangled up in them and can consistently be regurgitated, but they fundamentally do not learn or model them. They no more have "knowledge" than image generating models do, even if the image generators can correctly produce specific anime characters with semi-accurate details.