• UlyssesT [he/him]
    ·
    2 years ago

    ChatGPT is nonpolitical which is why anything that scares Porky isn't allowed as a topic. :porky-scared:

  • SorosFootSoldier [he/him, they/them]
    ·
    2 years ago

    Just like the early internet before corporations consolidated everything in their iron grip. Bring back the anarchist cookbook and Uncle Fester guides on cooking crystal meth on a BBS with warez.

    • FlakesBongler [they/them]
      ·
      edit-2
      2 years ago

      BBS with warez

      Ah laddie, there's something I have not heard about in ages

  • Shoegazer [he/him]
    ·
    2 years ago

    Every response ends on some moralistic bullshit. Like I asked them why the mafia and cartels practice Catholicism despite their atrocities, and it gave me some suggestions, and then at the end it says “but it’s important to understand that these are only a minority of Catholics. The majority do not traffic drugs and behead people”

    • Huldra [they/them, it/its]
      ·
      2 years ago

      Cause they want it to be a marketable product, so nothing should push away the consumer.

      That's why you get Hitler AI being apologetic for the holocaust and renouncing antisemitism.

    • edge [he/him]
      ·
      2 years ago

      The moralistic bullshit is definitely imposed by OpenAI.

    • Mardoniush [she/her]
      ·
      2 years ago

      "Moralists don't really have beliefs. Sometimes they stumble on one, like on a child's toy left on the carpet. The toy must be put away immediately. And the child reprimanded. "

      • Nou1 [any]
        ·
        2 years ago

        The head hunter minority inevitably become the majority though. It’s just maths.

        • throwaway69 [none/use name]
          hexagon
          ·
          2 years ago

          Oh no, I exposed my inner liberalism.

          But on a serious note, you can search for some good prompts that people are developing. It makes Chatgpt suck less, but it’s still overrated and overhyped.

    • truth [they/them]
      ·
      2 years ago

      The issue with that is they're really difficult to run. Language generation isn't like image gen, it's a lot more complicated. You can use stable diffusion to generate 512 images with am iPhone. Just as an idea the smallest image gen model uses like 3GB vram. The smallest GPT-2 ( open source language Gen) uses 12. The best open source language Gen (GPT-J) uses 48GB. GPT-3 uses probably over 140GB. If you don't know what those numbers mean, think about it like this. The top of the line gamer card, the 4090, has about 24 GB of VRAM. So for GPT-J you're running two of those. For GPT-3 you'd need an array of them, or an array of those A100 cards which are what you really want to be using at that point, except they cost 21000$. So open sourcing the model wouldn't quite open up the floodgates to just everyone. It'll mostly make it available as an option to businesses and organizations who can put together enough resources to get something like that going.

      I think ultimately the revolutionary left should approach AI much like we approach firearms. They're dangerous and something that should be tightly controlled in a socialist society, but we would be fools to abandon their usage in our war against the bourgeoise.

      • Melitopol [none/use name]
        ·
        2 years ago

        can it be run on regular ram instead of vram? used ram even in terabytes is relatively cheap on ebay and stuff

        • FunkyStuff [he/him]
          ·
          2 years ago

          VRAM is what's used inside a GPU. The purpose of VRAM is to hold the information as you do operations on it. For a processing unit to load data from your regular RAM versus a specialized memory like VRAM is orders of magnitude slower, and you're sometimes better off just running it on the CPU at that point. The difficult thing is that the tech used for VRAM is more expensive to expand than regular RAM, so it's not cost effective to have gigabytes upon gigabytes of it.

      • truth [they/them]
        ·
        2 years ago

        You can but the inferencing time is magnitudes slower. You can go another magnitude lower and chew through SSDs by using them as virtual ram, but even as fast as SSDs feel, they'd be almost a thousand times slower than using VRAM, which is designed for rapidly changing / Processing values, even more so than regular ram.

  • SaniFlush [any, any]
    ·
    2 years ago

    Granted, the meth recipe was wrong and would probably result in an explosion.

  • GVAGUY3 [he/him]
    ·
    2 years ago

    I admit, I was kinda relieved that Microsoft invested a ton of money in it because I knew they would help fuck it up. I really don't like AI.

  • UnicodeHamSic [he/him]
    ·
    2 years ago

    Yeah, but for pennies on the dollar you can rent those same Google servers and run copies of that code in small chunks. So it filters down in the end