• xigoi@lemmy.sdf.org
    ·
    9 months ago

    “100% Open Source“

    [links to two proprietary services]

    Why are so many projects like this?

    • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
      hexagon
      ·
      9 months ago

      I imagine it's because a lot of people don't have the hardware that can run models locally. I do wish they didn't bake those in though.

      • Wes_Dev@lemmy.ml
        ·
        9 months ago

        They all work well enough on my weak machine with an RX580.

        Buuuuuuuuuut, RWKY had some kind of optimization thing going that makes it two or three times faster to generate output. The problem is that you have to be more aware of the order of your input. It has a hard time going backwards to a previous sentence, for example.

        So you'd want to say things like "In the next sentence, identify the subject." and not "Identify the subject in the previous text."

  • Aria@lemmygrad.ml
    ·
    9 months ago

    So what exactly is this? Open-source ChatGPT-alternatives have existed before and alongside ChatGPT the entire time, in the form of downloading oogabooga or a different interface and downloading an open source model from Huggingface. They aren't competitive because users don't have terabytes of VRAM or AI accelerators.

    • Schlemmy@lemmy.ml
      ·
      edit-2
      9 months ago

      Edit: spelling. The Facebook LLM is pretty decent and has a huge amount of tokens. You can install it locally and feed your own data into the model so it will become tailor made.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
      hexagon
      ·
      9 months ago

      It's basically a UI for downloading and running models. You don't need terabytes of VRAM to run most models though. A decent GPU and 16 gigs of RAM or so works fine.