archive: https://archive.ph/ZJdOS

  • blobjim [he/him]
    ·
    2 days ago

    Just running some data through the resulting model is still somewhat expensive since they have so many parameters. And of course for a lot of things, you want to train the model on new data you're putting through it anyways.

    • UlyssesT [he/him]
      ·
      2 days ago

      The proselytizer treated it as a gotcha, so I appreciate the additional information.

      • blobjim [he/him]
        ·
        edit-2
        2 days ago

        In their defense, I'm sure there are tons of actually useful machine learning models that don't use that much power once trained.

        I have an iPhone with Face ID and I think the way they did that was to train a model on lots of people's faces, and they just ship that expensive-to-train model with the operating system and then it trains a little bit more when you use face ID. I can't imagine it uses that much power since you're running the algorithm every time you open the phone.

        I'm sure any model worth anything probably does require a lot of training and energy usage. I guess it really depends on the eventual utility whether it's worth it.