Their charter: https://openai.com/charter

OpenAI is the company behind ChatGPT among other AI products. I try to keep myself out of the loop when it comes to AI because I end up hearing about it anyway so I wasn't aware of this charter.

For the unaware, AGI stands for Artificial General Intelligence. It basically means a form of AI that is extremely advanced and general-purpose like human intelligence is. For contrast, ChatGPT and Stable Diffusion (for example) are highly specialised. The former generates text response to text input and the latter generates images in response to text input.

Despite both these AI technologies of today being very impressive (even if their proprietors try to obscure the training and energy cost), the path to achieving AGI is pretty much inconceivable at present. Current AI technologies may have exploratory value in achieving AGI in some far future. But AGI is most likely not going to be built upon currently existing technologies and is going to be a different beast altogether provided it exists in the first place.

Given this, I find it absolutely baffling that OpenAI is talking about AGI like they do. This is the same level of delusion as Elon Musk talking about Mars colonisation. But given that techbros see themselves as the stewards for the next step in civilisational evolution, I guess it should come as no surprise that they eat this shit up uncritically.

I'm not sure what role these generative AIs will play in the near future. I am trying to figure out whether they will primarily be sold to corporations to cut labour cost or to end users to boost productivity. But talking of AGI and AI singularity and far fetched shit like that is a pure marketing stunt.

  • albigu@lemmygrad.ml
    ·
    edit-2
    1 year ago

    Moore's law (and apparently now Moore himself lol) has been dead for a while and cores can't really get that much faster due to power dissipation. On the other hand there are some really harsh physical or theoretical limits to parallelisms (besides the points in that article, I've heard it's incredibly slow to have memory busses for the same memory in more than 16 cores). I wouldn't place any bet that we're going to get any more computing power than what we have.

    The solution (to most of computing, not just AI) right now is to roll up the sleeves and start writing actually efficient software because we can't expect that the next generation of Intel processors will necessarily make our optimizations redundant anymore.

    That includes developing (possibly slightly worse but) more compute-efficient ML models, but the big AI boys are allergic to that because they rely on funding or are directly owned by the biggest cloud server providers.

    • Lmaydev@programming.dev
      ·
      edit-2
      1 year ago

      They are looking at analogue chips as a potential solution.

      https://www.newscientist.com/article/2388005-analogue-chips-can-slash-the-energy-used-to-run-ai-models/

      I think minting chips for that specific AI model rather than running on a generic chip will be a potential solution.