Show

  • gay_king_prince_charles [she/her, he/him]
    ·
    1 month ago

    It's not that they hired the wrong people, it's that LLMs struggle with both numbers and factual accuracy. This isn't a personel issue, it's a structural issue with LLMs.

    • Xavienth@lemmygrad.ml
      ·
      1 month ago

      Because LLMs just basically appeared in Google search and it was not any Google employee's decision to implement them despite knowing they're bullshit generators /s

      • psivchaz@reddthat.com
        ·
        1 month ago

        I mean, define employee. I'm sure someone with a Chief title was the one who made the decision. Everyone else gets to do it or find another job.

          • gay_king_prince_charles [she/her, he/him]
            ·
            edit-2
            1 month ago

            I mean LLMs are cool to work on and a fun concept. An n dimensional regression where n is the trillions of towns in your dataset is cool. The issue is that it is cool in the same way as a grappling hook or a blockchain.

    • UlyssesT
      ·
      edit-2
      10 days ago

      deleted by creator

      • gay_king_prince_charles [she/her, he/him]
        ·
        1 month ago

        Google gets around 9 billion searches per day. Human fact checking google search quick responses would be an impossible. If each fact check takes 30 seconds, you would need close to 10 million people working full time just to fact check that.

          • gay_king_prince_charles [she/her, he/him]
            ·
            1 month ago

            also I'm pretty sure Google could hire 10 million people

            Assuming minimum wage at full time, that is 36 billion a year. Google extracts 20 billion in surplus labor per year, so no, Google could not 10 million people.

        • UlyssesT
          ·
          edit-2
          10 days ago

          deleted by creator

          • gay_king_prince_charles [she/her, he/him]
            ·
            1 month ago

            Are you also suggesting it's impossible for specific times that it really matters, such as medical information?

            Firstly, how do you filter for medical information in a way that works 100% of the time. You are going to miss a lot of medical questions because NLI has countless edge cases. Secondly, you need to make sure your fact checkers are accurate, which is very hard to do. Lastly, you are still getting millions and millions of medical questions per day and you would need tens of thousands of medical fact checkers that need to be perfectly accurate. Having fact checkers will lull people into a false sense of security, which will be very bad when they inevitably get things wrong.

            • UlyssesT
              ·
              edit-2
              10 days ago

              deleted by creator

              • gay_king_prince_charles [she/her, he/him]
                ·
                edit-2
                1 month ago

                If you see a note saying "This was confirmed to be correct by our well-trained human fact checkers" and one saying "[Gemini] can make mistakes. Check important info.", you are more likely to believe the first than the second. The solution here is to look at actual articles with credited authors, not to have an army of people reviewing every single medical query.

                • UlyssesT
                  ·
                  edit-2
                  10 days ago

                  deleted by creator

                  • gay_king_prince_charles [she/her, he/him]
                    ·
                    1 month ago

                    LLM usage here doesn't help, that's true. But medical queries weren't good before LLM's either, just because it's an incredibly complex field with many edge cases. There is a reason self diagnosis is dangerous and it isn't because of technology.technology.

                    • UlyssesT
                      ·
                      edit-2
                      10 days ago

                      deleted by creator

  • peeonyou [he/him]
    ·
    1 month ago

    i work with a bunch of former googlers and googleXers(?) and they are some of the most insufferable people on the planet

    • UlyssesT
      ·
      edit-2
      10 days ago

      deleted by creator

      • peeonyou [he/him]
        ·
        1 month ago

        also see:

        "oh hey did you know Joan Norbaberts?" "yeah, she was an amazing xingbongler!" "yeah well i heard she went to Zombubo with Dave Bilbby" "oh yeah, but did you know Dave and Zach Erawan were working in building 42 back in 2009?" "no it wasn't building 42 it was building 87, the one with the spaceship themed ball pit on floor 7" "oh, no you're thinking of building 27, building 87 had the 360 degree wall of tvs that you could zoom in from space to see peoples' nose hairs" "oh yeah, but anyway Zeny Bazinga worked as an SRE with Dave... do you know Barla Bingus?" "yeah I worked with Barla Bingus on ads, but then Zipity Duda came and requisitioned a server farm in Alabama, oh hoo hoooo" "ohhh man you remember that? yeah oh my god, and LARRY SCHDMITDT WAS SO MAD" "YEAH LOL"

        and on... and on... and on... and on..

  • Lussy [any, hy/hym]
    ·
    1 month ago

    I feel like such a shitty engineer for not remembering or having the slightest interest in even the most basic electrical shit. i don’t even get this fucking meme.

    I can do civil/mech/chem but show me electricity and I feel like I’m in preschool.

    Pre 18th century ass brain capacity

    • Xavienth@lemmygrad.ml
      ·
      edit-2
      1 month ago

      North America uses 120 V for most circuits. Power is the product of voltage and current.

      At 1 Amp, 120 watts are dissipated by the circuit. About the heat of two incandescent light bulbs.

      At 10 Amps, 1200 watts are dissipated by the circuit, about the heat of a space heater.

      At 551 Amps, 66,000 watts are dissipated by the circuit. I don't even have a good comparison. That's like the power draw of 50 homes all at once.

      The higher the gauge, the lower the diameter of the wire. The lower the diameter of the wire, the more of that 66,000 watts is going to be dissipated by the wire itself instead of the load where it is desired. At 22 gauge, basically all of it will be dissipated by the wire, at least for the first fraction of a second before the wire vaporizes in a small explosion.

      EDIT: In this scenario, the total resistance of the circuit must be at most 0.22 Ω. Otherwise, the current will not reach 551 A due to Ohm's Law, V=I×R. This resistance corresponds to a maximum length of 13 feet for copper wire and no load.

      • Lussy [any, hy/hym]
        ·
        1 month ago

        By dissipated by the wire instead of the load what do you mean?

        • sawne128 [he/him]
          ·
          1 month ago

          The wire heats up.

          Wires have a small resistance which causes a voltage drop over the wire if the current is big enough (U=RI), and therefore it draws power (P=UI) which warms it up. Thinner wires have more resistance.

      • PKMKII [none/use name]
        ·
        1 month ago

        I ran this by my brother who’s an electrician and he inferred that might be where the number is coming from, some data on how many amps you can dump into various wire gauges before they simply stop being solids.

    • NephewAlphaBravo [he/him]
      ·
      1 month ago

      551 amps is an amount you would put in a comedic joke about using way too many amps

    • context [fae/faer, fae/faer]
      ·
      1 month ago

      imagine trying to direct the output of a fire hose on full blast through one of those thin red drinking straws that come with cocktails

        • context [fae/faer, fae/faer]
          ·
          1 month ago

          sort of but if we're extending the analogy i don't think the thin plastic drinking straw will make an effective replacement for the steel nozzle at the end of a water jet cutter, either

    • infuziSporg [e/em/eir]
      ·
      1 month ago

      0.2 amps of current going through a human torso is fatal in virtually all cases.

    • Rania 🇩🇿@lemmygrad.ml
      ·
      1 month ago

      Yeah, it took me a few minutes to realize "551 amps? that's an insane cartoonish number", also in the engineering field, but not electrical stuff

  • RION [she/her]
    ·
    1 month ago

    My job wants to use an AI medical notetaker instead of hiring someone for it... Surely nothing like this will happen :clueless:

  • FloridaBoi [he/him]
    ·
    edit-2
    1 month ago

    my company is going full steam ahead with AI stuff and a coworker (who is lebanese and we talk about palestine but he has jewish cabal conspiracy brainworms) loves the promise (fantasy?) of AI, especially GenAI. This mfer uses it to summarize short articles and write his emails. I feel like I'm a crazy person because I enjoy reading stuff and writing too.

    He sent me a demo yesterday where they had a local instance of an LLM trained on internal data and sure enough it was able to pull info from disparate sources and it was legit kinda neat. Most of what it did was chatbot stuff but with NLP and NLG. To me, this seems like really complicated way of having a search algorithm which we know to be more efficient and faster especially since it was just fetching info.

    However it was only neat bc it was running on internal data with strict boundaries, also it belies that a massive, comprehensive data dictionary had to be made and populated by people to allow for these terms/attributes/dimensions to be linked together. One of the things it did in the demo was execute SQL based off of a question how many of these items on this date? which it then provided as select sum(amount) from table where report_date = date and it also provided graphs to show fluctuations in that data over time. I didn't validate the results but I would hope it wouldn't make stuff up especially since the training set was only internal. My experience with other AI apps is that you can ask the thing the same question and you'll get different results.

      • FloridaBoi [he/him]
        ·
        1 month ago

        Jfc. Like who do you blame here? The model for being stupid, the prompter for not validating and if they’re validating then are there any time savings?

  • Feinsteins_Ghost [he/him]
    ·
    edit-2
    1 month ago

    putting the VFD into the ketchup every single time.

    265k watts lol

    For reference, 22awg solid is telephone wire. 22awg stranded is a hair thinner. I’ve made 22awg glow red-hot by dumping 12v and just a lil bit of anmps into it.

  • collapse_already@lemmy.ml
    ·
    1 month ago

    An enterprising lawyer is going to make a tidy sum when someone breathes copper vapor after following this advice.

  • nothx [he/him]
    ·
    1 month ago

    encouraging people to BE their own fuses.