• 0 Posts
  • 55 Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle


  • It is absolutely stupid, stupid to the tune of “you shouldn’t be a decision maker”, to think an LLM is a better use for “getting a quick intro to an unfamiliar topic” than reading an actual intro on an unfamiliar topic. For most topics, wikipedia is right there, complete with sources. For obscure things, an LLM is just going to lie to you.

    As for “looking up facts when you have trouble remembering it”, using the lie machine is a terrible idea. It’s going to say something plausible, and you tautologically are not in a position to verify it. And, as above, you’d be better off finding a reputable source. If I type in “how do i strip whitespace in python?” an LLM could very well say “it’s your_string.strip()”. That’s wrong. Just send me to the fucking official docs.

    There are probably edge or special cases, but for general search on the web? LLMs are worse than search.











  • Well, in this example, the information provided by the AI was simply wrong. If it had done the traditional search method of pointing to the organization’s website where they had the hours listed, it would have worked fine.

    This idea that “we’re all entitled to our opinion” is nonsense. That’s for when you’re a child and the topic is what flavor Jelly Bean you like. It’s not for like policy or things that matter. You can’t just “it’s my opinion” your way through “this algorithm is O(n^2) but I like it better than O(n) so I’m going to use it for my big website”. Or more on topic, you can’t use it for “these results are wrong but I like them better”




  • If a feature is useful people will use it, be it AI or not AI.

    People will also use it if it’s not useful, if it’s the default.

    A friend of mine did a search the other day to find the hour of something, and google’s AI lied to her. Top of the page, just completely wrong.

    Luckily I said, “That doesn’t sound right” and checked the official site, where we found the truth.

    Google is definitely forcing this out, even when it’s inferior to other products. Hell, it’s inferior to their own, existing product.

    But people will keep using AI, because it’s there, and it’s right most of the time.

    Google sucks. They should be broken up, and their leadership barred from working in tech. We could have had a better future. Instead we have this hallucinatory hellhole.