• kalleboo@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 months ago

    It used to be you’d search for something, click on the results and load the ads on the page with the info.

    Then google started adding their snippets with direct answers, and yes, there has been an uproar from content sites about that. But some fraction of people still click through for more context.

    With LLMs, all that traffic is 100% gone.

      • CrayonRosary@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        That’s just kicking the can down the road. They’ll be exactly as trustworthy as your own brain at summarizing articles soon. What then?

        • UndercoverUlrikHD@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          I still want to know the source of what I’m being told. There are plenty of brains out there smarter than mine, I’ll still ask for sources.

    • fruitycoder@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      There is a reason why RAG and fine-tuning are big topics in the field. General foundation models are good for general low risk info, but if people really care its generally not enough.

      • TheMurphy@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Unfortunately, most people don’t care. That’s why most get their news from Facebook or TikTok, and only read headlines.