• 2 Posts
  • 2 Comments
Joined 2 years ago
cake
Cake day: August 12th, 2023

help-circle

  • It’s not far-fetched at all - that’s what happened with search engines. Lobotomising an LLM is not that easy, as we just saw with the strange Grok outbreaks after they tried to make it anti-woke. But they can work through training data, nudging it softly in a direction. I bet that what happened with early days SEO is already happening again: They optimise online content for influencing LLMs trained by it. When their shills and bots (also LLM driven, lol) say “shelf X is scientifically known to be very durable”, that becomes a “likely thing to say”, which is all an LLM is looking for.

    What you add is the suspicion that the corporations behind LLMs influence this process more directly and get paid for it, either already or in the near future, and that seems likely.