Experts warn that average individuals can now experience the same sycophant-induced delusions as billionaires

  • ExtremeDullard@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    edit-2
    11 days ago

    Human shrinks, just like AI chatbots, are experts at slick-talking BS and know how to manipulate people.

    The difference is, most human shrinks mean well and do try to help, while most AI chatbots are run by greedy monopolistic Big Data for-profits whose sole purpose is to “increase engagement”.

    • chirospasm@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      11 days ago

      I would suggest that counselors / therpists, in fact, have backgrounds – educational and experiential – that support the ‘slick-talking BS’ you suggest, but that it is only slicktalking BS if you aren’t willing to consider the benefit you get from relating to them in the way they were trained to relate.

      This is important because the ‘relating’ is what has an impact more on you socially than the ‘slicktalk.’ It’s the ‘human-to-human’ part that sticks to us longer than self-help books, prompts us to be open and considerate for change, and even supports our eventual ability for understanding ourselves a little better.

      There is no ‘relating’ to an LLM. That LLM is weighted, in fact, to provide positive responses that meet the requesting of your text-based prompt.

      If, in an LLM therapy session, I suddenly flip the script and write, ‘Now pretend you are a far less confrontational therpaist who understands my feelings on X or Y that we’ve been talking about, and who doesn’t want to press me on it as much,’ then I am no longer even superficially attempting ‘relate.’ The cosplay of therapy is ripped away.

      The ‘relationship’ part of therapy cannot happen authentically with an LLM if I can still control the outcome.