I've increasingly become concerned about a rise in people (especially teens) frequently using AI to get advice on how to engage in important interpersonal conflicts.
I've seen lots of discussion (primarily in mainstream media) about AI Psychosis, and the more obvious fear of people/kids outsourcing their thinking to chatbots and getting dumber. Both of these things matter, but specifically the outsourcing of interpersonal conflict resolution seems possibly quite bad and feels under discussed.
I'd estimate something like ~40-60% of young Americans have put a screenshot of a text exchange into ChatGPT and asked if the other person was wrong / crazy / etc... I know people who I'd consider to have high eq/iq using ChatGPT to weigh in on if their parent or partner is gaslighting them, and consequently severely damaging that relationship when they eventually get the yes.
I'm curious who is working on this, or if people think this is just an obvious issue that is super low priority?
I've increasingly become concerned about a rise in people (especially teens) frequently using AI to get advice on how to engage in important interpersonal conflicts.
I've seen lots of discussion (primarily in mainstream media) about AI Psychosis, and the more obvious fear of people/kids outsourcing their thinking to chatbots and getting dumber. Both of these things matter, but specifically the outsourcing of interpersonal conflict resolution seems possibly quite bad and feels under discussed.
I'd estimate something like ~40-60% of young Americans have put a screenshot of a text exchange into ChatGPT and asked if the other person was wrong / crazy / etc... I know people who I'd consider to have high eq/iq using ChatGPT to weigh in on if their parent or partner is gaslighting them, and consequently severely damaging that relationship when they eventually get the yes.
I'm curious who is working on this, or if people think this is just an obvious issue that is super low priority?