This is a special post for quick takes by Apathy. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since:

I've increasingly become concerned about a rise in people (especially teens) frequently using AI to get advice on how to engage in important interpersonal conflicts. 

I've seen lots of discussion (primarily in mainstream media) about AI Psychosis, and the more obvious fear of people/kids outsourcing their thinking to chatbots and getting dumber. Both of these things matter, but specifically the outsourcing of interpersonal conflict resolution seems possibly quite bad and feels under discussed. 

I'd estimate something like ~40-60% of young Americans have put a screenshot of a text exchange into ChatGPT and asked if the other person was wrong / crazy / etc... I know people who I'd consider to have high eq/iq using ChatGPT to weigh in on if their parent or partner is gaslighting them, and consequently severely damaging that relationship when they eventually get the yes. 

I'm curious who is working on this, or if people think this is just an obvious issue that is super low priority? 

Even my daughter uses gpt after our arguments and I'm very much aware of this. She even praises gpt for calming her down in her discussions with her mom. If the stuff if calming her down- i guess it's a good as well as a bad thing. Good as she's not stressing herself and bad since she's probably hiding her thoughts now.

More from Apathy
Curated and popular this week
Relevant opportunities