• 1 Post
  • 28 Comments
Joined 2 years ago
cake
Cake day: August 27th, 2023

help-circle
  • Sort of but I think influence over emotional states is understating it and just the tip of the iceberg. It also made it sound passive and accidental. The real problem will be overt control as a logical extension to the kinds of trade offs we already see people make about, for example data privacy. With the Replika fiasco I bet heaps of those people would have paid good money to get their virtual love interests de-“lobotomized”.





  • That’s really interesting. Its output to this prompt totally ignored the biggest and most obviously detrimental effect of this problem at scale.

    Namely, emotional dependence will give AI’s big tech company owners increased power over people.

    It’s not as if these concepts aren’t widely discussed online, everything from Meta’s emotional manipulation experiments or Cambridge Analytica through to the meltdowns Replika owners had over changes to the algorithm are relevant here.






  • I think I’m just going to have to agree to disagree.

    AI getting a diagnosis wrong is one thing.

    AI being bulit in such a way that it hands out destructive advice human scientists already know is wrong, like vaccines cause autism, homeopathy, etc, is a malevolent and irresponsible use of tech imo.

    To me, it’s like watching a civilization downgrading it’s own scientific progress.


  • I take your point. The version I heard of that joke is “the person who graduated at the bottom of their class in med school”.

    Still, at the moment we can try to avoid those doctors. I’m concerned about the popularizing and replication of bad advice beyond that.

    The problem here is this tool is being marketed to GPs, not patients, so you wouldn’t necessarily know where the opinion is coming from.


  • I’d hope the bar for medical advice is higher than “better than the worst doctor”.

    Will be interesting to see where liability lies with this one. In the example given, following the advice could permanently worsen patients.

    Given that the advice is proven to be wrong and goes against official medical guidance for doctors, that could potentially be material for a class action lawsuit.



  • It would have to be the fail rate of an average doctor, because if average doctors are the use case then moving the bar to fail rate of a bad doctor doesn’t make any sense. You would end up saying worse outcomes = better.

    I think the missing piece here is accountability.

    If doctors are being encouraged to give harmful out-of-date advice, who will end up with a class action lawsuit on their hands - doctors or OE?