• Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    6 months ago

    I will keep saying this- This is a distraction from the real problem, which is that people will trust an LLM with a realistic voice and simulated emotional responses a lot more than just text output or even something like Alexa which still sounds artificial. Factual errors and hallucinations will be more widely believed. More people will start believing they’re alive.

    I don’t care who it sounds like, the problem is it sounds like a real person.