This is the part of the AI hustle which takes directly from Science Fiction to make their product seem more powerful and better than it is.

No, Fancy Autocorrect does not have 'feelings' to hurt, this isn't HAL 9000, this isn't Data from Star Trek, it's a hustle. We are nowhere near such technology, we barely understand Human consciousness, let alone replicate it artificially. We're just fooling ourselves.

theguardian.com/technology/202

@Lazarou True, but people have an uncanny need to empathize. To quote a TV show: "We are the only species on Earth that observe Shark Week. Sharks don't even observe Shark Week, but we do. For the same reason I can pick up this pencil, tell you its name is Steve and break it, and part of you dies just a little bit on the inside, because people can connect with anything. We can sympathize withn a pencil, we can forgive a shark, and we can give Ben Affleck an Academy Award for Screenwriting."
@Lazarou

Much of this conversation seems to simultaneously center around and ignore the ELIZA effect.

Chatbots are nothing new, and the term is older than I am.

"From a psychological standpoint, the ELIZA effect is the result of a subtle cognitive dissonance between the user's awareness of programming limitations and their behavior towards the output of the program."

https://en.wikipedia.org/wiki/ELIZA_effect