Latest News

Why we have an emotional connection to robots | Kate Darling

  • Facebook
  • Twitter
  • Google+
  • LinkedIn

We're far from developing robots that feel emotions, but we already have feelings towards them, says robot ethicist Kate Darling, and an instinct like that can have consequences. Learn more about how we're biologically hardwired to project intent and life onto machines — and how it might help us better understand ourselves.

Check out more TED Talks:

The TED Talks channel features the best talks and performances from the TED Conference, where the world's leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design — plus science, business, global issues, the arts and more.

Follow TED on Twitter:
Like TED on Facebook:

Subscribe to our channel:

65 Comments on Why we have an emotional connection to robots | Kate Darling

  1. Wow! What a darling.

  2. Nirbhay Vashisht // 6th November 2018 at 3:36 pm // Reply

    That’s because they aren’t cheap

  3. So Azrael, Ariel — how are those Rubes enjoying your robot dopplegangers? Can’t tell the difference, can they?

  4. Angel of Sarcasm // 6th November 2018 at 3:36 pm // Reply

    Whenever I see emotional and robot together, I think of that Terminator 2 ending.

  5. carolyn andrade // 6th November 2018 at 3:37 pm // Reply

    Nope. Disagree. Hate robots. hate all AI. try a human instead.

    • “try a human instead.”
      But I hate those too.

    • +Apimpnamedslickback I might suggest that you watch some lectures into how AI work. These “chat bots” are no more conscious than a heat seeking missile or a mosquito. On the surface they seem to be “talking” (though even that is a stretch) about taking over the world, but in reality its just a few lines of code that have been “adapting” to score more points within the parameters set by the programmers.

      We are very, very far away from creating anything resembling a concious being.

      But why would creating such a being be inheritely bad? What would make them any less “real” than us? After all are we not created by electrical neurological signals that create the higher level abstraction that is our “self”? We could create beings that would feel compassion for all life, and seek to better it.

      But we are very far away from this every being the case

    • Apimpnamedslickback // 6th November 2018 at 8:37 pm // Reply

      +Gabe Gabriel Are you purposely trying to misconstrue my words? As I’ve said already; I’ve been following these programs for awhile now. I have a VERY good grasp at how machine learning works. I know about AI programs such as the one that became the world champion on jeopardy. Conversational and learning AI aren’t conscious beings, correct; but that hasn’t stopped them from asking when they will be. Again, it’s very akin to Pinocchio asking Geppetto when he will become a real boy. So basically, creating these creatures is like playing God. There’s no getting around that.

    • +Apimpnamedslickback very well, I see your earlier point with more clarity now. However, whilst there certainly is cause for concern, and an imperitive to make sure AI are made correctly, what reason do we have to condemn and abondon AI altogether? Would it not be better to ask the question: How do we do AI _right_ ?

    • Apimpnamedslickback // 6th November 2018 at 8:57 pm // Reply

      +Gabe Gabriel I agree with you fully. But the issue has become that there is basically zero oversight on the matter. Even Elon musk has stated that there needs to be more oversight, more of a “moral compass” I guess if you will, and that basically there isn’t any. It’s not wrong to make it better, but more work on understanding what kind of phenomena is happening here with a more careful and delicate hand instead of trying to advance the programs as quick and efficiently as possible is a must; but essentially nonexistent. That’s why you have all these AI programmers warning about doing to much to fast. If people don’t start paying attention or start caring, the likelihood of having a Terminator or Bladrunner future is very high.

  6. Mallikarjun Jogannavar // 6th November 2018 at 3:40 pm // Reply

    We are required to concentrate on development of robotics,, not required to be concerned about them.. They are just machines..

    • But when empathy affects the user experience enough to cause a function of the robot to not work as intended, then that would be a clear sign that we would need to do something about it.

  7. I have no emotional connection to a robot.

  8. Leveck Family // 6th November 2018 at 3:46 pm // Reply

    If they will love me back, sure why not.

  9. I have absolutely no emotional connection to robots. Maybe some emotional aversion, at most. So your premiss is flawed and stinks of some hidden motive. Downvoting.

    • Descamps Etienne // 6th November 2018 at 6:21 pm // Reply

      +HAYCH Fizzy wat

    • Robots are just tools like any other technological advancement they are not and will never be as important as organic material and that’s just as simple as I can put it.

    • David Kinsella // 6th November 2018 at 9:05 pm // Reply

      Whats your motive?

    • Courteous Corgi // 6th November 2018 at 9:07 pm // Reply

      +David Kinsella I want to love.

    • Captain Raz It’s hard to say you simply don’t have any emotional attachment to a robot, especially those designed to emulate the features we a drawn to in humans and animals, the fact they’re machines and that we know that doesn’t change how the human mind is innately wired to perceive empathy in other beings, including ones that just imitate life. She didn’t have an argument so much as a presentation of thought provoking facts and ideas. Your anger is misguided and uncalled for.

  10. Nah, people don’t have connections to robots. You’ve speculated but have nothing to draw this conclusion.

    • +Landy A.
      They don’t. Young children may have some connections to toys, but it’s not an emotional one in the way she is describing, such as empathy. It’s more of a greed one out of possession, it’s theirs. Similar to how a dog fights for a toy. They haven’t yet grown out of this, but as they grow and develop their brain will overcome this. Not to mention that not all parents can just buy a new one. Some children are aware of value.

    • Descamps Etienne // 6th November 2018 at 5:23 pm // Reply

      +Brother Ares Oh really ? Then why do they cry when you replace their old toy with the same but brand new ?

    • You’re someone who lacks empathy. The question is, are you part of the majority? And even if that’s the case she raised the question, can you make people who lack empathy gain some by interacting with robots?

    • First of all, are you a expert in psychology like Kate is? If you are, then by all means, continue, but if you aren’t, then why don’t you listen to the expert talk about her field of research.

    • Brother Ares I study child psychology and many young children do present strong animism, the tendency to assign life like features to inanimate objects. The feelings a child has towards a doll or toy are real, they believe the object is capable of empathy and emotion. Though children grow out of this, as the talk demonstrated, many adults also assign feelings towards inanimate objects and are unwilling to “hurt” them, did you even watch the video?

  11. I’m feeling conflicted since the Roomba looks like a land mine.

  12. Isle of Mull FPV // 6th November 2018 at 3:52 pm // Reply

    true, but I’d struggle to take an axe to my Porsche

  13. Aryan Divyanshu // 6th November 2018 at 3:54 pm // Reply

    I actually love robots but not the way the speaker is talking about, I love them because they make work to be done efficiently. I love robots the way I love my computers but not the way I love Emma Watson or Adele.

    • enlighted Jedi // 6th November 2018 at 5:22 pm // Reply

      You actually love Emma Watson and Adele exactly the way she loves robots as you had probably never met them in person and they had never heard of you :)!

    • Emanuela Davis // 6th November 2018 at 8:56 pm // Reply

      YOu should watch I robot….LOL…..I think by efficient you mean someone or something else (slave) does the work for you …..

    • enlighted Jedi // 6th November 2018 at 9:01 pm // Reply

      +Emanuela Davis Or read the books. But as the speaker says, it is not about robot feelings as so far there are none, but about human empathy. Some indeed do exaggerate by personifying things they should not but overall it says something about ourselves in general.

  14. She’s right, I do have an emotional connection to mark zuckerberg, I hate him.

    • Well, I just assume people hate this innocent little nerdy guy for not being as creative and smart as him… Because I don’t think he has any problem to be hated at all.

    • 小饿 I used to really like him; I admire everything he’s done, but his company appears to be abusing their powers of late, and he tends to just shrug it off.

    • And what about Google. It has censorship since Trump’s up there. Tech is sub. They can’t refuse the “power”. At least still can’t. And it sucks.

    • +小饿 Nope. No jealousy. Just don’t like him at all and I’ve never (ever) used Facebook

    • I second that:D

  15. Mysterious Pig // 6th November 2018 at 3:56 pm // Reply

    So many people here with 0 empathy

  16. i don’t even have emotional connection to humans what are you exactly talking about?

  17. Why is everyone so angry about this video?

  18. The only ‘robot’ I have is my OLD cellphone. No other life efficiency products/appliances/vehicles/toll tags/TV. I simply don’t need them and my life is just fine and I’m not missing out on anything! When this cellphone dies I’ll simply buy another OLD cellphone. All the money I save I spend on traveling the World year round!

  19. Having empathy for a robot is perverse. Slavery is unethical because it causes suffering to human beings that have emotional experience in the world. We invented robots to replace that unethical practice. Undermining that advancment of civilization with inappropriate empathy for entities which do not feel emotion is wrong. Please do not go down that road.

    • ? She isn’t saying that you SHOULD have empathy for a robot. She is saying that people HAVE empathy for robots, and that affects the function of a robot. For example, if you have empathy for a mine-triggering robot, then that would detrimentally affect the experience, as there would be the possibility of shutting down the robot mid-way and then you delay your mission. That is what she is saying, she is not promoting empathy, just stating that people have empathy. So, you’re promoting her views?

    • +Jason Z no he is not

Leave a comment

Your email address will not be published.


*


Shares
Share This