The concept Theory of Mind

This week we will discuss the concept of ‘Theory of Mind.’

Read this article and watch the associated videos:  https://nobaproject.com/modules/theory-of-mind#abstract

1. Now, recall a situation in which you tried to infer what a person was thinking or feeling but you just couldn’t figure it out, and recall another situation in which you tried the same but succeeded. Which tools were you able to use in the successful case that you didn’t or couldn’t use in the failed case?

2. In the near future we will have robots that closely interact with people. Which theory of mind tools should a robot definitely have? Which ones are less important? Why?

 

 

 

 

 

Classmate 1

This topic is pretty apropos for a situation I encountered tonight. I was in the kitchen making dinner, and my daughter came up to me and just stood there and stared at me. It was after school, before dinner, but after a small snack. I knew what she was about to ask me, and I said, “no.” I thought she would ask if she could watch TV before dinner, and I preemptively said no. She just looked at me confused and said, “what?” “No, you cannot watch TV right now,” I said. She replied, “I was just going to ask what was for dinner.” I felt pretty small, but I felt like I had been in that situation before and knew exactly what she would say. When she said something different, I was pretty shocked. I had misread her goal, what was for dinner. The context around the situation led me to believe the goal was TV.

There have been other times when I have been on a team for a long time when we could look at each other and know exactly what to do next. Projection is likely the culprit in this instance. My teammates and I are seeking the same goal, have had the same background (in that situation and context), and are placed in the same environment.

Empathy is something that robots would need to have. I was trying to cancel a contract that my mother-in-law had, and I told the “person” on the other end of the chat that she was no longer with us. That person then asked if she had considered a different plan. After saying the same thing back to the person, a similar response came back. It was not until the third time that there was a distinct difference in the reply, and I could tell it was a human. Sympathy and empathy were exuded from the text, and the tone changed completely. If robots cannot understand emotions, humans will feel sensitivities, which will create distrust between the two. I think joint attention would be hard to achieve; the robots and humans can look or hear the same thing, but it will be processed differently.

 

 

Classmate 2

Week 5 Memory

1. Now, recall a situation in which you tried to infer what a person was thinking or feeling but you just couldn’t figure it out, and recall another situation in which you tried the same but succeeded. Which tools were you able to use in the successful case that you didn’t or couldn’t use in the failed case?

In the cases where I have been successful in inferring a person’s thoughts and feelings, I have engaged automatic empathy and simulation. By pausing to access my own feelings when interacting with someone I can differentiate my mood prior to interacting with a person and during the interaction. This can give a baseline to how a person is feeling. Simulation has helped me to draw on my own mental state to frame what another person may be feeling given a specific situation. When I have “misread” or misunderstood a person, many times it has been a result of projection. Too often, I have made assumptions about a person’s perceptions of an issue or situation and ended up causing embarrassment, tension, or offense.

2. In the near future we will have robots that closely interact with people. Which theory of mind tools should a robot definitely have? Which ones are less important? Why?

As artificial intelligence and robotics advance and applications become more accessible and common in the world, many of the theory of minds tools are important to the robot/person interaction but much depends on the role or job of the robot. First, the robot must have humanlike features for the person to begin a humanlike interaction with this accomplished the robot should be able to identify agents and recognize the agent’s goals. It less important that a robot differentiate between intentional and unintentional if the robot is performing a service like making coffee or assisting with grocery checkout. Mimicry and synchrony can put a person at ease and build connection this would be important if a robot’s function was to help in a medical settling such as taking blood pressure and temperature. Automatic empathy seems so distinctively human and except for chimpanzees, other animals do not demonstrate this behavior. It would be unnecessary for a robot to have this tool. Joint attention is important for a robot who is interacting in more complex ways with humans because of shared engagement and an understand of an object meaning. Stimulation and projection are tools that robots would benefit from not having because it without them. There is a sense of objectivity and lessens misinterpretations. Finally, it is not a robot’s job to be able understand by inference a person’s mental state. This can be achieved by asking questions and processing data.

 

"Order a similar paper and get 100% plagiarism free, professional written paper now!"

Order Now