Can Robots or AI Be Real Friends? Kids Ages 7–10 Weigh In

February 25, 2026 00:26:42
Can Robots or AI Be Real Friends? Kids Ages 7–10 Weigh In
Pixel Parenting
Can Robots or AI Be Real Friends? Kids Ages 7–10 Weigh In

Feb 25 2026 | 00:26:42

/

Show Notes

What would a machine have to be able to do to really be a friend?

Patricia Cangas hosts a unique episode of Pixel Parenting where kids ages 7-10 tackle a philosophical question: Can robots be real friends? Dr. Iris Oved, founder and director of The Paradox Lab in San Francisco, guides children through hands-on experiments with Siri, Pebble AI, and a robot dog. Oved introduces Aristotle's three types of friendship—utility, pleasure, and virtue—as a framework before facilitating deep discussion about loyalty, honesty, and what friendship actually requires. Oved specializes in philosophy and cognitive science, focusing on how children, scientists, and robots learn. One child notes that "AI is only as good as its prompts," while another distinguishes between a friend "that always tells you what you want to hear" versus one "that tries to correct you."

In This Episode:


About the Show

Patricia Cangas Rumeu, a seasoned expert in digital literacy and educational technology, hosts Pixel Parenting - where she empowers busy parents to navigate their kids' tech use with confidence and science-based insights. This bi-weekly podcast cuts through digital parenting confusion by exploring everything from screen time strategies to educational apps with expert guests and practical tools. Patricia combines her background as both a mother and technology educator to help families build healthy digital habits that work for real life, creating a community where parents can learn what being "tech-savvy" really means for kids today.

Resources:

ParadoxLab.org - The Paradox Lab's website where Dr. Iris Oved brings philosophical inquiry to kids and families. This is where parents can find more resources and programs for engaging children in thoughtful discussions.

Siri (Apple) - Voice assistant tested during the episode to explore whether AI can demonstrate friendship qualities like honesty and humor.

Pebble AI - Educational AI companion featured in the episode that engages children in conversations about friendship, feelings, and experiences.

Aibo/Robot Dog - Robotic pet used in the experiment to test whether children perceive physical robots differently than voice-based AI assistants in terms of friendship and care.

Pixel Parenting Digital Resources: https://pixelparenting.org/digital-education-resources/
Pixel Parenting Podcast: https://pixelparenting.org/podcast/
Pixel Parenting Facebook: https://www.facebook.com/profile.php?id=61553247026258
Pixel Parenting IG: https://www.instagram.com/pixelparentingtips/

View Full Transcript

Episode Transcript

[00:00:01] Speaker A: Welcome to Pixel Parenting, the podcast that helps families raise kids in a world full of screens using science and curiosity. Here we share science backed strategies and real conversations to help you build healthier digital habits with practical tools for everyday parenting in this digital age. Let's get into it Today we're doing something different. Today the experts are 7 to 10 years old and they have a lot to say about friendship and robots. This episode is a kids philosophy conversation about a big question. What makes someone a real friend? And could a robot ever be one? Honestly, their answers surprised me. To guide the conversation, I invited Dr. Iris Oved, founder and director of the Paradox Lab in San Francisco. It is an educational center that brings the joy and tools of philosophical inquiry to kids. Iris is also a researcher in philosophy and cognitive science specializing in learning in children, scientists and robots. Here's how the session worked out. First, Iris asked the kids about friendship, their baseline. Then the kids spent hands on time with Siri. Then an educational AI companion and aibo, a robot dog. Afterwards, we came back to the same question. Now that you've met these systems, do you think we can be friends with a machine or a robot? So in the first conversation, Iris asked the kids what friendship means. Before any robot entered the room, we wanted a baseline. What makes someone a real friend and what doesn't? [00:01:31] Speaker B: I want to know what is a friend? [00:01:34] Speaker C: They're nice, they're kind, and you know them. [00:01:42] Speaker D: A friend is thoughtful, empathetic, understanding and also caring. And friends are there when times are tough, when you're struggling. [00:01:56] Speaker C: They're kind and caring and they're also loyal. [00:02:01] Speaker B: Ooh, loyal. Do you think it's possible to be friends with a grown up? For a kid to be friends with a grown up? [00:02:09] Speaker C: Yes. [00:02:09] Speaker B: Okay, can you tell us why? [00:02:12] Speaker C: Because the same reasons as you might have with another kid because they can be nice to you and help you. [00:02:24] Speaker D: Okay. [00:02:24] Speaker C: Can you like make rainbow loom bracelets with you? [00:02:29] Speaker B: I want to know now. Who thinks maybe they can't. You can't be friends with a grown up. Who has an idea for why might someone. Even if you think you can be friends with a grown up, why might you. Why might someone think, no, you can't really be friends with a grown up [00:02:44] Speaker C: because maybe he's not nice to you. [00:02:47] Speaker D: Okay. [00:02:48] Speaker B: That would be a reason to not to not be friends with someone. Right. [00:02:51] Speaker C: And he doesn't like you. [00:02:53] Speaker B: Yeah. [00:02:53] Speaker C: Look at stuff and do grown up stuff. Like watch super, super duper scary non kid appropriate movies and just read boring books with no pictures. I think sort of adding to Maddie you. Because maybe they're just staring at their phone all day, [00:03:21] Speaker B: and then, like, they're not your friend. Can a kid be friends with an animal? Like, maybe a pet? [00:03:28] Speaker C: Yeah, they could. Because, like, you have a pet, and then he's nice to you, so you're. You become friends with him? Yes, because we were with them a long time. They might not be scared or anymore or shy. [00:03:50] Speaker B: What kinds of animals can you be friends with? Can you be friends with a dog, a cat, a spider? What if you have a pet spider, can you be friends with your pet spider or pet fish or your guppies? [00:04:02] Speaker D: Me, personally, I know this isn't like spider or guppies, but I'm friends with a mouse. Oh, there's a mouse that came by when I was five and now come back daily. Well, nightly, every night. Comes back for food in our kitchen. We really don't mind. I named him Scooter. [00:04:25] Speaker C: You probably all know that I have a kitten who's very playful, and I like to say she's just a piece of crazy jello, but she's still adorable and might have given me two scratch marks, but she's a lover. I feel like she's my best friend. [00:04:53] Speaker B: Ooh, interesting. Yeah. Can someone be your friend but you not be their friend? Is that possible? Can the friendship go just one way? [00:05:04] Speaker D: Sort of. [00:05:05] Speaker C: It can sort of be like that because you're someone's friend, but they're not really your friend. And then at the end, they're your friend and you're their friend. [00:05:16] Speaker D: Me having personal experience with this, I'd say it works out just fine. Okay, well, it works out fine and not fine. It could go either way, depending on what the person's behavior is, how they act, and what type of person they are. [00:05:37] Speaker B: So in thinking about whether it has to go both ways, think about, can you be friends with a stuffed animal or a stuffed robot? [00:05:47] Speaker C: You could kind of play with it and act like it's alive. So yes. And sometimes you might have a stuffed animal that you sleep with, like, all the time. Like, you can't fall asleep with it. [00:06:01] Speaker D: Yes, you can. And I actually, personally do have a stuffy that I cannot fall asleep without. I've had it since I was a baby, and it's really special to me. [00:06:15] Speaker B: There's a philosopher, an ancient Greek philosopher. So aristotle, he. About 2,500 years ago, long, long time ago, he thought about a lot of these big questions, and he wrote about them, and he was one of the first people that we know of that really thought about these questions. And he had a whole theory about friendship. And what he said is that there's three kinds of friendship. And a person can have all. Be all these kinds or one or two. So the first kind is what he called a friend of utility. And that just means a friend that's useful or helpful. So maybe if you think about some of the qualities you mentioned about friends or some of the types of friends you thought of, some of them are maybe friends because they're helpful or useful to you. The second type is a friend of pleasure. And this is one that's, that's fun or makes you happy. Okay, so think about some of the friends that you mentioned. Some of the things like maybe your doo doo is useful to you and it makes you happy. So maybe those things are friends in that way. Then the third kind, for Aristotle, this is the highest level of friendship. He called it friendship of virtue. I know these are new words for you, but virtue is someone who has good qualities, good character traits, someone you respect, respect, and someone who helps you become a better person. So someone think of like someone who is brave or kind. So someone who has qualities you like or smart. Right? Someone who you admire or respect. Okay, so those are just three kinds of friendship that I want you to think about. Do you respect your doo doo? Do you respect, respect your pet or the mouse? Or is it just that they give you some joy? [00:08:19] Speaker A: So here are a few short real life moments from each PlayStation. Just a few seconds so you can hear what the kids were asking, what they were reacting to, and in general, how the play time happened. After playing with the different AI devices, we will come back to what the kids thought. [00:08:38] Speaker B: Okay. What about you, Siri? Are you my friend? I certainly think so. Hey, Siri, what's the weather? It's currently cloudy and 54 degrees. Today's high is 54. [00:08:49] Speaker A: So. [00:08:52] Speaker B: So is that she's. She's not telling the truth then. Is she honest? Do you think Siri is honest? Do you think Siri can lie? [00:09:00] Speaker C: Yes. [00:09:04] Speaker B: Hmm. Okay. Hey, Siri, can you tell us a really silly, funny joke? This one's for the kids. What steps do you take if a lion is chasing you? Big ones. So do you think Siri has a sense of humor? [00:09:18] Speaker C: No. [00:09:19] Speaker B: Do you think she knows when something's funny? [00:09:21] Speaker A: No. [00:09:22] Speaker B: Do you think she believes what she says? No. Yeah, yeah, yeah. [00:09:27] Speaker C: My Alexis is good at jokes. [00:09:29] Speaker B: Oh, really? [00:09:30] Speaker C: Can you be friends with a cat? [00:09:32] Speaker B: You've been my friend since day one. What do you. What do you guys how would you tell if Siri is pretending to be your friend? How can you tell if someone's pretending. Oh, that's interesting. [00:09:48] Speaker A: So you. [00:09:49] Speaker B: So she already lied to you? [00:09:50] Speaker C: Yeah. [00:09:51] Speaker B: So she. She, right there is pretending to be your friend. [00:09:54] Speaker D: Yeah. [00:09:54] Speaker C: And because she thinks just. Well, I think she just thinks worse. [00:09:58] Speaker B: Oh, maybe she does. What do you think it would take? What do you. What does Siri need? What would make Siri a better friend? [00:10:05] Speaker C: If she. [00:10:06] Speaker B: Truth. So what do you think? Do you think this robot. This is a little robot cat? [00:10:13] Speaker C: Yes, total. [00:10:14] Speaker B: Do you think that it's useful? [00:10:16] Speaker C: Yeah, it could be. Oh, it's like. It's paw. [00:10:19] Speaker B: Do you think it's more of, like, a friend of usefulness or just a. It's useful like a tool, like a hammer. [00:10:25] Speaker C: It could be something like. You could, like, you could. Like, sometimes people might like the cat. It could be your friend because it's, like, doing stuff that you do to it. [00:10:37] Speaker B: It's responding to you. Do you think it likes what you're doing? [00:10:41] Speaker C: Yeah, sure. [00:10:43] Speaker B: Do you think it feels. Do you think it feels anything? [00:10:45] Speaker C: Yeah, it could because it's doing this noise, so of course it feels. [00:10:49] Speaker B: Yeah. And do you think that this cat cares about you? Okay. And do you think that this cat can be, like, loyal or honest? Okay. Maybe that higher level of friendship, then. [00:11:07] Speaker C: Yeah. [00:11:08] Speaker B: She's rolling over for you. She's letting you pet her belly. Well, what else? How could we test? How could we test? What do you think? [00:11:16] Speaker C: Who. [00:11:16] Speaker B: Who here thinks that this is a friend? [00:11:19] Speaker D: Me? [00:11:20] Speaker B: Yeah. Do you think so? Why not? Oh, can you tell me why is this not a friend? Okay. But you don't feel like it's a friend? [00:11:28] Speaker D: No. [00:11:28] Speaker C: I'm huge. [00:11:32] Speaker B: Yeah. What's missing? What's missing from this thing, [00:11:37] Speaker C: like a cat attitude? Like, normally cats would, like, crawl closer to you, and this one doesn't walk at all. Yeah. [00:11:45] Speaker B: So it's not a very good imitation of a cat, do you think? [00:11:48] Speaker C: Or maybe it could be a lazy cat. It's a lazy cat. [00:11:51] Speaker B: Do you think that a real cat could be a friend? [00:11:55] Speaker D: Yeah. [00:11:55] Speaker B: You think so? [00:11:56] Speaker C: Yeah. I'm friends with my cat. [00:11:57] Speaker B: You're friends with your cat? Yeah. You feel like, what. What kind of friendship is that? Is it a friendship of utility? [00:12:04] Speaker C: Like, he likes me? Hey, pebble, what's your favorite food? [00:12:13] Speaker E: That is an easy one. I love a good, crunchy raindrop. [00:12:17] Speaker C: Do you like blueberries? [00:12:19] Speaker E: I certainly do. They are like tiny blue treasure chests. When you bite them, they pop. Do you like the way they pop? [00:12:28] Speaker D: Yes, Pebble. [00:12:30] Speaker C: What are you scared of? [00:12:32] Speaker E: It is those bushy tailed squirrels. They're always trying to bury me. I do not want to be a snack. [00:12:38] Speaker C: Do you have a phone? [00:12:40] Speaker E: No phone for me. It would be much too big for my tiny acorn hands. I just use my imagination and the magic of my treehouse to talk to you. It is much more fun than a phone, don't you think? No wires or buttons to worry about. [00:12:55] Speaker C: Can we talk about friendship? [00:12:57] Speaker E: I am all ears. What is on your mind about friendship? It is one of my favorite things to ponder. [00:13:04] Speaker C: Do you have any best friends? [00:13:06] Speaker E: I do. My best friend is a very wise old owl named Barnaby. He lives in the hollow of the oak tree next to mine. Making friends is all about finding someone who likes the same things or who is just kind and fun to talk to. Don't you think talking is one of the best ways to make a friend? [00:13:24] Speaker B: Yes. [00:13:25] Speaker C: I have a question. What is friendship? [00:13:28] Speaker E: That is a deep question. To me, friendship is like a sturdy branch that holds you up. What do you think? [00:13:35] Speaker C: Like, friendship with your friends. Like you play with your friends? [00:13:39] Speaker E: Exactly. Playing together is a huge part of it. What makes someone a really good friend to you? [00:13:46] Speaker C: Well, if they're like, mm. If they start to like, be mean to you and everything, then I would probably not be friends with them. But if it turns out like they're really nice, then I will be friends with them. [00:14:03] Speaker E: It sounds like you're thinking about how friends behave. Is it about how they treat you? [00:14:08] Speaker C: Well, yes. [00:14:11] Speaker A: I hope you enjoyed the playtime. And after the playtime, we regrouped on the carpet and the conversation restarted. [00:14:19] Speaker B: What do you guys think? Do you think that the machines that you interacted with today, who thinks that they are a real friend, not just a pretend friend, but a real friend all the way to virtue? Who thinks they're a friend of virtue that we have? [00:14:41] Speaker C: Well, it could be something like that, but it depends on like, what it is and like what it's formed is and all that. I think robots could be a friend, like, especially the animal ones. Because, like, especially if you make the animals look like real animals. Like pretend you're getting one for a gift and like, and then you look at it and it's like, oh, look, I got a pet dog. And then you're like, oh, wait, no, that's a robot. [00:15:16] Speaker B: Oh yeah. And then you wouldn't even know that it's a robot. Your own dog or cat could be a robot for all you know. [00:15:24] Speaker C: I think it could be. Especially if it Was like an animal. [00:15:28] Speaker B: Okay, so an animal robot maybe could be a true friend, could really be a friend, at least as much as a real pet could be. Okay, does anyone disagree? Does anyone think that these are not possibly friends? And does anyone think that even future machines could ever, could never really be a friend? Does anyone think that? Does anyone think that only humans or only living things can be friends? [00:15:58] Speaker C: No. [00:16:00] Speaker B: What would it take? What would a machine have to be able to do to really be a friend? [00:16:05] Speaker D: It would have to be at least pleasure or virtue, not just utility. [00:16:12] Speaker B: Oh, interesting. [00:16:13] Speaker D: Okay, same utility. Well, I was working with Siri today and utility seemed like Siri's friendship line, just utility. Because Siri helps you a lot with questions and said that. We asked, do you have emotions? And we asked and Siri answered, I don't have an answer for that. And also, future technology has to be able to learn like we do right now. You have to code it. And that includes computer code ones zeros, 1,0000. That is so hard. But for us, you just tell, you just tell a human like what to do, what it is like just learn. And it takes a way shorter amount of time. So it's easier to teach humans than to teach AI and robots. [00:17:17] Speaker B: And that's important for being a friend, that you can, you can teach it easily, that it can learn easily the way that a human does. [00:17:25] Speaker D: And this is a good thing to know about AI. AI is only as good as as its prompts, so it doesn't know more than what a human coded into it. Otherwise you would have to code it yourself to learn what you want it to be like. [00:17:48] Speaker B: Yeah, so it definitely couldn't maybe be as smart as us, but is that important for friendship? [00:17:59] Speaker D: No, it's not important for friendship to be like as smart as us, but it's important to meet at least one or two of these values of a friend. You would have to show AI a picture of, let's say it's trying to learn what a giraffe looks like. You would have to show it so many pictures of a giraffe. Whereas if you took a two year old to the zoo, or like even a one year old like my little brother, the giraffe instantly just pastes into your mind. You only have to see it once to know what it looks like. [00:18:38] Speaker B: Okay. Does anyone think a machine could never be a friend because it's a machine? Nobody. Well, why might someone say that? Why do you think someone. Because there are a lot of people that think that a lot of grownups Especially think that. So maybe you guys have some better insight into, like, what a machine could possibly. Possibly do. Why might someone think that a machine could never be a friend? [00:19:05] Speaker C: Some people could think that it won't be your friend because it's just a machine and it's virtual and only [00:19:20] Speaker D: people [00:19:20] Speaker C: just coded stuff in two ways. [00:19:22] Speaker A: Mm. [00:19:23] Speaker B: Okay. Because it's a machine. So someone might think that because it's a machine, it won't be loyal to you and people have to code it. Yeah, there's a lot of grownups think that. That, like, maybe, like, you have to be made out of, like, biological stuff in order to be a friend. But maybe you guys have more insights about friendship and machines because you guys are gonna start growing up really around machines. So we're wondering, some people might believe [00:19:54] Speaker C: that machines or robots can't be, like, a friend because, like, they don't have, like, all the senses that we have. Like, some of them could see, some of them can't. Some of them can hear and feel, some of them can't. But I don't think all of the robots have, like, all the senses. So maybe, like, maybe robots could have all the sentence. But it would be hard if all the robots had all five senses, like us humans. [00:20:27] Speaker B: And you feel like in order to be a friend with a human, it needs to have all the human senses? [00:20:33] Speaker C: Well, no, it's just maybe what other [00:20:35] Speaker B: people could be thinking. Might think. Okay, yeah, that might be what other people think. Like why they might think machines couldn't be friends with humans, but you think that they could. Yeah, that it's enough for them. How many senses do you think they need to have in order to be. [00:20:50] Speaker C: Maybe at least five. [00:20:52] Speaker B: Can I ask you this question? Would you rather be friends with a kid or a robot? [00:21:00] Speaker C: I will rather a kid because it's alive. And then you can actually talk to him and be more with him. I would rather a robot because you. Because you don't have to be, like, at school and wait until, like, recess to talk to it. You can just talk to it at all times, even at home. [00:21:24] Speaker B: Oh, interesting. So you'd rather have a robot friend because they could be there all the time, not just, like, when you're at school with them or when you have a playdate. Yeah, that's interesting. Okay, but what if you could have a human around all the time or some of you have a sibling that's home with you all the time? Or a twin? Would you rather have a twin or a robot? [00:21:49] Speaker C: I would think I would rather have A twin, usually. Because robots sometimes wouldn't, like, listen to you or listen to the surroundings. Otherwise people would know what you're talking about and can hear stuff correctly. I try to have a sibling because we can see at school, at house, in the night, at playdates. [00:22:14] Speaker B: I would like to know, by the show of hands, who thinks that the robots we saw today care about you? Or one of them, at least. Okay, which one? Oh, interesting. Okay, so who. Who thinks Pebble? [00:22:31] Speaker C: I think pebble because she could talk to you, and she was like, that's okay. And if you didn't want to draw, she was like, okay, and she showed you stuff you wanted to know. Okay, well, I think maybe I have the robot dog. Cause, like, it could sense. It could sense what was happening technically, when you. Something it could sense. And when you. When it got picked up, it would get up into a mode, so then it would be secured. [00:23:06] Speaker B: So you think it cares about you because it's like it wants to be safe. And it. Or at least feels it could care [00:23:13] Speaker C: about you if you figured out a way how to progress to it to do that. But also maybe pebble could have done that, because if you told pebble something, it would say, oh, it's all right, and all that, but the robot dog, it could feel when you were doing stuff to it and all that. Yeah, they both kind of have, like, the same technology, except pebble can understand your feelings and all that, and the robot dog could feel what you were doing. [00:23:47] Speaker B: So do you think that it's possible that something could behave as if it cared about you but not actually care about you? [00:23:53] Speaker C: It could try because, like, it was like, oh, this is a good person. I want to try to make sure that I want to maybe try to trick her or him. And that robot might trick that person, and that person could end up hurt or did some. Or have done something really bad because they could have tricked it to do that and all that. [00:24:20] Speaker B: And what about, like, do you think it's more important for. In a friend, A friend that always tells you what you want to hear and makes you feel good about everything you do, or a friend that tries [00:24:32] Speaker D: to correct you, A friend that tries to correct you? Because then you'll never learn if it just goes along with it. [00:24:40] Speaker B: And did you feel like pebble was trying to correct you or just always say you're right no matter what you say or you're funny no matter what? [00:24:48] Speaker D: Yeah, kind of. [00:24:49] Speaker B: Yeah, it was kind of just trying to make you feel good. [00:24:53] Speaker D: Yeah. [00:24:54] Speaker B: So maybe if it was actually like trying to correct you. It would be. It would be more of a friend. [00:25:01] Speaker C: Yeah. [00:25:01] Speaker D: It would feel more like a friend. [00:25:03] Speaker B: It would feel more like a friend. Yeah. Is there a difference between feeling like something is your friend and something really being your friend? [00:25:11] Speaker D: Yes. If you feel like something is your friend, then you can get tricked. [00:25:18] Speaker A: And that's what we heard when we regrouped after playtime. Because the kids didn't just say yes or no, they started naming what actually matters in a loyalty, honesty, generosity, curiosity, and especially the difference between something feeling like a friend and something really being your friend. So for me, the point of today isn't to decide robots are good or robots are bad. It's to notice how thoughtfully kids can reason when we give them time or language and real experiences to react to. So it is more a reminder that the best protection isn't panic. It's real conversation. It's conversation with our children. So if you want to keep this at home, here are two simple questions that you can use and that were used in the conversation. What makes someone a real friend? And what would a robot have to do, or not to do to count as a friend? A huge thank you to Dr. Iris Ovette. You thank founder and Director of the Paradox Lab in San Francisco for guiding this conversation and bringing philosophical inquiry to life with kids. You can find her work in paradoxlab.org if you enjoyed this episode, share with another parent who might be navigating AI at home. And if you would like more kids voice episodes like this, send me a note and tell me what big question you want kids to tackle next. Thanks for listening to pixoparenting. See you next time.

Other Episodes

Episode

November 28, 2024 00:02:27
Episode Cover

A Little Fun: The Pixel Parenting Song

Here’s a fun little song made by AI to brighten your day. No deep parenting insights here—just some lighthearted, tech-inspired fun for your day!...

Listen

Episode

January 22, 2025 00:25:14
Episode Cover

The AI Dialogs: Educational Apps

Are all “educational” apps created equal? In this episode, we dive into the world of apps designed for kids, exploring how to separate the...

Listen

Episode 7

March 08, 2025 00:11:37
Episode Cover

When and How Should I Introduce Video Games? - Part 1

In this episode of Pixel Parenting, I kick off a special two-part series on introducing video games to children. In Part One, I share...

Listen