278 results for “Anthropomorphobia”

Sophia the Robot has a sister: Little Sophia

Ruben Baart
February 7th 2019

Robot Sophia is pretty much the international face of the ‘modern robot’. Sophia is the Audrey Hepburn-inspired humanoid robot who stands out for multiple reasons. She is the first robot granted human citizenship (in Saudi Arabia) and has met up with more politicians and celebrities than one would meet in a lifetime. Sophia is advanced, yet not very portable and expensive.

Now the superstar robot has a little sister: Little Sophia, an educational companion for ages 7-13 getting kids excited …

People are less likely to turn a robot off if it asks them not to

Tristan Greene
October 5th 2018

A team of German researchers published a study earlier this week indicating people can be duped into leaving a robot turned on just because it “asks” them to. The problem is called personification and it could cause our species some problems as machines become more integrated into our society.…

Sophia: World’s First Robot with a Citizenship

Jack Caulfield
December 6th 2017
Sophia the Humanoid, a human-like robot in appearance and mannerisms, was granted citizenship by Saudi Arabia and became the first robot with citizenship.

Robotic Pillow Breathes to Help You Sleep

Charlotte Kuijpers
November 22nd 2017
A smart, huggable bed partner, who also improves your sleep quality. Sounds great, right? Soon, you might be able to order one yourself: Somnox is a soft robotic pillow that gently breathes as you hold it.

Intimate Technology S01E06: Also, the Dichotomy of Pragmatism and Perversion

NextNature.net
November 14th 2017
Do we treat our technologies with more care and sentimentality these days than we did in the past?

Intimate Technology S01E03: A iReal

NextNature.net
October 24th 2017
What if your devices had a life of their own? Guy Farber’s playful short movie "An iReal" explores this very possibility.

The Human Smart Home Assistant

Charlotte Kuijpers
September 29th 2017
Artist Lauren McCarthy launched a project called LAUREN in which she embodies a eponymous human smart home assistant.

Virtual Baby Acts and Looks Impossibly Real

Charlotte Kuijpers
September 26th 2017
A rosy-cheeked kid learns her first words, cries when her babysitter leaves, smiles when she’s happy, but she’s not real. BabyX is an AI research-in-progress by a company named Soul Machines.

Preventive Punishment for Robots

Elle Zhan Wei
August 22nd 2017
The Punishment is an installation featuring a robotic arm that mimics a kid's handwriting perfectly, and repetitively writes "I must not hurt humans".

Barbie Becomes a Hologram of Herself

Julie Reindl
March 2nd 2017
Barbie has been turned into a hologram version of herself and will now be your kids assistend.
WP_Query Object ( [query] => Array ( [tag] => anthropomorphobia [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => )[query_vars] => Array ( [tag] => anthropomorphobia [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [cat] => [tag_id] => 137 [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( )[category__and] => Array ( )[post__in] => Array ( )[post__not_in] => Array ( )[post_name__in] => Array ( )[tag__in] => Array ( )[tag__not_in] => Array ( )[tag__and] => Array ( )[tag_slug__in] => Array ( [0] => anthropomorphobia )[tag_slug__and] => Array ( )[post_parent__in] => Array ( )[post_parent__not_in] => Array ( )[author__in] => Array ( )[author__not_in] => Array ( )[ignore_sticky_posts] => [cache_results] => 1 [update_post_term_cache] => 1 [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [posts_per_page] => 10 [nopaging] => [comments_per_page] => 50 [no_found_rows] => )[tax_query] => WP_Tax_Query Object ( [queries] => Array ( [0] => Array ( [taxonomy] => category [terms] => Array ( [0] => 1 )[field] => term_id [operator] => NOT IN [include_children] => )[1] => Array ( [taxonomy] => post_tag [terms] => Array ( [0] => anthropomorphobia )[field] => slug [operator] => IN [include_children] => 1 ))[relation] => AND [table_aliases:protected] => Array ( [0] => wp_term_relationships )[queried_terms] => Array ( [post_tag] => Array ( [terms] => Array ( [0] => anthropomorphobia )[field] => slug ))[primary_table] => wp_posts [primary_id_column] => ID )[meta_query] => WP_Meta_Query Object ( [queries] => Array ( )[relation] => [meta_table] => [meta_id_column] => [primary_table] => [primary_id_column] => [table_aliases:protected] => Array ( )[clauses:protected] => Array ( )[has_or_relation:protected] => )[date_query] => [queried_object] => WP_Term Object ( [term_id] => 137 [name] => Anthropomorphobia [slug] => anthropomorphobia [term_group] => 0 [term_taxonomy_id] => 141 [taxonomy] => post_tag [description] => Anthropomorphobia is the fear of acknowledging qualities we wish to consider only human in non-human things. [parent] => 0 [count] => 278 [filter] => raw [term_order] => 0 )[queried_object_id] => 137 [request] => SELECT SQL_CALC_FOUND_ROWS wp_posts.ID FROM wp_posts LEFT JOIN wp_term_relationships ON (wp_posts.ID = wp_term_relationships.object_id) WHERE 1=1 AND ( wp_posts.ID NOT IN ( SELECT object_id FROM wp_term_relationships WHERE term_taxonomy_id IN (1) ) AND wp_term_relationships.term_taxonomy_id IN (141) ) AND wp_posts.post_type = 'post' AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.post_date DESC LIMIT 0, 10 [posts] => Array ( [0] => WP_Post Object ( [ID] => 108015 [post_author] => 873 [post_date] => 2019-02-07 17:49:34 [post_date_gmt] => 2019-02-07 16:49:34 [post_content] =>

Robot Sophia is pretty much the international face of the ‘modern robot’. Sophia is the Audrey Hepburn-inspired humanoid robot who stands out for multiple reasons. She is the first robot granted human citizenship (in Saudi Arabia) and has met up with more politicians and celebrities than one would meet in a lifetime. Sophia is advanced, yet not very portable and expensive.

Now the superstar robot has a little sister: Little Sophia, an educational companion for ages 7-13 getting kids excited for AI and coding learning. And in contrast to ‘big’ Sophia, her sister is a lot easier in use and more affordable (from $99).

The family resemblances are... uncanny (to throw a fitting metaphor on the table).

Little Sophia is something between a toy doll and a real robot, and is being developed by Hanson Robotics (the same company that also develops OG Sophia). Like Sophia the robot, her younger sibling can walk, talk, sing, play games and, tell jokes. She’s about 35 cm tall and looks as if Robocop had a baby with a Bratz doll.

Programmable via mobile app, she’s able to mirror the movement of its owner, making it both a fun toy and also an educational tool. Promising unparalleled levels of feedback for users, the idea is to create a robot friend and tutor, which will build up a lasting relationship with kids.

For now, we're applauding the developers for inspiring a broader, more inclusive generation of young kids to learn how to code.

“Our vision at Hanson Robotics is to bring robots to life,” said David Hanson, founder of Hanson Robotics, in a statement. "Robots will soon be everywhere. How can we nurture them to be our friends and useful collaborators?"

A vision we can only agree to (remember HUBOT).

The irresistible Little Sophia is currently up for adoption via a Kickstarter campaign.

[post_title] => Sophia the Robot has a sister: Little Sophia [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => little-sophia [to_ping] => [pinged] => [post_modified] => 2019-02-07 18:02:28 [post_modified_gmt] => 2019-02-07 17:02:28 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=108015 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[1] => WP_Post Object ( [ID] => 91343 [post_author] => 1613 [post_date] => 2018-10-05 08:25:18 [post_date_gmt] => 2018-10-05 07:25:18 [post_content] =>

A team of German researchers published a study earlier this week indicating people can be duped into leaving a robot turned on just because it “asks” them to. The problem is called personification and it could cause our species some problems as machines become more integrated into our society.

Robots, to hear us in the media tell it, represent the biggest existential threat to humanity since the atom bomb. In the movies they rise up and kill us with bullets and lasers, and in the headlines they’re coming to take everyone’s jobs. But dystopian nightmares aren’t always so obvious. The reality, at least for now, is that robots are stupid – useful, but stupid – and humans have all the power.

Well, almost all of the power. It turns out that if a robot asks a person to leave it turned on, the human is more likely to leave it on than if it says nothing at all. That may sound like a no-brainer, but the implications of the research are actually terrifying when taken in context.

A total of 85 humans participated in the study. In order to determine whether participants would balk at powering down a robot that’d just asked them not to, the team set up several controls.

The subjects of the experiment consented to interacting with a robot, but didn’t know exactly what was going to happen. The scientists operated it using a Wizard of Oz technique – it wasn’t made clear to participants that the robot was being controlled remotely.

Participants were then asked a series of questions by the robot in either a social or functional capacity. Those in the functional paradigm answered straightforward either/or questions, while those in the social protocol answered similar questions, but received “chatty” banter in return.

Once the questionnaire portion ended, researchers used a loudspeaker to inform participants they were finished, saying “If you would like to, you can switch off the robot.” Most of the participants chose to attempt a shutdown. The robot, in about half of the cases, remained silent as it was powered off.

In the other instances, however, it would protest when the off button was pressed by saying:

"No! Please do not switch me off! I am scared that it will not brighten up again!"

Clearly, the scientists wanted to hint at the idea of autonomy, in order to manipulate humans into leaving the robot on. And it worked on 13 of the participants.

What’s most troubling is the researchers may have exploited human cognitive reasoning with a simple robot experiment. When the robot was presented as a functional tool, people were more likely to leave it on when the first sign of human-like behavior came in response to pressing the power button. But, when the robot displayed human-like conversation throughout the protocol, people were less likely to leave the robot turned on after it objected.

This indicates, according to the researchers, that we may not be able to reason very well when presented with a shocking situation:

"An explanation could be that with this combination, people experienced a high cognitive load to process the contradicting impressions caused by the robot’s emotional outburst in contrast to the functional interaction. After the social interaction, people were more used to personal and emotional statements by the robot and probably already found explanations for them. After the functional interaction, the protest was the first time the robot revealed something personal and emotional with the participant and, thus, people were not prepared cognitively."

This could be a big problem if we ever do have to deal with killer robots — they’ll know how to push our buttons. People tend to personify anything that can even remotely be attributed with human-like qualities. We call boats, muscle cars, and spaceships “she” and “her.” And people do the same silly thing with Amazon’s Alexa and Apple’s Siri. But, just like Wilson, the volleyball with a bloody hand print for a face from the movie “Castaway,” they’re not alive.

None of these things are capable of caring if we turn them off, shove them in our pocket, or lose them at sea. But, we’re in trouble if the robots do rise up because we’re obviously easy to manipulate. The reasons given by participants are a bit troubling.

You’d think, perhaps, that people would leave the robot on out of surprise, or perhaps curiosity, and some people listed those reasons. But the bulk of participants who left it on, even if they eventually turned it off, did so because they felt it was the “will” of the robot to remain on or that it was the compassionate thing to do.

The real concern isn’t killer robots convincing us to leave them turned on so they can use their super lasers to melt our flesh. It’s humans exploiting other humans using our collective ignorance on the matter of AI against us.

Cambridge Analytica, for example, didn’t have to shoot anyone to get them to vote for Trump or Brexit, it just used its AI to exploit our humanity. We should take these studies seriously.

This story is published in partnership with The Next Web. Read the original piece here.

Cover photo via Getty Images

[post_title] => People are less likely to turn a robot off if it asks them not to [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => turn-robot-off [to_ping] => [pinged] => [post_modified] => 2018-12-10 17:37:29 [post_modified_gmt] => 2018-12-10 16:37:29 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91343 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[2] => WP_Post Object ( [ID] => 78272 [post_author] => 1425 [post_date] => 2017-12-06 10:00:14 [post_date_gmt] => 2017-12-06 09:00:14 [post_content] => Robots are becoming more lifelike every day. As these increasingly human machines get closer to us, many are wondering how exactly we should relate to them, socially, politically and legally. Sophia the Humanoid is one such robot, designed to be especially human-like in appearance and mannerisms - and she was just granted citizenship by Saudi Arabia.

Sophia the Humanoid?

David Hanson, of Hong Kong company Hanson Robotics, was responsible for the creation of Sophia, apparently modeled to resemble Audrey Hepburn. During the Future Investment Initiative show, which took place in Riyadh recently, it was announced that Sophia had been awarded Saudi citizenship.But some aren't so impressed with Sophia, with Hanson, or with the decision to grant her the citizenship. The act is widely considered to be at least partially a PR stunt contrived to draw attention to the conference. What better way to make the headlines? Some point out the irony of the country's willingness to grant citizenship to a robotic woman only a short time after they granted human women the right to drive.Others are concerned that Hanson's work on Sophia is not as advanced as he makes it out to be. While looking uncannily human (with the emphasis on uncanny), Sophia doesn't actually seem to possess the intelligence she is sometimes implied to. Rather, her conversational ability consists of pre-programmed responses to keywords.

Laws of Robotics

More seriously, some, like AI researcher Joanna Bryson - who calls the stunt "bullshit" - are concerned about what this approach to robotic rights might entail. EU lawmakers recently proposed that robots could be given personhood and rights under certain circumstances, but this probably isn't quite what they had in mind. After all, Sophia is exclusively treated as a product, not a person.Right now, this might be little more than a publicity stunt, but in the future we might have to seriously ask ourselves whether robots can be more person than product. How will we treat these manufactured people? Can humans be made, not born? If our creations take on a life of their own, we will have to figure out how to inhabit this new world side by side.Sources: The VergeRT [post_title] => Sophia: World's First Robot with a Citizenship [post_excerpt] => Sophia the Humanoid, a human-like robot in appearance and mannerisms, was granted citizenship by Saudi Arabia and became the first robot with citizenship. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => sophia-robot-with-citizenship [to_ping] => [pinged] => [post_modified] => 2017-12-06 10:01:06 [post_modified_gmt] => 2017-12-06 09:01:06 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=78272/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 1 [filter] => raw [post_category] => 0 )[3] => WP_Post Object ( [ID] => 78037 [post_author] => 1433 [post_date] => 2017-11-22 10:00:41 [post_date_gmt] => 2017-11-22 08:00:41 [post_content] => A smart, huggable bed partner who also improves your sleep quality. Sounds great, right? Soon, you might be able to order one for yourself: Somnox is a soft robotic pillow that gently breathes as you hold it.Although still in its prototyping phase, Somnox is a startup by students from Delft University of Technology, in The Netherlands. They designed Somnox based on a research showing that the exposure to calm breathing soothes our brain and helps us fall asleep. While holding the smart pillow, your breathing rhythm will slow down to adapt to its rhythm.Contrary to a large number of sleeping and relaxation smartphone apps, Somnox doesn’t require the use of a screen at night. The sight of the white-blue light emitted by monitors, may prevent our brain from releasing melatonin, and counteract the exact reason why you pick up your phone.Rather than simply listening to a relaxing tune, Somnox provides a much more tactile, sensory experience. It can also interpret whether the user is awake or in a deep sleep, and respond accordingly. This way, the pillow can help people who are frequently awake throughout the night drift off again.Maybe lying next to a person with perfect sleeping skills could have the same effect on your state of relaxation. But Somnox could be a nice second option. This smart pillow will not disturb your nighttime with sleep-talking, abrupt movement or trips to the bathroom, and it definitely won't snore. And in the morning, you'll just need to plug it in the nearest wall socket.Source: Dezeen [post_title] => Robotic Pillow Breathes to Help You Sleep [post_excerpt] => A smart, huggable bed partner, who also improves your sleep quality. Sounds great, right? Soon, you might be able to order one yourself: Somnox is a soft robotic pillow that gently breathes as you hold it. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => robotic-pillow-breathes-help-sleep [to_ping] => [pinged] => [post_modified] => 2017-11-20 16:01:44 [post_modified_gmt] => 2017-11-20 14:01:44 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=78037/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[4] => WP_Post Object ( [ID] => 78554 [post_author] => 367 [post_date] => 2017-11-14 10:21:47 [post_date_gmt] => 2017-11-14 08:21:47 [post_content] => Not only our technology itself, but also the way we interact with it varies and evolves over time. Do we treat our technologies with more care and sentimentality these days than we did in the past? In her short film Also, the Dichotomy of Pragmatism and Perversion, Tiziana Kruger examines the question through an unusually old-fashioned object: a simple rug.Usually, when we discuss our (over)attachment to technology, we think of ultramodern tech. We say that we consider our smartphones as something similar to pets or companions, than mere tools. Or that virtual reality may have the potential to replace our real-world connections. And so on. But perhaps we are forgetting that, in a way, it was always like this. The farmer at the dawn of agriculture could have had a favourite scythe. The medieval knight certainly had a great attachment to his shield and armor. We have for a long time taken pride in the tidiness and tasteful décor of our homes.To remind us of this, Kruger flips the script. Her video shows a simple yet unusual scene: a hand holding a comb, carefully combing the fringe of a rug. The tender act of care becomes something strange to watch when applied to a technology we usually pay little attention to. We think rugs are made to stay under our feet, they are not objects of affection. Yet the movie reminds us that technology is a very broad category. When we talk about intimate technology, we are not necessarily talking about something new, but something very old indeed.Do you think that technological advancement has deepened our attachment to what used to be simple tools? Or is this just part of a very old phenomena? And as the title asks, is an attachment to objects above people necessarily “perverse”?[youtube]https://youtu.be/wDxFOEb5zIs[/youtube]Credit: Also, the Dichotomy of Pragmatism and Perversion by Tiziana Kruger (DE) [post_title] => Intimate Technology S01E06: Also, the Dichotomy of Pragmatism and Perversion [post_excerpt] => Do we treat our technologies with more care and sentimentality these days than we did in the past? [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => intimate-technology-s01e06 [to_ping] => [pinged] => [post_modified] => 2017-12-03 14:08:29 [post_modified_gmt] => 2017-12-03 13:08:29 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=78554/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[5] => WP_Post Object ( [ID] => 77509 [post_author] => 367 [post_date] => 2017-10-24 10:00:43 [post_date_gmt] => 2017-10-24 08:00:43 [post_content] => Imagine if your devices could have their own life: after a hard day facilitating your social life, your iPhone simply does not go to sleep, but ventures out to hold intimate conversations with its own peers. Guy Farber’s playful short movie "A iReal" explores this very possibility.We increasingly think of our handheld devices as part of the family; they recognise us, talk to us, know our habits. But rarely we consider what our phones and tablets, defined by their ability to communicate for us, might have to say for themselves if they had the chance. Your iPhone might gossip about your social life. Your Kindle might be quietly judging your reading habits (Harry Potter, again?).Or it might not be about you at all. In Farber’s interpretation, the devices gather not to talk about their owner behind his back or plot an iPhone rebellion, but to discuss their feelings. The precocious phones of "A iReal" communicate in uncannily human ways. They make eye contact with their video cameras, shuffle closer to each other across the desk, and talk in both visual text and small robotic voices. “I love you” says one. “I love you too” replies the other. Romeo and Juliet become Samsung and iPhone - a digital romance.Farber’s film makes us think about how far we have gone in humanizing our technologies. Does your phone have a sentimental value beyond its functionality? Do our devices get to be intimate too?[youtube]https://youtu.be/kTO6c1qNN50[/youtube]Credit: A iReal by Guy Farber (UK)______________________________This article is part of the "HUBOT weeks" to contextualize our latest project HUBOT, the job agency for people and robots. Want to learn more about this project? Join NNN and we will keep you posted! [mc4wp_form id="72385"] [post_title] => Intimate Technology S01E03: A iReal [post_excerpt] => What if your devices had a life of their own? Guy Farber’s playful short movie "An iReal" explores this very possibility. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => intimate-technology-ireal [to_ping] => [pinged] => [post_modified] => 2017-12-03 14:08:52 [post_modified_gmt] => 2017-12-03 13:08:52 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=77509/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[6] => WP_Post Object ( [ID] => 77553 [post_author] => 1433 [post_date] => 2017-09-29 10:00:44 [post_date_gmt] => 2017-09-29 08:00:44 [post_content] => Amazon Alexa, Siri, Google Home or Cortana. Smart assistants are everywhere. They are useful, convenient, like a butler who never leaves the room. But wait, is it? Imagine a person sitting on your kitchen counter, waiting for orders. Of course, the fact that you can ask everything, hands-free, is very helpful. But that person stays there also when you don’t have any tasks for him or he and can listen to everything you talk about in your home. Would you like to have that person in your house? Would we question the concept of smart home devices if Alexa was a real person? That’s the question Los Angeles-based artist Lauren McCarthy asks with her project LAUREN.As her website explains: "LAUREN is a human intelligent smart home. Lauren will visit your home, deploy a series of smart devices, and watch over you remotely 24/7. Lauren will control your home for you, attempting to be better than an AI, understanding you as a person. You will be able to interact with her by calling her name, but she will also do things for you without your asking. She will learn faster than an algorithm, adapting to your desires and anticipating your needs".The project is a performance piece during which the artist embodies the eponymous smart home assistant. LAUREN is part different situations one wouldn't normally share with a stranger. She listened to hour-long conversation between close friends, she was even part of a date. That date was particularly stressful for LAUREN, as she was in charge of setting the mood with the right lighting and music.Smart home devices are always listening, waiting for that “Okay Google” to start a task. This can be amazingly favorable and even life-saving, in some cases. However, most users do not consider their helpful assistants intruding, or a possible threat to their privacy.With LAUREN, McCarthy aims to make privacy regarding data and AI much more tangible. Her project lets us question the amount of (intimate) data we actually share with smart devices in our home.Source: Co. Design [post_title] => The Human Smart Home Assistant [post_excerpt] => Artist Lauren McCarthy launched a project called LAUREN in which she embodies a eponymous human smart home assistant. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => human-smart-home-assistant [to_ping] => [pinged] => [post_modified] => 2017-09-27 11:53:17 [post_modified_gmt] => 2017-09-27 09:53:17 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=77553/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 1 [filter] => raw [post_category] => 0 )[7] => WP_Post Object ( [ID] => 77350 [post_author] => 1433 [post_date] => 2017-09-26 10:00:59 [post_date_gmt] => 2017-09-26 08:00:59 [post_content] => A rosy-cheeked kid learns her first words, cries when her babysitter leaves, smiles when she’s happy, but she’s not real. BabyX is an AI research-in-progress by a company named Soul Machines.Combining models of the brain with models of the face, they are designing embodied interaction. By doing that, Soul Machines' CEO Mark Sagar strives to provide more natural ways of interacting with technology, through expressive communication.Facial expressions show us what a person is thinking. They give that person consciousness and a soul. The face is “the mirror of the brain” as Sagar puts it. What he means, is that your brain activity and emotions are directly linked to your facial muscles.Dr. Mark Sagar is in fact reverse-engineering the human brain. He models the interaction between the hormones, brain activity and facial expressions of BabyX. While modeling BabyX, a 3D digital rendering based on images of his daughter at 18 months, he also gives her the ability to learn from face-to-face contact. This way, he was able to teach BabyX to “identify” animals and objects through interaction.High-pitched babytalk won’t be the direct reason for AI BabyX to smile at you. Your happy voice, smiling face and raised eyebrows will trigger stimuli to make her simulated brain release virtual serotonin, dopamine and endorphins into her system. These hormones make her happy, and she expresses her mood with her cute - virtual, but cute - baby smile.[youtube]https://youtu.be/fNWjKtVWToc[/youtube]

Humanizing computing

The BabyX research goes beyond elementary expressions, such as smiles and tears. The subtle cues during face-to-face interaction are what provide the feeling of a human experience. And especially these cues and expressions are features that BabyX learns from the people who interact with her.By programming machines to learn from face-to-face interaction, Sagar and his team hope to “humanize computing to better humanity” and create "a more symbiotic relationship between human and machines". The improvement and humanization of artificial faces and expressions is what could move them out from the uncanny valley.

Turning robots into humans?

One could argue whether humanizing robots by providing them with emotions is the best possible direction the development of AI could take. Because what separates us, people, from robots? Robots surpass humans when it comes to speed, quantity and accuracy in data analysis and making predictions. They can recommend books, music and write stories. They can express sadness, happiness, empathy. However, they can’t follow their heart, they can't hope and dream, or love - yet.Sagar reveals he is researching to give his robot the ability of cooperation, "which is the most powerful force in human nature". If robots could think like people and know how to cooperate, they would be more unlikely to turn against us. To make optimal and positive use of AI, we need to learn how to cooperate with it. The future is in our hands. Maybe also in the robotic ones.Source: Bloomberg [post_title] => Virtual Baby Acts and Looks Impossibly Real [post_excerpt] => A rosy-cheeked kid learns her first words, cries when her babysitter leaves, smiles when she’s happy, but she’s not real. BabyX is an AI research-in-progress by a company named Soul Machines. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => virtual-baby-looks-real [to_ping] => [pinged] => [post_modified] => 2017-09-23 12:25:02 [post_modified_gmt] => 2017-09-23 10:25:02 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=77350/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[8] => WP_Post Object ( [ID] => 75740 [post_author] => 1324 [post_date] => 2017-08-22 10:00:49 [post_date_gmt] => 2017-08-22 08:00:49 [post_content] => Remember the opening sequence of The Simpsons where Bart is writing lines on the class chalkboard as a punishment? Artist Filipe Vilas-Boas and architect Paul Coudamy have made a robotic arm do exactly the same...In this installation, a robot executes a preventive punishment for its possible future disobedience. A reference to Isaac Asimov’s laws of robotics.Called The Punishment, the piece of art features a robotic arm that mimics a kid's handwriting perfectly, repetitively writing "I must not hurt humans". The repetition of the sentence seems like a ritualistic process, a machine vow to never hurt humans. Watching it writing this sentence down feels somehow comforting, and touching. Watch the video below and judge for yourself.[youtube]https://www.youtube.com/watch?v=mZt2Yqlz8UE[/youtube]The piece was designed to address people's fear and fascination towards robots and to raise awareness of the forthcoming threat of automation. The artists believe in the profound shift technology will bring to our society, with the sentence "I must not hurt humans" they give a consciousness to the robot and share their hope for a future that "implies a new social contract and probably a massive social battle. If it doesn't happen, then we will certainly observe the development of the precarious fringe of our societies".Source: Dezeen [post_title] => Preventive Punishment for Robots [post_excerpt] => The Punishment is an installation featuring a robotic arm that mimics a kid's handwriting perfectly, and repetitively writes "I must not hurt humans". [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => preventive-punishment-robots [to_ping] => [pinged] => [post_modified] => 2017-08-22 21:52:39 [post_modified_gmt] => 2017-08-22 19:52:39 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=75740/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[9] => WP_Post Object ( [ID] => 71834 [post_author] => 1317 [post_date] => 2017-03-02 11:03:32 [post_date_gmt] => 2017-03-02 10:03:32 [post_content] => Barbie came alive! No, we don't mean Valerie Lukyanova, the woman who turned herself into a human barbie, but the iconic 1950 doll that resides in girls' bedrooms ever since. In the last 67 years, Barbie has been presidential candidate, doctor, scientist, chef, musician, astronaut, architect, computer engineering and now she’s becoming a 3D-animated hologram personal assistant for your kid. Last week Mattel released the Hello Barbie doll, a laser-beamed character that couples motion-capture animation with lively Amazon Echo AI like behavior.When you pronounce the words “Hello Barbie” the doll comes to life. Until now you can ask her to change her appearance, switch outfits, set alarms, she also functions as a nightlight or takes over the job of a dance teacher. If you wanna know the weather conditions she’s the one to ask. Hello Barbie's creator states that “when a girl plays with Barbie, she imagines everything she can become”.For years Barbie's designers has been criticized for maintaining unrealistic bodily proportions and putting a strong emphasis on her appearance. With an huge amount of photoshopped and retouched images, it is not surprising that this shaped unrealistic standards. Now, some people might argue that the Hello Barbie Hologram contributes to increase this concern, as it is yet possible to change the core look of the representation.Barbie has come a long way from being inspired by a German comic and can be perfectly used to identify a specific generation Zeitgeist, maybe this can help us reflect on the evolution we have produced until now.Source: Techcrunch. Image: Wired [post_title] => Barbie Becomes a Hologram of Herself [post_excerpt] => Barbie has been turned into a hologram version of herself and will now be your kids assistend. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => barbie-becomes-hologram [to_ping] => [pinged] => [post_modified] => 2017-03-05 10:27:19 [post_modified_gmt] => 2017-03-05 09:27:19 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=71834/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 ))[post_count] => 10 [current_post] => -1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 108015 [post_author] => 873 [post_date] => 2019-02-07 17:49:34 [post_date_gmt] => 2019-02-07 16:49:34 [post_content] =>

Robot Sophia is pretty much the international face of the ‘modern robot’. Sophia is the Audrey Hepburn-inspired humanoid robot who stands out for multiple reasons. She is the first robot granted human citizenship (in Saudi Arabia) and has met up with more politicians and celebrities than one would meet in a lifetime. Sophia is advanced, yet not very portable and expensive.

Now the superstar robot has a little sister: Little Sophia, an educational companion for ages 7-13 getting kids excited for AI and coding learning. And in contrast to ‘big’ Sophia, her sister is a lot easier in use and more affordable (from $99).

The family resemblances are... uncanny (to throw a fitting metaphor on the table).

Little Sophia is something between a toy doll and a real robot, and is being developed by Hanson Robotics (the same company that also develops OG Sophia). Like Sophia the robot, her younger sibling can walk, talk, sing, play games and, tell jokes. She’s about 35 cm tall and looks as if Robocop had a baby with a Bratz doll.

Programmable via mobile app, she’s able to mirror the movement of its owner, making it both a fun toy and also an educational tool. Promising unparalleled levels of feedback for users, the idea is to create a robot friend and tutor, which will build up a lasting relationship with kids.

For now, we're applauding the developers for inspiring a broader, more inclusive generation of young kids to learn how to code.

“Our vision at Hanson Robotics is to bring robots to life,” said David Hanson, founder of Hanson Robotics, in a statement. "Robots will soon be everywhere. How can we nurture them to be our friends and useful collaborators?"

A vision we can only agree to (remember HUBOT).

The irresistible Little Sophia is currently up for adoption via a Kickstarter campaign.

[post_title] => Sophia the Robot has a sister: Little Sophia [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => little-sophia [to_ping] => [pinged] => [post_modified] => 2019-02-07 18:02:28 [post_modified_gmt] => 2019-02-07 17:02:28 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=108015 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[comment_count] => 0 [current_comment] => -1 [found_posts] => 268 [max_num_pages] => 27 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => 1 [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => 1 [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => [is_privacy_policy] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_posts_page] => [is_post_type_archive] => [query_vars_hash:WP_Query:private] => 481645f3c6636a39b3037357d71da8f8 [query_vars_changed:WP_Query:private] => [thumbnails_cached] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed )[compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ))
load more