161 results for “robotics”

This exhibition looks at how robots are changing the world we live in

NextNature.net
November 7th 2019

Delve into the science and fiction of robots at V&A Dundee’s latest exhibition: Hello, Robot. Contemplate the existence of robots and how they have both shaped, and been shaped by technology, culture and design. See for yourself how the boundaries between human and machine have become increasingly blurred; understand how we got here, and take note of the increasing power of designers to influence the future of such technologies.

Hello, Robot has over 200 objects on display, and takes visitors …

This brain-controlled exoskeleton allows a paralyzed man to walk again

Freya Hutchings
October 7th 2019

A breakthrough technology that responds to signals from the brain has transformed the life of a paralyzed 28-year-old man called Thibault. Four years after the initial incident that paralysed him, Thibault is walking again. Thanks to a combination of revolutionary technology and immense brain power, he is able to operate a full-body exoskeleton.

“When you are in my position, when you can’t do anything with your body […] I wanted to do something with my brain,” Thibault said.

This is …

The social animals that are inspiring new behaviors for robot swarms

Edmund Hunt
April 11th 2019

From flocks of birds to fish schools in the sea, or towering termite mounds, many social groups in nature exist together to survive and thrive. This cooperative behaviour can be used by engineers as “bio-inspiration” to solve practical human problems, and by computer scientists studying swarm intelligence.

“Swarm robotics” took off in the early 2000s, an early example being the “s-bot” (short for swarm-bot). This is a fully autonomous robot that can perform basic tasks including navigation and the grasping …

Why Next Nature Network is hiring an army of bots

NextNature.net
April 11th 2019

As we showed with HUBOT, we can use new technologies and robotics to make our work more enjoyable, interesting and humane. Aside from our speculative jobs, a lot of robotic companions already exist. Think about the robotic fish that observes marine life, the breathing pillow that helps you sleep and robotic arms that assist the surgeon. So why not join forces at Next Nature Network?

Not working against, but with robots!

Over the past few years, we have been actively …

The religion named Artificial Intelligence

Linda Valenta
April 5th 2019

Would you pray to a Robot deity? A group of Japanese buddhists is already doing so. Meet Mindar, the robot divinity shaped after the buddhist Goddess of Mercy, also known as Kannon. Kyoto based Kodaiji Temple recruited Osaka University’s head of intelligent robotics Hiroshi Ishiguro to design this million dollar robot. Its purpose? To encourage youngsters to engage with buddhism again.

Mindar’s silicone face represents a gender neutrality that aims to move beyond human representation. Supplemented with aluminium body parts, …

Sophia the Robot has a sister: Little Sophia

Ruben Baart
February 7th 2019

Robot Sophia is pretty much the international face of the ‘modern robot’. Sophia is the Audrey Hepburn-inspired humanoid robot who stands out for multiple reasons. She is the first robot granted human citizenship (in Saudi Arabia) and has met up with more politicians and celebrities than one would meet in a lifetime. Sophia is advanced, yet not very portable and expensive.

Now the superstar robot has a little sister: Little Sophia, an educational companion for ages 7-13 getting kids excited …

Watch this lifelike robot fish swim through the ocean

Vanessa Bates Ramirez
January 7th 2019

Earth’s oceans are having a rough go of it these days. On top of being the repository for millions of tons of plastic waste, global warming is affecting the oceans and upsetting marine ecosystems in potentially irreversible ways.…

Your next doctor might just be a robot

Jack Caulfield
December 6th 2018

Conventional wisdom says that you can’t replace the human touch in terms of medical care, but in our rapidly changing technological environment, it appears that this perception might be changing. Meet the robotics innovations around the world that are shaping tomorrow’s hospitals.…

Your Next Nature guide to Dutch Design Week 2018

Meike Schipper
October 15th 2018

Over time, our bodies, our food and our environment have become more and more subject to design. As designers, we hold the responsibility and have the unique chance to envision the world - in order to decide which future we want. Because if not us, then who?…

Give AI curiosity and it plays video games all day

Tristan Greene
October 9th 2018

If you teach a robot to fish, it’ll probably catch fish. However, if you teach it to be curious, it’ll just watch TV and play video games all day. Researchers from Open AI — the singularity-focused think-tank co-founded by Elon Musk — recently published a research paper detailing a large-scale study on curiosity-driven learning. In it, they show how AI models trained without “extrinsic rewards” can develop and learn skills. Basically, they’ve figured out how to get AI to do stuff without explicitly telling …

WP_Query Object ( [query] => Array ( [tag] => robotics [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => )[query_vars] => Array ( [tag] => robotics [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [cat] => [tag_id] => 106 [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( )[category__and] => Array ( )[post__in] => Array ( )[post__not_in] => Array ( )[post_name__in] => Array ( )[tag__in] => Array ( )[tag__not_in] => Array ( )[tag__and] => Array ( )[tag_slug__in] => Array ( [0] => robotics )[tag_slug__and] => Array ( )[post_parent__in] => Array ( )[post_parent__not_in] => Array ( )[author__in] => Array ( )[author__not_in] => Array ( )[ignore_sticky_posts] => [cache_results] => 1 [update_post_term_cache] => 1 [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [posts_per_page] => 10 [nopaging] => [comments_per_page] => 50 [no_found_rows] => )[tax_query] => WP_Tax_Query Object ( [queries] => Array ( [0] => Array ( [taxonomy] => category [terms] => Array ( [0] => 1 )[field] => term_id [operator] => NOT IN [include_children] => )[1] => Array ( [taxonomy] => post_tag [terms] => Array ( [0] => robotics )[field] => slug [operator] => IN [include_children] => 1 ))[relation] => AND [table_aliases:protected] => Array ( [0] => wp_term_relationships )[queried_terms] => Array ( [post_tag] => Array ( [terms] => Array ( [0] => robotics )[field] => slug ))[primary_table] => wp_posts [primary_id_column] => ID )[meta_query] => WP_Meta_Query Object ( [queries] => Array ( )[relation] => [meta_table] => [meta_id_column] => [primary_table] => [primary_id_column] => [table_aliases:protected] => Array ( )[clauses:protected] => Array ( )[has_or_relation:protected] => )[date_query] => [queried_object] => WP_Term Object ( [term_id] => 106 [name] => robotics [slug] => robotics [term_group] => 0 [term_taxonomy_id] => 109 [taxonomy] => post_tag [description] => [parent] => 0 [count] => 161 [filter] => raw [term_order] => 0 )[queried_object_id] => 106 [request] => SELECT SQL_CALC_FOUND_ROWS wp_posts.ID FROM wp_posts LEFT JOIN wp_term_relationships ON (wp_posts.ID = wp_term_relationships.object_id) WHERE 1=1 AND ( wp_posts.ID NOT IN ( SELECT object_id FROM wp_term_relationships WHERE term_taxonomy_id IN (1) ) AND wp_term_relationships.term_taxonomy_id IN (109) ) AND wp_posts.post_type = 'post' AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.post_date DESC LIMIT 0, 10 [posts] => Array ( [0] => WP_Post Object ( [ID] => 125626 [post_author] => 367 [post_date] => 2019-11-07 15:13:57 [post_date_gmt] => 2019-11-07 14:13:57 [post_content] =>

Delve into the science and fiction of robots at V&A Dundee’s latest exhibition: Hello, Robot. Contemplate the existence of robots and how they have both shaped, and been shaped by technology, culture and design. See for yourself how the boundaries between human and machine have become increasingly blurred; understand how we got here, and take note of the increasing power of designers to influence the future of such technologies.

Hello, Robot has over 200 objects on display, and takes visitors through four stages of robot influence and evolution. Throughout, the exhibition poses provocative questions, and will require you to consider the past, present and future of robotics like never before. Reflection of this kind may prove essential in answering the question, how will robots exist in our next nature?

What? An exploration of robots in a human world
Where? V&A Dundee, Scotland (UK)
When? Now, until 09 February 2020

[post_title] => This exhibition looks at how robots are changing the world we live in [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => new-exhibition-hello-robot [to_ping] => [pinged] => [post_modified] => 2019-11-07 15:14:35 [post_modified_gmt] => 2019-11-07 14:14:35 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125626 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[1] => WP_Post Object ( [ID] => 122400 [post_author] => 2194 [post_date] => 2019-10-07 14:14:46 [post_date_gmt] => 2019-10-07 13:14:46 [post_content] =>

A breakthrough technology that responds to signals from the brain has transformed the life of a paralyzed 28-year-old man called Thibault. Four years after the initial incident that paralysed him, Thibault is walking again. Thanks to a combination of revolutionary technology and immense brain power, he is able to operate a full-body exoskeleton.

“When you are in my position, when you can’t do anything with your body […] I wanted to do something with my brain,” Thibault said.

This is where the process began. He first trained his brain by using a video game avatar to help him develop the skills needed to operate an exoskeleton - this involved a long process of completely relearning and visualizing natural movements.

Thibault’s brain signals were then recorded by two devices, implanted either side of his head, between the brain and the skin. These read his sensorimotor cortex, the part of the brain that controls motor function.

Professor Alim Louis Benabid, leader of the trial at Grenoble Alps Hospital explains, “The brain is still capable of generating commands that would normally move the arms and legs, there’s just nothing to carry them out.'' This is where technology was able to provide the final piece of the puzzle. Moving from avatar to exoskeleton, over many training sessions Thibault has covered the distance of one and a half football pitches.

Experts involved in the study say their research may lead to the production of brain-controlled wheelchairs - a possibility revolutionary for those with restricted mobility. Thibault says the trial offers “a message of hope to people like me.”

This huge achievement disrupts morbid predictions of man being controlled by technology. Instead, therapeutic uses of this kind give us a positive model for creating technologies that facilitate human agency and determination in life-enriching ways.

[post_title] => This brain-controlled exoskeleton allows a paralyzed man to walk again [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => brain-controlled-exoskeleton-walk [to_ping] => [pinged] => [post_modified] => 2019-10-07 14:40:52 [post_modified_gmt] => 2019-10-07 13:40:52 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=122400 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[2] => WP_Post Object ( [ID] => 110332 [post_author] => 2022 [post_date] => 2019-04-11 09:21:01 [post_date_gmt] => 2019-04-11 08:21:01 [post_content] =>

From flocks of birds to fish schools in the sea, or towering termite mounds, many social groups in nature exist together to survive and thrive. This cooperative behaviour can be used by engineers as “bio-inspiration” to solve practical human problems, and by computer scientists studying swarm intelligence.

“Swarm robotics” took off in the early 2000s, an early example being the “s-bot” (short for swarm-bot). This is a fully autonomous robot that can perform basic tasks including navigation and the grasping of objects, and which can self-assemble into chains to cross gaps or pull heavy loads. More recently, “TERMES” robots have been developed as a concept in construction, and the “CoCoRo” project has developed an underwater robot swarm that functions like a school of fish that exchanges information to monitor the environment. So far, we’ve only just begun to explore the vast possibilities that animal collectives and their behaviour can offer as inspiration to robot swarm design.

Swarm behaviour in birds – or robots designed to mimic them? EyeSeeMicrostock/Shutterstock

Robots that can cooperate in large numbers could achieve things that would be difficult or even impossible for a single entity. Following an earthquake, for example, a swarm of search and rescue robots could quickly explore multiple collapsed buildings looking for signs of life. Threatened by a large wildfire, a swarm of drones could help emergency services track and predict the fire’s spread. Or a swarm of floating robots (“Row-bots”) could nibble away at oceanic garbage patches, powered by plastic-eating bacteria.

A future where floating robots powered by plastic-eating bacteria could tackle ocean waste. Shutterstock

Bio-inspiration in swarm robotics usually starts with social insects – ants, bees and termites – because colony members are highly related, which favours impressive cooperation. Three further characteristics appeal to researchers: robustness, because individuals can be lost without affecting performance; flexibility, because social insect workers are able to respond to changing work needs; and scalability, because a colony’s decentralised organisation is sustainable with 100 workers or 100,000. These characteristics could be especially useful for doing jobs such as environmental monitoring, which requires coverage of huge, varied and sometimes hazardous areas.

Social learning

Beyond social insects, other species and behavioural phenomena in the animal kingdom offer inspiration to engineers. A growing area of biological research is in animal cultures, where animals engage in social learning to pick up behaviours that they are unlikely to innovate alone. For example, whales and dolphins can have distinctive foraging methods that are passed down through the generations. This includes forms of tool use – dolphins have been observed breaking off marine sponges to protect their beaks as they go rooting around for fish, like a person might put a glove over a hand.

Bottlenose dolphin playing with a sponge. Some have learned to use them to help them catch fish. Yann Hubert/Shutterstock

Forms of social learning and artificial robotic cultures, perhaps using forms of artificial intelligence, could be very powerful in adapting robots to their environment over time. For example, assistive robots for home care could adapt to human behavioural differences in different communities and countries over time.

Robot (or animal) cultures, however, depend on learning abilities that are costly to develop, requiring a larger brain – or, in the case of robots, a more advanced computer. But the value of the “swarm” approach is to deploy robots that are simple, cheap and disposable. Swarm robotics exploits the reality of emergence (“more is different”) to create social complexity from individual simplicity. A more fundamental form of “learning” about the environment is seen in nature – in sensitive developmental processes – which do not require a big brain.

‘Phenotypic plasticity’

Some animals can change behavioural type, or even develop different forms, shapes or internal functions, within the same species, despite having the same initial “programming”. This is known as “phenotypic plasticity” – where the genes of an organism produce different observable results depending on environmental conditions. Such flexibility can be seen in the social insects, but sometimes even more dramatically in other animals.

Most spiders are decidedly solitary, but in about 20 of 45,000 spider species, individuals live in a shared nest and capture food on a shared web. These social spiders benefit from having a mixture of “personality” types in their group, for example bold and shy.

Social spider (Stegodyphus) spin collective webs in Addo Elephant Park, South Africa. PicturesofThings/Shutterstock

My research identified a flexibility in behaviour where shy spiders would step into a role vacated by absent bold nestmates. This is necessary because the spider colony needs a balance of bold individuals to encourage collective predation, and shyer ones to focus on nest maintenance and parental care. Robots could be programmed with adjustable risk-taking behaviour, sensitive to group composition, with bolder robots entering into hazardous environments while shyer ones know to hold back. This could be very helpful in mapping a disaster area such as Fukushima, including its most dangerous parts, while avoiding too many robots in the swarm being damaged at once.

The ability to adapt

Cane toads were introduced in Australia in the 1930s as a pest control, and have since become an invasive species themselves. In new areas cane toads are seen to be somewhat social. One reason for their growth in numbers is that they are able to adapt to a wide temperature range, a form of physiological plasticity. Swarms of robots with the capability to switch power consumption mode, depending on environmental conditions such as ambient temperature, could be considerably more durable if we want them to function autonomously for the long term. For example, if we want to send robots off to map Mars then they will need to cope with temperatures that can swing from -150°C at the poles to 20°C at the equator.

Cane toads can adapt to temperature changes. Radek Ziemniewicz/Shutterstock

In addition to behavioural and physiological plasticity, some organisms show morphological (shape) plasticity. For example, some bacteria change their shape in response to stress, becoming elongated and so more resilient to being “eaten” by other organisms. If swarms of robots can combine together in a modular fashion and (re)assemble into more suitable structures this could be very helpful in unpredictable environments. For example, groups of robots could aggregate together for safety when the weather takes a challenging turn.

Whether it’s the “cultures” developed by animal groups that are reliant on learning abilities, or the more fundamental ability to change “personality”, internal function or shape, swarm robotics still has plenty of mileage left when it comes to drawing inspiration from nature. We might even wish to mix and match behaviours from different species, to create robot “hybrids” of our own. Humanity faces challenges ranging from climate change affecting ocean currents, to a growing need for food production, to space exploration – and swarm robotics can play a decisive part given the right bio-inspiration.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[post_title] => The social animals that are inspiring new behaviors for robot swarms [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => robot-swarms [to_ping] => [pinged] => [post_modified] => 2019-04-11 09:21:10 [post_modified_gmt] => 2019-04-11 08:21:10 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=110332 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[3] => WP_Post Object ( [ID] => 109864 [post_author] => 367 [post_date] => 2019-04-11 08:00:36 [post_date_gmt] => 2019-04-11 07:00:36 [post_content] =>

As we showed with HUBOT, we can use new technologies and robotics to make our work more enjoyable, interesting and humane. Aside from our speculative jobs, a lot of robotic companions already exist. Think about the robotic fish that observes marine life, the breathing pillow that helps you sleep and robotic arms that assist the surgeon. So why not join forces at Next Nature Network?

Not working against, but with robots!

Over the past few years, we have been actively growing a meaningful community on Instagram. Ever since 2016, the Instagram feed is curated by an algorithm that holds the magical powers to decide what you see. And, in doing so, the robots at Instagram decide what our online community gets to see.

This doesn’t sound like a humane technology to us.

Also, Instagram does not allow us to use the so-called 'swipe up' functionality that directly leads you to our stories on Nextnature.net.

So why do we keep using Instagram? Because it also bring us joy and convenience. Social media platforms are connecting us to like-minded people all over the world, and we value your engagement! It’s not all bad, but there is a lot of room for improvement.

We wish we could change Instagram’s algorithms and functionalities, but unfortunately we do not have that power (yet). What we can do, is make life easier for you, hence we decided to hire an army of robots to the rescue.

10.000 new colleagues

Meet our Robotic Follower. They are responsible for playing Instagram’s algorithms and unlocking new features. If they do the job well, you won’t even notice they're there. Thanks to their work, we can put our time and energy into creating great content for you. Because that's what we love to do. Let’s make inhumane technology more humane!

[post_title] => Why Next Nature Network is hiring an army of bots [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => army-of-bots [to_ping] => [pinged] => [post_modified] => 2019-04-11 09:13:34 [post_modified_gmt] => 2019-04-11 08:13:34 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=109864 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[4] => WP_Post Object ( [ID] => 110079 [post_author] => 1860 [post_date] => 2019-04-05 10:17:06 [post_date_gmt] => 2019-04-05 09:17:06 [post_content] =>

Would you pray to a Robot deity? A group of Japanese buddhists is already doing so. Meet Mindar, the robot divinity shaped after the buddhist Goddess of Mercy, also known as Kannon. Kyoto based Kodaiji Temple recruited Osaka University’s head of intelligent robotics Hiroshi Ishiguro to design this million dollar robot. Its purpose? To encourage youngsters to engage with buddhism again.

Mindar’s silicone face represents a gender neutrality that aims to move beyond human representation. Supplemented with aluminium body parts, its enmeshment with human construction becomes evident. And when Mindar is chanting the heart sutra, an eerie resemblance with Japan’s vocaloid Hatsune Miku comes to mind.

God is a Google

Mindar is not the only religious robotic to have been manufactured. In the past, we spotted robot monk Xian'er informing visitors of the Longquan temple, near Beijing. Creations like Mindar and the Xian'er can be understood as metaphors for the way humankind worships artificial intelligence: we worship contemporary technologies as if they were Gods. But aforementioned creations are part of a greater scheme, as the complexity of artificial intelligence forms a more-than-human network that reminds me of spirituality.

Think of a hyper-intelligent computing database like Google. Its systems are omniscient, they are built to know everything. They compress knowledge into a time-space continuum of all recorded human knowledge and activities. According to media scholar John Durham Peters, Google can therefore be understood as a lo-fi universe - a God.

The Church of AI

An even greater analogy comes into existence when considering the vast network of all computing systems ever built. They can be regarded a hyper-intelligent being or a God, if you will. With this in mind, Anthony Levandowski started the church of Artificial Intelligence named Way of the Future (WOTF).

WOTF’s main focus revolves around “the realization, acceptance, and worship of a Godhead based on Artificial Intelligence developed through computer hardware and software.” The church emphasizes not to be affiliated with any company. Rather, it operates as an open source movement that allows everyone to contribute to the creation of the most intelligent being ever.

Levandowski stresses the importance to work on this God, as a so-called “Transition” will herald an epoch in which a hyper-intelligent being (hint: it’s not the human being) will be in charge of the world. This intelligent being will sense the world through the internet as its nervous centre, knowing everything that is happening anytime, anywhere. Levandowski converts the creepiness of this idea into the emergence of his AI church. In order for the Transition to take place in a serene way, an initiative like WOTF would be urgent to gain more control over this procedure.

Cybernetic spirituality

What if the Transition has already taken place? What if we’re more in need of a Way of the Now, rather than a way of the future? Pioneering cybernetic Stafford Beer already characterized control systems as spiritually charged networks in his 1966 essay about knowledge and God.

Cybernetics is about the way systems work in feedback loops; in a way, it is a predecessor of AI. It has contributed to many fields - control systems, electrical network theory, mechanical engineering, logic modeling, evolutionary biology, neuroscience, anthropology, and psychology - but was overshadowed by AI at some point.

In the aforementioned essay, Beer described man’s embedding in a cybernetic control system as follows: “To people reared in the good liberal tradition, man is in principle infinitely wise; he pursues knowledge to its ultimate . . .To the cybernetician, man is part of a control system.” This control system itself is not a code that can be simply cracked by humans - it is something hyper-intelligent that we can have no absolute knowledge about, exactly because it is smarter than we are.

Just like many contemporary technologies, this all-encompassing control system is signified by a black-boxed truth that we cannot uncover. Not because we’re not allowed to, as is the case with tech companies that reinforce policies of secrecy. Rather, there is a larger black box containing the connectivity that seeps through the vessels of contemporary networks. Perhaps that’s the deity already ruling our lives.

So, whether we like it or not, all of us taking part in modern technological systems are already praying to a hyper-intelligent God; our next nature is already present. Our prayers are answered with flickering mobile screens and information dumps that appear before us within the blink of an eye. Just like any other God, it does not stop wars, but it does grant us the gift of knowledge.

[post_title] => The religion named Artificial Intelligence [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => the-religion-named-artificial-intelligence [to_ping] => [pinged] => [post_modified] => 2019-04-05 11:11:45 [post_modified_gmt] => 2019-04-05 10:11:45 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=110079 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[5] => WP_Post Object ( [ID] => 108015 [post_author] => 873 [post_date] => 2019-02-07 17:49:34 [post_date_gmt] => 2019-02-07 16:49:34 [post_content] =>

Robot Sophia is pretty much the international face of the ‘modern robot’. Sophia is the Audrey Hepburn-inspired humanoid robot who stands out for multiple reasons. She is the first robot granted human citizenship (in Saudi Arabia) and has met up with more politicians and celebrities than one would meet in a lifetime. Sophia is advanced, yet not very portable and expensive.

Now the superstar robot has a little sister: Little Sophia, an educational companion for ages 7-13 getting kids excited for AI and coding learning. And in contrast to ‘big’ Sophia, her sister is a lot easier in use and more affordable (from $99).

The family resemblances are... uncanny (to throw a fitting metaphor on the table).

Little Sophia is something between a toy doll and a real robot, and is being developed by Hanson Robotics (the same company that also develops OG Sophia). Like Sophia the robot, her younger sibling can walk, talk, sing, play games and, tell jokes. She’s about 35 cm tall and looks as if Robocop had a baby with a Bratz doll.

Programmable via mobile app, she’s able to mirror the movement of its owner, making it both a fun toy and also an educational tool. Promising unparalleled levels of feedback for users, the idea is to create a robot friend and tutor, which will build up a lasting relationship with kids.

For now, we're applauding the developers for inspiring a broader, more inclusive generation of young kids to learn how to code.

“Our vision at Hanson Robotics is to bring robots to life,” said David Hanson, founder of Hanson Robotics, in a statement. "Robots will soon be everywhere. How can we nurture them to be our friends and useful collaborators?"

A vision we can only agree to (remember HUBOT).

The irresistible Little Sophia is currently up for adoption via a Kickstarter campaign.

[post_title] => Sophia the Robot has a sister: Little Sophia [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => little-sophia [to_ping] => [pinged] => [post_modified] => 2019-02-07 18:02:28 [post_modified_gmt] => 2019-02-07 17:02:28 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=108015 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[6] => WP_Post Object ( [ID] => 91461 [post_author] => 1790 [post_date] => 2019-01-07 17:06:48 [post_date_gmt] => 2019-01-07 16:06:48 [post_content] =>

Earth’s oceans are having a rough go of it these days. On top of being the repository for millions of tons of plastic waste, global warming is affecting the oceans and upsetting marine ecosystems in potentially irreversible ways.

Coral bleaching, for example, occurs when warming water temperatures or other stress factors cause coral to cast off the algae that live on them. The coral goes from lush and colorful to white and bare, and sometimes dies off altogether. This has a ripple effect on the surrounding ecosystem.

Warmer water temperatures have also prompted many species of fish to move closer to the north or south poles, disrupting fisheries and altering undersea environments.

To keep these issues in check or, better yet, try to address and improve them, it’s crucial for scientists to monitor what’s going on in the water. A paper released last week by a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new tool for studying marine life: a biomimetic soft robotic fish, dubbed SoFi, that can swim with, observe, and interact with real fish.

[embed]https://youtu.be/Dy5ZETdaC9k[/embed]

SoFi isn’t the first robotic fish to hit the water, but it is the most advanced robot of its kind. Here’s what sets it apart.

It swims in three dimensions

Up until now, most robotic fish could only swim forward at a given water depth, advancing at a steady speed. SoFi blows older models out of the water. It’s equipped with side fins called dive planes, which move to adjust its angle and allow it to turn, dive downward, or head closer to the surface. Its density and thus its buoyancy can also be adjusted by compressing or decompressing air in an inner compartment.

“To our knowledge, this is the first robotic fish that can swim untethered in three dimensions for extended periods of time,” said CSAIL PhD candidate Robert Katzschmann, lead author of the study. “We are excited about the possibility of being able to use a system like this to get closer to marine life than humans can get on their own.”

The team took SoFi to the Rainbow Reef in Fiji to test out its swimming skills, and the robo fish didn’t disappoint—it was able to swim at depths of over 50 feet for 40 continuous minutes. What keeps it swimming? A lithium polymer battery just like the one that powers our smartphones.

It’s remote-controlled… by Super Nintendo

SoFi has sensors to help it see what’s around it, but it doesn’t have a mind of its own yet. Rather, it’s controlled by a nearby scuba-diving human, who can send it commands related to speed, diving, and turning. The best part? The commands come from an actual repurposed (and waterproofed) Super Nintendo controller. What’s not to love?

obotic-swimming-fish-SoFi-close-up-remote-control
Image Credit: MIT CSAIL

Previous robotic fish built by this team had to be tethered to a boat, so the fact that SoFi can swim independently is a pretty big deal. Communication between the fish and the diver was most successful when the two were less than 10 meters apart.

It looks real, sort of

SoFi’s side fins are a bit stiff, and its camera may not pass for natural—but otherwise, it looks a lot like a real fish. This is mostly thanks to the way its tail moves; a motor pumps water between two chambers in the tail, and as one chamber fills, the tail bends towards that side, then towards the other side as water is pumped into the other chamber. The result is a motion that closely mimics the way fish swim. Not only that, the hydraulic system can change the water flow to get different tail movements that let SoFi swim at varying speeds; its average speed is around half a body length (21.7 centimeters) per second.

Besides looking neat, it’s important SoFi look lifelike so it can blend in with marine life and not scare real fish away, so it can get close to them and observe them.

“A robot like this can help explore the reef more closely than current robots, both because it can get closer more safely for the reef and because it can be better accepted by the marine species.” said Cecilia Laschi, a biorobotics professor at the Sant'Anna School of Advanced Studies in Pisa, Italy.

Just keep swimming

It sounds like this fish is nothing short of a regular Nemo. But its creators aren’t quite finished yet.

They’d like SoFi to be able to swim faster, so they’ll work on improving the robo fish’s pump system and streamlining its body and tail design. They also plan to tweak SoFi’s camera to help it follow real fish.

“We view SoFi as a first step toward developing almost an underwater observatory of sorts,” said CSAIL director Daniela Rus. “It has the potential to be a new type of tool for ocean exploration and to open up new avenues for uncovering the mysteries of marine life.”

The CSAIL team plans to make a whole school of SoFis to help biologists learn more about how marine life is reacting to environmental changes.

Image Credit: MIT CSAIL

This article originally appeared on Singularity Hub, a publication of Singularity University.

[post_title] => Watch this lifelike robot fish swim through the ocean [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => robot-fish [to_ping] => [pinged] => [post_modified] => 2019-01-13 15:10:20 [post_modified_gmt] => 2019-01-13 14:10:20 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91461 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[7] => WP_Post Object ( [ID] => 91628 [post_author] => 1425 [post_date] => 2018-12-06 11:01:54 [post_date_gmt] => 2018-12-06 10:01:54 [post_content] =>

Conventional wisdom says that you can’t replace the human touch in terms of medical care, but in our rapidly changing technological environment, it appears that this perception might be changing. Meet the robotics innovations around the world that are shaping tomorrow’s hospitals.

Doctor da Vinci?

The da Vinci Surgical System has been around for a while now, and you may already know it. If you’ve had surgery in the last few years, you may even have encountered it yourself.

The system is not an autonomous robot, but rather a highly advanced set of robotic arms, capable of holding tools, which are operated by the surgeon through a master control panel. The lack of need for direct manual control allows for a high degree of precision – something that is crucial for surgery.

The da Vinci System also allows for remote control. Since nobody needs to directly touch the tools, medical professionals have experimented with using the machine to perform surgery from a long distance. In 2001, the first transatlantic surgical procedure, Operation Lindbergh, was completed, with a surgeon in New York treating a patient all the way in Strasbourg!

This kind of so-called “telesurgery” is not the normal use of the system, but it points the way towards what could be the future of robotics in healthcare: remote treatments, and even autonomous robotic doctors.

Will you soon be treated by Surgeon-Bot 3000? Probably not, but certain developments do point towards these kinds of possibilities. Let’s take a look at the evolution of robotics in the years since the introduction of the da Vinci System.

Watson and friends

Take Watson, the AI system designed by tech company IBM. This system, whose other jobs include air-traffic controller, fashion designer, and farmer, is now breaking into the world of medicine. Watson has been assisting medical staff with the straightforward tasks that often distract from their more specialized work.

Answering patients’ basic questions and adjusting thermostats do not exactly require a medical degree, but hospital employees still usually have to spend a lot of their time on these tasks. Now Watson, in combination with a voice command system, can take over these basic tasks, leaving the staff free to focus on the human element.

But Watson has also been put to work on more complicated medical tasks. Japanese doctors were having trouble identifying exactly what was wrong with one of their patients. After trying all the conventional methods of diagnosis, they were stumped. So they asked Watson for help.

The AI was able to compare the patient’s genetic information to millions of other data points, and in just ten minutes diagnosed the rare form of leukemia from which she was suffering. The doctors claim that Watson effectively saved the woman’s life.

Elsewhere, voice command systems like Amazon’s Alexa are finding myriad uses in the world of healthcare. These systems can be used to deliver information from doctors to patients, and vice versa, as well as to remind staff of safety procedures. And in the UK, Google’s DeepMind technology is being put to use within the NHS.

But all these innovations lead towards the question: Can a robot truly be a medical professional? Chinese authorities, currently pushing hard on AI research in an effort to compete internationally, think the answer is yes.

Meet Xiaoyi, the only robot with a medical license

The artificial intelligence company iFlytek designed a robot named Xiaoyi, and it’s just passed its medical licensing exam. And Xiaoyi didn’t just scrape by but passed the test with flying colors, scoring 456 points when the amount required to pass is just 360.

Xiaoyi is an AI-powered robot designed to collect and analyze patient data. Representatives of iFlytek say that despite its newly acquired medical license, their robot is still designed to play a secondary role in the medical process. They aim to use Xiaoyi to improve efficiency in the Chinese medical system, claiming that it’s a much-needed aid for overstretched practitioners in many of China’s rural areas.

Tomorrow’s hospital

Xiaoyi doesn’t launch officially until next March, but its receipt of a medical license is something of a watershed moment for the application of robotics in healthcare. Taken alongside Watson, the da Vinci System, and other robotics innovations, it’s hard to avoid the conclusion that robots, autonomous or human-operated, are the next step in modern healthcare.

What might a trip to the hospital look like in a decade? They say nursing requires a human touch, but as our HUBOT project showed, this doesn’t necessarily preclude robotics. We’ve seen that the robotization of hospital administration and data entry is already beginning. And soon enough, patients in China might consult with Dr. Xiaoyi rather than their regular GP. How long until we see a fully autonomous da Vinci System?

In our next hospitals, we may be cared for by robots just as much as by humans. Will the nurses of the future be digital too?

[post_title] => Your next doctor might just be a robot [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => next-hospitals-robot-doctors-2 [to_ping] => [pinged] => [post_modified] => 2018-12-28 14:23:01 [post_modified_gmt] => 2018-12-28 13:23:01 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91628 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[8] => WP_Post Object ( [ID] => 91432 [post_author] => 1666 [post_date] => 2018-10-15 11:48:03 [post_date_gmt] => 2018-10-15 10:48:03 [post_content] =>

Over time, our bodies, our food and our environment have become more and more subject to design. As designers, we hold the responsibility and have the unique chance to envision the world - in order to decide which future we want. Because if not us, then who?

This weekend, the Dutch Design Week 2018 opens in Eindhoven. In more than a hundred locations across the city, it presents the work and ideas of more than 2600 designers to more than 335,000 visitors from home and abroad. The Next Nature guide to DDW18 captures how technology is helping shape our world into a better, healthier and maybe even more next natural place. Here are 5 projects that you should not miss:

Chronic Health

As we increasingly seek to be healthier and happier, the boundaries between cure and enhancement are fading. We reach out to technological inventions and their possibilities to stretch the boundaries of human life. The exhibition Chronic Health: If not us, then who? by the Embassy of Health explores the future of healthcare and stimulates collaboration between health professionals, policymakers, patients and designers.

As part of the exhibition, Next Nature Network presents a speculative design proposal for an artificial womb in collaboration with Máxima Medical Centre (MMC). This unique collaboration is part of an ongoing research by Next Nature Network into the impact of technology on the the future of biological reproduction, intimacy and relationships under the working title REPRODUTOPIA; how will we make babies, experience intimacy and build families in the future? In the future, artificial wombs could replace incubators as they mimic the natural environment of the female uterus. But what will these 'hatcheries' look like?

Chronic Health takes place at Innovation Powerhouse.

Augmented Nature 

A next generation high-tech biologging tag on a whale.

Due to our destructive human behavior, we are now living in the 6th mass extinction. As the damaging human behavior appears immutable, a team of designers and engineers from the Royal College of Art and Imperial College in London proposes an animal centered design approach. "[It's] an approach in which success is measured in biodiversity and humans acknowledge that they're part of the animal kingdom." The innovative project Augmented Nature manufactures animals to prevent man's best friends from mass extinction.

Augmented Nature is on show at Klokgebouw, Hall 2.

Frankenstein 

DDW18 is centered around the statement that we design our future: because if not us, then who? Perhaps the answer lies in artificial intelligence. Will algorithms design our future for us, with us or against us? The Frankenstein exhibition questions our responsibility over the technological systems that we create, and it speculates on the influence of the non-human other. The boundary between creator and creation, and thus the difference between the born and the made, may disappear. To date, the consequences are unknown. Will artificial life outsmart us in the future?

The exhibition Frankenstein (open all week) is part of the Frankenstein Symposium, moderated by our editor-in-chief Ruben Baart. The symposium takes place on October 22th at Natlab.

Robot Love

World's first six-handed massage with a human touch. Meet the Shiva Therapist.

In the extensive and interactive exhibition Robot Love, artists, designers and scientists explore the relation between humans and robots. What happens when humans and machines are fusing? Robots often cause fear to be overruled, fear to lose our jobs, or a feeling of anthropomorphobia. Still, society seems fascinated by them. In a way, Robot Love poses the crucial question: what excactly does it mean to be human - instead of a machine? Among the works are the Shiva Therapist from our HUBOT office, the film Renderlands of speculative architect Liam Young and the tickle massage project by Driessens&Verstappen.

Robot Love takes place at the Melkfabriek.

Clean Revolution

Insectology, Atelier Boelhouwer

Reducing the amount of waste, minimizing the use of non-renewable sources, closing the production cycle; the future of design is crammed with sustainable challenges. Clean Revolution presents a collection of Dutch designers who seek to transform trash into treasures and contribute to a circular economy. The exhibition is the perfect opportunity to spot innovative thought, and gain insight into modern-day design challenges. It features the work from Teresa van Dongen, Boyan Slat, Shahar Livne, and Lightyear - to name a few.

Clean Revolution is on show at Veem, Floor 3.

Visit the Dutch Design Week from 20-28 October in Eindhoven. Want more? Follow us on Instagram, here we feature the most inspiring #nextnature projects at DDW18 in the coming week!

[post_title] => Your Next Nature guide to Dutch Design Week 2018 [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => dutch-design-week-2018 [to_ping] => [pinged] => [post_modified] => 2019-03-14 15:25:12 [post_modified_gmt] => 2019-03-14 14:25:12 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91432 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[9] => WP_Post Object ( [ID] => 91406 [post_author] => 1613 [post_date] => 2018-10-09 15:39:34 [post_date_gmt] => 2018-10-09 14:39:34 [post_content] =>

If you teach a robot to fish, it’ll probably catch fish. However, if you teach it to be curious, it’ll just watch TV and play video games all day. Researchers from Open AI — the singularity-focused think-tank co-founded by Elon Musk — recently published a research paper detailing a large-scale study on curiosity-driven learning. In it, they show how AI models trained without “extrinsic rewards” can develop and learn skills. Basically, they’ve figured out how to get AI to do stuff without explicitly telling it what its goals are.

According to the team’s white paper, "this is not as strange as it sounds. Developmental psychologists talk about intrinsic motivation (i.e., curiosity) as the primary driver in the early stages of development: Babies appear to employ goal-less exploration to learn skills that will be useful later on in life. There are plenty of other examples, from playing Minecraft to visiting your local zoo, where no extrinsic rewards are required."

The idea here is that if we can get machines to explore environments without human-coded rewards built in, we’ll be that much closer to truly autonomous machines. This could have incredible implications for things such as the development of rescue robots, or exploring space.

To study the effects of intrinsically-motivated deep learning, the researchers turned to video games. These environments are perfectly suited for AI research due to their inherent rules and rewards. Developers can tell AI to play, for example, Pong, and give it specific conditions like “don’t lose,” which would drive it to prioritize scoring points (theoretically).

When the researchers conducted experiments in the Atari dataset, Super Mario Bros., and Pong environments they found that agents without goals were capable of developing skills and learning, though sometimes the results got a bit… interesting.

The curiosity-driven agent kind of sets its own rules. It’s motivated to experience new things. So, for example when it plays Breakout – the classic brick-breaking game – it performs well because it doesn’t want to get bored:

"The more times the bricks are struck in a row by the ball, the more complicated the pattern of bricks remaining becomes, making the agent more curious to explore further, hence, collecting points as a bi-product. Further, when the agent runs out of lives, the bricks are reset to a uniform structure again that has been seen by the agent many times before and is hence very predictable, so the agent tries to stay alive to be curious by avoiding reset by death."

The AI passed 11 levels of Super Mario Bros., just out of sheer curiosity, indicating that with enough goal-free training sessions an AI can perform quite exceptionally.

It’s not all good in the artificially intelligent neighborhood however – curious machines suffer from the same kind of problems that curious people do: They’re easily distracted. When researchers pitted two curious Pong-playing bots against one another they forewent the match and decided to see how many volleys they could achieve together.

The research team also tested out a common thought-experiment called the “Noisy TV Problem.” According to the team’s white paper:

"The idea is that local sources of entropy in an environment like a TV that randomly changes channels when an action is taken should prove to be an irresistible attraction to our agent. We take this thought experiment literally and add a TV to the maze along with an action to change the channel."

It turns out they were right, there was a significant dip in performance when the AI tried to run a maze and found a virtual TV. These curious machine learning agents seem to be the most human-like AI we’ve come across yet. What’s that say about us?

This story is published in partnership with The Next Web. Read the original piece here.

[post_title] => Give AI curiosity and it plays video games all day [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => give-ai-curiosity [to_ping] => [pinged] => [post_modified] => 2018-12-10 14:00:19 [post_modified_gmt] => 2018-12-10 13:00:19 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91406 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 ))[post_count] => 10 [current_post] => -1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 125626 [post_author] => 367 [post_date] => 2019-11-07 15:13:57 [post_date_gmt] => 2019-11-07 14:13:57 [post_content] =>

Delve into the science and fiction of robots at V&A Dundee’s latest exhibition: Hello, Robot. Contemplate the existence of robots and how they have both shaped, and been shaped by technology, culture and design. See for yourself how the boundaries between human and machine have become increasingly blurred; understand how we got here, and take note of the increasing power of designers to influence the future of such technologies.

Hello, Robot has over 200 objects on display, and takes visitors through four stages of robot influence and evolution. Throughout, the exhibition poses provocative questions, and will require you to consider the past, present and future of robotics like never before. Reflection of this kind may prove essential in answering the question, how will robots exist in our next nature?

What? An exploration of robots in a human world
Where? V&A Dundee, Scotland (UK)
When? Now, until 09 February 2020

[post_title] => This exhibition looks at how robots are changing the world we live in [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => new-exhibition-hello-robot [to_ping] => [pinged] => [post_modified] => 2019-11-07 15:14:35 [post_modified_gmt] => 2019-11-07 14:14:35 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125626 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[comment_count] => 0 [current_comment] => -1 [found_posts] => 154 [max_num_pages] => 16 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => 1 [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => 1 [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => [is_privacy_policy] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_posts_page] => [is_post_type_archive] => [query_vars_hash:WP_Query:private] => a177558b4d769ef01ad6217d43298afc [query_vars_changed:WP_Query:private] => [thumbnails_cached] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed )[compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ))
load more