488 results for “Humane Technology”

Five ways AI could make your car as smart as a human passenger

Max Eiza
January 6th 2020

Driving long distances without a passenger can be lonely. If you’ve ever done it, you might have wished for a companion to talk to – someone emotionally intelligent who can understand you and help you on the road. The disembodied voice of SatNav helps to fill the monotonous silence, but it can’t hold a conversation or keep you safe.

Research on driverless cars is well underway, but less is heard about the work being done to make cars a smart …

These speculative ‘Next Senses’ allow you to augment your senses with technology

NextNature.net
December 30th 2019

Imagine you could communicate telepathically with a whale, listen to the WiFi networks in your environment, or experience smells through seeing color. Developments in technology give us the rare opportunity to expand and augment our sensorial capabilities, and relate to other (non-)human life forms in various hybrid forms.

The world we live in changes constantly, but the senses we use to perceive it remain the same. Next Senses explores the unchartered territory of how we could experience the world with …

The beginner’s guide to biohacking

Peter Joosten
November 19th 2019

What is the first thing that comes to mind when you hear the term 'biohacking'? Perhaps you are now thinking of a bunch of kids sitting in their kitchen with a DNA kit, (wannabe) cyborgs inserting subcutaneous chips in their bodies, or perhaps a person striving for optimum performance through a perfect lifestyle. These are all types of biohacking, but there's more to it. Here's what you need to know (and were too afraid to ask).

Bulletproof coffee

The problem …

Should men be able to give birth to children?

NextNature.net
October 30th 2019

Within a few years, it may be possible for premature babies to grow inside an artificial womb. And when that day arrives, should men be able to give birth to children? Should we externalize pregnancy with artificial wombs? And are these feminist dreams or frankenstein nightmares? Welcome to Reprodutopia, a debate on our reproductive futures.

A new narrative

For a long time the birds and the bees served us well to explain where our children come from. Yet radical developments …

Truly smart homes could help dementia patients live independently

Dorothy Monekosso
October 28th 2019

You might already have what’s often called a “smart home”, with your lights or music connected to voice-controlled technology such as Alexa or Siri. But when researchers talk about smart homes, we usually mean technologies that use artificial intelligence to learn your habits and automatically adjust your home in response to them. Perhaps the most obvious example of this are thermostats that learn when you are likely to be home and what temperature you prefer, and adjust themselves accordingly without …

Watch: BBC reports on world’s first artificial womb for humans

NextNature.net
October 21st 2019

Now that the team of researchers at the Eindhoven University of Technology (whom we previously collaborated with to design a prototype for an artificial womb) has been awarded a €2.9 million grant to develop a working prototype of their artificial womb, this breakthrough raises ethical questions about the future of baby making on a global scale.

Therefore the BBC caught up with NNN designer Lisa Mandemaker, as part of their BBC 100 Women of 2019, on what it means to …

Artificial Womb receives €2.9m funding to develop prototype

Freya Hutchings
October 8th 2019

Hooray! The team of researchers at the Eindhoven University of Technology (whom we previously collaborated with to design a prototype for an artificial womb) has been awarded a €2.9 million grant to develop a working prototype of their artificial womb.

Artificial womb: a brief explainer

The artificial womb would provide premature babies with artificial respiration in conditions close to a biological womb. Oxygen and nutrients would be delivered to the baby through an umbilical cord-like tube. Inside, the baby would …

This brain-controlled exoskeleton allows a paralyzed man to walk again

Freya Hutchings
October 7th 2019

A breakthrough technology that responds to signals from the brain has transformed the life of a paralyzed 28-year-old man called Thibault. Four years after the initial incident that paralysed him, Thibault is walking again. Thanks to a combination of revolutionary technology and immense brain power, he is able to operate a full-body exoskeleton.

“When you are in my position, when you can’t do anything with your body […] I wanted to do something with my brain,” Thibault said.

This is …

How technology bridges the generational communication gap

Freya Hutchings
September 28th 2019

Emoji, Skype, Selfies - can these communication technologies close the generation gap? In part, yes! Young people are teaching senior citizens how to use technology, and it’s benefiting both groups. Here’s how.

Sharing knowledge about technology can form both a means and an end for more meaningful connections between the elderly and the young. A growing number of initiatives are recognizing the huge potential of bringing different generations together - from reducing feelings of isolation and boredom amongst the elderly, …

Smart cities could give the visually impaired a new outlook on urban life

Drishty Sobnath and Ikram Ur Rehman
September 17th 2019

Travelling to work, meeting friends for a catch up or just doing some shopping are often taken for granted by people with no known disabilities. For the visually impaired, these seemingly simple things can be a serious challenge.

But imagine a city equipped with technology that enables the visually impaired to recognise people, places or even bank notes, helping them to live more independently whether indoors or in a public place. That’s the promise of so-called smart cities, which use …

WP_Query Object ( [query] => Array ( [tag] => humane-technology [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => )[query_vars] => Array ( [tag] => humane-technology [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [cat] => [tag_id] => 115 [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( )[category__and] => Array ( )[post__in] => Array ( )[post__not_in] => Array ( )[post_name__in] => Array ( )[tag__in] => Array ( )[tag__not_in] => Array ( )[tag__and] => Array ( )[tag_slug__in] => Array ( [0] => humane-technology )[tag_slug__and] => Array ( )[post_parent__in] => Array ( )[post_parent__not_in] => Array ( )[author__in] => Array ( )[author__not_in] => Array ( )[ignore_sticky_posts] => [cache_results] => 1 [update_post_term_cache] => 1 [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [posts_per_page] => 10 [nopaging] => [comments_per_page] => 50 [no_found_rows] => )[tax_query] => WP_Tax_Query Object ( [queries] => Array ( [0] => Array ( [taxonomy] => category [terms] => Array ( [0] => 1 )[field] => term_id [operator] => NOT IN [include_children] => )[1] => Array ( [taxonomy] => post_tag [terms] => Array ( [0] => humane-technology )[field] => slug [operator] => IN [include_children] => 1 ))[relation] => AND [table_aliases:protected] => Array ( [0] => wp_term_relationships )[queried_terms] => Array ( [post_tag] => Array ( [terms] => Array ( [0] => humane-technology )[field] => slug ))[primary_table] => wp_posts [primary_id_column] => ID )[meta_query] => WP_Meta_Query Object ( [queries] => Array ( )[relation] => [meta_table] => [meta_id_column] => [primary_table] => [primary_id_column] => [table_aliases:protected] => Array ( )[clauses:protected] => Array ( )[has_or_relation:protected] => )[date_query] => [queried_object] => WP_Term Object ( [term_id] => 115 [name] => Humane Technology [slug] => humane-technology [term_group] => 0 [term_taxonomy_id] => 118 [taxonomy] => post_tag [description] => Technology that adapts to humans, rather than forcing humans to adapt to it. It works with our bodies, senses, and instincts, and takes human values as a cornerstone of its development. [parent] => 0 [count] => 488 [filter] => raw [term_order] => 0 )[queried_object_id] => 115 [request] => SELECT SQL_CALC_FOUND_ROWS wp_posts.ID FROM wp_posts LEFT JOIN wp_term_relationships ON (wp_posts.ID = wp_term_relationships.object_id) WHERE 1=1 AND ( wp_posts.ID NOT IN ( SELECT object_id FROM wp_term_relationships WHERE term_taxonomy_id IN (1) ) AND wp_term_relationships.term_taxonomy_id IN (118) ) AND wp_posts.post_type = 'post' AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.post_date DESC LIMIT 0, 10 [posts] => Array ( [0] => WP_Post Object ( [ID] => 126405 [post_author] => 2320 [post_date] => 2020-01-06 14:59:04 [post_date_gmt] => 2020-01-06 13:59:04 [post_content] =>

Driving long distances without a passenger can be lonely. If you’ve ever done it, you might have wished for a companion to talk to – someone emotionally intelligent who can understand you and help you on the road. The disembodied voice of SatNav helps to fill the monotonous silence, but it can’t hold a conversation or keep you safe.

Research on driverless cars is well underway, but less is heard about the work being done to make cars a smart companion for drivers. In the future, the cars still driven by humans are likely to become as sensitive and attentive to their driver’s needs as another person. Sound far-fetched? It’s closer than you might think.

1. Ask your car questions

We’re already familiar with AI in our homes and mobile phones. Siri and Alexa answer questions and find relevant search items from around the web on demand. The same will be possible in cars within the near future. Mercedes are integrating Siri into their new A-class car. The technology can recognise the driver’s voice and their way of speaking – rather than just following a basic set of commands, the AI could interpret meaning from conversation in the same way another person could.

2. From the screen to your drive

Those with longer memories may remember a talking car that was a regular on TV. Knight Rider and its super intelligent KITT was a self-aware car that was fiercely loyal to Michael, the driver. Though KITT’s mounted flame thrower and bomb detector might not make it into commercial vehicles, drivers could talk to their cars through a smart band on their wrists. The technology is being developed to allow people to start their car before they reach it, to warm the seats, to set the destination on the navigation system, flash the lights, lock the doors and sound the horn – all from a distance with voice command.

3. Big Motor is watching you

A driver alert system already exists that, through a series of audible and vibrating gestures, tries to keep the driver awake or warn them against sudden lane departure. By 2021 though, there are plans to install in-car cameras to monitor a driver’s behaviour.

If the driver looked away from the road for a period of time, or appeared drunk or sleepy, the car would take action. This might start with slowing down and alerting a call centre for someone to check on the driver, but if the driver didn’t respond, the car could take control, slow down and park in a safe place. The potential to improve road safety is promising, but there are credible concerns for what in-car cameras could mean for individual privacy.

4. A cure for road rage

Increasingly intelligent and perceptive cars won’t stop at visual cues. An AI assistant has been developed which can pick up on the driver’s mood and well-being by detecting their heart rate, eye movements, facial expressions and the tone of their voice. It’s suggested the car would learn the driver’s habits and interact with them by, for example, playing the driver’s favourite music to calm them down. It can also suggest some nice places to go – perhaps a nearby café or park – where the driver could stop to improve their state of mind.

5. A butler on the road

As technology is developed to monitor the mood of drivers, the next step may be cars which can act to improve them. Autonomous vehicles which can take over driving when drivers are stressed could change the windscreen display to show photographs or peaceful scenes. Smart glass windscreens could even black out the surroundings entirely to create a tranquil space – known tentatively in ongoing research as “cocoon mode” – where the interior is invisible from outside and the occupants can rest while the car drives. Cars might even dispense snacks and drinks on demand from refrigerated cartridges, using technology that’s under development but not scheduled to make its debut until 2035.

Whether for good or ill, cars are likely to change beyond recognition in the near future. It may no longer be ridiculous to think that the wildest science fiction dreams could be driving us to work in the not so distant future.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[post_title] => Five ways AI could make your car as smart as a human passenger [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => five-ways-ai-could-make-your-car-as-smart-as-a-human-passenger [to_ping] => [pinged] => [post_modified] => 2020-01-06 14:59:04 [post_modified_gmt] => 2020-01-06 13:59:04 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=126405 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[1] => WP_Post Object ( [ID] => 126419 [post_author] => 367 [post_date] => 2019-12-30 14:36:53 [post_date_gmt] => 2019-12-30 13:36:53 [post_content] =>

Imagine you could communicate telepathically with a whale, listen to the WiFi networks in your environment, or experience smells through seeing color. Developments in technology give us the rare opportunity to expand and augment our sensorial capabilities, and relate to other (non-)human life forms in various hybrid forms.

The world we live in changes constantly, but the senses we use to perceive it remain the same. Next Senses explores the unchartered territory of how we could experience the world with technology.

Ask yourself; if you could have another sense, which would you choose?

Next Senses, today

Some attempts have already been made to expand our perception with ‘next’ senses. Cyborg artist Neil Harbisson has a camera mounted on his head that translates colors into a vibration that he can hear, allowing him to hear colors.

Cybernetics professor Kevin Warwack has implanted sensors in the nerves of his left arm, which communicate with the sensors in his wife’s hand, allowing them to share the feeling of touch. There is also a group of people, so-called grinders or bio-hackers, who experiment with DIY magnetic implants with which they can detect electromagnetic fields.

Five augmented senses

Next Senses is an ongoing research project that consists of five future scenarios, set in parallel worlds where biology and technology have fully merged. Enjoy:

#1 Synesthesia

Synesthesia is a rare phenomenon where a sensation in one of the senses, such as smell, triggers a sensation in another, such as sight. Synesthetes can taste sounds, smell colors or even see scents. Now, synesthesia could be made widely accessible through technology. Imagine how your perception of the world would change when you can see certain scents.

#2 Electronic Empathy

For people who experience difficulties in identifying and describing feelings, the world can be a confusing place. They may experience an emotion, but are unsure which emotion it is. Electronic Empathy is a ‘third-eye’ implant that runs on facial recognition algorithms and is directly projected onto its users’ sight. It may help those in need detect the emotion they were looking for in the first place.

#3 Skin Waves

While humans have sought to discover, name, chart and plot every inch of land on the planet, the deepest depths of our oceans remain unknown. Skin Waves allows its user to feel the frequencies of whale sounds coursing through their bones, enabling a multispecies relationship as a way to deepen their spiritual relationship to nature.

#4 Baby Code

The link between a parent and child is profound, both physical and emotional factors influence the parent-child bonding process, and this bond can only strengthen over time. Baby Code imagines a future in which parents can use sensor technology that allows them to cater exactly what their newborn needs.

#5 WiFi Angels

WiFi radiation is all around, yet invisible to our human senses. Imagine you could hear WiFi. Every area has its own soundscape. Streets, parks, subways, hotels, highways and beaches all sound different. WiFi Angels allows its user to sense electromagnetic radiations by turning the WiFi networks around them into a choir of singing angels.

Looking for more? Good!

What senses can we develop to look at our world from an alternative perspective? Our talks present a richer understanding of nature. Our speakers present inspiring Stories for Change: new narratives on possible and preferable futures in which biology and technology are fusing. Curious? Get in touch!

[post_title] => These speculative 'Next Senses' allow you to augment your senses with technology [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => five-next-senses [to_ping] => [pinged] => [post_modified] => 2019-12-30 14:40:09 [post_modified_gmt] => 2019-12-30 13:40:09 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=126419 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[2] => WP_Post Object ( [ID] => 125855 [post_author] => 2180 [post_date] => 2019-11-19 12:27:45 [post_date_gmt] => 2019-11-19 11:27:45 [post_content] =>

What is the first thing that comes to mind when you hear the term 'biohacking'? Perhaps you are now thinking of a bunch of kids sitting in their kitchen with a DNA kit, (wannabe) cyborgs inserting subcutaneous chips in their bodies, or perhaps a person striving for optimum performance through a perfect lifestyle. These are all types of biohacking, but there's more to it. Here's what you need to know (and were too afraid to ask).

Bulletproof coffee

The problem with biohacking is that all the examples outlined above are true. Amateur biotechnologists, cyborgs and supporters of a healthy lifestyle all associate themselves with the term biohacking.

Within the latter group, which I call the lifestyle optimizers, Dave Asprey is the guru. Asprey is the frontman of the American brand Bulletproof. Among other things, this brand sells special coffee that you must mix with butter and coconut oil. The promised result: instant focus, without any sugar crash and hours of satiation.

A brief bio of biohacking

But what exactly do we mean when we speak of 'biohacking'? The term was first used in 1988 in an opinion piece for the Washington Post. The article described the possibilities to perform all kinds of technological experiments from your basement. This included DNA analysis, the cultivation of bacteria and testing the effect of viruses on fungi. Today, this definition is still dominant for the group of amateur biotechnologists.

Within the other two groups, the cyborgs and the lifestyle optimizers, biohacking is aimed at people. In using the term, the link to computers is made: just consider how computerhackers break into hardware and software vs. biohackers grinding their own wetware.

The cyborgs take this notion quite literally, by implanting technology into their bodies, whereas lifestyle followers believe that you can improve the human body and prevent aging with smart nutrition, health hacks and useful gadgets.

Steam engines and other metaphors

The comparison with computer technology comes from our current technological paradigm. Yet in the past, the paradigm of that time was used to look at the human body.

At the time of the Industrial Revolution, the human brain was considered a constellation of pipes, steam and drive shafts. The saying "blowing off some steam" is also a good example of how people saw themselves as, well, a kind of steam engine.

These days we see the brain often described as an algorithm or hard disk and the body as a battery that needs to recharge. Keeping this in mind, the idea of biohacking is not that strange.

Technology after all, is what makes us human.

Shifting boundaries

Take something as simple as sight. In prehistoric times, your chances of survival were nil when suffering from poor vision. When the first glasses were made around 1200 AD, our ancestors most likely responded, “Your vision was given to you by God—why change that?"

As we have developed ourselves scientifically over time, so did our technology; as contact lenses are socially accepted today, does this also apply to smart contact lenses that have a Google Glass-like function tomorrow?

And what about LASIK (laser-assisted in situ keratomileusis), or commonly referred to as laser eye surgery. This technology is becoming more accessible, but how socially accepted is it to give yourself super vision like golf superstar Tiger Woods

This is my point: ethical boundaries of what we find socially acceptable are constantly shifting. That is what biohacking is about. Glasses are no longer biohacking, but smart contact lenses are.

Thinking ahead, one may wonder: Will glasses at some point become out-dated? Will everyone have genetically modified eyes for optimum vision?

Chances are, the next generations of biohackers will be at the forefront of these technologies. Perhaps they will replace their biological eyes with bionic ones. Perhaps they will simply change their diet.

Just like our technology, biohacking (and its dream and ideas that we have of ourselves) moves along the progress of mankind. But as with other technological developments, it's impossible to predict how these will evolve in the future. But there is one thing that we can be certain of. Things will change.

[post_title] => The beginner's guide to biohacking [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => the-beginners-guide-to-biohacking [to_ping] => [pinged] => [post_modified] => 2019-11-22 10:18:12 [post_modified_gmt] => 2019-11-22 09:18:12 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125855 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[3] => WP_Post Object ( [ID] => 120960 [post_author] => 367 [post_date] => 2019-10-30 11:57:08 [post_date_gmt] => 2019-10-30 10:57:08 [post_content] =>

Within a few years, it may be possible for premature babies to grow inside an artificial womb. And when that day arrives, should men be able to give birth to children? Should we externalize pregnancy with artificial wombs? And are these feminist dreams or frankenstein nightmares? Welcome to Reprodutopia, a debate on our reproductive futures.

A new narrative

For a long time the birds and the bees served us well to explain where our children come from. Yet radical developments in reproductive technology force us to rewrite this story.

Artificial wombs, gene editing techniques and reprogramming adult cells into eggs or sperm cells are revolutionary ways for human beings to reproduce, and appear to be closer than any of us can imagine.

It’s time for a much-needed discussion about the way technology radically alters our attitude towards reproduction, gender, relationships and love in the 21st century. If we are to rewrite the human story, let’s make sure it becomes a story that benefits all.

[post_title] => Should men be able to give birth to children? [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => reprodutopia-welcome [to_ping] => [pinged] => [post_modified] => 2019-10-31 20:56:38 [post_modified_gmt] => 2019-10-31 19:56:38 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=120960 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 5 [filter] => raw [post_category] => 0 )[4] => WP_Post Object ( [ID] => 120970 [post_author] => 2214 [post_date] => 2019-10-28 16:30:46 [post_date_gmt] => 2019-10-28 15:30:46 [post_content] =>

You might already have what’s often called a “smart home”, with your lights or music connected to voice-controlled technology such as Alexa or Siri. But when researchers talk about smart homes, we usually mean technologies that use artificial intelligence to learn your habits and automatically adjust your home in response to them. Perhaps the most obvious example of this are thermostats that learn when you are likely to be home and what temperature you prefer, and adjust themselves accordingly without you needing to change the settings.

My colleagues and I are interested in how this kind of true smart home technology could help people with dementia. We hope it could learn to recognise the different domestic activities a dementia sufferer carries out throughout the day and help them with each one. This could even lead up to the introduction of household robots to automatically assist with chores.

The growing number of people with dementia is encouraging care providers to look to technology as a way of supporting human carers and improving patients’ quality of life. In particular, we want to use technology to help people with dementia live more independently for as long as possible.

Dementia affects people’s cognitive abilities (things like perception, learning, memory and problem-solving skills). There are many ways that smart home technology can help with this. It can improve safety by automatically closing doors if they are left open or turning off cookers if they are left unattended. Bed and chair sensors or wearable devices can detect how well someone is sleeping or if they have been inactive for an unusual amount of time.

Lights, TVs and phones can be controlled by voice-activated technology or a pictorial interface for people with memory problems. Appliances such as kettles, fridges and washing machines can be controlled remotely.

People with dementia can also become disoriented, wander and get lost. Sophisticated monitoring systems using radiowaves inside and GPS outside can track people’s movements and raise an alert if they travel outside a certain area.

All of the data from these devices could be fed in to complex artificial intelligence that would automatically learn the typical things people do in the house. This is the classic AI problem of pattern matching (looking for and learning patterns from lots of data). To start with, the computer would build a coarse model of the inhabitants’ daily routines and would then be able to detect when something unusual is happening, such as not getting up or eating at the usual time.

A finer model could then represent the steps in a particular activity such as washing hands or making a cup of tea. Monitoring what the person is doing step by step means that, if they forget halfway through, the system can remind them and help them continue.

The more general model of the daily routine could use innocuous sensors such as those in beds or doors. But for the software to have a more detailed understanding of what is happening in the house you would need cameras and video processing that would be able to detect specific actions such as someone falling over. The downside to these improved models is a loss of privacy.

Future smart homes could include robot carers. Via Miriam Doerr Martin Frommherz/Shutterstock

The smart home of the future could also come equipped with a humanoid robot to help with chores. Research in this area is moving at a steady, albeit slow, pace, with Japan taking the lead with nurse robots.

The biggest challenge with robots in the home or care home is that of operating in an unstructured environment. Factory robots can operate with speed and precision because they perform specific, pre-programmed tasks in a purpose-designed space. But the average home is less structured and changes frequently as furniture, objects and people move around. This is a key problem which researchers are investigating using artificial intelligence techniques, such as capturing data from images (computer vision).

Robots don’t just have the potential to help with physical labour either. While most smart home technologies focus on mobility, strength and other physical characteristics, emotional well-being is equally important. A good example is the PARO robot, which looks like a cute toy seal but is designed to provide therapeutic emotional support and comfort.

Understanding interaction

The real smartness in all this technology comes from automatically discovering how the person interacts with their environment in order to provide support at the right moment. If we just built technology to do everything for people then it would actually reduced their independence.

For example, emotion-recognition software could judge someone’s feelings from their expression could adjust the house or suggest activities in response, for example by changing the lighting or encouraging the patient to take some exercise. As the inhabitant’s physical and cognitive decline increases, the smart house would adapt to provide more appropriate support.

There are still many challenges to overcome, from improving the reliability and robustness of sensors, to preventing annoying or disturbing alarms, to making sure the technology is safe from cybercriminals. And for all the technology, there will always be a need for a human in the loop. The technology is intended to complement human carers and must be adapted to individual users. But the potential is there for genuine smart homes to help people with dementia live richer, fuller and hopefully longer lives.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[post_title] => Truly smart homes could help dementia patients live independently [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => smart-homes-help-dementia-patients [to_ping] => [pinged] => [post_modified] => 2019-10-29 14:27:21 [post_modified_gmt] => 2019-10-29 13:27:21 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=120970 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[5] => WP_Post Object ( [ID] => 124278 [post_author] => 367 [post_date] => 2019-10-21 14:52:00 [post_date_gmt] => 2019-10-21 13:52:00 [post_content] =>

Now that the team of researchers at the Eindhoven University of Technology (whom we previously collaborated with to design a prototype for an artificial womb) has been awarded a €2.9 million grant to develop a working prototype of their artificial womb, this breakthrough raises ethical questions about the future of baby making on a global scale.

Therefore the BBC caught up with NNN designer Lisa Mandemaker, as part of their BBC 100 Women of 2019, on what it means to design an artificial womb.

The interview was recored during the buildup of Reprodutopia, our latest exhibition that presents thought-provoking visions of reproductive technologies.

What? The Reprodutopia Clinic expo
When? From 9 October  — 30 November 2019
Where? Droog Amsterdam

[post_title] => Watch: BBC reports on world's first artificial womb for humans [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => watch-bbc-reports-artificial-womb [to_ping] => [pinged] => [post_modified] => 2019-10-21 14:52:03 [post_modified_gmt] => 2019-10-21 13:52:03 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=124278 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[6] => WP_Post Object ( [ID] => 122728 [post_author] => 2194 [post_date] => 2019-10-08 15:16:32 [post_date_gmt] => 2019-10-08 14:16:32 [post_content] =>

Hooray! The team of researchers at the Eindhoven University of Technology (whom we previously collaborated with to design a prototype for an artificial womb) has been awarded a €2.9 million grant to develop a working prototype of their artificial womb.

Artificial womb: a brief explainer

The artificial womb would provide premature babies with artificial respiration in conditions close to a biological womb. Oxygen and nutrients would be delivered to the baby through an umbilical cord-like tube. Inside, the baby would be protected by a substance close to amniotic fluid.

Guid Oei, a professor at the university and a practicing gynaecologist, says that the conditions of current incubators are too harsh for premature babies born without fully developed lungs or intestines. As a result, attempts to deliver oxygen and nutrients directly to the organs often result in lasting damage and survival rates are low for babies less than 22 weeks old.

“Within five years it will be possible for a premature baby to continue to mature in an artificial womb”
Guid Oei, gynecologist

Indeed, the model is revolutionary in that “when we put the [babies] lungs back under water then they can develop, they can mature [...] the baby will receive the oxygen by the umbilical cord, just like in the natural womb,” Oei explains. The researchers hope that the artificial womb will be ready for use in clinics within five years.

The technology needed to create the artificial womb has been tested on lambs using so-called bio bags. Lambs born at the equivalent of 23 weeks of human pregnancy continued to develop within the biobags and, after being removed, grew up normally.

The power of design

It's interesting to see how a visualization — that was initially created to spark conversation about scientific developments in reproductive technology — is now at the forefront of media reporting of the research grant.

The design was conceptualized and visualised by Next Nature designer-in-chief Hendrik-Jan Grievink, in close collaboration with the team of Guid Oei, for Dutch Design Week 2018.

The unique collaboration between Máxima Medical Centre and Next Nature Network is part of an ongoing research into the impact of technology on the future of biological reproduction, intimacy and relationships: Welcome to Reprodutopia.

Want to see it for yourself? You can! The prototype is currently on display at the Reprodutopia expo in Amsterdam. During your visit, challenge and ask yourself: How will we live, love and reproduce in next nature?

What? The Reprodutopia Clinic expo
When? From 9 October  — 30 November 2019
Where? Droog Amsterdam

[post_title] => Artificial Womb receives €2.9m funding to develop prototype [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => artificial-womb-receives-fund [to_ping] => [pinged] => [post_modified] => 2019-10-10 10:48:30 [post_modified_gmt] => 2019-10-10 09:48:30 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=122728 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 1 [filter] => raw [post_category] => 0 )[7] => WP_Post Object ( [ID] => 122400 [post_author] => 2194 [post_date] => 2019-10-07 14:14:46 [post_date_gmt] => 2019-10-07 13:14:46 [post_content] =>

A breakthrough technology that responds to signals from the brain has transformed the life of a paralyzed 28-year-old man called Thibault. Four years after the initial incident that paralysed him, Thibault is walking again. Thanks to a combination of revolutionary technology and immense brain power, he is able to operate a full-body exoskeleton.

“When you are in my position, when you can’t do anything with your body […] I wanted to do something with my brain,” Thibault said.

This is where the process began. He first trained his brain by using a video game avatar to help him develop the skills needed to operate an exoskeleton - this involved a long process of completely relearning and visualizing natural movements.

Thibault’s brain signals were then recorded by two devices, implanted either side of his head, between the brain and the skin. These read his sensorimotor cortex, the part of the brain that controls motor function.

Professor Alim Louis Benabid, leader of the trial at Grenoble Alps Hospital explains, “The brain is still capable of generating commands that would normally move the arms and legs, there’s just nothing to carry them out.'' This is where technology was able to provide the final piece of the puzzle. Moving from avatar to exoskeleton, over many training sessions Thibault has covered the distance of one and a half football pitches.

Experts involved in the study say their research may lead to the production of brain-controlled wheelchairs - a possibility revolutionary for those with restricted mobility. Thibault says the trial offers “a message of hope to people like me.”

This huge achievement disrupts morbid predictions of man being controlled by technology. Instead, therapeutic uses of this kind give us a positive model for creating technologies that facilitate human agency and determination in life-enriching ways.

[post_title] => This brain-controlled exoskeleton allows a paralyzed man to walk again [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => brain-controlled-exoskeleton-walk [to_ping] => [pinged] => [post_modified] => 2019-10-07 14:40:52 [post_modified_gmt] => 2019-10-07 13:40:52 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=122400 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[8] => WP_Post Object ( [ID] => 121264 [post_author] => 2194 [post_date] => 2019-09-28 10:05:56 [post_date_gmt] => 2019-09-28 09:05:56 [post_content] =>

Emoji, Skype, Selfies - can these communication technologies close the generation gap? In part, yes! Young people are teaching senior citizens how to use technology, and it’s benefiting both groups. Here’s how.

Sharing knowledge about technology can form both a means and an end for more meaningful connections between the elderly and the young. A growing number of initiatives are recognizing the huge potential of bringing different generations together - from reducing feelings of isolation and boredom amongst the elderly, to positively imparting children with wisdom that only comes from life experience.

Indeed, existing intergenerational care homes - where care for both the old and young takes place on the same site - celebrate how interactions between different age groups improves the mobility, lifespan and overall happiness amongst older people, while providing strong, caring relationships for the young.

Fostering connections between young and old

So, what about those who do not experience the benefits of daily interaction with young people? Signing up for Ipad lessons may be the answer - and the assigned teacher could be a pleasant surprise! At a regular care home in the UK, school children visit on a weekly basis to teach older people how to use technology.

This simple idea surpassed all expectations - the collaborative venture into technology resulted in mutually beneficial experience for both groups. As the children shared their technological skills, the elderly passed on their life experiences.

Between sessions, fascinated pupils were able to email residents questions about their life histories, and learn more about events such as the Second World War. In California, a mentoring scheme, ‘Teach Seniors Technology’, is showing the elderly how to swipe.

One participant, who at first struggled to even open her ipad, went on to print her own calendar of ipad paintings which she then sent to friends and family. In other cases, simple game consoles such as Wii bring generations together in healthy competition through a mixture of virtual and physical gaming. These examples demonstrate how technology can succeed in fostering meaningful connections both on and offline.

What happens when old and young connect

Indeed, while the basics of email and Skype can help less mobile members of society keep in contact with friends and family, the real-life interactions that surround the development of such skills are equally as beneficial.

One young person, a volunteer for the US-based ‘Mentor Up’ scheme for senior citizens, stated, ‘I can honestly say I feel like i’ve learned more during these sessions than I’ve taught...for me, just talking with them and learning their stories is what draws me back every time.’

Apps, videos, games and the wealth of information accessible online can form a diverse library that both generations can draw on to share their life experiences, aspirations and spark joy. For example, one young mentor put his mentee back in contact with a childhood friend after finding his email address online.

Bridging the generational gap

Collaborations of this kind are groundbreaking, and crucially highlight how different generations have a lot to offer each other. Often elderly people seek social connections and a sense of purpose, while in many cases young people are less judgemental and open to new experiences. As explained above, it seems technology can act as a middle ground for realizing these needs, and can form a bridge for generational gaps.

The everyday impact of collaborative online explorations is promising: since residents of the UK care home were introduced to ipads and virtual headsets ‘the need for antipsychotic drugs has all but disappeared, and emergency ambulance calls have fallen 29%.’ It seems that these initiatives form just the start of a different approach to caring for the elderly, essential at a time when Europe’s population is getting older.

These benefits do not exclude the young - a 2016 Stanford report concluded that ‘aging adults play critical roles in the lives of young people, especially the most vulnerable in society.’ Certainly, seeing children as our future should not involve consigning older generations to the past - the elderly play a crucial role in shaping what our society will become.

It seems, when thinking about the possibilities of technology, we should not forget the meaningful connections between people that surround it. In this case, technology is a site at which sections of society can form bonds and enrich each other's lives.

In both virtual and physical worlds, interactions of this kind to improve wellbeing in powerful and mutually beneficial ways. Afterall, every generation has grown up with technology. This leads us to wonder, whether we can imagine a future in which we can grow along with our technology and find joy in its ability to bring people together, both on and offline.

[post_title] => How technology bridges the generational communication gap [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => how-technology-bridges-the-generational-communication-gap [to_ping] => [pinged] => [post_modified] => 2019-10-02 11:43:36 [post_modified_gmt] => 2019-10-02 10:43:36 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=121264 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[9] => WP_Post Object ( [ID] => 120165 [post_author] => 2205 [post_date] => 2019-09-17 17:13:53 [post_date_gmt] => 2019-09-17 16:13:53 [post_content] =>

Travelling to work, meeting friends for a catch up or just doing some shopping are often taken for granted by people with no known disabilities. For the visually impaired, these seemingly simple things can be a serious challenge.

But imagine a city equipped with technology that enables the visually impaired to recognise people, places or even bank notes, helping them to live more independently whether indoors or in a public place. That’s the promise of so-called smart cities, which use things like internet-connected devices and artificial intelligence to improve services and the quality of life for their residents.

For example, the visually impaired could hugely benefit from a smart city’s enhanced transport system. “Virtual Warsaw”, a smart city project in Poland’s busy capital, is based on cutting edge technologies and aims to provide a set of “eyes” to those who have visual problems.

The city has developed a network of beacon sensors to assist the visually impaired to move around independently. These are small, low-cost transmitters that can be fitted to buildings and send people real-time information about their surroundings to their phones via Bluetooth. This can include the location of building entrances, bus stops, or even empty seats on a bus or where to queue in municipal buildings.

In 2018, Dubai ran a pilot scheme involving an iPhone app that can convert written information in metro stations into audio instructions, helping users navigate from the entrance to the ticket machine, gate, platform and carriage.

Once travellers have arrived at their destination, smart cities can help them navigate public spaces. Simply providing better connectivity for smartphones is a good start, for example by fitting buildings with 5G-enabled small cells instead of relying on traditional masts for signal.

This would enable the visually impaired to make better use of smartphone apps such as Seeing AI and Blind Square, which can describe surroundings or give audio directions to users. Google is also developing a platform called Lookout, that uses a camera to help people identify money or recognise the colour of objects.

Better connected for real-time navigation. Via Diego Cervo/Shutterstock

But smart cities can go further with public technology. For example, they could provide automated information points with tactile maps or audio systems describing the surrounding location. If these included a camera that users can point at different buildings and other aspects of the environment, then image recognition, an application of artificial intelligence, could recognise these objects and describe them to the user.

Similarly, shopping malls could be equipped with product-recognition devices to allow shoppers to compare products in shops. These could come in the form of simple clips that can be added on top of any pair of glasses and can identify and describe a product to a user.

Smart buildings

Smart city technology can also help inside buildings. One existing example is voice-controlled home assistant technology such as Amazon Echo (Alexa) and Google Home, which can already be used to operate locks, lights and appliances or add items to a shopping list. But we should also expect home automation to go further, with sensors used to open windows and close curtains in response to changing weather conditions, and even to help people find lost objects.

Smart cities will revolutionise how people live, communicate or shop, especially for visually impaired people. We are now starting to witness the emergence of smart cities such as in Dubai, Singapore, New York and Warsaw. However, the adoption of smart city technology is still in its infancy, which is why the European Union is investing up to €1 billion in supporting projects in around 300 cities.

A recent review by professional services firm PwC found that smart city development is expected to increase steadily around the world over the next seven years, creating a US$2.5 trillion market by 2025. Urban development is growing at the fastest rate in human history. Smart city technology can help to meet some of the expectations of urban development that are growing just as fast.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[post_title] => Smart cities could give the visually impaired a new outlook on urban life [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => visually-impaired-urban-life [to_ping] => [pinged] => [post_modified] => 2019-09-26 16:40:36 [post_modified_gmt] => 2019-09-26 15:40:36 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=120165 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 ))[post_count] => 10 [current_post] => -1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 126405 [post_author] => 2320 [post_date] => 2020-01-06 14:59:04 [post_date_gmt] => 2020-01-06 13:59:04 [post_content] =>

Driving long distances without a passenger can be lonely. If you’ve ever done it, you might have wished for a companion to talk to – someone emotionally intelligent who can understand you and help you on the road. The disembodied voice of SatNav helps to fill the monotonous silence, but it can’t hold a conversation or keep you safe.

Research on driverless cars is well underway, but less is heard about the work being done to make cars a smart companion for drivers. In the future, the cars still driven by humans are likely to become as sensitive and attentive to their driver’s needs as another person. Sound far-fetched? It’s closer than you might think.

1. Ask your car questions

We’re already familiar with AI in our homes and mobile phones. Siri and Alexa answer questions and find relevant search items from around the web on demand. The same will be possible in cars within the near future. Mercedes are integrating Siri into their new A-class car. The technology can recognise the driver’s voice and their way of speaking – rather than just following a basic set of commands, the AI could interpret meaning from conversation in the same way another person could.

2. From the screen to your drive

Those with longer memories may remember a talking car that was a regular on TV. Knight Rider and its super intelligent KITT was a self-aware car that was fiercely loyal to Michael, the driver. Though KITT’s mounted flame thrower and bomb detector might not make it into commercial vehicles, drivers could talk to their cars through a smart band on their wrists. The technology is being developed to allow people to start their car before they reach it, to warm the seats, to set the destination on the navigation system, flash the lights, lock the doors and sound the horn – all from a distance with voice command.

3. Big Motor is watching you

A driver alert system already exists that, through a series of audible and vibrating gestures, tries to keep the driver awake or warn them against sudden lane departure. By 2021 though, there are plans to install in-car cameras to monitor a driver’s behaviour.

If the driver looked away from the road for a period of time, or appeared drunk or sleepy, the car would take action. This might start with slowing down and alerting a call centre for someone to check on the driver, but if the driver didn’t respond, the car could take control, slow down and park in a safe place. The potential to improve road safety is promising, but there are credible concerns for what in-car cameras could mean for individual privacy.

4. A cure for road rage

Increasingly intelligent and perceptive cars won’t stop at visual cues. An AI assistant has been developed which can pick up on the driver’s mood and well-being by detecting their heart rate, eye movements, facial expressions and the tone of their voice. It’s suggested the car would learn the driver’s habits and interact with them by, for example, playing the driver’s favourite music to calm them down. It can also suggest some nice places to go – perhaps a nearby café or park – where the driver could stop to improve their state of mind.

5. A butler on the road

As technology is developed to monitor the mood of drivers, the next step may be cars which can act to improve them. Autonomous vehicles which can take over driving when drivers are stressed could change the windscreen display to show photographs or peaceful scenes. Smart glass windscreens could even black out the surroundings entirely to create a tranquil space – known tentatively in ongoing research as “cocoon mode” – where the interior is invisible from outside and the occupants can rest while the car drives. Cars might even dispense snacks and drinks on demand from refrigerated cartridges, using technology that’s under development but not scheduled to make its debut until 2035.

Whether for good or ill, cars are likely to change beyond recognition in the near future. It may no longer be ridiculous to think that the wildest science fiction dreams could be driving us to work in the not so distant future.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[post_title] => Five ways AI could make your car as smart as a human passenger [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => five-ways-ai-could-make-your-car-as-smart-as-a-human-passenger [to_ping] => [pinged] => [post_modified] => 2020-01-06 14:59:04 [post_modified_gmt] => 2020-01-06 13:59:04 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=126405 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[comment_count] => 0 [current_comment] => -1 [found_posts] => 462 [max_num_pages] => 47 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => 1 [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => 1 [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => [is_privacy_policy] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_posts_page] => [is_post_type_archive] => [query_vars_hash:WP_Query:private] => 75bd6cd93e9471293c43fc3dd9cb2c38 [query_vars_changed:WP_Query:private] => [thumbnails_cached] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed )[compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ))
load more