174 results for “Physical-computing”

MIT’s new voiceless interface can read the words in your head

Vanessa Bates Ramirez
October 17th 2018

The way we interact with the technology in our lives is getting progressively more seamless. If typing terms or addresses into your phone wasn’t easy enough, now you can just tell Siri to do the search or pull up the directions for you. Don’t feel like getting off the couch to flick a switch, or want your house to be lit up by the time you pull into your driveway? Just tell your Echo home assistant what you want, and …

Intimate Technology S01E07: Our Selves

NextNature.net
November 21st 2017
We think of selfhood as residing in our everyday stream of consciousness. But could we possibly have a second digital self in our online behaviors? Watch episode 3 of our Intimate Technology series.

Will We Share .thought Files?

Mathilde Nakken
October 10th 2016
With the evolving technique called EEG, electroencephalography, we can measure brain activity and ultimately even read the brain.

Caress of the Gaze

Van Mensvoort
October 16th 2015
Designer Behnaz Farahi envisions an interactive 3D printed outfit that can detect and respond to the gaze of the other and respond accordingly with life-like behavior.

A Wooden Floor that Follows the Curves of the Tree it’s Made of

Van Mensvoort
September 16th 2015
A wooden floor that follows the curves of the tree it's made of, thanks to smart milling techniques.

Hacking And Controlling Cars Remotely

Robin Bergman
July 24th 2015
Two American hackers have been working on hacking cars to takeover full control of vehicles.

Googles Natural Tracking Technology

Van Mensvoort
June 12th 2015
We may soon all be making natural gestures to interact with our digital devices.

Hunger Games for Robots

Yunus Emre Duyar
May 10th 2015
How would you feel if robots inherited ethical complications of existence?

3D Holography: the Future of Medicine

Yunus Emre Duyar
April 25th 2015
A new technology aims to provide doctors with a true hologram of organs.

Hyperform: the Future of 3D Printing

Louise Huyghebaert
February 13th 2015

"However, while 3D printers are becoming increasingly accessible and capable of rivalling the quality of professional equipment, they are still inherently limited by a small print volume, placing severe constraints on the type and scale of objects we can create." Says designer Marcelo Coelho. With a very smart construction strategy in mind, Coelho developed together with designer and technologist Skylar Tibbits an algorithmic software named Hyperform.

The algorithm can transforms a needed form - possibly bigger than the printer’s measurement reach …

WP_Query Object ( [query] => Array ( [tag] => physical-computing [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => )[query_vars] => Array ( [tag] => physical-computing [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [cat] => [tag_id] => 110 [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( )[category__and] => Array ( )[post__in] => Array ( )[post__not_in] => Array ( )[post_name__in] => Array ( )[tag__in] => Array ( )[tag__not_in] => Array ( )[tag__and] => Array ( )[tag_slug__in] => Array ( [0] => physical-computing )[tag_slug__and] => Array ( )[post_parent__in] => Array ( )[post_parent__not_in] => Array ( )[author__in] => Array ( )[author__not_in] => Array ( )[ignore_sticky_posts] => [cache_results] => 1 [update_post_term_cache] => 1 [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [posts_per_page] => 10 [nopaging] => [comments_per_page] => 50 [no_found_rows] => )[tax_query] => WP_Tax_Query Object ( [queries] => Array ( [0] => Array ( [taxonomy] => category [terms] => Array ( [0] => 1 )[field] => term_id [operator] => NOT IN [include_children] => )[1] => Array ( [taxonomy] => post_tag [terms] => Array ( [0] => physical-computing )[field] => slug [operator] => IN [include_children] => 1 ))[relation] => AND [table_aliases:protected] => Array ( [0] => wp_term_relationships )[queried_terms] => Array ( [post_tag] => Array ( [terms] => Array ( [0] => physical-computing )[field] => slug ))[primary_table] => wp_posts [primary_id_column] => ID )[meta_query] => WP_Meta_Query Object ( [queries] => Array ( )[relation] => [meta_table] => [meta_id_column] => [primary_table] => [primary_id_column] => [table_aliases:protected] => Array ( )[clauses:protected] => Array ( )[has_or_relation:protected] => )[date_query] => [queried_object] => WP_Term Object ( [term_id] => 110 [name] => Physical-computing [slug] => physical-computing [term_group] => 0 [term_taxonomy_id] => 113 [taxonomy] => post_tag [description] => [parent] => 0 [count] => 174 [filter] => raw [term_order] => 0 )[queried_object_id] => 110 [request] => SELECT SQL_CALC_FOUND_ROWS wp_posts.ID FROM wp_posts LEFT JOIN wp_term_relationships ON (wp_posts.ID = wp_term_relationships.object_id) WHERE 1=1 AND ( wp_posts.ID NOT IN ( SELECT object_id FROM wp_term_relationships WHERE term_taxonomy_id IN (1) ) AND wp_term_relationships.term_taxonomy_id IN (113) ) AND wp_posts.post_type = 'post' AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.post_date DESC LIMIT 0, 10 [posts] => Array ( [0] => WP_Post Object ( [ID] => 91451 [post_author] => 1790 [post_date] => 2018-10-17 12:02:17 [post_date_gmt] => 2018-10-17 11:02:17 [post_content] =>

The way we interact with the technology in our lives is getting progressively more seamless. If typing terms or addresses into your phone wasn’t easy enough, now you can just tell Siri to do the search or pull up the directions for you. Don’t feel like getting off the couch to flick a switch, or want your house to be lit up by the time you pull into your driveway? Just tell your Echo home assistant what you want, and presto—lights on.

Engineers have been working on various types of brain-machine interfaces to take this seamlessness one step further, be it by measuring activity in the visual cortex to recreate images, or training an algorithm to "speak" for paralyzed patients based on their brain activation patterns.

At the Association for Computing Machinery’s ACM Intelligent User Interface conference in Tokyo, a team from MIT Media Lab unveiled AlterEgo, a wearable interface that "reads" the words users are thinking—without the users having to say anything out loud.

[embed]https://youtu.be/RuUSc53Xpeg[/embed]

If you thought Google Glass was awkward-looking, AlterEgo’s not much sleeker; the tech consists of a white plastic strip that hooks over the ear and extends below the jaw, with an additional attachment placed just under the wearer’s mouth. The strip contains electrodes that pick up neuromuscular signals, which are released when the user thinks of a certain word, silently "saying" it inside his or her head. A machine learning system then interprets the signals and identifies which words the user had in mind—and, amazingly, it does so correctly 92 percent of the time.

Arnav Kapur, a graduate student who led AlterEgo’s development, said, “The motivation for this was to build an IA device—an intelligence-augmentation device. Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”

It’s Not All in Your Head

Who knew your face made specific, teeny muscle movements when you think? Isn’t that the fun of it, that there’s no way anyone but you can know what's in your head?

It turns out we have a system that prepares for physical speech; it's active even when we don't say anything out loud, and the preparation extends all the way to our muscles, which give off myoelectric signals based on what they think we're about to say.

To figure out which areas of our faces give off the strongest neuromuscular signals related to speech, the MIT team had test subjects think of and silently say (also called “subvocalize”) a sequence of words four times, with a group of 16 electrodes placed on different parts of subjects’ faces each time.

Analysis of the resulting data showed that signals from seven specific electrode locations best deciphered subvocalized words. The team fed the data to a neural network, which was able to identify patterns between certain words and the signals AlterEgo had picked up.

More Than Words

Thus far, the system’s abilities are limited to fairly straightforward words; the researchers used simple math problems and chess moves to collect initial data, with the range of users’ vocabularies limited to about 20 possible words. So while its proof of concept is pretty amazing, AlterEgo has a ways to go before it will be able to make out all your thoughts. The tech’s developers are aiming to expand its capabilities, though, and their future work will focus on collecting data for more complex words and conversations.

What’s It For?

While technologies like AlterEgo can bring convenience to our lives, we should stop and ask ourselves how much intrusiveness we’re willing to allow in exchange for just that—convenience, as opposed to need. Do I need to have electrodes read my thoughts while I’m, say, grocery shopping in order to get the best deals, or save the most time? Or can I just read price tags and walk a little faster?

When discussing the usefulness of the technology, Pattie Maes, a professor of media arts and sciences at MIT and Kapur’s thesis advisor, mentioned the inconvenience of having to take out your phone and look something up during a conversation. “My students and I have been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present,” she said.

Thad Starner is a professor at Georgia Tech’s College of Computing. He wasn't involved in AlterEgo’s creation, but he's done a lot of work in wearable tech and was closely involved with Google Glass. Starner had some ideas about more utilitarian applications for AlterEgo, pointing out that in high-noise environments, such as on an airport’s tarmac, on the flight deck of an aircraft carrier, or in power plants or printing presses, the system would “be great to communicate with voice in an environment where you normally wouldn’t be able to.”

Starner added, “This is a system that would make sense, especially because oftentimes in these types of or situations people are already wearing protective gear. For instance, if you’re a fighter pilot, or if you’re a firefighter, you’re already wearing these masks.” He also mentioned the tech would be useful for special operations and the disabled.

Gearing research for voiceless interfaces like AlterEgo towards these practical purposes would likely up support for the tech, while simultaneously taming fears of Orwellian mind-reading and invasions of mental privacy. It’s a conversation that will get louder—inside engineers’ heads and out—as progress in the field advances.

Image Credit: Lorrie Lejeune / MIT

This article originally appeared on Singularity Hub, a publication of Singularity University.

[post_title] => MIT's new voiceless interface can read the words in your head [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => alter-ego [to_ping] => [pinged] => [post_modified] => 2018-12-10 14:18:27 [post_modified_gmt] => 2018-12-10 13:18:27 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91451 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[1] => WP_Post Object ( [ID] => 78687 [post_author] => 367 [post_date] => 2017-11-21 10:00:01 [post_date_gmt] => 2017-11-21 08:00:01 [post_content] => In the modern world, our identities are intimately tied to our technologies. We think of selfhood as residing in our everyday stream of consciousness. But could we possibly have a second digital self in our online behaviors? In their film Our Selves - An Imaginary Dialogue Between Brainwaves and Smartphones, Gianmaria Vernetti and Elisa Troglia reflect on the relationship between these two selves.On the left side of the screen, the video presents the human mind as a simple digital interface. Thoughts flash across the top of the monitor, ranging from the mundane (“I should not waste time on Facebook”, “No more emails today”, “Selfie”) to the vulnerable and beautiful (“Feel sad”, “Inner self”, “Autumn leaves”). Beneath this, the interface tracks frustration and excitement as numerical values. Meanwhile on the right side, the text cycles through smartphone commands: “Established Internet”, “Established Facebook”.At first these two streams do not interact - but the title suggests we are here for a dialogue. As the movie progresses the streams indeed begin to interact in quite poignant ways. Amid the continuing techno-babble, the words “I’ll be with you" briefly flash up on the smartphone side of the screen.What do you make of Vernetti and Troglia’s film? Do you recognize your own thoughts and moods in it, or your own mode of interaction with technology? Is the movie a parable of loneliness in the postmodern world, or simply a reflection of our new hyperconnected way of being?[youtube]https://youtu.be/2TDFyJOiaNQ[/youtube]Credit: Our Selves - An Imaginary Dialogue Between Brainwaves and Smartphones by Gianmaria Vernetti and Elisa Troglia (IT) [post_title] => Intimate Technology S01E07: Our Selves [post_excerpt] => We think of selfhood as residing in our everyday stream of consciousness. But could we possibly have a second digital self in our online behaviors? Watch episode 3 of our Intimate Technology series. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => intimate-technology-s01e07 [to_ping] => [pinged] => [post_modified] => 2017-12-03 14:08:22 [post_modified_gmt] => 2017-12-03 13:08:22 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=78687/ [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[2] => WP_Post Object ( [ID] => 67041 [post_author] => 936 [post_date] => 2016-10-10 11:19:06 [post_date_gmt] => 2016-10-10 09:19:06 [post_content] => Don’t we all wish to look into someone else’s mind every once in a while? With the evolving technique called EEG, electroencephalography, we can measure brain activity and ultimately even read the brain. The newest inventions are becoming more and more portable, ready to be implemented into our everyday life.EEG works as follows: when your brain is active, little electrical pulses are exchanged between brain neurons. The EEG sensors recognizes those electrical exchanges; the more active your brain is, the more signals it sends out. The collected data are gathered in a map of your brain, that allows to see which fields are active. We already know this technique from fMRI scans, which are working more precise then EEG, though the scanners are bigger and the procedure take long time. Compared to fMRI, EEG is more exploitable in ordinary life, gathering realtime data and allowing the user to move around.[youtube]http://youtu.be/1ovv6lmPHSI[/youtube]Therefore, many researches are done to develop EEG headwear, like in-ear EEG, brain hacking devices hat fit inside the ear. Also the entertainment industry is impatiently waiting to create brain driven games, no controller needed, thinking about jumping will make your character jump. Another moment when your brain is active and your body rests is when you are asleep. Dreem is a headband that tracks your sleeping behavior by checking your brain activity. It sends out sounds to deepen your sleep. And while we are waiting for the self driving car, there is already a brain driven car, ideal for disabled people.These are just some of the benefits of using neuroscience in our future daily life. The biggest negative aspect is our privacy. Just like the personal data collected by our digital devices, EEG collects even more private information - your thoughts. When the inside of your head becomes a place where privacy is at stake, many ethic questions raise. For now EEG is still a very unprecise tool to measure brain acitivity, not knowing the exact content of your thoughts. But we can start think about the moment in which we will be able to exchange ‘.thought’ files with friends.Source: Futurism. Image: FocusBand [post_title] => Will We Share .thought Files? [post_excerpt] => With the evolving technique called EEG, electroencephalography, we can measure brain activity and ultimately even read the brain. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => save-dreams-thought [to_ping] => [pinged] => [post_modified] => 2016-10-21 11:38:33 [post_modified_gmt] => 2016-10-21 09:38:33 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=66052 [menu_order] => 56 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[3] => WP_Post Object ( [ID] => 57530 [post_author] => 4 [post_date] => 2015-10-16 16:00:31 [post_date_gmt] => 2015-10-16 15:00:31 [post_content] => If technology transformed animals into people; is technology perhaps also capable of changing people back into animals? Architect and interaction designer Behnaz Farahi envisions an interactive 3D printed outfit that can detect and respond to the gaze of the other, and respond accordingly with life-like behavior. Rest assure, we are the primitives of a next nature.Thanks Sanne. [post_title] => Caress of the Gaze [post_excerpt] => Designer Behnaz Farahi envisions an interactive 3D printed outfit that can detect and respond to the gaze of the other and respond accordingly with life-like behavior. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => caress-of-the-gaze [to_ping] => [pinged] => [post_modified] => 2016-01-12 11:03:37 [post_modified_gmt] => 2016-01-12 10:03:37 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=57530 [menu_order] => 509 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[4] => WP_Post Object ( [ID] => 56828 [post_author] => 4 [post_date] => 2015-09-16 15:00:25 [post_date_gmt] => 2015-09-16 14:00:25 [post_content] => While traditional wood milling forces trees into straight rectangle shapes, novel smart milling techniques allow for industrial-scale manufactured hardwood flooring that follows a tree’s growth.In the past such floors were the rare product of a few dedicated craftsmen, but Estonian company Bolefloor now brings curved-length flooring to the market at large.product_1You have to love these floors as they show that as technology matures, it better resonates with biology. [post_title] => A Wooden Floor that Follows the Curves of the Tree it's Made of [post_excerpt] => A wooden floor that follows the curves of the tree it's made of, thanks to smart milling techniques. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => floor-follows-tree-curves [to_ping] => [pinged] => [post_modified] => 2016-01-12 11:03:43 [post_modified_gmt] => 2016-01-12 10:03:43 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=56828 [menu_order] => 541 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[5] => WP_Post Object ( [ID] => 55300 [post_author] => 824 [post_date] => 2015-07-24 16:14:00 [post_date_gmt] => 2015-07-24 14:14:00 [post_content] => Two American hackers, Charlie Miller and Chris Valasek, during the last two years have been working on hacking cars to takeover full control of vehicles. At the beginning of the project in 2013, their hacks had limitations: they had to sit in the back of the car with their laptops hooked up with wires to the cars central nervous system. Today the two hackers have gone wireless, operating over the internet.This was possible because car manufactures are implementing smart inter-connective technologies and integrating WiFi hot spots into their products. The only thing a hacker needs to know is the car IP address to take full control over the car, anytime anywhere. “From an attacker’s perspective, it’s a super nice vulnerability”Miller says. “This is what everyone who thinks about car security has worried about for years. This is a reality”.150701_car_hackers_12-1024x683-530x353_2150701_car_hackers_17-1024x683-530x353_3Read more at: Wired [post_title] => Hacking And Controlling Cars Remotely [post_excerpt] => Two American hackers have been working on hacking cars to takeover full control of vehicles. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => hacking-and-controlling-cars-remotely [to_ping] => [pinged] => [post_modified] => 2015-08-12 09:55:57 [post_modified_gmt] => 2015-08-12 07:55:57 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net.webslice.eu/?p=55300 [menu_order] => 582 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[6] => WP_Post Object ( [ID] => 45253 [post_author] => 4 [post_date] => 2015-06-12 16:00:07 [post_date_gmt] => 2015-06-12 14:00:07 [post_content] => Search giant Google is developing a new interaction sensor that can track movements with great accuracy using radar technology. It's only the size of a small computer chip and can be inserted into everyday objects and things we use daily.Watch the video for a guaranteed moment of amazement. If the Google's Soli technology final implementation will be as precise as the demonstration, we may soon all be making magical gestures to interact with our digital devices. And the best thing: it will feel entirely natural. [post_title] => Googles Natural Tracking Technology [post_excerpt] => We may soon all be making natural gestures to interact with our digital devices. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => googles-natural-tracking-technology [to_ping] => [pinged] => [post_modified] => 2015-08-07 10:49:42 [post_modified_gmt] => 2015-08-07 08:49:42 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=45253 [menu_order] => 625 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[7] => WP_Post Object ( [ID] => 44642 [post_author] => 835 [post_date] => 2015-05-10 14:39:15 [post_date_gmt] => 2015-05-10 12:39:15 [post_content] => As technology develops and introduces more complex wild systems every day, it is inevitable that some jobs are taken over by our creations. How would you feel if robots inherited ethical complications of existence? This is the question that Berlin-based audiovisual artist Martin Reiche seeks to answer, by pitting robots against each other in a deadly fight for resources.In his installation called Drone Garden, Reiche uses "plants" that exist in a virtual garden of wires, circuit boards and cables. These plants, called drones by the author, are coded to constantly fight for resources, data in this case. With his installation, Reiche questions whether it is ethical to manipulate a virtual world that can be designed to fight for its existence.Source: Fast Co.Design [post_title] => Hunger Games for Robots [post_excerpt] => How would you feel if robots inherited ethical complications of existence? [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => hunger-games-for-robots [to_ping] => [pinged] => [post_modified] => 2015-08-25 00:03:21 [post_modified_gmt] => 2015-08-24 22:03:21 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=44642 [menu_order] => 663 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[8] => WP_Post Object ( [ID] => 44307 [post_author] => 835 [post_date] => 2015-04-25 16:00:26 [post_date_gmt] => 2015-04-25 14:00:26 [post_content] => Although biomedical imaging technologies have been improved enough to show us 3D images of our insides, there are still technical problems associated with the current technology. A new company called EchoPixel might have an innovative solution with its interactive holographic viewer system True3D Viewer.Current 3D viewing technologies are not able to create a truly accurate 3D rendition: 2D images gathered from MRI, CT scanner, ultrasound and other devices are stitched together to create a rough 3D image. The result is not perfect, there can be holes in these images, which make diagnosis and treatment processes a challenge for doctors.With True3D Viewer, EchoPixel aims to solve this problem by providing medical professionals with a hologram of organs that can be moved around, zoomed in on, or manipulated in actual 3D space. The new technology might not necessarily improve medical practices by leaps and bounds, but it can make diagnosis and treatment processes more effective.The 3D holography could avoid uncomfortable physical procedures and help detect problems in complex organs such as the heart or the brain. Get a glimpse of the technology in the video below:[vimeo]http://vimeo.com/112189590[/vimeo]Story via Wired. Image via EchoPixel [post_title] => 3D Holography: the Future of Medicine [post_excerpt] => A new technology aims to provide doctors with a true hologram of organs. [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => 3d-holography-the-future-of-medicine [to_ping] => [pinged] => [post_modified] => 2015-04-23 14:44:04 [post_modified_gmt] => 2015-04-23 12:44:04 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=44307 [menu_order] => 684 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[9] => WP_Post Object ( [ID] => 42836 [post_author] => 838 [post_date] => 2015-02-13 16:00:49 [post_date_gmt] => 2015-02-13 15:00:49 [post_content] => [vimeo]http://vimeo.com/73998150[/vimeo]"However, while 3D printers are becoming increasingly accessible and capable of rivalling the quality of professional equipment, they are still inherently limited by a small print volume, placing severe constraints on the type and scale of objects we can create." Says designer Marcelo Coelho. With a very smart construction strategy in mind, Coelho developed together with designer and technologist Skylar Tibbits an algorithmic software named Hyperform.The algorithm can transforms a needed form - possibly bigger than the printer’s measurement reach - into an origami-like chain structure, which can be unfolded into the bigger final product. Hyperform makes it possible to print bigger forms in a single piece, while the ordinary printers print different parts separately and assemblies them later. "Hyperform encodes assembly information into the actual parts, so there is no need for a separate assembly instruction sheet and parts don't need to be individually labeled and sorted" says Coelho.Source: Core77

[post_title] => Hyperform: the Future of 3D Printing [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => hyperform-the-future-of-3d-printing [to_ping] => [pinged] => [post_modified] => 2015-02-12 17:53:50 [post_modified_gmt] => 2015-02-12 16:53:50 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=42836 [menu_order] => 775 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 ))[post_count] => 10 [current_post] => -1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 91451 [post_author] => 1790 [post_date] => 2018-10-17 12:02:17 [post_date_gmt] => 2018-10-17 11:02:17 [post_content] =>

The way we interact with the technology in our lives is getting progressively more seamless. If typing terms or addresses into your phone wasn’t easy enough, now you can just tell Siri to do the search or pull up the directions for you. Don’t feel like getting off the couch to flick a switch, or want your house to be lit up by the time you pull into your driveway? Just tell your Echo home assistant what you want, and presto—lights on.

Engineers have been working on various types of brain-machine interfaces to take this seamlessness one step further, be it by measuring activity in the visual cortex to recreate images, or training an algorithm to "speak" for paralyzed patients based on their brain activation patterns.

At the Association for Computing Machinery’s ACM Intelligent User Interface conference in Tokyo, a team from MIT Media Lab unveiled AlterEgo, a wearable interface that "reads" the words users are thinking—without the users having to say anything out loud.

[embed]https://youtu.be/RuUSc53Xpeg[/embed]

If you thought Google Glass was awkward-looking, AlterEgo’s not much sleeker; the tech consists of a white plastic strip that hooks over the ear and extends below the jaw, with an additional attachment placed just under the wearer’s mouth. The strip contains electrodes that pick up neuromuscular signals, which are released when the user thinks of a certain word, silently "saying" it inside his or her head. A machine learning system then interprets the signals and identifies which words the user had in mind—and, amazingly, it does so correctly 92 percent of the time.

Arnav Kapur, a graduate student who led AlterEgo’s development, said, “The motivation for this was to build an IA device—an intelligence-augmentation device. Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”

It’s Not All in Your Head

Who knew your face made specific, teeny muscle movements when you think? Isn’t that the fun of it, that there’s no way anyone but you can know what's in your head?

It turns out we have a system that prepares for physical speech; it's active even when we don't say anything out loud, and the preparation extends all the way to our muscles, which give off myoelectric signals based on what they think we're about to say.

To figure out which areas of our faces give off the strongest neuromuscular signals related to speech, the MIT team had test subjects think of and silently say (also called “subvocalize”) a sequence of words four times, with a group of 16 electrodes placed on different parts of subjects’ faces each time.

Analysis of the resulting data showed that signals from seven specific electrode locations best deciphered subvocalized words. The team fed the data to a neural network, which was able to identify patterns between certain words and the signals AlterEgo had picked up.

More Than Words

Thus far, the system’s abilities are limited to fairly straightforward words; the researchers used simple math problems and chess moves to collect initial data, with the range of users’ vocabularies limited to about 20 possible words. So while its proof of concept is pretty amazing, AlterEgo has a ways to go before it will be able to make out all your thoughts. The tech’s developers are aiming to expand its capabilities, though, and their future work will focus on collecting data for more complex words and conversations.

What’s It For?

While technologies like AlterEgo can bring convenience to our lives, we should stop and ask ourselves how much intrusiveness we’re willing to allow in exchange for just that—convenience, as opposed to need. Do I need to have electrodes read my thoughts while I’m, say, grocery shopping in order to get the best deals, or save the most time? Or can I just read price tags and walk a little faster?

When discussing the usefulness of the technology, Pattie Maes, a professor of media arts and sciences at MIT and Kapur’s thesis advisor, mentioned the inconvenience of having to take out your phone and look something up during a conversation. “My students and I have been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present,” she said.

Thad Starner is a professor at Georgia Tech’s College of Computing. He wasn't involved in AlterEgo’s creation, but he's done a lot of work in wearable tech and was closely involved with Google Glass. Starner had some ideas about more utilitarian applications for AlterEgo, pointing out that in high-noise environments, such as on an airport’s tarmac, on the flight deck of an aircraft carrier, or in power plants or printing presses, the system would “be great to communicate with voice in an environment where you normally wouldn’t be able to.”

Starner added, “This is a system that would make sense, especially because oftentimes in these types of or situations people are already wearing protective gear. For instance, if you’re a fighter pilot, or if you’re a firefighter, you’re already wearing these masks.” He also mentioned the tech would be useful for special operations and the disabled.

Gearing research for voiceless interfaces like AlterEgo towards these practical purposes would likely up support for the tech, while simultaneously taming fears of Orwellian mind-reading and invasions of mental privacy. It’s a conversation that will get louder—inside engineers’ heads and out—as progress in the field advances.

Image Credit: Lorrie Lejeune / MIT

This article originally appeared on Singularity Hub, a publication of Singularity University.

[post_title] => MIT's new voiceless interface can read the words in your head [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => alter-ego [to_ping] => [pinged] => [post_modified] => 2018-12-10 14:18:27 [post_modified_gmt] => 2018-12-10 13:18:27 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91451 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[comment_count] => 0 [current_comment] => -1 [found_posts] => 166 [max_num_pages] => 17 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => 1 [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => 1 [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => [is_privacy_policy] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_posts_page] => [is_post_type_archive] => [query_vars_hash:WP_Query:private] => a4e3a33dafc1132a87508e0ba146d633 [query_vars_changed:WP_Query:private] => [thumbnails_cached] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed )[compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ))
load more