483 results for “Wild Systems”

It could be time to start thinking about a cybernetic Bill of Rights

Mike Ryder
January 6th 2020

Like it or loathe it, the robot revolution is now well underway and the futures described by writers such as Isaac Asimov, Frederik Pohl and Philip K. Dick are fast turning from science fiction into science fact. But should robots have rights? And will humanity ever reach a point where human and machine are treated the same?

At the heart of the debate is that most fundamental question: what does it mean to be human? Intuitively, we all think we …

Explore your relationship with AI in this exhibition

NextNature.net
December 12th 2019

What makes us human? And why do we sometimes fear artificial intelligence? And what about technological singularity - the moment in time when artificial intelligence outperforms human intelligence? The increasing yet often invisible implementation of AI in our daily life (think voice assistant and deep-learning algorithms) causes more questions than answers. Should we be defensive or welcome this new technology as part of our human evolution?

The exhibition AI: More than Human (now at Forum in Groningen—previously at the Barbican …

Why electric cars should be allowed to drive faster

Van Mensvoort
December 4th 2019

In response to the nitrogen crisis, the Dutch cabinet is planning to reduce the speed limit during the day to 100 kph. In itself a sensible decision. But it is strange that this measure also affects motorists who cause no nitrogen emissions whatsoever. A missed opportunity to reward sustainable actions. That's why we started a petition. We want a separate lane for electric cars where 130 km/h is allowed. Sign if you agree with us.

Before I explain why we …

Four visions for the future of public transport

Marcus Enoch
November 7th 2019

The way people get around is starting to change, and as a professor of transport strategy I do rather wonder if the modes of transport we use today will still be around by the turn of the next century.

Growing up, my favourite book was a children’s encyclopaedia first published in 1953. One double page spread featured an annotated cityscape, showing all aspects of the built environment – most of which we would still be familiar with now. The various …

This exhibition looks at how robots are changing the world we live in

NextNature.net
November 7th 2019

Delve into the science and fiction of robots at V&A Dundee’s latest exhibition: Hello, Robot. Contemplate the existence of robots and how they have both shaped, and been shaped by technology, culture and design. See for yourself how the boundaries between human and machine have become increasingly blurred; understand how we got here, and take note of the increasing power of designers to influence the future of such technologies.

Hello, Robot has over 200 objects on display, and takes visitors …

How AI is reshaping electronic music

Linda Valenta
September 12th 2019

The idea that AI can compose electronic music may sound a little off to people. It raises essential questions about creativity as a product exclusive to humans: can an AI be creative? Can it be musical? Can it compete with human-made melodies? Does it need to?

More and more, AI has set foot in the realm of creative industries. From an AI writing the next GoT book to IBM’s Watson creating a trailer for a non-exisent sci-fi thriller. And that’s …

Microbiocene: A microbiological archeology of the future

Linda Valenta
July 11th 2019

In configuring our next nature, artists and scientists explore new languages that move beyond the Anthropocene - the era of human beings. These semantics would bridge the gap between mankind and technology, but also between humans and other species, establishing a cosmological understanding of life. Within this endeavour, bio-artists Amanda Baum and Rose Leahy delved into more-than-human narratives by creating a monument for the Microbiocene: the age of the microbial.

The Microbiocene is an epoch we’ve always lived in and …

Why Next Nature Network is hiring an army of bots

NextNature.net
April 11th 2019

As we showed with HUBOT, we can use new technologies and robotics to make our work more enjoyable, interesting and humane. Aside from our speculative jobs, a lot of robotic companions already exist. Think about the robotic fish that observes marine life, the breathing pillow that helps you sleep and robotic arms that assist the surgeon. So why not join forces at Next Nature Network?

Not working against, but with robots!

Over the past few years, we have been actively …

The religion named Artificial Intelligence

Linda Valenta
April 5th 2019

Would you pray to a Robot deity? A group of Japanese buddhists is already doing so. Meet Mindar, the robot divinity shaped after the buddhist Goddess of Mercy, also known as Kannon. Kyoto based Kodaiji Temple recruited Osaka University’s head of intelligent robotics Hiroshi Ishiguro to design this million dollar robot. Its purpose? To encourage youngsters to engage with buddhism again.

Mindar’s silicone face represents a gender neutrality that aims to move beyond human representation. Supplemented with aluminium body parts, …

Your Next Nature guide to Transmediale 2019

NextNature.net
January 18th 2019

Berlin is kicking off its cultural season with the not-to-miss 23th installment of Transmediale. This year the digital art/culture festival focuses on how feelings are made into objects of technological design, and asks what role emotions and empathy play within digital culture.

We combed the program so you don't have to:

How to Grow and Use Your Feelers (Workshop. Wednesday from 11:00 to 14:00

Donna Harrway's writings inspired the interdisciplinary techno-feminist research group #purplenoise to immerse us in a world …

WP_Query Object ( [query] => Array ( [tag] => wild-systems [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => )[query_vars] => Array ( [tag] => wild-systems [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [cat] => [tag_id] => 102 [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( )[category__and] => Array ( )[post__in] => Array ( )[post__not_in] => Array ( )[post_name__in] => Array ( )[tag__in] => Array ( )[tag__not_in] => Array ( )[tag__and] => Array ( )[tag_slug__in] => Array ( [0] => wild-systems )[tag_slug__and] => Array ( )[post_parent__in] => Array ( )[post_parent__not_in] => Array ( )[author__in] => Array ( )[author__not_in] => Array ( )[ignore_sticky_posts] => [cache_results] => 1 [update_post_term_cache] => 1 [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [posts_per_page] => 10 [nopaging] => [comments_per_page] => 50 [no_found_rows] => )[tax_query] => WP_Tax_Query Object ( [queries] => Array ( [0] => Array ( [taxonomy] => category [terms] => Array ( [0] => 1 )[field] => term_id [operator] => NOT IN [include_children] => )[1] => Array ( [taxonomy] => post_tag [terms] => Array ( [0] => wild-systems )[field] => slug [operator] => IN [include_children] => 1 ))[relation] => AND [table_aliases:protected] => Array ( [0] => wp_term_relationships )[queried_terms] => Array ( [post_tag] => Array ( [terms] => Array ( [0] => wild-systems )[field] => slug ))[primary_table] => wp_posts [primary_id_column] => ID )[meta_query] => WP_Meta_Query Object ( [queries] => Array ( )[relation] => [meta_table] => [meta_id_column] => [primary_table] => [primary_id_column] => [table_aliases:protected] => Array ( )[clauses:protected] => Array ( )[has_or_relation:protected] => )[date_query] => [queried_object] => WP_Term Object ( [term_id] => 102 [name] => Wild Systems [slug] => wild-systems [term_group] => 0 [term_taxonomy_id] => 105 [taxonomy] => post_tag [description] => Anthropogenic processes that go feral. A computer virus, for example, continues running long after its programmer has had any direct role in how it functions or what computers it infects. [parent] => 0 [count] => 483 [filter] => raw [term_order] => 0 )[queried_object_id] => 102 [request] => SELECT SQL_CALC_FOUND_ROWS wp_posts.ID FROM wp_posts LEFT JOIN wp_term_relationships ON (wp_posts.ID = wp_term_relationships.object_id) WHERE 1=1 AND ( wp_posts.ID NOT IN ( SELECT object_id FROM wp_term_relationships WHERE term_taxonomy_id IN (1) ) AND wp_term_relationships.term_taxonomy_id IN (105) ) AND wp_posts.post_type = 'post' AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.post_date DESC LIMIT 0, 10 [posts] => Array ( [0] => WP_Post Object ( [ID] => 126400 [post_author] => 2319 [post_date] => 2020-01-06 14:53:14 [post_date_gmt] => 2020-01-06 13:53:14 [post_content] =>

Like it or loathe it, the robot revolution is now well underway and the futures described by writers such as Isaac Asimov, Frederik Pohl and Philip K. Dick are fast turning from science fiction into science fact. But should robots have rights? And will humanity ever reach a point where human and machine are treated the same?

At the heart of the debate is that most fundamental question: what does it mean to be human? Intuitively, we all think we know what this means – it almost goes without saying. And yet, as a society, we regularly dehumanise others, and cast them as animal or less than human – what philosopher Giorgio Agamben describes as “bare life”.

Take the homeless for example. People who the authorities treat much like animals, or less than animals (like pests) who need to be guarded against with anti-homeless spikes and benches designed to prevent sleep. A similar process takes places within a military setting, where enemies are cast as less than human to make them easier to fight and easier to kill.

Humans also do this to other “outsiders” such as immigrants and refugees. While many people may find this process disturbing, these artificial distinctions between insider and outsider reveal a key element in the operation of power. This is because our very identities are fundamentally built on assumptions about who we are and what it means to be included in the category of “human”. Without these wholly arbitrary distinctions, we risk exposing the fact that we’re all a lot more like animals than we like to admit.

Being human

Of course, things get a whole lot more complicated when you add robots into the mix. Part of the problem is that we find it hard to decide what we mean by “thought” and “consciousness” and even what we mean by “life” itself. As it stands, the human race doesn’t have a strict scientific definition on when life begins and ends.

Similarly, we don’t have a clear definition on what we mean by intelligent thought and how and why people think and behave in different ways. If intelligent thought is such an important part of being human (as some would believe), then what about other intelligent creatures such as ravens and dolphins? What about biological humans with below average intelligence?

These questions cut to the heart of the rights debate and reveal just how precarious our understanding of the human really is. Up until now, these debates have solely been the preserve of science fiction, with the likes of Flowers for Algernon and Do Androids Dream of Electric Sheep? exposing just how easy it is to blur the line between the human and non-human other. But with the rise of robot intelligence these questions become more pertinent than ever, as now we must also consider the thinking machine.

Machines and the rule of law

But even assuming that robots were one day to be considered “alive” and sufficiently intelligent to be thought of in the same way as human beings, then the next question is how might we incorporate them into society and how we might hold them to account when things go wrong?

Traditionally, we tend to think about rights alongside responsibilities. This comes as part of something known as social contract theory, which is often associated with political philosopher Thomas Hobbes. In a modern context, rights and responsibilities go hand-in-hand with a system of justice that allows us to uphold these rights and enforce the rule of law. But these principles simply cannot be applied to a machine. This is because our human system of justice is based on a concept of what it means to be human and what it means to be alive.

So, if you break the law, you potentially forfeit some part of your life through incarceration or (in some nations) even death. However, machines cannot know mortal existence in the same way humans do. They don’t even experience time in the same way as humans. As such, it doesn’t matter how long a prison sentence is, as a machine could simply switch itself off and remain essentially unchanged.

For now at least, there’s certainly no sign of robots gaining the same rights as human beings and we’re certainly a long way off from machines thinking in a way that might be described as “conscious thought”. Given that we still haven’t quite come to terms with the rights of intelligent creatures such as ravens, dolphins and chimpanzees, the prospect of robot rights would seem a very long way off.

The question then really, is not so much whether robots should have rights, but whether we should distinguish human rights from other forms of life such as animal and machine. It may be that we start to think about a cybernetic Bill of Rights that embraces all thinking beings and recognises the blurred boundaries between human, animal and machine.

Whatever the case, we certainly need to move away from the distinctly problematic notion that we humans are in some way superior to every other form of life on this planet. Such insular thinking has already contributed to the global climate crisis and continues to create tension between different social, religious and ethnic groups. Until we come to terms with what it means to be human, and our place in this world, then the problems will persist. And all the while, the machines will continue to gain intelligence.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[post_title] => It could be time to start thinking about a cybernetic Bill of Rights [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => cybernetic-bill-of-rights [to_ping] => [pinged] => [post_modified] => 2020-01-06 14:53:15 [post_modified_gmt] => 2020-01-06 13:53:15 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=126400 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[1] => WP_Post Object ( [ID] => 111455 [post_author] => 367 [post_date] => 2019-12-12 10:47:00 [post_date_gmt] => 2019-12-12 09:47:00 [post_content] =>

What makes us human? And why do we sometimes fear artificial intelligence? And what about technological singularity - the moment in time when artificial intelligence outperforms human intelligence? The increasing yet often invisible implementation of AI in our daily life (think voice assistant and deep-learning algorithms) causes more questions than answers. Should we be defensive or welcome this new technology as part of our human evolution?

The exhibition AI: More than Human (now at Forum in Groningen—previously at the Barbican in London) invites you to explore your relationship with artificial intelligence.

Curators Suzanne Livingston and Maholo Uchida have asked artists, scientists and researchers to demonstrate AI’s potential to revolutionize our lives. Experience the capabilities of AI in the form of cutting-edge research projects by DeepMind, Massachusetts Institute of Technology (MIT) and Neri Oxman; and interact directly with exhibits and installations to experience the possibilities first-hand.

Take your chance and dive into this immersive installation What a Loving and Beautiful World by artist collective teamLab. The visuals consist out of Chinese characters and natural phenomena triggered by interaction. When a visitor touches a character, the world contained inside that character unfolds on the walls.

AI, Ain’t I a Woman? is an exploration of AI from a political perspective. Joy Buolamwini is a poet of code who uses art and research to illuminate the social implications of artificial intelligence. In this case, she lays bare the racial bias of facial recognition.

Inspired by the Dutch 'tulip-mania' in the 1630s, Anna Ridler draws parallels between tulips and the current mania around cryptocurrencies. Created by an AI,  the film shows blooming tulips that are controlled by the bitcoin price. changing over time to show how the market fluctuates. The project echoes 17th century Dutch still life flowers paintings, which despite their supposed realism, are imagined because the flowers in them could never bloom at the same time. Does cryptocurrency provide us with a similar imagined reality?

Visit the newly opened Forum in Groningen to see these projects and much more! Expect your preconceptions to be challenged and discover how this technology impacts our human essence from historical, scientific, social and creative perspectives.

What? A travelling exhibition to explore our relationship with AI
When?
Now, until 30 April 2020
Where?
Forum, Groningen

[post_title] => Explore your relationship with AI in this exhibition [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => ai-more-than-human [to_ping] => [pinged] => [post_modified] => 2019-12-12 16:23:02 [post_modified_gmt] => 2019-12-12 15:23:02 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=111455 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[2] => WP_Post Object ( [ID] => 126090 [post_author] => 4 [post_date] => 2019-12-04 13:09:15 [post_date_gmt] => 2019-12-04 12:09:15 [post_content] =>

In response to the nitrogen crisis, the Dutch cabinet is planning to reduce the speed limit during the day to 100 kph. In itself a sensible decision. But it is strange that this measure also affects motorists who cause no nitrogen emissions whatsoever. A missed opportunity to reward sustainable actions. That's why we started a petition. We want a separate lane for electric cars where 130 km/h is allowed. Sign if you agree with us.

Before I explain why we should keep the speed limit at 130 kph for drivers of electric cars, a confession: I don’t own an electric car myself.

I pretty much always take the train. It’s a comfortable option and you can spend the journey reading or working. I do still own (blush) an old-fashioned gas-powered car. I’ve had it for six years, and I still regret not opting for an electric car at the time. You know the arguments: too expensive, too small a radius of action, the car I liked only came in a gas variant. The consumer in me defeated the citizen. So it often goes.

As citizens we have high-minded ideas about a better society. Then we walk into a shop and buy things totally at odds with our ideals. Unfortunately, I’m all too familiar with this particular human flaw.

What the government can do

This is where the government can give the citizen a push in the right direction. How? By inviting the consumer to behave in a way that contributes to a better society. This is already happening. When I want to park my gas car, there’s never a convenient spot besides the charging stations reserved for electric cars.

Good for them, I think. These people have made a more sustainable choice than I have, and they should be rewarded. If I’m soon to be sped past at 130 kph, I can feel bitter towards the rich showoffs in the fast lane. Or I can bear in mind that these investors in the energy transition have earned their fun.

A positive step in the energy transition

When a new technology is introduced, it’s always more expensive than the existing options at first. The first flat-screen televisions were only affordable for millionaires, because the investments made into new factories had to be earned back.

That lasts a few years, and then the price falls and the product becomes affordable for everyone. I see rich people prefer a Tesla to a Ferrari. Let’s encourage that. Besides, a fully electric second-hand Nissan Leaf costs €8500. I could afford that, and when I look around me on the highway, I suspect that plenty of other motorists could too.

Granted, electric cars are not perfect. Wear and tear on the tires still produces fine dust. Driving fast consumes more energy. If that energy is supplied by coal-fired power stations, CO2 emissions are still involved. And that’s without going into the conditions in the lithium mines—plus the energy costs of producing the batteries. You could lose hope thinking about it all. Staying in bed is more sustainable.

But the question remains: is the glass half empty or half full?

Answer: half full.

Electric cars are a positive step in the energy transition, which must be encouraged. Elsewhere that’s already happening. In Austria, electric cars adhere to a speed limit of 130 kph, while 100 kph is the norm for others. In Norway, electric cars can drive in bus lanes. Even Uber encourages its drivers to go electric. Why wouldn’t the Dutch government do the same? Maybe it’s something to do with our Calvinist mentality?

Sustainability should be rewarded

Sustainability is often equated with cutting down, austerity, stringency. That is an error. We think about it one-sidedly, in terms of crime and punishment, when we should be thinking in terms of possibilities, smart solutions and circularity.

There is an unbelievable amount of energy available on Earth, and right now we only harvest and utilize a tiny proportion of it. How absurd is it that humanity continues to generate most of its energy from the burning of coal, which releases CO2 that contributes to climate change, while a gigantic power plant floats in space at a safe distance?

I hope that future generations can laugh about it and do better.

The sun is a nuclear fusion reactor with an output of 380 million million million million watts. That’s equal to 10 million oil barrels per second, per world citizen. Even though only a small proportion of this energy reaches the Earth, this proportion is still 9000 times the current energy requirements of all 7.7 billion people on Earth combined.

If we’re smart enough to put more of the available energy to work for us in a sustainable way, a bright beautiful world full of abundance awaits us. That costs time, that costs money, we’re still a long way off, and the journey will be full of ups and downs. The nitrogen crisis is not called a crisis for nothing. Policy has to be made at lightning speed. That policy can be refined by rewarding positive sustainable actions.

There are plenty of places where this can be done safely. Think of five-lane highways. In the right-hand lane, a diesel-powered truck chugs along at 80 kph. In the middle three lanes, gas-powered cars go 100 kph. In the left-hand lane, an electric car zooms into the future at 130 kph.

[post_title] => Why electric cars should be allowed to drive faster [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => electric-cars-130 [to_ping] => [pinged] => [post_modified] => 2019-12-05 12:00:48 [post_modified_gmt] => 2019-12-05 11:00:48 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=126090 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 2 [filter] => raw [post_category] => 0 )[3] => WP_Post Object ( [ID] => 125688 [post_author] => 2279 [post_date] => 2019-11-07 15:36:15 [post_date_gmt] => 2019-11-07 14:36:15 [post_content] =>

The way people get around is starting to change, and as a professor of transport strategy I do rather wonder if the modes of transport we use today will still be around by the turn of the next century.

Growing up, my favourite book was a children’s encyclopaedia first published in 1953. One double page spread featured an annotated cityscape, showing all aspects of the built environment – most of which we would still be familiar with now. The various modes of transport illustrated – trains, buses, lorries, taxis, motorcycles, bikes, pedestrians and private cars – still work together as a system in fundamentally the same ways.

But a whole range of possible (though not inevitable) societal and technological changes could revolutionise how we travel in the coming decades. These include large-scale responses to the climate change agenda and energy sourcing and security; shifting demographic trends (such as growing numbers of elderly people); the development of the collaborative economy; the growing use of big data; and the apparent inevitability of driverless cars.

To examine what future urban transport systems might look like, I recently directed a future-gazing project for New Zealand’s Ministry of Transport exploring how people might be travelling in the year 2045. I helped develop four scenarios, along two axes of change.

The first axis considered automation – at one end, vehicles are still be driven much like today (partial automation). At the other, they’re driverless (full automation). The second axis related to how dense cities could become – one future where the population is more dispersed (like Los Angeles) and another where it is concentrated at a higher density (more like Hong Kong). With these axes in mind, I generated four possible futures for public transport, which could play out in cities across the world.

Choose your fighter. By Marcus Enoch, Author provided

1. Shared shuttles

In the “shared shuttle” city, demand responsive minibuses, Uber-style taxis and micro-modes – such as shared bicycles, electric bikes and hoverboards – to cover the “last mile” to your destination are widespread. Hiring these different forms of transport is simple, thanks to seamless booking and payment systems and a thriving entrepreneurial spirit among a range of commercial, social and government transport providers. Meanwhile, new environmental regulations mean that owning a car is more expensive than it used to be, and private vehicles are restricted to the suburbs.

Flexibility is a core feature of this scenario, with vehicles and services that adjust to the needs of individuals, and with how the space continually adapts to meet the needs of the city as a whole. There’s also a collaborative ethos, reinforced by the development of a more compact and high-density city, while progress toward full automation has been slow because of safety and privacy concerns.

2. Mobility market

Private cars still dominate urban transport in the mobility market scenario. Many citizens live and often work in dispersed, low-density suburban areas, since city-centre housing became too expensive for most to afford. Fewer people walk and cycle, because of the long distances involved. And the use of public transport has declined, since less dense transport networks mean there are fewer viable routes, though a limited network of automated trains and buses is still used for trips to the city centre.

Car use has fallen somewhat since the 2010s, because “active management” measures – such as pre-bookable fast lanes and tolls – are now necessary to control congestion, despite the completion of a sizeable road building programme in the recent past.

Instead, commercially provided pre-paid personalised “mobility packages” are helping to stimulate the use of a whole range of shared mobility options, such as car-pooling, bike hire and air taxi schemes. These now account for around a quarter of all journeys.

3. Connected corridors

Society in this high-tech, highly urbanised world of connected corridors is characterised by perceptive but obedient citizens who trade access to their personal data in return for being able to use an extremely efficient transport system. Physically switching between different services or even different modes of travel is hassle free, thanks to well designed interchange points, and fully integrated timetabling, ticketing and information systems.

For instance, travellers might walk, e-cycle or take a demand-responsive minibus to a main route interchange, then board a high frequency rail service to get across town and finally take a shared autonomous taxi to their destination. Each will be guided by a personalised, all-knowing “travel ambassador” app on their smartphone or embedded chip, which will minimise overall travel times or maybe maximise sightseeing opportunities, according to their preferences.

Private cars are not really needed. People trust technology to deliver inexpensive and secure transport services and appreciate living close to work, family and friends.

4. Plentiful pods

In this future, fleets of variously-sized driverless pods now provide around three-quarters of those journeys that still need to be taken across the low-density, high-tech city. These pods having largely replaced most existing public transport services, and the vast majority of privately-owned cars.

People do still walk or cycle for some shorter trips. But pods are so convenient, providing affordable point-to-point journeys for those not satisfied by virtual interactions. Passengers can pay even less, if they agree to share with others. Pods are also fully connected to the internet, and are priced and tailored to meet customer needs. Ultimately, pods give people the freedom to work, learn or live where the weather is best or the houses are cheapest.

My research did not pass judgement as to which scenario should be pursued. But it did conclude that public transport will need to evolve to meet future challenges, and that the role of government will still be of key importance going forward, no matter which path is chosen. Personally though, if forced to choose, I think I’d favour a shared shuttle future more than the others - it just seems more sociable.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Cover image: Renault's float autonomous car

[post_title] => Four visions for the future of public transport [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => four-visions-for-the-future-of-public-transport [to_ping] => [pinged] => [post_modified] => 2019-11-07 15:36:17 [post_modified_gmt] => 2019-11-07 14:36:17 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125688 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[4] => WP_Post Object ( [ID] => 125626 [post_author] => 367 [post_date] => 2019-11-07 15:13:57 [post_date_gmt] => 2019-11-07 14:13:57 [post_content] =>

Delve into the science and fiction of robots at V&A Dundee’s latest exhibition: Hello, Robot. Contemplate the existence of robots and how they have both shaped, and been shaped by technology, culture and design. See for yourself how the boundaries between human and machine have become increasingly blurred; understand how we got here, and take note of the increasing power of designers to influence the future of such technologies.

Hello, Robot has over 200 objects on display, and takes visitors through four stages of robot influence and evolution. Throughout, the exhibition poses provocative questions, and will require you to consider the past, present and future of robotics like never before. Reflection of this kind may prove essential in answering the question, how will robots exist in our next nature?

What? An exploration of robots in a human world
Where? V&A Dundee, Scotland (UK)
When? Now, until 09 February 2020

[post_title] => This exhibition looks at how robots are changing the world we live in [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => new-exhibition-hello-robot [to_ping] => [pinged] => [post_modified] => 2019-11-28 17:40:43 [post_modified_gmt] => 2019-11-28 16:40:43 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125626 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[5] => WP_Post Object ( [ID] => 119866 [post_author] => 1860 [post_date] => 2019-09-12 09:34:24 [post_date_gmt] => 2019-09-12 08:34:24 [post_content] =>

The idea that AI can compose electronic music may sound a little off to people. It raises essential questions about creativity as a product exclusive to humans: can an AI be creative? Can it be musical? Can it compete with human-made melodies? Does it need to?

More and more, AI has set foot in the realm of creative industries. From an AI writing the next GoT book to IBM’s Watson creating a trailer for a non-exisent sci-fi thriller. And that’s not where it ends: the music industry also got involved when that same Watson was used by award winning producers to create country rap, not to mention a Eurovision song created with machine learning.

Electronic music, too, is affected by the algorithmic technologies that revolutionize the way humans relate to the arts. As a discipline that has technology at its very core, electronic music is bound to cross paths with the ways of AI. From DJing to producing and from contriving DJ names to directing music videos, algorithmic agency is growing stronger each day.

The subsequent question is how humans pertain to these technologies and how the arts and AI can be treated as a symbiosis, rather than a dystopian binary. Put differently, how can we embrace AI as an instrument we can work together with, rather than an autonomous entity overruling human creativity?

https://www.youtube.com/watch?v=4MKAf6YX_7M

Music has always been technological

There has always been a link between music and technology, as essentially it revolves around counting and measuring the rhythm, as much as it relies on instruments.

Clapping their hands, our early ancestors used their body as an instrument to create rhythmic music. As our predecessors found out that they could smack sticks or stones to enhance the beat without hurting their hands, drums were invented.

Fast-forward to the 20th century, elaborate drum kits emerged at the intersection of African-American brass bands and western instruments. The technology of the bass paddle made it possible to use both hands and feet to incite sound, hence evolving the drum kit as we know it now.

The instrument was further technologized when companies like Korg and Roland started producing drum machines on a massive scale. The genres that emerged from these instruments diverged, but essentially, both the drum kit and drum machine serve as a technology to produce the rhythms and sounds that we know as music.

Are algorithms are the next DJs?

In the same line, DJing has undergone changes when the vinyl decks were complemented by USB-driven CDJs. Though the technologies changed, the art of DJing remains present – just in different ways.

In this day and age, AI is the upcoming technology broadening the horizon of (electronic) music. On a day-to-day basis, algorithms are already silently ruling our music taste through auto-playlists like the ones developed by YouTube, Spotify and Apple Genius. In a way, algorithms are already our next DJs.

But not only are these algorithms able to curate music to our likenings; they are also able to flawlessly mix our favorite tracks together. Recently, a Spotify playlist was born that tests an automixing feature with the help of AI. The Drum And Bass Fix playlist seamlessly beatmatches two tracks when shuffle is switched on.

Not into drum and bass? Then try curating your own beat-matched set or mashup by using Rave DJ. This online application allows you to upload a YouTube or Spotify playlist with the use of algorithms. It then creates a smooth mix of even the most obscure track combinations.

Naturally, tech giant Google also engaged with algorithmic advances within the electronic music industry by developing an AI synth named NSynth. This open source synthesizer uses Google’s network to reproduce the qualities of sounds and instruments, which feeds its algorithms. Though based on neural networks, it actually comes as a hardware product with a touchscreen pad.

https://www.youtube.com/watch?v=ZsZc4Q_eDk4

Will AI outmix humanity?

These tools may seem futuristic, but there are plenty of artists already utilizing AI to produce music. At this year’s Transmediale, UK DJ and producer Actress even granted his AI offspring complete artistic agency by giving it a stage name: Young Paint. Together, they enacted a live audiovisual performance that was mostly based on real-time improvisation, but they also captured some collaborative ventures on a mini-album via his new label Werk__Ltd.

According to electronic musician Olle Holmberg, it is just a matter of time before we will be following AI DJ’s and producers on social media, after attending our favorite algorithmically driven gigs – which is basically already happening with the advent of virtual influencers.

Based on the semantic traits that can be found in Hardwax’s database of DJ names, Holmberg recently published a list of DJ names generated by an AI. Though a DJ name might seem trivial, it does show that AI is capable of mimicking and further developing our club experience based on our current ideas of what clubbing should be like.

https://www.youtube.com/watch?v=v_4UqpUmMkg

Team human

There is an uncanny objection to these kind of technological advances, assuming they would violate our authentic ‘humanness’, when in fact it is in our very human nature to be technological. Speaking, writing, reading counting, singing – these are all cultural technologies; so are DJing and producing.

The cycle that drove us from drum kits to drum machines is the same evolutionary force driving humans to interact with AI in creating new musical works of art. Within this framework, AI basically is our next nature’s cultural technology.

Scholar and electronic music composer Holly Herndon, who built an AI recording system to help with her latest album, addresses the pervasive narrative in which technology is dehumanizing and instead proposes to ‘run towards’ technology, but on her own human terms.

This brings us to the crucial debate revolving around AI: we often forget how algorithms are technologies developed by humans. If algorithms become dehumanizing vehicles, they can only be so because the human system made them that way. 

[post_title] => How AI is reshaping electronic music [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => ai-reshaping-electronic-music [to_ping] => [pinged] => [post_modified] => 2019-09-12 15:44:07 [post_modified_gmt] => 2019-09-12 14:44:07 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=119866 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[6] => WP_Post Object ( [ID] => 113738 [post_author] => 1860 [post_date] => 2019-07-11 12:45:15 [post_date_gmt] => 2019-07-11 11:45:15 [post_content] =>

In configuring our next nature, artists and scientists explore new languages that move beyond the Anthropocene - the era of human beings. These semantics would bridge the gap between mankind and technology, but also between humans and other species, establishing a cosmological understanding of life. Within this endeavour, bio-artists Amanda Baum and Rose Leahy delved into more-than-human narratives by creating a monument for the Microbiocene: the age of the microbial.

The Microbiocene is an epoch we’ve always lived in and will continue to live in, as the vibrant matter on planet earth emerged and thrives through microbial life, e.g. bacteria. In collaboration with the Royal Netherlands Institute for Sea Research (NIOZ), Baum & Leahy dove into the deep time of the microfossil molecules Emiliania huxleyi, which are found in ancient sea sediment. The result is an award winning symbiosis between art and science, as well as an artefact for the ecologies that are yet to be embraced by the human species.

We caught up with the duo and spoke about the philosophical matter pushing their piece to emerge, and the microbial matter it is made of.

"The installation envisions a future archaeological site, thousands of years from now."

You created the ‘Microbiocene’ piece for the Bio Art and Design Award last year. Tell us about the creative process of the project; did you already have in mind this result or did it evolve from something completely different?

The Microbiocene as an overarching concept is something we’d been thinking about for awhile - over the past couple of years almost all our projects have become about mapping out the Microbiocene - the ancient, ongoing, and future era of microorganisms. We’ve explored this through various lenses; spiritual, material, ritualistic, ancestral.

When applying to the BAD Awards, we were immediately inspired by the research from the Department of Marine Microbiology and Biogeochemistry at The Royal Netherlands Institute for Sea Research (NIOZ). The scientists at NIOZ work with sea sediment containing microbial fossil molecules, which hold information about past environmental conditions, both recent and ancient.

NIOZ’s research combined with this cultural, philosophical framework gave birth to the idea of creating a form of ‘biological Rosetta Stone’ - a relic being found, and a language translated, to discover information about an ancient (invisible) civilisation.

Baum & Leahy, Microbiocene: Ancient ooze to future myths, 2018, Photo by Boudewijn Bollmann, MU ArtSpace

Inspired by the aspect of deep time, the installation envisions a future archaeological site, thousands of years into the future, where the Microbiocene monument is being found. It is inscribed with myths of the Microbiocene, a (re)telling of history and future on Earth as microbe-centric. These stories were based on information we unearthed from microbial fossils in sea sediment dating from the present to nearly 10,000 years ago. We then developed this data into narratives with our collaborating scientists, projecting different future scenarios.

The idea was to create a narration that was informed both by microbe and mammal.

You used ‘microglyphs’ in your piece —a microbe-centric language system co-created by the artists and scientists— how did you develop this language? Are the shapes imprinted on your works also literally found under the microscope?

The microglyphs were created with input from both scientific, cultural associations as well as free associations between us and the scientists. Some of the symbols are more literal —like a double bond in a molecule meaning cold, or Ehux being a graphic representation of how it looks, whilst some are more complex like the Microbiocene microglyph, which refers to life beginning on earth.

Whilst creating the microglyphs we discussed the multitude of forms that language takes,and the inherent human desire to traverse their boundaries – across cultures, disciplines and species. From the Rosetta Stone to art-sci collaborations to alien communication attempts, the wish to understand, and to translate is constant: we all dream of babel fish.

Baum & Leahy, Microbiocene: Ancient ooze to future myths, 2018, Photo by Boudewijn Bollmann, MU ArtSpace

By creating a visual language for the Microbiocene, we attempted to move towards a more multimodal form of communication with the potential to be interpreted in various ways by anyone encountering it. Each of the microglyphs has multiple meanings, which change responsively with the surrounding microglyphs. Different compositions of the microglyphs explore movements within the meaning of the sentences.

The microglyphs are an initial iteration into working with the materiality of language, which we continue to explore in workshops, and our other projects. By consciously molding language, or sign making, into new biologically informed structures, we begin to weave our mammalian minds into the Microbiocene.

What kind of scientists did you collaborate with?

We collaborated with biogeochemists from the Royal Netherlands Institute for Sea Research – Julie Lattaud, Gabriella Weiss and Laura Schreuder. They study the alkenone biomarkers, especially sturdy molecules, left by microorganisms in sea sediment. This sediment is collected in long cores, which have a cross section of earth from the seabed and below, with the top layer being the most recent, and the bottom being from the mostpast.

"We like to work with scientists as partners on an equal basis of passion for understanding."

Their lab work is wonderfully intimate with the sediment that is collected. The coresare opened from a long tube and these incredibly distinct lines are revealed along the earth core, indicating thousands of years of life being lived before turning to matter.

Do you think you could have created this piece without this collaboration? What role does and should science play in art? Where does science stop and art begin?

The idea of the piece itself grew out of and was continuously informed by the scientific research, so it would have been another piece without the scientific collaboration. Like any other relationship, the symbiosis between art and science can and should take many forms, from the abstract and experimental to the more systematic.

At this point in time, we see not only creative potential but also a certain urgency, in the point between ecological transformation, emerging technologies and increased sensitivity and awareness towards the planetary web of life.

We like to work with scientists as partners on an equal basis of passion for understanding, working with and caring for living systems - although with very different means of research and expression. Before restricting ourselves within established epistemological systems, we try and create a nurturing space of shared curiosity, where ideas and visions aren’t limited to our individual areas of expertise.

Baum & Leahy, Microbiocene: Ancient ooze to future myths, 2018, Photo by Max Kneefel, MU ArtSpace

Do you think that art is stuck in the anthropocene? Is art too much focused on human experience?

We think it’s important that art happens across many ‘cenes’- and that it’s also urgently important to reflect on our lives in the Anthropocene. Yet we are interested in exploring an alternative - one that is generative, slimey and messy, and optimistic about the adaptable forces of life. Microbiocene is just one. We continuously draw on inspiration from Donna Haraway’s ‘chthulucene’. Nurturing the diversity and moving away from dominant narratives of the Anthropocene is what we find urgently needed - within all fields, not just artistic.

Whilst creating Microbiocene we were thinking a lot about the magnitude of microbial experience that has come before us, and how this has had slow yet defining atmospheric and evolutionary impacts on the Earth, setting out the conditions for terran life to thrive. In contrast, human’s time on Earth is becoming very much defined by rapid changes, caused by a few, and resulting in wider impacts for all, and some much more than others. We believe a more microbial approach could trigger the emergence of new systems of adaptation and cohabitation.

"The monument is raised to mark and celebrate how humans learn to become more microbial in their planetary impact."

By looking at the history of time on Earth through the perspective of the Microbiocene, we hoped to condense this microbial evolutionary perspective into a material and sensorial experience able to inspire new ideas and trajectories challenging current anthropocentric worldviews. The Microbiocene monument is raised to mark and celebrate how humans learn to become more microbial in their planetary impact. Focusing on more-than-human adaptive strategies and experience as a worthy alternative. For us, drawing the narrative out of information in the material remains of microbial experience was a way to do this.

Baum & Leahy i.c.w. Sofie Birch and Pernille Kjær, Interterrestrials, 2019
Baum & Leahy i.c.w. Sofie Birch and Pernille Kjær, Interterrestrials, 2019

What role does materiality play in your piece and how do you elevate a materiality from human to more-than-human?

It was an incredible opportunity for us to use the sea sediment from our studies as part of the material in the sculpture.

The particular sediment we were working with is called calcareous ooze, meaning it contains a large proportion of skeletal remains of coccolithophores. This included Emiliania Huxleyi (Ehux) - the microorganism we were studying within the sediment - which has an incredible, vibrant materiality to it.

It is a single celled alga covered in calcium carbonate-rich platelets, which – with the help of deep time –transmutates into materials such as chalk and lime. The build up of these microscopic organisms on the seabed over long periods has an immense, macroscopic effect, as expressed in the White Cliffs of Dover and Møns Klint.

When understood as the material result of numerous coccolithophore bodies and existence, this coastal landscape becomes a more-than-human monument in itself. We wished to translate the immensity of this deep time within this lively material we had in the lab.

"Microbes are in a way a ‘gateway’ to the unknowns of the universe."

Part of what we find fascinating about the microbial world is the (to us) mysterious material liminality - microbes are in a way a ‘gateway’ to the unknowns of the universe, which we know makes up more than 90% of our perceived reality.

We can’t see the microbes with our naked eye, yet with electron microscopy technologies etc it’s revealed how alive, vibrant and ‘material’ they are. We see them as active, reproductive, communicative, busy organisms, just like ourselves.

This many faceted relationship between the microbes’ ubiquitous, ghostly presence and the very material reality of their lives, which resonates with our human experience, continues to puzzle and inspire us.

Even more incomprehensible invisible organic elements like bacteriophages, proteins, DNA, molecules, atoms, dark matter, down to the strange world of quantum mechanics, seems more ‘approachable’ when we think of them through the universal, microbial gateway.

Baum & Leahy i.c.w. Naja Ankarfeldt, The Red Nature of Mammalga, 2018
Baum & Leahy i.c.w. Naja Ankarfeldt, The Red Nature of Mammalga (detail), 2018

Philosopher Timothy Morton wrote that we have to think in terms of durations, meaning we have to create a ‘deep time’ to ‘think ecologically’. Are you perceiving the world differently in terms of temporality since you made this work? Do you experience a more cosmological time as opposed to a human history?

When working with material such as the sediment cores that have such an evident history, it’s impossible not to become incredibly aware of and sensitive to the vast periods of time on Earth that have preceded us.

Our collaborating scientists work with these kind of time scales everyday, and so are used to thinking about time on Earth in terms of epochs, rather than through the length of their own human experience.

This was intriguing for us, and something we were trying to approach in Microbiocene not only as an installation, but also as a framework. Indeed, we believe that if humans could enter a mindset of deep time, we would see a big shift in our ways of producing and distributing materials. If humans could think in terms of the length of time that that plastic bottle will be on Earth, rather than the length of time it was experienced it in our lives, we surely wouldn’t be producing and distributing them in this way.

Yet, whilst the Microbiocene entails this very cosmological way of thinking – we’re not sure we can claim to have transcended into everyday cosmological experience of time from this. Despite our best efforts, we’re still just too darn human for that.

Baum & Leahy, Cellular Sanctum, 2018
Baum & Leahy, Cellalur Sanctum, 2018

You both have a background in design - do you think different aesthetics in our everyday surroundings will amount to different environmental awareness? And if so, what’s the potential role of aesthetics in environment awareness?

In our work, we combine tactile, sensorial materiality with collective, ceremonial practicessuch as meditation, ritual and writing to practice and nurture a symbiosis between matter (microbial, mammalien, etc) and mind. We aim to bring focus to how internal and external realities are interrelational and constantly shaping each other. By materialising a speculative scenario, ongoing tendencies can be harnessed and the actual long term realisations can emerge.

We have both been inspired by biophilic design principles, how biomorphic form, aesthetic,and material can be used to strengthen, encourage, and practice our connection with other species and ecologies.

Recently we’ve been thinking about ‘microbiophilia’, and how to stir emotions for organisms we can’t see, yet live all around, on, and within us. In previous pieces such as Cellular Sanctum (2018), and The Red Nature of Mammalga (in collaboration with Naja Ankarfeldt, 2018), we created tactile biomorphic, microbial forms, microbial drinks, and written participatory chants, to create a tactile and sensual experience.

Through these aesthetic experiences we aim to seed a heightened awareness to the parallel microscopic world, within those who experience them.

More microbiocene ?

https://vimeo.com/322794918

Cover photo by Max Kneefel.

[post_title] => Microbiocene: A microbiological archeology of the future [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => microbiocene [to_ping] => [pinged] => [post_modified] => 2019-09-02 17:03:24 [post_modified_gmt] => 2019-09-02 16:03:24 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=113738 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[7] => WP_Post Object ( [ID] => 109864 [post_author] => 367 [post_date] => 2019-04-11 08:00:36 [post_date_gmt] => 2019-04-11 07:00:36 [post_content] =>

As we showed with HUBOT, we can use new technologies and robotics to make our work more enjoyable, interesting and humane. Aside from our speculative jobs, a lot of robotic companions already exist. Think about the robotic fish that observes marine life, the breathing pillow that helps you sleep and robotic arms that assist the surgeon. So why not join forces at Next Nature Network?

Not working against, but with robots!

Over the past few years, we have been actively growing a meaningful community on Instagram. Ever since 2016, the Instagram feed is curated by an algorithm that holds the magical powers to decide what you see. And, in doing so, the robots at Instagram decide what our online community gets to see.

This doesn’t sound like a humane technology to us.

Also, Instagram does not allow us to use the so-called 'swipe up' functionality that directly leads you to our stories on Nextnature.net.

So why do we keep using Instagram? Because it also bring us joy and convenience. Social media platforms are connecting us to like-minded people all over the world, and we value your engagement! It’s not all bad, but there is a lot of room for improvement.

We wish we could change Instagram’s algorithms and functionalities, but unfortunately we do not have that power (yet). What we can do, is make life easier for you, hence we decided to hire an army of robots to the rescue.

10.000 new colleagues

Meet our Robotic Follower. They are responsible for playing Instagram’s algorithms and unlocking new features. If they do the job well, you won’t even notice they're there. Thanks to their work, we can put our time and energy into creating great content for you. Because that's what we love to do. Let’s make inhumane technology more humane!

[post_title] => Why Next Nature Network is hiring an army of bots [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => army-of-bots [to_ping] => [pinged] => [post_modified] => 2019-04-11 09:13:34 [post_modified_gmt] => 2019-04-11 08:13:34 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=109864 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[8] => WP_Post Object ( [ID] => 110079 [post_author] => 1860 [post_date] => 2019-04-05 10:17:06 [post_date_gmt] => 2019-04-05 09:17:06 [post_content] =>

Would you pray to a Robot deity? A group of Japanese buddhists is already doing so. Meet Mindar, the robot divinity shaped after the buddhist Goddess of Mercy, also known as Kannon. Kyoto based Kodaiji Temple recruited Osaka University’s head of intelligent robotics Hiroshi Ishiguro to design this million dollar robot. Its purpose? To encourage youngsters to engage with buddhism again.

Mindar’s silicone face represents a gender neutrality that aims to move beyond human representation. Supplemented with aluminium body parts, its enmeshment with human construction becomes evident. And when Mindar is chanting the heart sutra, an eerie resemblance with Japan’s vocaloid Hatsune Miku comes to mind.

God is a Google

Mindar is not the only religious robotic to have been manufactured. In the past, we spotted robot monk Xian'er informing visitors of the Longquan temple, near Beijing. Creations like Mindar and the Xian'er can be understood as metaphors for the way humankind worships artificial intelligence: we worship contemporary technologies as if they were Gods. But aforementioned creations are part of a greater scheme, as the complexity of artificial intelligence forms a more-than-human network that reminds me of spirituality.

Think of a hyper-intelligent computing database like Google. Its systems are omniscient, they are built to know everything. They compress knowledge into a time-space continuum of all recorded human knowledge and activities. According to media scholar John Durham Peters, Google can therefore be understood as a lo-fi universe - a God.

The Church of AI

An even greater analogy comes into existence when considering the vast network of all computing systems ever built. They can be regarded a hyper-intelligent being or a God, if you will. With this in mind, Anthony Levandowski started the church of Artificial Intelligence named Way of the Future (WOTF).

WOTF’s main focus revolves around “the realization, acceptance, and worship of a Godhead based on Artificial Intelligence developed through computer hardware and software.” The church emphasizes not to be affiliated with any company. Rather, it operates as an open source movement that allows everyone to contribute to the creation of the most intelligent being ever.

Levandowski stresses the importance to work on this God, as a so-called “Transition” will herald an epoch in which a hyper-intelligent being (hint: it’s not the human being) will be in charge of the world. This intelligent being will sense the world through the internet as its nervous centre, knowing everything that is happening anytime, anywhere. Levandowski converts the creepiness of this idea into the emergence of his AI church. In order for the Transition to take place in a serene way, an initiative like WOTF would be urgent to gain more control over this procedure.

Cybernetic spirituality

What if the Transition has already taken place? What if we’re more in need of a Way of the Now, rather than a way of the future? Pioneering cybernetic Stafford Beer already characterized control systems as spiritually charged networks in his 1966 essay about knowledge and God.

Cybernetics is about the way systems work in feedback loops; in a way, it is a predecessor of AI. It has contributed to many fields - control systems, electrical network theory, mechanical engineering, logic modeling, evolutionary biology, neuroscience, anthropology, and psychology - but was overshadowed by AI at some point.

In the aforementioned essay, Beer described man’s embedding in a cybernetic control system as follows: “To people reared in the good liberal tradition, man is in principle infinitely wise; he pursues knowledge to its ultimate . . .To the cybernetician, man is part of a control system.” This control system itself is not a code that can be simply cracked by humans - it is something hyper-intelligent that we can have no absolute knowledge about, exactly because it is smarter than we are.

Just like many contemporary technologies, this all-encompassing control system is signified by a black-boxed truth that we cannot uncover. Not because we’re not allowed to, as is the case with tech companies that reinforce policies of secrecy. Rather, there is a larger black box containing the connectivity that seeps through the vessels of contemporary networks. Perhaps that’s the deity already ruling our lives.

So, whether we like it or not, all of us taking part in modern technological systems are already praying to a hyper-intelligent God; our next nature is already present. Our prayers are answered with flickering mobile screens and information dumps that appear before us within the blink of an eye. Just like any other God, it does not stop wars, but it does grant us the gift of knowledge.

[post_title] => The religion named Artificial Intelligence [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => the-religion-named-artificial-intelligence [to_ping] => [pinged] => [post_modified] => 2019-04-05 11:11:45 [post_modified_gmt] => 2019-04-05 10:11:45 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=110079 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[9] => WP_Post Object ( [ID] => 107665 [post_author] => 5 [post_date] => 2019-01-18 12:43:27 [post_date_gmt] => 2019-01-18 11:43:27 [post_content] =>

Berlin is kicking off its cultural season with the not-to-miss 23th installment of Transmediale. This year the digital art/culture festival focuses on how feelings are made into objects of technological design, and asks what role emotions and empathy play within digital culture.

We combed the program so you don't have to:

How to Grow and Use Your Feelers (Workshop. Wednesday from 11:00 to 14:00

Donna Harrway's writings inspired the interdisciplinary techno-feminist research group #purplenoise to immerse us in a world of “feelers” as symbols for an extended human sensorium.

Algorithmic Intimacies (Talk. Saturday from 12:00 to 13:30)

Intimacy is a crucial element of domestic life, yet there's a deficit in current understandings of how technologies are used within algorithmic intimacies. In this talk, fembots, virtual assistants and dating apps are discussed to reflect upon how today’s algorithmic lives are felt.

Knitting and Knotting Love (Keynote. Saturday from 18:00 to 19:30)

How do you love? And how is this love traversed through networks? In their performative lecture at transmediale 2019, Shaka McGlotten tracks a networked experience of love.

Alter Media (Screening. Saturday from 19:30 to 21:30)

From global connectedness bridging unimaginable distances to data abuse, automated opinion manipulation and unrestrained marketing strategies. This screening depicts a broad spectrum of lived experiences with the media spheres of our time.

Actress + Young Paint (live AI/AV) (Performance. Saturday from 21:30 to 22:30)

Meet the AI-based character that spends its time programming Actress’ sonic palette. Expect a life-size projection of the AI working in a virtual studio, coming together with a physical performance on stage.

Cover image: Rory Pilgrim, Software Garden, 2018. Courtesty of the artist and andriesse-eyck galerie Some rights reserved. (Performance: Friday from 20:00 to 21:00)

Transmediale 2019 takes place from 31 January to 3 February 2019 at Haus der Kulturen der Welt. Tickets.

[post_title] => Your Next Nature guide to Transmediale 2019 [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => your-next-nature-guide-to-transmediale-2019 [to_ping] => [pinged] => [post_modified] => 2019-01-18 14:14:58 [post_modified_gmt] => 2019-01-18 13:14:58 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=107665 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 ))[post_count] => 10 [current_post] => -1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 126400 [post_author] => 2319 [post_date] => 2020-01-06 14:53:14 [post_date_gmt] => 2020-01-06 13:53:14 [post_content] =>

Like it or loathe it, the robot revolution is now well underway and the futures described by writers such as Isaac Asimov, Frederik Pohl and Philip K. Dick are fast turning from science fiction into science fact. But should robots have rights? And will humanity ever reach a point where human and machine are treated the same?

At the heart of the debate is that most fundamental question: what does it mean to be human? Intuitively, we all think we know what this means – it almost goes without saying. And yet, as a society, we regularly dehumanise others, and cast them as animal or less than human – what philosopher Giorgio Agamben describes as “bare life”.

Take the homeless for example. People who the authorities treat much like animals, or less than animals (like pests) who need to be guarded against with anti-homeless spikes and benches designed to prevent sleep. A similar process takes places within a military setting, where enemies are cast as less than human to make them easier to fight and easier to kill.

Humans also do this to other “outsiders” such as immigrants and refugees. While many people may find this process disturbing, these artificial distinctions between insider and outsider reveal a key element in the operation of power. This is because our very identities are fundamentally built on assumptions about who we are and what it means to be included in the category of “human”. Without these wholly arbitrary distinctions, we risk exposing the fact that we’re all a lot more like animals than we like to admit.

Being human

Of course, things get a whole lot more complicated when you add robots into the mix. Part of the problem is that we find it hard to decide what we mean by “thought” and “consciousness” and even what we mean by “life” itself. As it stands, the human race doesn’t have a strict scientific definition on when life begins and ends.

Similarly, we don’t have a clear definition on what we mean by intelligent thought and how and why people think and behave in different ways. If intelligent thought is such an important part of being human (as some would believe), then what about other intelligent creatures such as ravens and dolphins? What about biological humans with below average intelligence?

These questions cut to the heart of the rights debate and reveal just how precarious our understanding of the human really is. Up until now, these debates have solely been the preserve of science fiction, with the likes of Flowers for Algernon and Do Androids Dream of Electric Sheep? exposing just how easy it is to blur the line between the human and non-human other. But with the rise of robot intelligence these questions become more pertinent than ever, as now we must also consider the thinking machine.

Machines and the rule of law

But even assuming that robots were one day to be considered “alive” and sufficiently intelligent to be thought of in the same way as human beings, then the next question is how might we incorporate them into society and how we might hold them to account when things go wrong?

Traditionally, we tend to think about rights alongside responsibilities. This comes as part of something known as social contract theory, which is often associated with political philosopher Thomas Hobbes. In a modern context, rights and responsibilities go hand-in-hand with a system of justice that allows us to uphold these rights and enforce the rule of law. But these principles simply cannot be applied to a machine. This is because our human system of justice is based on a concept of what it means to be human and what it means to be alive.

So, if you break the law, you potentially forfeit some part of your life through incarceration or (in some nations) even death. However, machines cannot know mortal existence in the same way humans do. They don’t even experience time in the same way as humans. As such, it doesn’t matter how long a prison sentence is, as a machine could simply switch itself off and remain essentially unchanged.

For now at least, there’s certainly no sign of robots gaining the same rights as human beings and we’re certainly a long way off from machines thinking in a way that might be described as “conscious thought”. Given that we still haven’t quite come to terms with the rights of intelligent creatures such as ravens, dolphins and chimpanzees, the prospect of robot rights would seem a very long way off.

The question then really, is not so much whether robots should have rights, but whether we should distinguish human rights from other forms of life such as animal and machine. It may be that we start to think about a cybernetic Bill of Rights that embraces all thinking beings and recognises the blurred boundaries between human, animal and machine.

Whatever the case, we certainly need to move away from the distinctly problematic notion that we humans are in some way superior to every other form of life on this planet. Such insular thinking has already contributed to the global climate crisis and continues to create tension between different social, religious and ethnic groups. Until we come to terms with what it means to be human, and our place in this world, then the problems will persist. And all the while, the machines will continue to gain intelligence.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[post_title] => It could be time to start thinking about a cybernetic Bill of Rights [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => cybernetic-bill-of-rights [to_ping] => [pinged] => [post_modified] => 2020-01-06 14:53:15 [post_modified_gmt] => 2020-01-06 13:53:15 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=126400 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[comment_count] => 0 [current_comment] => -1 [found_posts] => 467 [max_num_pages] => 47 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => 1 [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => 1 [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => [is_privacy_policy] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_posts_page] => [is_post_type_archive] => [query_vars_hash:WP_Query:private] => da3cb6b58b07d0039d6d92a412741b24 [query_vars_changed:WP_Query:private] => [thumbnails_cached] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed )[compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ))
load more