480 results for “Wild Systems”

Four visions for the future of public transport

Marcus Enoch
November 7th 2019

The way people get around is starting to change, and as a professor of transport strategy I do rather wonder if the modes of transport we use today will still be around by the turn of the next century.

Growing up, my favourite book was a children’s encyclopaedia first published in 1953. One double page spread featured an annotated cityscape, showing all aspects of the built environment – most of which we would still be familiar with now. The various …

This exhibition looks at how robots are changing the world we live in

NextNature.net
November 7th 2019

Delve into the science and fiction of robots at V&A Dundee’s latest exhibition: Hello, Robot. Contemplate the existence of robots and how they have both shaped, and been shaped by technology, culture and design. See for yourself how the boundaries between human and machine have become increasingly blurred; understand how we got here, and take note of the increasing power of designers to influence the future of such technologies.

Hello, Robot has over 200 objects on display, and takes visitors …

How AI is reshaping electronic music

Linda Valenta
September 12th 2019

The idea that AI can compose electronic music may sound a little off to people. It raises essential questions about creativity as a product exclusive to humans: can an AI be creative? Can it be musical? Can it compete with human-made melodies? Does it need to?

More and more, AI has set foot in the realm of creative industries. From an AI writing the next GoT book to IBM’s Watson creating a trailer for a non-exisent sci-fi thriller. And that’s …

Microbiocene: A microbiological archeology of the future

Linda Valenta
July 11th 2019

In configuring our next nature, artists and scientists explore new languages that move beyond the Anthropocene - the era of human beings. These semantics would bridge the gap between mankind and technology, but also between humans and other species, establishing a cosmological understanding of life. Within this endeavour, bio-artists Amanda Baum and Rose Leahy delved into more-than-human narratives by creating a monument for the Microbiocene: the age of the microbial.

The Microbiocene is an epoch we’ve always lived in and …

AI: More than human

NextNature.net
May 24th 2019

What makes us human? And why do we sometimes fear artificial intelligence? And what about technological singularity - the moment in time when artificial intelligence outperforms human intelligence? The increasing yet often invisible implementation of AI in our daily life (think voice assistant and deep-learning algorithms) causes more questions than answers. Should we be defensive or welcome this new technology as part of our human evolution?

The recently opened exhibition AI: More than Human at the Barbican in London invites …

Why Next Nature Network is hiring an army of bots

NextNature.net
April 11th 2019

As we showed with HUBOT, we can use new technologies and robotics to make our work more enjoyable, interesting and humane. Aside from our speculative jobs, a lot of robotic companions already exist. Think about the robotic fish that observes marine life, the breathing pillow that helps you sleep and robotic arms that assist the surgeon. So why not join forces at Next Nature Network?

Not working against, but with robots!

Over the past few years, we have been actively …

The religion named Artificial Intelligence

Linda Valenta
April 5th 2019

Would you pray to a Robot deity? A group of Japanese buddhists is already doing so. Meet Mindar, the robot divinity shaped after the buddhist Goddess of Mercy, also known as Kannon. Kyoto based Kodaiji Temple recruited Osaka University’s head of intelligent robotics Hiroshi Ishiguro to design this million dollar robot. Its purpose? To encourage youngsters to engage with buddhism again.

Mindar’s silicone face represents a gender neutrality that aims to move beyond human representation. Supplemented with aluminium body parts, …

Your Next Nature guide to Transmediale 2019

NextNature.net
January 18th 2019

Berlin is kicking off its cultural season with the not-to-miss 23th installment of Transmediale. This year the digital art/culture festival focuses on how feelings are made into objects of technological design, and asks what role emotions and empathy play within digital culture.

We combed the program so you don't have to:

How to Grow and Use Your Feelers (Workshop. Wednesday from 11:00 to 14:00

Donna Harrway's writings inspired the interdisciplinary techno-feminist research group #purplenoise to immerse us in a world …

Managing the data deluge: Twitter as a tool for ecological research

Marianne Messina
December 27th 2018

As early as 2009-10, researchers were looking at Twitter data mining as a way to predict the incidence of flu. At the time, the H1N1 virus, or “swine flu,” had made the jump from swine to humans and arrived in the United States. The Center for Disease Control (CDC) took notice and began sponsoring research.

Eight years later, data scientists Alessandro Vespignani and his team have developed statistical models for crunching Twitter data in flu forecasting that can predict, six …

How self-driving cars will change our sex lives

bryan clark
November 22nd 2018

It’s clear that driverless cars will revolutionize the way we get from Point A to Point B. Perhaps less obvious is how it’ll change our sex lives.…

WP_Query Object ( [query] => Array ( [tag] => wild-systems [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => )[query_vars] => Array ( [tag] => wild-systems [post_type] => post [post_status] => publish [orderby] => date [order] => DESC [category__not_in] => Array ( [0] => 1 )[numberposts] => 10 [suppress_filters] => [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [cat] => [tag_id] => 102 [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( )[category__and] => Array ( )[post__in] => Array ( )[post__not_in] => Array ( )[post_name__in] => Array ( )[tag__in] => Array ( )[tag__not_in] => Array ( )[tag__and] => Array ( )[tag_slug__in] => Array ( [0] => wild-systems )[tag_slug__and] => Array ( )[post_parent__in] => Array ( )[post_parent__not_in] => Array ( )[author__in] => Array ( )[author__not_in] => Array ( )[ignore_sticky_posts] => [cache_results] => 1 [update_post_term_cache] => 1 [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [posts_per_page] => 10 [nopaging] => [comments_per_page] => 50 [no_found_rows] => )[tax_query] => WP_Tax_Query Object ( [queries] => Array ( [0] => Array ( [taxonomy] => category [terms] => Array ( [0] => 1 )[field] => term_id [operator] => NOT IN [include_children] => )[1] => Array ( [taxonomy] => post_tag [terms] => Array ( [0] => wild-systems )[field] => slug [operator] => IN [include_children] => 1 ))[relation] => AND [table_aliases:protected] => Array ( [0] => wp_term_relationships )[queried_terms] => Array ( [post_tag] => Array ( [terms] => Array ( [0] => wild-systems )[field] => slug ))[primary_table] => wp_posts [primary_id_column] => ID )[meta_query] => WP_Meta_Query Object ( [queries] => Array ( )[relation] => [meta_table] => [meta_id_column] => [primary_table] => [primary_id_column] => [table_aliases:protected] => Array ( )[clauses:protected] => Array ( )[has_or_relation:protected] => )[date_query] => [queried_object] => WP_Term Object ( [term_id] => 102 [name] => Wild Systems [slug] => wild-systems [term_group] => 0 [term_taxonomy_id] => 105 [taxonomy] => post_tag [description] => Anthropogenic processes that go feral. A computer virus, for example, continues running long after its programmer has had any direct role in how it functions or what computers it infects. [parent] => 0 [count] => 480 [filter] => raw [term_order] => 0 )[queried_object_id] => 102 [request] => SELECT SQL_CALC_FOUND_ROWS wp_posts.ID FROM wp_posts LEFT JOIN wp_term_relationships ON (wp_posts.ID = wp_term_relationships.object_id) WHERE 1=1 AND ( wp_posts.ID NOT IN ( SELECT object_id FROM wp_term_relationships WHERE term_taxonomy_id IN (1) ) AND wp_term_relationships.term_taxonomy_id IN (105) ) AND wp_posts.post_type = 'post' AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.post_date DESC LIMIT 0, 10 [posts] => Array ( [0] => WP_Post Object ( [ID] => 125688 [post_author] => 2279 [post_date] => 2019-11-07 15:36:15 [post_date_gmt] => 2019-11-07 14:36:15 [post_content] =>

The way people get around is starting to change, and as a professor of transport strategy I do rather wonder if the modes of transport we use today will still be around by the turn of the next century.

Growing up, my favourite book was a children’s encyclopaedia first published in 1953. One double page spread featured an annotated cityscape, showing all aspects of the built environment – most of which we would still be familiar with now. The various modes of transport illustrated – trains, buses, lorries, taxis, motorcycles, bikes, pedestrians and private cars – still work together as a system in fundamentally the same ways.

But a whole range of possible (though not inevitable) societal and technological changes could revolutionise how we travel in the coming decades. These include large-scale responses to the climate change agenda and energy sourcing and security; shifting demographic trends (such as growing numbers of elderly people); the development of the collaborative economy; the growing use of big data; and the apparent inevitability of driverless cars.

To examine what future urban transport systems might look like, I recently directed a future-gazing project for New Zealand’s Ministry of Transport exploring how people might be travelling in the year 2045. I helped develop four scenarios, along two axes of change.

The first axis considered automation – at one end, vehicles are still be driven much like today (partial automation). At the other, they’re driverless (full automation). The second axis related to how dense cities could become – one future where the population is more dispersed (like Los Angeles) and another where it is concentrated at a higher density (more like Hong Kong). With these axes in mind, I generated four possible futures for public transport, which could play out in cities across the world.

Choose your fighter. By Marcus Enoch, Author provided

1. Shared shuttles

In the “shared shuttle” city, demand responsive minibuses, Uber-style taxis and micro-modes – such as shared bicycles, electric bikes and hoverboards – to cover the “last mile” to your destination are widespread. Hiring these different forms of transport is simple, thanks to seamless booking and payment systems and a thriving entrepreneurial spirit among a range of commercial, social and government transport providers. Meanwhile, new environmental regulations mean that owning a car is more expensive than it used to be, and private vehicles are restricted to the suburbs.

Flexibility is a core feature of this scenario, with vehicles and services that adjust to the needs of individuals, and with how the space continually adapts to meet the needs of the city as a whole. There’s also a collaborative ethos, reinforced by the development of a more compact and high-density city, while progress toward full automation has been slow because of safety and privacy concerns.

2. Mobility market

Private cars still dominate urban transport in the mobility market scenario. Many citizens live and often work in dispersed, low-density suburban areas, since city-centre housing became too expensive for most to afford. Fewer people walk and cycle, because of the long distances involved. And the use of public transport has declined, since less dense transport networks mean there are fewer viable routes, though a limited network of automated trains and buses is still used for trips to the city centre.

Car use has fallen somewhat since the 2010s, because “active management” measures – such as pre-bookable fast lanes and tolls – are now necessary to control congestion, despite the completion of a sizeable road building programme in the recent past.

Instead, commercially provided pre-paid personalised “mobility packages” are helping to stimulate the use of a whole range of shared mobility options, such as car-pooling, bike hire and air taxi schemes. These now account for around a quarter of all journeys.

3. Connected corridors

Society in this high-tech, highly urbanised world of connected corridors is characterised by perceptive but obedient citizens who trade access to their personal data in return for being able to use an extremely efficient transport system. Physically switching between different services or even different modes of travel is hassle free, thanks to well designed interchange points, and fully integrated timetabling, ticketing and information systems.

For instance, travellers might walk, e-cycle or take a demand-responsive minibus to a main route interchange, then board a high frequency rail service to get across town and finally take a shared autonomous taxi to their destination. Each will be guided by a personalised, all-knowing “travel ambassador” app on their smartphone or embedded chip, which will minimise overall travel times or maybe maximise sightseeing opportunities, according to their preferences.

Private cars are not really needed. People trust technology to deliver inexpensive and secure transport services and appreciate living close to work, family and friends.

4. Plentiful pods

In this future, fleets of variously-sized driverless pods now provide around three-quarters of those journeys that still need to be taken across the low-density, high-tech city. These pods having largely replaced most existing public transport services, and the vast majority of privately-owned cars.

People do still walk or cycle for some shorter trips. But pods are so convenient, providing affordable point-to-point journeys for those not satisfied by virtual interactions. Passengers can pay even less, if they agree to share with others. Pods are also fully connected to the internet, and are priced and tailored to meet customer needs. Ultimately, pods give people the freedom to work, learn or live where the weather is best or the houses are cheapest.

My research did not pass judgement as to which scenario should be pursued. But it did conclude that public transport will need to evolve to meet future challenges, and that the role of government will still be of key importance going forward, no matter which path is chosen. Personally though, if forced to choose, I think I’d favour a shared shuttle future more than the others - it just seems more sociable.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Cover image: Renault's float autonomous car

[post_title] => Four visions for the future of public transport [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => four-visions-for-the-future-of-public-transport [to_ping] => [pinged] => [post_modified] => 2019-11-07 15:36:17 [post_modified_gmt] => 2019-11-07 14:36:17 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125688 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[1] => WP_Post Object ( [ID] => 125626 [post_author] => 367 [post_date] => 2019-11-07 15:13:57 [post_date_gmt] => 2019-11-07 14:13:57 [post_content] =>

Delve into the science and fiction of robots at V&A Dundee’s latest exhibition: Hello, Robot. Contemplate the existence of robots and how they have both shaped, and been shaped by technology, culture and design. See for yourself how the boundaries between human and machine have become increasingly blurred; understand how we got here, and take note of the increasing power of designers to influence the future of such technologies.

Hello, Robot has over 200 objects on display, and takes visitors through four stages of robot influence and evolution. Throughout, the exhibition poses provocative questions, and will require you to consider the past, present and future of robotics like never before. Reflection of this kind may prove essential in answering the question, how will robots exist in our next nature?

What? An exploration of robots in a human world
Where? V&A Dundee, Scotland (UK)
When? Now, until 09 February 2020

[post_title] => This exhibition looks at how robots are changing the world we live in [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => new-exhibition-hello-robot [to_ping] => [pinged] => [post_modified] => 2019-11-07 15:14:35 [post_modified_gmt] => 2019-11-07 14:14:35 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125626 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[2] => WP_Post Object ( [ID] => 119866 [post_author] => 1860 [post_date] => 2019-09-12 09:34:24 [post_date_gmt] => 2019-09-12 08:34:24 [post_content] =>

The idea that AI can compose electronic music may sound a little off to people. It raises essential questions about creativity as a product exclusive to humans: can an AI be creative? Can it be musical? Can it compete with human-made melodies? Does it need to?

More and more, AI has set foot in the realm of creative industries. From an AI writing the next GoT book to IBM’s Watson creating a trailer for a non-exisent sci-fi thriller. And that’s not where it ends: the music industry also got involved when that same Watson was used by award winning producers to create country rap, not to mention a Eurovision song created with machine learning.

Electronic music, too, is affected by the algorithmic technologies that revolutionize the way humans relate to the arts. As a discipline that has technology at its very core, electronic music is bound to cross paths with the ways of AI. From DJing to producing and from contriving DJ names to directing music videos, algorithmic agency is growing stronger each day.

The subsequent question is how humans pertain to these technologies and how the arts and AI can be treated as a symbiosis, rather than a dystopian binary. Put differently, how can we embrace AI as an instrument we can work together with, rather than an autonomous entity overruling human creativity?

https://www.youtube.com/watch?v=4MKAf6YX_7M

Music has always been technological

There has always been a link between music and technology, as essentially it revolves around counting and measuring the rhythm, as much as it relies on instruments.

Clapping their hands, our early ancestors used their body as an instrument to create rhythmic music. As our predecessors found out that they could smack sticks or stones to enhance the beat without hurting their hands, drums were invented.

Fast-forward to the 20th century, elaborate drum kits emerged at the intersection of African-American brass bands and western instruments. The technology of the bass paddle made it possible to use both hands and feet to incite sound, hence evolving the drum kit as we know it now.

The instrument was further technologized when companies like Korg and Roland started producing drum machines on a massive scale. The genres that emerged from these instruments diverged, but essentially, both the drum kit and drum machine serve as a technology to produce the rhythms and sounds that we know as music.

Are algorithms are the next DJs?

In the same line, DJing has undergone changes when the vinyl decks were complemented by USB-driven CDJs. Though the technologies changed, the art of DJing remains present – just in different ways.

In this day and age, AI is the upcoming technology broadening the horizon of (electronic) music. On a day-to-day basis, algorithms are already silently ruling our music taste through auto-playlists like the ones developed by YouTube, Spotify and Apple Genius. In a way, algorithms are already our next DJs.

But not only are these algorithms able to curate music to our likenings; they are also able to flawlessly mix our favorite tracks together. Recently, a Spotify playlist was born that tests an automixing feature with the help of AI. The Drum And Bass Fix playlist seamlessly beatmatches two tracks when shuffle is switched on.

Not into drum and bass? Then try curating your own beat-matched set or mashup by using Rave DJ. This online application allows you to upload a YouTube or Spotify playlist with the use of algorithms. It then creates a smooth mix of even the most obscure track combinations.

Naturally, tech giant Google also engaged with algorithmic advances within the electronic music industry by developing an AI synth named NSynth. This open source synthesizer uses Google’s network to reproduce the qualities of sounds and instruments, which feeds its algorithms. Though based on neural networks, it actually comes as a hardware product with a touchscreen pad.

https://www.youtube.com/watch?v=ZsZc4Q_eDk4

Will AI outmix humanity?

These tools may seem futuristic, but there are plenty of artists already utilizing AI to produce music. At this year’s Transmediale, UK DJ and producer Actress even granted his AI offspring complete artistic agency by giving it a stage name: Young Paint. Together, they enacted a live audiovisual performance that was mostly based on real-time improvisation, but they also captured some collaborative ventures on a mini-album via his new label Werk__Ltd.

According to electronic musician Olle Holmberg, it is just a matter of time before we will be following AI DJ’s and producers on social media, after attending our favorite algorithmically driven gigs – which is basically already happening with the advent of virtual influencers.

Based on the semantic traits that can be found in Hardwax’s database of DJ names, Holmberg recently published a list of DJ names generated by an AI. Though a DJ name might seem trivial, it does show that AI is capable of mimicking and further developing our club experience based on our current ideas of what clubbing should be like.

https://www.youtube.com/watch?v=v_4UqpUmMkg

Team human

There is an uncanny objection to these kind of technological advances, assuming they would violate our authentic ‘humanness’, when in fact it is in our very human nature to be technological. Speaking, writing, reading counting, singing – these are all cultural technologies; so are DJing and producing.

The cycle that drove us from drum kits to drum machines is the same evolutionary force driving humans to interact with AI in creating new musical works of art. Within this framework, AI basically is our next nature’s cultural technology.

Scholar and electronic music composer Holly Herndon, who built an AI recording system to help with her latest album, addresses the pervasive narrative in which technology is dehumanizing and instead proposes to ‘run towards’ technology, but on her own human terms.

This brings us to the crucial debate revolving around AI: we often forget how algorithms are technologies developed by humans. If algorithms become dehumanizing vehicles, they can only be so because the human system made them that way. 

[post_title] => How AI is reshaping electronic music [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => ai-reshaping-electronic-music [to_ping] => [pinged] => [post_modified] => 2019-09-12 15:44:07 [post_modified_gmt] => 2019-09-12 14:44:07 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=119866 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[3] => WP_Post Object ( [ID] => 113738 [post_author] => 1860 [post_date] => 2019-07-11 12:45:15 [post_date_gmt] => 2019-07-11 11:45:15 [post_content] =>

In configuring our next nature, artists and scientists explore new languages that move beyond the Anthropocene - the era of human beings. These semantics would bridge the gap between mankind and technology, but also between humans and other species, establishing a cosmological understanding of life. Within this endeavour, bio-artists Amanda Baum and Rose Leahy delved into more-than-human narratives by creating a monument for the Microbiocene: the age of the microbial.

The Microbiocene is an epoch we’ve always lived in and will continue to live in, as the vibrant matter on planet earth emerged and thrives through microbial life, e.g. bacteria. In collaboration with the Royal Netherlands Institute for Sea Research (NIOZ), Baum & Leahy dove into the deep time of the microfossil molecules Emiliania huxleyi, which are found in ancient sea sediment. The result is an award winning symbiosis between art and science, as well as an artefact for the ecologies that are yet to be embraced by the human species.

We caught up with the duo and spoke about the philosophical matter pushing their piece to emerge, and the microbial matter it is made of.

"The installation envisions a future archaeological site, thousands of years from now."

You created the ‘Microbiocene’ piece for the Bio Art and Design Award last year. Tell us about the creative process of the project; did you already have in mind this result or did it evolve from something completely different?

The Microbiocene as an overarching concept is something we’d been thinking about for awhile - over the past couple of years almost all our projects have become about mapping out the Microbiocene - the ancient, ongoing, and future era of microorganisms. We’ve explored this through various lenses; spiritual, material, ritualistic, ancestral.

When applying to the BAD Awards, we were immediately inspired by the research from the Department of Marine Microbiology and Biogeochemistry at The Royal Netherlands Institute for Sea Research (NIOZ). The scientists at NIOZ work with sea sediment containing microbial fossil molecules, which hold information about past environmental conditions, both recent and ancient.

NIOZ’s research combined with this cultural, philosophical framework gave birth to the idea of creating a form of ‘biological Rosetta Stone’ - a relic being found, and a language translated, to discover information about an ancient (invisible) civilisation.

Baum & Leahy, Microbiocene: Ancient ooze to future myths, 2018, Photo by Boudewijn Bollmann, MU ArtSpace

Inspired by the aspect of deep time, the installation envisions a future archaeological site, thousands of years into the future, where the Microbiocene monument is being found. It is inscribed with myths of the Microbiocene, a (re)telling of history and future on Earth as microbe-centric. These stories were based on information we unearthed from microbial fossils in sea sediment dating from the present to nearly 10,000 years ago. We then developed this data into narratives with our collaborating scientists, projecting different future scenarios.

The idea was to create a narration that was informed both by microbe and mammal.

You used ‘microglyphs’ in your piece —a microbe-centric language system co-created by the artists and scientists— how did you develop this language? Are the shapes imprinted on your works also literally found under the microscope?

The microglyphs were created with input from both scientific, cultural associations as well as free associations between us and the scientists. Some of the symbols are more literal —like a double bond in a molecule meaning cold, or Ehux being a graphic representation of how it looks, whilst some are more complex like the Microbiocene microglyph, which refers to life beginning on earth.

Whilst creating the microglyphs we discussed the multitude of forms that language takes,and the inherent human desire to traverse their boundaries – across cultures, disciplines and species. From the Rosetta Stone to art-sci collaborations to alien communication attempts, the wish to understand, and to translate is constant: we all dream of babel fish.

Baum & Leahy, Microbiocene: Ancient ooze to future myths, 2018, Photo by Boudewijn Bollmann, MU ArtSpace

By creating a visual language for the Microbiocene, we attempted to move towards a more multimodal form of communication with the potential to be interpreted in various ways by anyone encountering it. Each of the microglyphs has multiple meanings, which change responsively with the surrounding microglyphs. Different compositions of the microglyphs explore movements within the meaning of the sentences.

The microglyphs are an initial iteration into working with the materiality of language, which we continue to explore in workshops, and our other projects. By consciously molding language, or sign making, into new biologically informed structures, we begin to weave our mammalian minds into the Microbiocene.

What kind of scientists did you collaborate with?

We collaborated with biogeochemists from the Royal Netherlands Institute for Sea Research – Julie Lattaud, Gabriella Weiss and Laura Schreuder. They study the alkenone biomarkers, especially sturdy molecules, left by microorganisms in sea sediment. This sediment is collected in long cores, which have a cross section of earth from the seabed and below, with the top layer being the most recent, and the bottom being from the mostpast.

"We like to work with scientists as partners on an equal basis of passion for understanding."

Their lab work is wonderfully intimate with the sediment that is collected. The coresare opened from a long tube and these incredibly distinct lines are revealed along the earth core, indicating thousands of years of life being lived before turning to matter.

Do you think you could have created this piece without this collaboration? What role does and should science play in art? Where does science stop and art begin?

The idea of the piece itself grew out of and was continuously informed by the scientific research, so it would have been another piece without the scientific collaboration. Like any other relationship, the symbiosis between art and science can and should take many forms, from the abstract and experimental to the more systematic.

At this point in time, we see not only creative potential but also a certain urgency, in the point between ecological transformation, emerging technologies and increased sensitivity and awareness towards the planetary web of life.

We like to work with scientists as partners on an equal basis of passion for understanding, working with and caring for living systems - although with very different means of research and expression. Before restricting ourselves within established epistemological systems, we try and create a nurturing space of shared curiosity, where ideas and visions aren’t limited to our individual areas of expertise.

Baum & Leahy, Microbiocene: Ancient ooze to future myths, 2018, Photo by Max Kneefel, MU ArtSpace

Do you think that art is stuck in the anthropocene? Is art too much focused on human experience?

We think it’s important that art happens across many ‘cenes’- and that it’s also urgently important to reflect on our lives in the Anthropocene. Yet we are interested in exploring an alternative - one that is generative, slimey and messy, and optimistic about the adaptable forces of life. Microbiocene is just one. We continuously draw on inspiration from Donna Haraway’s ‘chthulucene’. Nurturing the diversity and moving away from dominant narratives of the Anthropocene is what we find urgently needed - within all fields, not just artistic.

Whilst creating Microbiocene we were thinking a lot about the magnitude of microbial experience that has come before us, and how this has had slow yet defining atmospheric and evolutionary impacts on the Earth, setting out the conditions for terran life to thrive. In contrast, human’s time on Earth is becoming very much defined by rapid changes, caused by a few, and resulting in wider impacts for all, and some much more than others. We believe a more microbial approach could trigger the emergence of new systems of adaptation and cohabitation.

"The monument is raised to mark and celebrate how humans learn to become more microbial in their planetary impact."

By looking at the history of time on Earth through the perspective of the Microbiocene, we hoped to condense this microbial evolutionary perspective into a material and sensorial experience able to inspire new ideas and trajectories challenging current anthropocentric worldviews. The Microbiocene monument is raised to mark and celebrate how humans learn to become more microbial in their planetary impact. Focusing on more-than-human adaptive strategies and experience as a worthy alternative. For us, drawing the narrative out of information in the material remains of microbial experience was a way to do this.

Baum & Leahy i.c.w. Sofie Birch and Pernille Kjær, Interterrestrials, 2019
Baum & Leahy i.c.w. Sofie Birch and Pernille Kjær, Interterrestrials, 2019

What role does materiality play in your piece and how do you elevate a materiality from human to more-than-human?

It was an incredible opportunity for us to use the sea sediment from our studies as part of the material in the sculpture.

The particular sediment we were working with is called calcareous ooze, meaning it contains a large proportion of skeletal remains of coccolithophores. This included Emiliania Huxleyi (Ehux) - the microorganism we were studying within the sediment - which has an incredible, vibrant materiality to it.

It is a single celled alga covered in calcium carbonate-rich platelets, which – with the help of deep time –transmutates into materials such as chalk and lime. The build up of these microscopic organisms on the seabed over long periods has an immense, macroscopic effect, as expressed in the White Cliffs of Dover and Møns Klint.

When understood as the material result of numerous coccolithophore bodies and existence, this coastal landscape becomes a more-than-human monument in itself. We wished to translate the immensity of this deep time within this lively material we had in the lab.

"Microbes are in a way a ‘gateway’ to the unknowns of the universe."

Part of what we find fascinating about the microbial world is the (to us) mysterious material liminality - microbes are in a way a ‘gateway’ to the unknowns of the universe, which we know makes up more than 90% of our perceived reality.

We can’t see the microbes with our naked eye, yet with electron microscopy technologies etc it’s revealed how alive, vibrant and ‘material’ they are. We see them as active, reproductive, communicative, busy organisms, just like ourselves.

This many faceted relationship between the microbes’ ubiquitous, ghostly presence and the very material reality of their lives, which resonates with our human experience, continues to puzzle and inspire us.

Even more incomprehensible invisible organic elements like bacteriophages, proteins, DNA, molecules, atoms, dark matter, down to the strange world of quantum mechanics, seems more ‘approachable’ when we think of them through the universal, microbial gateway.

Baum & Leahy i.c.w. Naja Ankarfeldt, The Red Nature of Mammalga, 2018
Baum & Leahy i.c.w. Naja Ankarfeldt, The Red Nature of Mammalga (detail), 2018

Philosopher Timothy Morton wrote that we have to think in terms of durations, meaning we have to create a ‘deep time’ to ‘think ecologically’. Are you perceiving the world differently in terms of temporality since you made this work? Do you experience a more cosmological time as opposed to a human history?

When working with material such as the sediment cores that have such an evident history, it’s impossible not to become incredibly aware of and sensitive to the vast periods of time on Earth that have preceded us.

Our collaborating scientists work with these kind of time scales everyday, and so are used to thinking about time on Earth in terms of epochs, rather than through the length of their own human experience.

This was intriguing for us, and something we were trying to approach in Microbiocene not only as an installation, but also as a framework. Indeed, we believe that if humans could enter a mindset of deep time, we would see a big shift in our ways of producing and distributing materials. If humans could think in terms of the length of time that that plastic bottle will be on Earth, rather than the length of time it was experienced it in our lives, we surely wouldn’t be producing and distributing them in this way.

Yet, whilst the Microbiocene entails this very cosmological way of thinking – we’re not sure we can claim to have transcended into everyday cosmological experience of time from this. Despite our best efforts, we’re still just too darn human for that.

Baum & Leahy, Cellular Sanctum, 2018
Baum & Leahy, Cellalur Sanctum, 2018

You both have a background in design - do you think different aesthetics in our everyday surroundings will amount to different environmental awareness? And if so, what’s the potential role of aesthetics in environment awareness?

In our work, we combine tactile, sensorial materiality with collective, ceremonial practicessuch as meditation, ritual and writing to practice and nurture a symbiosis between matter (microbial, mammalien, etc) and mind. We aim to bring focus to how internal and external realities are interrelational and constantly shaping each other. By materialising a speculative scenario, ongoing tendencies can be harnessed and the actual long term realisations can emerge.

We have both been inspired by biophilic design principles, how biomorphic form, aesthetic,and material can be used to strengthen, encourage, and practice our connection with other species and ecologies.

Recently we’ve been thinking about ‘microbiophilia’, and how to stir emotions for organisms we can’t see, yet live all around, on, and within us. In previous pieces such as Cellular Sanctum (2018), and The Red Nature of Mammalga (in collaboration with Naja Ankarfeldt, 2018), we created tactile biomorphic, microbial forms, microbial drinks, and written participatory chants, to create a tactile and sensual experience.

Through these aesthetic experiences we aim to seed a heightened awareness to the parallel microscopic world, within those who experience them.

More microbiocene ?

https://vimeo.com/322794918

Cover photo by Max Kneefel.

[post_title] => Microbiocene: A microbiological archeology of the future [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => microbiocene [to_ping] => [pinged] => [post_modified] => 2019-09-02 17:03:24 [post_modified_gmt] => 2019-09-02 16:03:24 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=113738 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[4] => WP_Post Object ( [ID] => 111455 [post_author] => 367 [post_date] => 2019-05-24 10:47:35 [post_date_gmt] => 2019-05-24 09:47:35 [post_content] =>

What makes us human? And why do we sometimes fear artificial intelligence? And what about technological singularity - the moment in time when artificial intelligence outperforms human intelligence? The increasing yet often invisible implementation of AI in our daily life (think voice assistant and deep-learning algorithms) causes more questions than answers. Should we be defensive or welcome this new technology as part of our human evolution?

The recently opened exhibition AI: More than Human at the Barbican in London invites you to explore your relationship with artificial intelligence. Curators Suzanne Livingston and Maholo Uchida have invited artists, scientists and researchers to demonstrate AI’s potential to revolutionize our lives. Experience the capabilities of AI in the form of cutting-edge research projects by DeepMind, Massachusetts Institute of Technology (MIT) and Neri Oxman; and interact directly with exhibits and installations to experience the possibilities first-hand.

Take your chance and dive into this immersive installation What a Loving and Beautiful World by artist collective teamLab. The visuals consist out of Chinese characters and natural phenomena triggered by interaction. When a visitor touches a character, the world contained inside that character unfolds on the walls.

AI, Ain’t I a Woman? is an exploration of AI from a political perspective. Joy Buolamwini is a poet of code who uses art and research to illuminate the social implications of artificial intelligence. In this case, she lays bare the racial bias of facial recognition.

Inspired by the Dutch 'tulip-mania' in the 1630s, Anna Ridler draws parallels between tulips and the current mania around cryptocurrencies. Created by an AI,  the film shows blooming tulips that are controlled by the bitcoin price. changing over time to show how the market fluctuates. The project echoes 17th century Dutch still life flowers paintings, which despite their supposed realism, are imagined because the flowers in them could never bloom at the same time. Does cryptocurrency provide us with a similar imagined reality?

Visit the Barbican in London to see these projects and much more! Expect your preconceptions to be challenged and discover how this technology impacts our human essence from historical, scientific, social and creative perspectives.

AI: More than Human is now on show at Barbican Centre in London until 26 August 2019.

[post_title] => AI: More than human [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => ai-more-than-human [to_ping] => [pinged] => [post_modified] => 2019-05-24 10:48:11 [post_modified_gmt] => 2019-05-24 09:48:11 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=111455 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[5] => WP_Post Object ( [ID] => 109864 [post_author] => 367 [post_date] => 2019-04-11 08:00:36 [post_date_gmt] => 2019-04-11 07:00:36 [post_content] =>

As we showed with HUBOT, we can use new technologies and robotics to make our work more enjoyable, interesting and humane. Aside from our speculative jobs, a lot of robotic companions already exist. Think about the robotic fish that observes marine life, the breathing pillow that helps you sleep and robotic arms that assist the surgeon. So why not join forces at Next Nature Network?

Not working against, but with robots!

Over the past few years, we have been actively growing a meaningful community on Instagram. Ever since 2016, the Instagram feed is curated by an algorithm that holds the magical powers to decide what you see. And, in doing so, the robots at Instagram decide what our online community gets to see.

This doesn’t sound like a humane technology to us.

Also, Instagram does not allow us to use the so-called 'swipe up' functionality that directly leads you to our stories on Nextnature.net.

So why do we keep using Instagram? Because it also bring us joy and convenience. Social media platforms are connecting us to like-minded people all over the world, and we value your engagement! It’s not all bad, but there is a lot of room for improvement.

We wish we could change Instagram’s algorithms and functionalities, but unfortunately we do not have that power (yet). What we can do, is make life easier for you, hence we decided to hire an army of robots to the rescue.

10.000 new colleagues

Meet our Robotic Follower. They are responsible for playing Instagram’s algorithms and unlocking new features. If they do the job well, you won’t even notice they're there. Thanks to their work, we can put our time and energy into creating great content for you. Because that's what we love to do. Let’s make inhumane technology more humane!

[post_title] => Why Next Nature Network is hiring an army of bots [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => army-of-bots [to_ping] => [pinged] => [post_modified] => 2019-04-11 09:13:34 [post_modified_gmt] => 2019-04-11 08:13:34 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=109864 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[6] => WP_Post Object ( [ID] => 110079 [post_author] => 1860 [post_date] => 2019-04-05 10:17:06 [post_date_gmt] => 2019-04-05 09:17:06 [post_content] =>

Would you pray to a Robot deity? A group of Japanese buddhists is already doing so. Meet Mindar, the robot divinity shaped after the buddhist Goddess of Mercy, also known as Kannon. Kyoto based Kodaiji Temple recruited Osaka University’s head of intelligent robotics Hiroshi Ishiguro to design this million dollar robot. Its purpose? To encourage youngsters to engage with buddhism again.

Mindar’s silicone face represents a gender neutrality that aims to move beyond human representation. Supplemented with aluminium body parts, its enmeshment with human construction becomes evident. And when Mindar is chanting the heart sutra, an eerie resemblance with Japan’s vocaloid Hatsune Miku comes to mind.

God is a Google

Mindar is not the only religious robotic to have been manufactured. In the past, we spotted robot monk Xian'er informing visitors of the Longquan temple, near Beijing. Creations like Mindar and the Xian'er can be understood as metaphors for the way humankind worships artificial intelligence: we worship contemporary technologies as if they were Gods. But aforementioned creations are part of a greater scheme, as the complexity of artificial intelligence forms a more-than-human network that reminds me of spirituality.

Think of a hyper-intelligent computing database like Google. Its systems are omniscient, they are built to know everything. They compress knowledge into a time-space continuum of all recorded human knowledge and activities. According to media scholar John Durham Peters, Google can therefore be understood as a lo-fi universe - a God.

The Church of AI

An even greater analogy comes into existence when considering the vast network of all computing systems ever built. They can be regarded a hyper-intelligent being or a God, if you will. With this in mind, Anthony Levandowski started the church of Artificial Intelligence named Way of the Future (WOTF).

WOTF’s main focus revolves around “the realization, acceptance, and worship of a Godhead based on Artificial Intelligence developed through computer hardware and software.” The church emphasizes not to be affiliated with any company. Rather, it operates as an open source movement that allows everyone to contribute to the creation of the most intelligent being ever.

Levandowski stresses the importance to work on this God, as a so-called “Transition” will herald an epoch in which a hyper-intelligent being (hint: it’s not the human being) will be in charge of the world. This intelligent being will sense the world through the internet as its nervous centre, knowing everything that is happening anytime, anywhere. Levandowski converts the creepiness of this idea into the emergence of his AI church. In order for the Transition to take place in a serene way, an initiative like WOTF would be urgent to gain more control over this procedure.

Cybernetic spirituality

What if the Transition has already taken place? What if we’re more in need of a Way of the Now, rather than a way of the future? Pioneering cybernetic Stafford Beer already characterized control systems as spiritually charged networks in his 1966 essay about knowledge and God.

Cybernetics is about the way systems work in feedback loops; in a way, it is a predecessor of AI. It has contributed to many fields - control systems, electrical network theory, mechanical engineering, logic modeling, evolutionary biology, neuroscience, anthropology, and psychology - but was overshadowed by AI at some point.

In the aforementioned essay, Beer described man’s embedding in a cybernetic control system as follows: “To people reared in the good liberal tradition, man is in principle infinitely wise; he pursues knowledge to its ultimate . . .To the cybernetician, man is part of a control system.” This control system itself is not a code that can be simply cracked by humans - it is something hyper-intelligent that we can have no absolute knowledge about, exactly because it is smarter than we are.

Just like many contemporary technologies, this all-encompassing control system is signified by a black-boxed truth that we cannot uncover. Not because we’re not allowed to, as is the case with tech companies that reinforce policies of secrecy. Rather, there is a larger black box containing the connectivity that seeps through the vessels of contemporary networks. Perhaps that’s the deity already ruling our lives.

So, whether we like it or not, all of us taking part in modern technological systems are already praying to a hyper-intelligent God; our next nature is already present. Our prayers are answered with flickering mobile screens and information dumps that appear before us within the blink of an eye. Just like any other God, it does not stop wars, but it does grant us the gift of knowledge.

[post_title] => The religion named Artificial Intelligence [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => the-religion-named-artificial-intelligence [to_ping] => [pinged] => [post_modified] => 2019-04-05 11:11:45 [post_modified_gmt] => 2019-04-05 10:11:45 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=110079 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[7] => WP_Post Object ( [ID] => 107665 [post_author] => 5 [post_date] => 2019-01-18 12:43:27 [post_date_gmt] => 2019-01-18 11:43:27 [post_content] =>

Berlin is kicking off its cultural season with the not-to-miss 23th installment of Transmediale. This year the digital art/culture festival focuses on how feelings are made into objects of technological design, and asks what role emotions and empathy play within digital culture.

We combed the program so you don't have to:

How to Grow and Use Your Feelers (Workshop. Wednesday from 11:00 to 14:00

Donna Harrway's writings inspired the interdisciplinary techno-feminist research group #purplenoise to immerse us in a world of “feelers” as symbols for an extended human sensorium.

Algorithmic Intimacies (Talk. Saturday from 12:00 to 13:30)

Intimacy is a crucial element of domestic life, yet there's a deficit in current understandings of how technologies are used within algorithmic intimacies. In this talk, fembots, virtual assistants and dating apps are discussed to reflect upon how today’s algorithmic lives are felt.

Knitting and Knotting Love (Keynote. Saturday from 18:00 to 19:30)

How do you love? And how is this love traversed through networks? In their performative lecture at transmediale 2019, Shaka McGlotten tracks a networked experience of love.

Alter Media (Screening. Saturday from 19:30 to 21:30)

From global connectedness bridging unimaginable distances to data abuse, automated opinion manipulation and unrestrained marketing strategies. This screening depicts a broad spectrum of lived experiences with the media spheres of our time.

Actress + Young Paint (live AI/AV) (Performance. Saturday from 21:30 to 22:30)

Meet the AI-based character that spends its time programming Actress’ sonic palette. Expect a life-size projection of the AI working in a virtual studio, coming together with a physical performance on stage.

Cover image: Rory Pilgrim, Software Garden, 2018. Courtesty of the artist and andriesse-eyck galerie Some rights reserved. (Performance: Friday from 20:00 to 21:00)

Transmediale 2019 takes place from 31 January to 3 February 2019 at Haus der Kulturen der Welt. Tickets.

[post_title] => Your Next Nature guide to Transmediale 2019 [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => your-next-nature-guide-to-transmediale-2019 [to_ping] => [pinged] => [post_modified] => 2019-01-18 14:14:58 [post_modified_gmt] => 2019-01-18 13:14:58 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=107665 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[8] => WP_Post Object ( [ID] => 101733 [post_author] => 1937 [post_date] => 2018-12-27 10:04:57 [post_date_gmt] => 2018-12-27 09:04:57 [post_content] =>

As early as 2009-10, researchers were looking at Twitter data mining as a way to predict the incidence of flu. At the time, the H1N1 virus, or “swine flu,” had made the jump from swine to humans and arrived in the United States. The Center for Disease Control (CDC) took notice and began sponsoring research.

Eight years later, data scientists Alessandro Vespignani and his team have developed statistical models for crunching Twitter data in flu forecasting that can predict, six weeks out, when and where a flu outbreak might peak, with 70 to 90 percent accuracy. The Vespignani model integrates flu tweets with CDC data and other inputs of the initial flu conditions, where Twitter acts as “a proxy for monitoring infectious disease incidence.” Vespignani also noted that his model could work with many digital (e.g. social media) sources, which often come with time or location stamps.

Because tweets are unstructured and copious, the chance to make use of Twitter data has inspired advanced work from many sciences – statistical and computational science, behavioral and linguistic science. How do people process language or influence each other? How can we apply machine learning to sort for target data amid random human associations?

Even as the field of ecology becomes buried in data from camera traps, tracking studies, and citizen science records, it has been relatively late in launching its Twitter expedition. But in the September 2018 Methods in Ecology and Evolution, a University of Gloucestershire study by Adam Hart and colleagues looked at the reliability of Twitter data for ecological studies. Hart devised a methodology to collect, scrape, and structure the data sets of tweets about three ecological phenomena.

In some sense, they were gambling that these three cyclic ecological phenomena – the annual emergence of flying ants, the sighting of spiders in the home, the synchronized-drone-like murmurations of starlings overhead – might impress Tweeters enough to make a significant appearance on Twitter. And on searching the Twitter API for keywords or hashtags, such as #flyingants, #spider, and #murmurations, the researchers’ gamble paid off.

“Make sure you choose something that people are likely to tweet about,” Hart said. “We still have much to learn about what motivates people to tweet about ecological phenomena and the sorts of information they are motivated to include.”

Hart and his colleagues then compared the Twitter results to published data from three citizen science (CS) studies of the same phenomena over the same time periods. The most robust Twitter samples came from tweets about spider sightings. Twitter-mining yielded fewer data points than the planned (CS) experiments ­– almost by a factor of ten in some cases – with starling murmurations yielding the fewest. But Hart’s team picked up the slack with data science.

“The statistical approaches we used allow for sample size in calculating significance,” Hart said. “So [a relatively low number of data points] is important, but it is allowed for in the analyses.”

Using a statistical comparison method, the Kolmogorov-Smirnov test, to study Twitter’s reliability against the CS data sets, Hart’s team was able to show a striking correlation.

After a discussion of Twitter’s reliability with respect to determining when and where sightings actually occur, the scientists conclude that Twitter mining can be a useful tool for ecologists, particularly in phenology, the study of “nature’s calendar.”

Retrospective data mining of social media and other digital sources has generated a lot of excitement in ecology because it can shave considerable time off certain types of big data-focused research projects.

Gabriella Leighton and colleagues developed a methodology for mining Google Images in tracking where, geographically, members of the same species started to exhibit different color variations. Like Hart et al (2018), Leighton and her co-authors compared Twitter results to known findings (Rounds, 1987) and found substantial correlation. The published paper couldn’t help mentioning time saved:

“Notably, the Google Images method took a few weeks,” Leighton et al reported, “while the more traditional data collection methods undertaken by Rounds (1987) took 3 years.”

A “quick and dirty” research option

From the point of view of Wesley Hochachka, a research associate with Cornell University’s eBird citizen science project, Twitter offers a “quick and dirty” research option.

“You’re not actually getting a sample of reality,” Hochachka told Mongabay, “you’re getting reality filtered through the curiosity of an individual person and their motivation to tell somebody else about it.”

Such individuality brings “noise,” or uncertainty, into the data sets. On a continuum of reliability between “quick and dirty” Twitter mining and long, hard traditional science, Hochachka would put citizen science somewhere in the middle. The eBird app collects data from bird watchers with a wide range of skill levels, at a rate of 7 or 8 million ornithology records a month. So data scientists have a lot to work with as they refine analytical methods to reduce uncertainty.

At the same time, the online eBird form that volunteers use to enter their bird sightings is the product of scientific design and testing. As compared with retrospective Twitter mining, the data form allows for some control – the crux of traditional science ­– over the data sets coming in.

According to Hochachka, the presence-only bias poses the biggest challenge to Twitter mining.

“You know when an event happened – you know when there’s been a murmuration,” Hochachka said, “but you don’t know when somebody didn’t see that starling murmuration. That’s why these data are never going to be as good as a predesigned study.”

In developing their data entry form, the eBird team sought to “minimize the proportion of data submitted in a presence-only form.” Hochachka points to a required field on the form that asks: “’Are you recording every species that you were able to see and identify?’ And if they say ‘yes’ – which, about 85 percent of the time, people say – then we know that if a certain species isn’t on that list, it was not detected by the observer.”

Without this field, the colorful eBird maps describing bird migration patterns would be much less reliable, Hochachka says. The statistical modeling that underlies the maps is predicated on a binary, yes-no analysis, where the absence of yes does not mean no.

“You actually need those nos,” Hochachka said. “If you don’t feed [the nos] into these sorts of analyses, the program itself creates them” (introducing uncertainty).

The bird-watching form tries to minimize other human inconsistencies – amount of time spent bird-watching, distance traveled while watching, the time of year, even the time of day.

“There’s something called a dawn chorus,” Hochachka said. “If you were to go into the same forest at dawn and at noon, it could be deafeningly loud at dawn and utterly silent at noon. But the birds haven’t gone anywhere; they’re just not as detectable.”

A source of big data

As the challenges of big, noisy data attract researchers with specialties in advanced math and computational science, some field scientists worry that hands-on discovery is getting lost in the noise.

Interestingly, Twitter is becoming an apt platform for informal “field work” in ecology. Valuable discoveries arrive on a hashtag or handle, particularly when the tweet contains images. In Hart et al’s Twitter-mined spider study, tweeted images allowed for determining the spiders’ sex and verifying a count of males to females.

Professor Helen Roy from the Centre of Ecology and Hydrology uses Twitter  (@UKLadybirds) hand in glove with her online survey of UK ladybirds (“ladybugs” in the US).

“We have been able to improve understanding of insect invasions through our studies on the harlequin ladybird – a global invasive alien species,” Roy told Mongabay. Twitter allows Roy to conduct both data intake and educational outreach.

“Just a few days ago, someone sent a picture [via Twitter] of the harlequin ladybird eating some moth eggs,” Roy said. “This is an important example of the way in which the harlequin ladybird may be adversely affecting other species.” Like retrospective Twitter mining, this research provides a useful starting point, though it lacks the representative nature of a statistical sample.

In the end, the findings by Hart et al (2018) offer researchers a different kind of research tool. With its high degree of noise and bias, it will serve less as a substitute for well-designed study and more as a time saver in preliminary research. Hochachka sees it as a potential “snapshot” tool: “if there was no other existing source of information,” Hochachka says, “or if you could gather information, but it would take a long time to collate it, and you wanted a snapshot right now.”

With their broad reach and increasing availability through machine learning analyses, Twitter and other social media may increasingly provide the big data that can help researchers support one avenue of research over another or suggest trends for further investigation.

This story is republished from Mongabay by Marianne Messina under a Creative Commons license. Read the original article.

Citations

Hart, A. G., Carpenter, W. S., Hlustik‐Smith, E., Reed, M., & Goodenough, A. E. (2018). Testing the potential of Twitter mining methods for data acquisition: Evaluating novel opportunities for ecological research in multiple taxa. Methods in Ecology and Evolution.

Leighton, G. R., Hugo, P. S., Roulin, A., & Amar, A. (2016). Just Google it: assessing the use of Google Images to describe geographical variation in visible traits of organisms. Methods in Ecology and Evolution, 7(9), 1060-1070.  https://besjournals.onlinelibrary.wiley.com/doi/10.1111/2041-210X.12562

Rounds, R. C. (1987). Distribution and analysis of colourmorphs of the black bear (Ursus americanus). Journal of Biogeography, 521-538.

[post_title] => Managing the data deluge: Twitter as a tool for ecological research [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => twitter-ecological-research [to_ping] => [pinged] => [post_modified] => 2019-01-08 16:29:23 [post_modified_gmt] => 2019-01-08 15:29:23 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=101733 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[9] => WP_Post Object ( [ID] => 91601 [post_author] => 1747 [post_date] => 2018-11-22 15:24:08 [post_date_gmt] => 2018-11-22 14:24:08 [post_content] =>

It’s clear that driverless cars will revolutionize the way we get from Point A to Point B. Perhaps less obvious is how it’ll change our sex lives.

According to a new study by the Annals of Tourism Research, ditching the driver could open up a whole lot of doors for transit-related sexual activity. Scott Cohen, the deputy director of research at the School of Hospitality and Tourism Management at the University of Surrey stated in the report that approximately 60 percent of Americans have already had sex in a car.

But with driverless cars, it might not just be the occasional quickie, or the awkward post-prom encounter. Driverless cars could actually be a boon for the sex industry.

“Sex is part of urban tourism and commercialized sex is part of that too, so it is quite likely that autonomous vehicles will lead to prostitution, whether legal or illegal, to take place in moving autonomous vehicles in the future,” Cohen, who lead the team behind the study, told NBC.

Of course, sex tourism was just a small piece of the puzzle. Overall, the paper focused on ways that autonomous vehicles would impact everything from sleeping to dining in the near future. The report notes that roadside diners and hotels would probably be the most affected businesses in our soon-to-come, self-driving future.

Cohen, though, doesn’t think sex in autonomous vehicles is going to happen any time soon. It won’t, he says, become a reality until “the 2040s.”

This story is published in partnership with The Next Web. Read the original piece here.

[post_title] => How self-driving cars will change our sex lives [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => driverless-car-sex-lives [to_ping] => [pinged] => [post_modified] => 2018-12-10 16:41:03 [post_modified_gmt] => 2018-12-10 15:41:03 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=91601 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 ))[post_count] => 10 [current_post] => -1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 125688 [post_author] => 2279 [post_date] => 2019-11-07 15:36:15 [post_date_gmt] => 2019-11-07 14:36:15 [post_content] =>

The way people get around is starting to change, and as a professor of transport strategy I do rather wonder if the modes of transport we use today will still be around by the turn of the next century.

Growing up, my favourite book was a children’s encyclopaedia first published in 1953. One double page spread featured an annotated cityscape, showing all aspects of the built environment – most of which we would still be familiar with now. The various modes of transport illustrated – trains, buses, lorries, taxis, motorcycles, bikes, pedestrians and private cars – still work together as a system in fundamentally the same ways.

But a whole range of possible (though not inevitable) societal and technological changes could revolutionise how we travel in the coming decades. These include large-scale responses to the climate change agenda and energy sourcing and security; shifting demographic trends (such as growing numbers of elderly people); the development of the collaborative economy; the growing use of big data; and the apparent inevitability of driverless cars.

To examine what future urban transport systems might look like, I recently directed a future-gazing project for New Zealand’s Ministry of Transport exploring how people might be travelling in the year 2045. I helped develop four scenarios, along two axes of change.

The first axis considered automation – at one end, vehicles are still be driven much like today (partial automation). At the other, they’re driverless (full automation). The second axis related to how dense cities could become – one future where the population is more dispersed (like Los Angeles) and another where it is concentrated at a higher density (more like Hong Kong). With these axes in mind, I generated four possible futures for public transport, which could play out in cities across the world.

Choose your fighter. By Marcus Enoch, Author provided

1. Shared shuttles

In the “shared shuttle” city, demand responsive minibuses, Uber-style taxis and micro-modes – such as shared bicycles, electric bikes and hoverboards – to cover the “last mile” to your destination are widespread. Hiring these different forms of transport is simple, thanks to seamless booking and payment systems and a thriving entrepreneurial spirit among a range of commercial, social and government transport providers. Meanwhile, new environmental regulations mean that owning a car is more expensive than it used to be, and private vehicles are restricted to the suburbs.

Flexibility is a core feature of this scenario, with vehicles and services that adjust to the needs of individuals, and with how the space continually adapts to meet the needs of the city as a whole. There’s also a collaborative ethos, reinforced by the development of a more compact and high-density city, while progress toward full automation has been slow because of safety and privacy concerns.

2. Mobility market

Private cars still dominate urban transport in the mobility market scenario. Many citizens live and often work in dispersed, low-density suburban areas, since city-centre housing became too expensive for most to afford. Fewer people walk and cycle, because of the long distances involved. And the use of public transport has declined, since less dense transport networks mean there are fewer viable routes, though a limited network of automated trains and buses is still used for trips to the city centre.

Car use has fallen somewhat since the 2010s, because “active management” measures – such as pre-bookable fast lanes and tolls – are now necessary to control congestion, despite the completion of a sizeable road building programme in the recent past.

Instead, commercially provided pre-paid personalised “mobility packages” are helping to stimulate the use of a whole range of shared mobility options, such as car-pooling, bike hire and air taxi schemes. These now account for around a quarter of all journeys.

3. Connected corridors

Society in this high-tech, highly urbanised world of connected corridors is characterised by perceptive but obedient citizens who trade access to their personal data in return for being able to use an extremely efficient transport system. Physically switching between different services or even different modes of travel is hassle free, thanks to well designed interchange points, and fully integrated timetabling, ticketing and information systems.

For instance, travellers might walk, e-cycle or take a demand-responsive minibus to a main route interchange, then board a high frequency rail service to get across town and finally take a shared autonomous taxi to their destination. Each will be guided by a personalised, all-knowing “travel ambassador” app on their smartphone or embedded chip, which will minimise overall travel times or maybe maximise sightseeing opportunities, according to their preferences.

Private cars are not really needed. People trust technology to deliver inexpensive and secure transport services and appreciate living close to work, family and friends.

4. Plentiful pods

In this future, fleets of variously-sized driverless pods now provide around three-quarters of those journeys that still need to be taken across the low-density, high-tech city. These pods having largely replaced most existing public transport services, and the vast majority of privately-owned cars.

People do still walk or cycle for some shorter trips. But pods are so convenient, providing affordable point-to-point journeys for those not satisfied by virtual interactions. Passengers can pay even less, if they agree to share with others. Pods are also fully connected to the internet, and are priced and tailored to meet customer needs. Ultimately, pods give people the freedom to work, learn or live where the weather is best or the houses are cheapest.

My research did not pass judgement as to which scenario should be pursued. But it did conclude that public transport will need to evolve to meet future challenges, and that the role of government will still be of key importance going forward, no matter which path is chosen. Personally though, if forced to choose, I think I’d favour a shared shuttle future more than the others - it just seems more sociable.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Cover image: Renault's float autonomous car

[post_title] => Four visions for the future of public transport [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => closed [post_password] => [post_name] => four-visions-for-the-future-of-public-transport [to_ping] => [pinged] => [post_modified] => 2019-11-07 15:36:17 [post_modified_gmt] => 2019-11-07 14:36:17 [post_content_filtered] => [post_parent] => 0 [guid] => https://nextnature.net/?p=125688 [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw [post_category] => 0 )[comment_count] => 0 [current_comment] => -1 [found_posts] => 465 [max_num_pages] => 47 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => 1 [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => 1 [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => [is_privacy_policy] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_posts_page] => [is_post_type_archive] => [query_vars_hash:WP_Query:private] => da3cb6b58b07d0039d6d92a412741b24 [query_vars_changed:WP_Query:private] => [thumbnails_cached] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed )[compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ))
load more