menu

Plantelligence and Ecomimesis

Plants Responding to People

Within a city like Baltimore, the landscape is generally considered the ‘background’ for human activity – a largely undifferentiated expanse of green, without much thought about the actual plants that fill the space. But UMBC Professor of Visual Arts Lynn Cazabon moved those plants to the foreground, and she is worked with the IRC to explore the ways that plants respond to the environment around them. She also wanted people to think about species that thrive in urban areas, and to that end worked with one common species, Conyza canadensis, better known as horseweed, a native annual. While most people think of horseweed as a nuisance, or weed, Cazabon is fascinated by its adaptability to the stresses of living in human-created landscapes.

Cazabon received a 2016 IRC Summer Faculty Fellowship for her project Plantelligence. “Plantelligence emerged from my interest in how plants perceive and respond to events occurring in their surrounding environment, as a means to bring attention to how global warming impacts the way plants and in turn urban landscapes are currently evolving. Recent research in the field of plant neurobiology has led to inquiries into the evolutionary purposes of the many ways that plants sense their surroundings, including through analogues to sight, hearing, touch, taste and smell as well as through perception of electrical, magnetic, and chemical input.” But, plants move and react at speeds below human perception—and this is where the IRC’s research in photogrammetry, 3D modeling, animation, and virtual reality comes into play.

Plantelligence first took shape in the photogrammetry rig, where the IRC took scans of growing plants every 30 minutes for about two months, ultimately generating 8 terabytes of data. But the plan to create a 3D timelapse film ran into a few temporal snags. The first was that the plants did not grow as quickly as we would have hoped. The second issue involved the time needed for the scans themselves: it took the computer 3-4 days to process a model from each scan. It would have taken far too much time to process every single scan. Technical director Ryan Zuber cleaned up one model plant, imposing quadrangular polygons on its surface, which allow the model to be textured and animated in Maya. Finally, the animated plants were imported into a generic virtual environment

In the summer of 2018, Cazabon displayed an updated version of this project, now called Ecomimesis in the exhibition Hustle at the Science Lab Gallery Detroit. An initiative of Michigan State University, the Science Lab Gallery Detroit is part of a network of international galleries featuring projects that blend art, science and technology aimed at reaching young audiences. The animation was customized for the Hustle exhibition with the virtual space designed to mirror the physical details of the Science Gallery Lab, so that as viewers don the Oculus headset they will see over a dozen plants emerging in a virtual and slightly idealized version of the space in which they are standing. The viewer’s body is intentionally not represented in the animation, resulting in an intimate encounter with the plants as the viewer floats around and merges with them. Ecomimesis will be shown at two adjacent stations in the gallery, but each viewer’s experience is unique. The plants appear at randomly generated locations and at varying points in their growth cycles. As one plant dies, another emerges.

Production Notes

Project Director: Lynn Cazabon
IRC Technical Director - Modeling: Ryan Zuber
IRC Technical Director - Programming: Mark Jarzynski
Photogrammetry and Programming: Mark Murnane