Within a city like Baltimore, the landscape is generally considered the ‘background’ for human activity – a largely undifferentiated expanse of green, without much thought about the actual plants that fill the space. But UMBC Professor of Visual Arts Lynn Cazabon wants to move those plants to the foreground, and she is working with the IRC to explore the ways that plants respond to the environment around them. She also wants people to think about species that thrive in urban areas, and to that end is working with one common species, Conyza canadensis, better known as horseweed, a native annual. While most people think of horseweed as a nuisance, or weed, Cazabon is fascinated by its adaptability to the stresses of living in human-created landscapes.
Cazabon received a 2016 IRC Summer Faculty Fellowship for her project Plantelligence. As she explains, “Plantelligence emerged from my interest in how plants perceive and respond to events occurring in their surrounding environment, as a means to bring attention to how global warming impacts the way plants and in turn urban landscapes are currently evolving. Recent research in the field of plant neurobiology has led to inquiries into the evolutionary purposes of the many ways that plants sense their surroundings, including through analogues to sight, hearing, touch, taste and smell as well as through perception of electrical, magnetic, and chemical input.” But, plants move and react at speeds below human perception—and this is where the IRC’s research in photogrammetry, 3D modeling, animation, and virtual reality comes into play. As Cazabon explained, “I am using time-lapse photography and photogrammetry to study the movements of growing plants in order to translate these movements for human perception through animation…My goal in using VR is to create an immersive environment for the viewer which blurs conventional distinctions between inside and outside.”
Plantelligence first took shape in the photogrammetry rig, where we took scans of growing plants every 30 minutes for about two months, ultimately generating 8 terabytes of data. But the plan to create a 3D time-lapse film ran into a few temporal snags. The first was that the plants did not grow as quickly as we would have hoped. The second issue involved the time needed for the scans themselves: it took the computer 3-4 days to process a model from each scan. It would have taken far too much time to process every single scan. As a result, faculty research assistant Mark Murnane is working on a way to process the images on UMBC’s High Performance Computing Facility, which will speed up the process enough to make it feasible in the future.
We also learned a lot about the challenges of scanning a plant, because the models that were initially generated needed a lot of cleaning up by hand. Technical director Ryan Zuber calls these irregular models, with holes and deformation, “crunchy,” and he went to work smoothing them out. He cleaned up one model plant, imposing quadrangular polygons on its surface, which allow the model to be textured and animated.
But, as Zuber and Cazabon realized, it’s not easy to create and animate a realistic plant that is designed to be seen individually and up close. The horseweed has many leaves, tiny hairs, and variable textures, all of which need to be able to move independently of each other, and all of which need to be seen at multiple stages of growth. Zuber is treating each leaf as an individual ‘character’ and has built a rig that can work for all of the leaves, regardless of their specific geometry. He studied time-lapse films of plants growing in order to get a sense of the way the leaves grow and unfurl, and is now able to animate the plant.
The next step involves placing the plant in VR space where people can interact with it, which Cazabon envisions as a generic, unadorned gallery space. The goal here is to bring the outside to the inside: to isolate the plant against a neutral space that is more ideal for human perception. The final step will be to bring the animated plant and gallery environment together with custom software that will enable a viewer to explore and interactively affect the plant’s growth.