In a groundbreaking twist on a long-standing scientific endeavor, Andrew Richardson, a Regents’ Professor in the School of Informatics, Computing, and Cyber Systems and a team of scientists across the U.S., are giving a modern upgrade to a decades-old study that focuses on the seasonal rhythms of plants and animals.
Since 2008, Richardson has collected more than 100 million images through the PhenoCam Network, a project where hundreds of digital cameras are tracking what is going on in natural and managed ecosystems throughout the U.S. and the world.
Data on vegetation color is extracted from the images, and changes in vegetation color provide valuable information about the passing seasons—information such as how and when crops grow and get harvested, how trees change color and leaf out during spring and fall, and how weather patterns affect what plants are doing in different regions of the U.S.
The project began when Richardson was at the University of New Hampshire; it is now run in collaboration with Chris Coffey and NAU’s Applied Research Computing Group in ITS. PhenoCam images and data are stored on NAU servers and image processing leverages NAU’s high-performance computing facility, Monsoon. Richardson said the data volumes are almost mind-boggling, and they rely heavily on this collaboration to keep things running smoothly.
The National Science Foundation has awarded two grants to Richardson that will combine the widespread use of cellphones with an advanced image recognition program, helping researchers track these changes in a more timely and economical manner.
Upgrading the data collection
“Our collaborator at the University of Massachusetts, Phuc Nguyen, reached out to me with an idea to develop an application for old smartphones that would recycle the technology instead of having them end up in the landfill,” Richardson said. “The cameras we currently use cost around $750 each, and these recycled cell phones are more like $50 each. By re-using old phones, we will be cutting down on electronic waste and using that technology in an entirely new and different way.”
Nguyen and his team are testing different cellphone cameras for quality imaging sensors. They will then use a capacitor, instead of the original battery, as a power source. The capacitor will be charged using energy harvested from the environment, creating a long-lived platform for environmental monitoring.
“With this technology, we could put out thousands more cameras, creating a denser network of phenology sensors across the U.S.,” Richardson said. “First, we need to do extensive head-to-head testing against standard PhenoCam hardware. Our second collaborator on this project, Troy Gilmore at the University of Nebraska-Lincoln, is interested in using the cameras to measure stream flow and water levels in lakes. His group will be testing to see how well recycled smartphones work for that application.”
AI-powered search
The second NSF grant awarded to the project will be used to develop a cutting-edge AI platform designed to organize, tag and process the PhenoCam photographs, providing a more dynamic process that will save time.
“The AI platform is being designed,” Richardson said. “Troy’s group has been working on the prototype for a few years. The idea of this project is to make that platform more user-friendly, better documented and more robust so it can be used for a wide range of applications without requiring highly-specialized skills or training. Hopefully, it will really change the scale of the kinds of information you can extract from digital photographs, while also saving a lot of time. This project is a collaboration between academic researchers and research computing specialists, which makes it a lot of fun.”
With AI processing millions of images and smartphones capturing moments from coast to coast, the study promises not just more data, but deeper insight into how our environment is changing, adding a new chapter to an ongoing project.
Jacob Blais, a PhD student in Richardson’s lab, is currently using the platform GRIME AI to help process hundreds of thousands of PhenoCam images from an experiment at the Sevilleta National Wildlife Refuge in New Mexico.
“GRIME AI is not your typical AI software,” Blais said. “It will improve the quality of the plant greenness data I use in my dissertation research by getting rid of non-vegetation areas within my images. This will increase our confidence in the scientific results and the story they tell about the impacts of climate change on desert ecosystems of the southwest U.S.”
View images captured by the PhenoCam in Hart Prairie
View images captured by the PhenoCam in the Flagstaff Arboretum
Mariana Laas | NAU Communications
(928) 523-5050 | mariana.laas@nau.edu