WASHINGTON -- This week behind the Smithsonian Castle, a research botanist and two computer science professors unveiled Leafsnap, a free plant identification app for the iPhone (available for the iPad next week and Android this summer) with broad potential for the future of image recognition.
“The project began in 2003 when we received funding from the National Science Foundation to develop a new identification system for the 21st century,” said John Kress, a research botanist at the Smithsonian’s National Museum of Natural History. “Now we have a scientific tool for botanists as well as the public. We did this so people could know what they’re looking at, but also so they could think about conserving it.”
Kress teamed up with Peter Belhumeur from Columbia University and David Jacobs at the University of Maryland to create the world’s first plant identification mobile app. The computer scientists--experts in biometrics and face recognition technology-- spent eight years collecting and photographing leaves, developing the recognition technology that separates the leaf from its background in order to identify it, and developing the algorithms and software.
I watched Belhumeur test out the app with his iPhone, snapping a picture of a Ginkgo Biloba leaf on a white piece of paper. He explained that Leafsnap is interactive, and the user has to make decisions based on a ranking.
“It works very much like Google,” he said, “but instead of being initiated by text, it’s initiated by images.” Each leaf photograph is matched against a leaf-image library using numerous shape measurements computed at points along the leaf’s outline. The best matches are then ranked and returned to the user for final verification. Sure enough, in less than 30 seconds, the response came back from the computers up at Columbia, with Ginkgo at the top of the suggestions.
“It won’t be 100 percent, because lots of leaves have a similar leaf shape,” Belhumeur said, “but that’s why they’re ranked.” In any case, he said, it’s much faster and less frustrating than trying to identify a plant through an old fashioned field guide. The information provided includes a description, how the plant is used and where it is found.
Belhumeur said part of the project involved photographing leaves in a way that has never been done, at a resolution so high that the detail would never be found in a field guide. The brilliant photos (created by nonprofit organization Finding Species) show each species’ leaf (both front and back), flower, seed, fruit and bark on a black background. They can be magnified down to the fur on the petiole, the stalk of the leaf that attaches the blade to the stem.
Currently, Leafsnap includes the 191 species of trees found in New York’s Central Park and Washington’s Rock Creek Park. By the end of this year, its library will include all 250 species found in the Northeast, and eventually it will include all the native trees in the United States, plus 50 to 100 introduced species, such as the Ginkgo.
Users can play leaf identification games, mark the species they’ve identified to add to their collection, and on a broader scale, they will be contributing to science. As they identify trees, the app automatically shares their images, species identifications and the tree’s location with a community of scientists. These scientists will use the information to map and monitor population growth and decline of trees nationwide.
Jacobs said that within a single species leaves can have quite diverse shapes, while leaves from different species are sometimes quite similar. “So one of the main technical challenges in using leaves to identify plant species has been to find effective representations of their shape, which capture their most important characteristics.”
He said this technology could potentially be used to identify all kinds of shapes. “One thing that’s exciting for us is that this project touches on some fundamental problems in image recognition,” he said, “[such as] how do you compare two shapes? It puts emphasis on basic questions.”
This post was originally published on Smartplanet.com