The most important way to combat climate change and the greenhouse gases that cause it is to develop a more intimate understanding of the plants in our environment. Traditionally, fieldwork was done by botanists, on paper, by hand, measuring, sketching, sampling ad tedium. Not only was this method subject to the inadequancy that is human error, it was also vastly limited by availability of time and travel resources, meaning that the data collected was in now way comprehensive or reproducable.
For my project, I investigated the use of the 3D Structure Sensor as a tool for capturing and recording plants in the Durfee Conservatory. To this end, I met with many difficulties, but also many rewards, which I will attempt to innumerate here.
The campus greenhouse is a surprising bustling place in the early spring, which taught me my first lesson about using the Structure Sensor: DO NOT let someone step between you and the object you are scanning! I had almost completed a scan of a bamboo grove, which had already taken 15 minutes of minute position-shifting, when an inconsiderate passerby pushed past me and in front of my lense (instead of using the bridge to bypass the both of us), repositioning my field of focus so badly that it somehow turned upside-down. I abruptly stopped working, to avoid mis-scanning over my previous work:
Another huge problem with scanning plants in particular is that the 3D Structure Sensor is not sensitive enough to pick up the little things, like stems and flowers. Although this does not present much of a problem when viewing the scan on 3D hosting software, if I was to attempt to print a plant without stems, I would end up with an assortment of odd leaf clusters and a pot. It is also worth mentioning that the Structure Sensor does not do well scanning thin objects, so the leaves that were picked up were rather globular and indistinct. If I were to attempt to use my 3D model of a given plant to record its measurements, the data would bear no resemblance to the original plant:
Perhaps the most frustrating failure of the structure sensor was that sometimes it was unable to focus on a certain plant at all, usually as a result of its being to small. As a result, whenever I hit the capture button, I would get a basic initial scan before that dreaded “Please move the object back into view” prompt popped up, only it was not possible to move the scanner to the infinitesimal exact position from which I had started the scan; of course, knowing this, I would still try to rescue my scan for several minutes before surrendering.
Although specific dimensions were iffy at best, the sensor’s ability to map pictures over 3D scans has incredible potential. Take, for instance, this scan of a pineapple plant sprouting a new fruit:
Again, the scanner has difficulties in registering plant surfaces, but when it comes to color and image overlay, the result is powerful. Given this, the future relationship between the structure sensor and the environment may well lie in species cataloguing; gone will be the days of hand-sketching and coloring plant parts!
The 3D Structure Sensor, as it stands, would potentially be of use in charting and cataloguing plant species. This could take the form of generating a basic backyard landscape, which could be used as a survey on the types of species living in an area. A crowdsourcing compaign, similar to the Backyard Bird Project, could be launched to get a sampling of species density across the country. When I attempted to scan a plant cluster, rather than individual specimens, the results were relatively rewarding:
However, in terms of my intended project to chart the change in biomass over a given time frame, the Structure Sensor is lacking in all necessary areas, namely being precision and ease of use.
Thankfully, 3D technology is changing rapidly; maybe the sensor will be refined sometime soon, and I will be able to restart my initial biomass investigation.