I feel like drones are everywhere in the news, and never with a good connotation. Really though, there is a myriad of useful applications for Unmanned Aerial Systems (UAS), so let’s think of them as a tool for good rather than a weapon-carrying-device.
In my case I’m thinking about agriculture, and how these tool can be used to grow crops. This topic has been in the news recently because of new FAA regulations that make it impossible for farmers to us UASs as a way to monitor their fields.
What does flying over a field get you?
Quite a lot, mostly by collecting images. Seeing fields from above can help farmers monitor plant health, pest damage, and water use. Data collection like this is an improvement in time and energy on manual collection on farms that are hundreds (or thousands) of acres.
A UAS is equipped with a number of sensors that collect data and store it for later.
One sensor that can be used to monitor crops from above is a LIDAR, which Alena wrote about a few months ago. As she explained, this sensor is useful for determining the canopy height of vegetation. This is just as useful in agriculture as in bunny habitat monitoring. If a farmer took LIDAR measurements regularly they could monitor how fast their crops were growing in a season or compare growth rate of crops in different areas of land.
Cameras that just take pictures are useful for simple counting of crops or livestock, but the really cool instruments go beyond our visual spectrum. Multispectral sensors capture image data in small chucks across the electro-magnetic spectrum. Later on, the data can be compiled to show all the layers of imaging collected, or just a specific few.
When looking at vegetation, wavelengths from the red, green and near infrared spectra are often used for analysis of crop management. Near infrared spectra falls between 750 and 2350 nm. Plants reflect green and infrared wavelengths, so areas that are high in vegetation like crop fields or parks will show up in these wavelengths. Areas that do not have much vegetation like roads and buildings will reflect red light.
Let’s take this a step further by applying an index.
A vegetation index allows for comparison between vegetative areas. The Normalized Difference Vegetative Index (NDVI) contrasts radiation at each pixel using this equation:
NDVI = (infrared - red)/(infrared + red)
This ratio results in a number between one and negative one for each pixel; The closer to one, the more vegetation in an area. More vegetation is indicative of a healthier crop. If you could pinpoint which areas of your crop weren’t healthy it would be easier to figure out what the problem was and fix it. When growing crops, pests or a lack of nutrients are recurring problems, so having a set of images over a period of time would be super useful to see how things change.
Infrared sensors on a UAS at the higher end of the spectra can measure the temperature of an area and help farmers predict where plants are not receiving enough water. Areas with high soil moisture will be cooler than areas with low soil moisture and is reflected in images captured by infrared sensors at a higher wavelength than those used for vegetative health. Currently, there are severe droughts in western states like California and Arizona where much of our food is grown. If a farmer wanted to target only the areas that really needed water they could use this kind of image mapping to locate dry areas and water with more precision.
Using UASs close to home
An example that is close to home for many of us at FTDM is apple scab and the work PhD student Matt Wallhead is doing with Dr. Kurt D. Broders. Apple scab is a fungus that can cause serious problems for apple farmers in places that are cool and moist in the spring and fall. The fungus overwinters in fallen apple leaves. In spring rainfall releases spores, which are carried up to the tree canopy by air currents or water. If infected, leaves will be yellow and chlorotic and dark olive colored scabs will appear on leaves and fruits. In very severe cases leaves and fruits will fall off. Even if fruits are not badly infected, any scabs deem the fruit unsalable.
Imagine a cool, wet spring - kind of like the past spring when winter just would not end. If you were an apple farmer you would be worried that all the moist weather would promote apple scab spore release and your crop for the year might be doomed. Fungicide can control apple scab, but it is expensive, so you wouldn’t want to spray it indiscriminately on all the apple trees in your orchard. So, you would tromp around looking for signs of scab and target those areas. In a big orchard this would be a lot of traveling around.
If you could use a UAS to collect multispectral images once a day you could be in better shape with a lot less work. By combing layers of color and temperature data, Matt is figuring out how to pinpoint apple scab outbreaks so farmers can treat just that specific area with fungicide. (For those of you concern with chemical applied to food products, fungicides are not the only treatment for scab. Other methods include using varieties that are more resistant, and keeping the area under apple trees free of leaf debris.)
I have only touched on the applications and analysis that can be provided by aerial imaging of crop fields. Each crop or pest or disease or problem is a little bit different. Isn't that what makes farming and science interesting? Taking this technology and making it affordable, user friendly, available and legal are the next steps. With the benefits UASs and multispectral images give it seems likely that this could happen in the not too distant future.
Interesting places to read more:
Recent posts on FTDM: