Imaging Sub-Team Fall 2014

Imaging Sub-Team Progress

UC Merced Engineering Service Learning

Unmanned Aerial Vehicles - Fall 2014

Overview

    The Unmanned Aerial Vehicles team, as a part of Engineering Service Learning at the University of California, Merced, is a team that is focused on bringing UAVs or drones into the small-scale agriculture industry by providing cheap and effective drone technology to farmers. The aim of the UAV team project is to provide a solution for farmers to help them increase the yield efficiency of their crops by improving their ability to monitor growth. By creating a system that will automatically monitor and detect crop deficiencies, the UAV team hopes to increase the amount of time a farmer has to focus on other important aspects of his crops, thereby increasing his efficiency and decreasing his costs.

    The Imaging Sub-Team, as a part of the UAV team, is focused on the data manipulation and analysis portion of the UAV project. As the UAV flies over the crop field, it takes pictures at set intervals. These pictures have to be catalogued and analyzed for them to actually be useful for a farmer. The goal of the Imaging Sub-Team is not only to figure out how this can be done, but also to implement it cheaply. Thus, the Imaging Sub-Team is adherently using only free, open-source software that can be provided to farmers without any licensing issues. The sub-team also writes its own code for use in the project in any portion that can not be done with existing open-source software.

    The Imaging Sub-Team is a very essential part of the UAV team. Without the data analysis software provided by the Imaging Sub-Team, the project could not have a specific focus on the small-scale agriculture industry.

 

Focus

    The focus of the Imaging Sub-Team is an attempt to find a narrow and specific scope that the project can be oriented towards. This focus has to be specific enough so that a process for analysis can be developed, yet general enough that it can be extrapolated to other areas of need. This is aligned with the goal of the project to be able to provide an overall analysis of a farmer’s crops, that is, to give a farmer all the tools necessary to provide a good overview of how his crops are growing and show him any problems that may arise. In addition, the focus must be something that can be spotted within the UAV team’s limitations, which arise from budget and legal constraints. The largest of these constraints is the camera type, which is a standard consumer digital camera that take pictures within the visible spectrum of light.

    Currently, the focus is on Pierce’s disease. Pierce’s disease is a very specific disease that affects grapevines and grapevines only. It is transmitted through the bite of an insect carrying the disease (normally Sharpshooters) and thus has the potential to spread rapidly. It also has the ability to kill an entire grapevine if it is not caught early enough. Since grapevines have an optimal lifespan of between 20 and 30 years, pierces disease can be devastating to the overall expected output of the vine. Combine this with its potential to spread, and the effects can have a large impact to a single, small-scale vineyard.

    Generally, the biggest outside indicator for Pierce’s disease is Leaf Scorching, or the discoloration of leaves due to water and nutrient deficiencies caused by Pierce’s disease. This is the main reason why Pierce’s disease was chosen as the focus of the Imaging Sub-Team, because Leaf Scorching happens in every plant that has leaves, and is a very good indicator of the plant’s overall health. Thus, the focus of the Imaging Sub-Team can be described as specifically Pierce’s disease, and generally Leaf Scorching.

 

Process

    The process by which analysis will take place is a step-by-step process where each step must be considered with reverence. Below is an outline of the process.

 

  1. Fly a UAV over a crop field, taking pictures along the way.

  2. Collect the pictures of the field and stitch them together into a single map.

  3. From the map, pick out the actual field to be analyzed.

  4. Find problematic areas and try to diagnose the issue.

  5. Compare the current state of the field to previous flyovers.

  6. Give an output about the health of the crop and provide suggestions for improvements.

 

    All of these steps involve the Imaging Sub-Team and they all involve lots of research and programming to achieve the desired results. Each of these steps will be explained in detail in the subsequent sections, including what they entail, the work progress so far, that the future expectations.

 

1: Flyovers and Image Capturing

    To gather the data required for analysis, a UAV with a camera attached must be flown over a crop field to take pictures. The flying is done automatically with an autopilot, and is part of the domain of the Platform Sub-Team. Picture taking must also be done automatically, and in order to reduce complexity and weight, it was decided to have the autopilot perform this function. Programming must be done on the autopilot when an autopilot is chosen and arrives. Another consideration is to decide how often photos will be taken. They can either be taken at pre-chosen GPS waypoints or at specific time intervals. This has yet to be decided.

 

2: Stitching of Images

    Image stitching requires software with the capabilities of integrating multiple images with undefined points of capture into a single, large image or map (often called a ‘panorama’). This map is an aerial image of the the entire field and surrounding structures. Software must be able to stitch the map together automatically and with a relatively low level of distortion and discontinuities. Autonomy is key; A farmer will have little patience for a program that needs to be messed with in order to produce the desired result. Likewise, a script or function to automatically run the stitching software without need for user input is important.

    Currently, certain software with stitching capabilities have been evaluated for their potential. The highest potential software is Microsoft Image Composite Editor, which necessitates the least amount of user input and also gives the highest quality image output. The software is free, however it is not open source, and may not be optimal for use in the project. Another piece of software that is open source and has potential is Hugin, however it is still being evaluated for use.

 

3: Field Detection

    Field detection is the process by which an individual crop field is picked out from a large area that may include multiple fields on an aerial image or map. The purpose of field detection is to be able to analyze data more accurately; if areas surrounding the crop field of interest are not eliminated, then errors will be introduced when the image is analyzed and output will not be consistent with reality. For a farmer, this means that only his field will show up in the final output and not his neighbor’s.

    Work on the field detection algorithm currently involves conversion of an image to HSV (due to how HSV images tend to show greater disparity between regional differences in color) and using an edge detection filter that will move over the entire stitched photo and check for areas where color changes rapidly.

 

4: Trouble Area Detection and Analysis

    The most important single feature of the Imaging software package will be it’s ability to detect troubled areas of the crop field and its analysis of these areas compared to the rest of the field. The software must be able to pick out the troubled areas and draw an outline around them so as to visually show the farmer where his crops are growing poorly and may be afflicted by various ailments. GPS waypoints should also be given so the farmer knows exactly where to look when he is out in the field checking his crops. In addition, the analysis should be able to give various outputs regarding health of the crops including but not limited to: overall health of the entire crop field, percent estimated yield of the crop field, possible ailments afflicting each individual troubled spot (as accurately as possible), and estimated reduction of yield in each individual troubled spot.

    Some work has been done providing an overall health of the field by utilizing color thresholds to give a percent of normally colored crops vs abnormally colored crops. For areas where there is a large concentration of leaf scorching (and hence lots of abnormally colored crops), an algorithm similar to the field detection algorithm must be used to find the boundaries of these areas. The alpha state of the analysis only provides the percent of normally colored crops vs abnormally colored crops, as work on the field detection algorithm has not yet been completed. Additionally, research must be completed to find the optimal color thresholds for normally colored crops and crops with types of leaf scorching. The alpha stage program has a default set threshold that does not correlate with anything and thus must be updated.

 

5: Comparison to Past Flyovers

    Due to the nature of periodic aerial photography, it is possible to analyse data across time in addition to across space. Future variations of the analysis program should implement features that utilize temporal data. This can include mapping of the growth patterns of crops, mapping the spread of diseases or ailments over time, and showing a time lapse of the farmers field to give him a visual of how his fields are doing. This will be a great asset to the UAV team and the most important selling point. Work on creating a timelapse should begin as soon as analysis has been completed.

 

6: Program Output

    Program output is especially important with regards to the usefulness of the UAV project to the farmer. Output should be concise, easy to read, and easily manipulatable to show various data. This should be as simple as possible to avoid confusion and frustration on the part of the farmer, as it cannot be expected that anyone who uses the software to know anything technical, meaning there must not be a steep learning curve. The program output should give concise data, as well as the latest aerial map of the farmer’s field with all the highlighting. Highlighted areas should be clickable so that more info can be outputted about that particular area. Additionally, the output should be navigable through time, that is there should be the ability to look at past data and analyses.

 

Future Considerations

    While the current path of the program is set to analyse field for leaf scorching, future iterations of the project should consider that the analysis program can be extrapolated to looking for things like water stress, nutrient depletion, and irrigation line disruption. These can all be analysed from the same set of images, and will be helpful to a farmer in terms of understanding the growth and health of his fields more completely. All of these can be analysed concurrently and outputted at the same time. However, lots of research must be done within these categories to see the visual effects before an algorithm can be developed. It is suggested that before additional analysis tools are developed, future semesters complete the algorithms and processes already set in place.

 

Acknowledgements

Below is a list of the current Imaging Sub-Team members who are diligently working towards the semester goal. Their contribution to the UAV team project as a whole is especially noteworthy.

 

Avery Berchek - aberchek@ucmerced.edu

Eric Harmon - eharmon@ucmerced.edu

Francis Espiritu - fespiritu@ucmerced.edu

Jesus Sergio Gonzalez - jgonzalezcastello@ucmerced.edu - UAV Team Leader

Joseph Kotlarek - jkotlarek@ucmerced.edu

Patrick Thavornkant - pthavornkant@ucmerced.edu

Thomas Thayer - tthayer@ucmerced.edu - Imaging Sub-Team Leader

Viridiana Navarro - vnavarro5@ucmerced.edu

Walter Vasquez - wvasquez4@ucmerced.edu

 

    Below is a list of all the other current members of the Unmanned Aerial Vehicles team, which includes members of both the Platform Sub-Team and the Marketing Sub-Team who are not involved with the Imaging Sub-Team.

 

Adrian Hernandez - ahernandez225@ucmerced.edu

Cesar Ramos - cramos8@ucmerced.edu

Christopher Reps - creps@ucmerced.edu

Justin O’Rourke - jorourke2@ucmerced.edu - Platform Sub-Team Leader

Phoskrit Manikalak - pmanikalak@ucmerced.edu

 

    Below is a list of the current supporting staff, mentors, and the community partner who helped get this project on its feet and provided guidance along the way.

 

Christopher Butler - cbutler@ucmerced.edu - Director of Engineering Service Learning

Duval Johnson - djohnson29@ucmerced.edu - Graduate Mentor

Larry Burrow - laburrow@ucanr.edu - Community Partner, Merced County Cooperative Extension

Tiekbiao Zhao - tzhao3@ucmerced.edu - Graduate Mentor

Dr. YangQuan Chen - ychen53@ucmerced.edu - Faculty Advisor

 

    Special thanks goes to the Mechatronics, Embedded Systems, and Automation (MESA) Laboratory for providing the UAV team with helpful resources and support.