Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Dronalyser: Low Cost Multispectral Drone Crop A...

Chetan Chawla
April 28, 2019
17

Dronalyser: Low Cost Multispectral Drone Crop Analysis

Under Supervision of Abhishek Gagneja, Assistant Professor, Bharati Vidyapeeth's College of Engineering, New Delhi

Chetan Chawla

April 28, 2019
Tweet

Transcript

  1. Stats • 15.4% of the GDP(Gross Domestic Product) by Agriculture

    • India ranks first globally • HR involvement of 31% • Land occupancy of around 60.45%
  2. The Problem • Farmers rely on rudimentary methods and intention

    based methodology to analyse • Miniature level analysis by farmers • Satellite analysis is expensive, have low revisiting rates, coarse spatial resolutions and works only when cloud coverage is less than 5% • RGB analysis has no insights • Existing Multispectral analysis is expensive
  3. The solution • Android Application based service • Drone Traversal

    of Field • Multispectral Imagery acquisition • On-board processing on raspberry pi
  4. • Feasible Overhead imagery • Traversal of the area in

    accurate positions • Provides high precision imagery on spatially respectable levels • Good battery levels with high flight times. Drone
  5. Work done Stage 1: Developed a code to fly drones

    in constrained environment from scratch to understand the dynamics of drones and PID Stage 2: Using a CC3D mini to operate the drone through remote control Stage 3: Using APM2.8 flight controller with GPS
  6. Stage 1 Stage 2 • Drone to fly in constrained

    environment • Has overhead camera • Does waypoint navigation using Whycon marker detection • Crosses Hoops and Avoids obstacles through Lua and Python Scripting • CC3D Mini was used • Had built in codes • We used FlySky Transmitter and Receiver at 2.4 GHz
  7. Stage 3- Current Setup • We have moved to the

    APM 2.8 flight controller • Incorporated GPS control and telemetry extensions • Used Mission Planner • Polygon method where the planner automatically divides the whole region and covers the entire area. • The Open Drone Map Plugin in Mission planner triggers the data acquisition circuitry to click images and later stitch the orthomosaic.
  8. Components 1. APM - Flight Controller 2. ESC-Electronic Speed Controllers

    3. BLDC motors 4. Rotors 5. Power division circuit 6. LiPo battery 7. Chassis 8. GPS 9. Raspberry Pi 3b 10. PiNoIR camera with red light filter
  9. What’s NDVI? Normalized Difference Vegetation Index (NDVI) quantifies vegetation by

    measuring the difference between near-infrared (which vegetation strongly reflects) and red light (which vegetation absorbs). It is an indicator of healthy vegetation
  10. Method 1: • We use 2 Raspberry Pi, one attached

    with a regular PiCam and other with a Pi NoIR camera. The NoIR camera has the IR filter removed, leading to additional near infrared also captured in the red channel. • We use the R channel of both the captured images for the formula to find the NDVI
  11. Method 2: • In this also we use 2 Raspberry

    Pi, one attached with a regular PiCam and other with a Pi NoIR camera. • In this we convert the images to grayscale for processing and use these in the formula to work on the intensity of the image.
  12. Problem with the first 2 methods: 1. 2 cameras cannot

    be connected to Raspberry Pi without an expensive adapter(~Rs. 7000) 2. It’s necessary to have overlapping photos for NDVI calculation, which requires camera placement dependent image pre-processing which is cumbersome. 3. Method 1 has the flaw that the R channel of NoIR image is not purely IR but also contains Red light, which cannot be removed programmatically. 4. Method 2 works on intensity, hence not giving true NDVI but a far approximation
  13. Method 3: • In this also we developed an Infrablue

    image, an image with the red channel having only IR. We did this by applying a Red light filter in front of the Pi NoIR camera. • Healthy plants, not only absorb red, but also blue and hence appear green.Therefore, we retrieve the reflected NIR from the R channel of the captured image and use Blue light instead of red in the formula