Saturday, August 12, 2017

Activity 7.5 UAV Sense and Avoid Sensor

7.5 - Blog Activity: Sense and Avoid Sensor Selection

Sense and Avoid Sensor Selection




          The world’s first drone deliveries have begun trial runs in the United Kingdom and the U.S. Once primarily used by militaries, small quadcopter and octocopter drones are now so commonplace they are for sale at home improvement stores and toy stores. People are flying drones for fun, for entertainment and for commercial purposes as diverse as filmmaking and farming.  All these uses have one thing in common: The drone’s human operator is required by law to be able to see the drone always. Why? The answer is simple: to make sure the drone doesn’t hit anything.  Beyond just wanting not to crash and damage their drones or themselves, drone operators must avoid collisions with people, property and other vehicles. Specifically, federal aviation regulations forbid aircraft – including drones – from flying “so close to another aircraft as to create a collision hazard.” The rules also require that “vigilance shall be maintained by each person operating an aircraft so as to see and avoid other aircraft.” These requirements are commonly referred to simply as “see-and-avoid”: Pilots must see and avoid other traffic.  But that places a significant limitation on drone operations. The whole point of drones is that they are unmanned. Without a human operator, see Fig 1. on board, though, how can a drone steer clear of collisions?


                                             Fig 1. Drone delivery 

          To be practical, delivery drones would have to be able to fly long distances, well out of sight of a human operator. How, then, can the operator prevent the drone from hitting a tree, building, airplane or even another drone? Although cameras could be mounted on the drone for this purpose, current civil drone video transmission technology is limited to a range of a few miles. As a result, to perform long-distance deliveries, the drone must autonomously detect nearby objects and avoid hitting them.
New research into sensors – at least some of which come from development of autonomous cars – is making increased autonomy possible for drones, potentially opening the skies to even more innovation (Conversation, 2017). 

          So, how do operators avoid running into objects along their flight path autonomously? The system that I found is the use of Light Detection and Ranging (LIDAR). Below is Fig 2. that shows how miniaturized LIDAR systems have become and are perfect for Unmanned Aerial Vehicles (UAV’s) that weigh less than 55 lbs. and easy to install.

Radar and LIDAR: Lidar, developed more recently, uses laser beams instead of radio waves, and can provide extremely detailed images of nearby features. The catch is that both radar and lidar systems have been bulky, heavy and expensive. That makes them hard to fit on relatively small drones; also, heavier drones require more battery power to stay aloft, which requires bigger (and heavier) batteries.





                                                             Fig 2. Small LIDAR Sensor

There is hope, though. Research in obstacle sensors and collision avoidance technology for autonomous automobiles has spurred the development of small, lower-cost radar and lidar devices. Once they are sufficiently small, and energy-efficient enough not to quickly drain drone batteries, both types of sensors could help solve the drone “see-and-avoid,” or really, because drones don’t have eyes, the “detect-and-avoid” problem. An in-flight view: A recent test flight here at Ohio University involved a lidar sensor mounted on a drone. When the drone was approximately five feet above the ground, the lidar could create an image of its surroundings.





                                       Fig 3. A LIDAR image from Drone in flight

On one side in Fig 3. the image had bushy-looking areas representing trees and foliage. One the other there were parallel lines indicating the location of a building wall. And in the middle, were some circular shapes representing the ground. This sort of obstacle detection capability and discernment will be essential for routine drone operation, particularly during takeoff and landing.
We are currently in what might be called the “Wright Brothers era” of drone development. Removing the human from the cockpit has challenged innovators and designers in many ways – including solving the task of obstacle detection. But as our technology advances, eventually – just like elevators that used to be operated by humans – people will grow used to the idea of these machines operating autonomously (Conversation, 2017).

          I would like to give just a basic review of what Light Detection and Ranging (LIDAR) is and its capabilities.

          Most of the latest UAV lidar systems can rotate around their own axis and offer 360-degree visibility. Modern devices achieve very high data rates with over one million distance points per second. How does LIDAR sensors work? The system emits a laser pulse and it is recorded as back scattered signals giving you a distance (Time of flight x speed of light) retrieving the UAV’s position and altitude to a precise echo position computation. Lidar Sensors for Drone Collision Avoidance:
Within a lidar sensor, many independent elements are integrated into a single device and will generate critical ranging data for safe navigation along with precise positioning. Lidar sensors have obstacle detection capabilities over a wide field of view which makes them ideal as part of a sense and avoid solution.  Collision avoidance technology has now moved across into the consumer drone sector with the highly innovative DJI Phantom 4 Pro (using 2 Ultrasound sensors  and 4 monocular sensors) and the Yuneec Typhoon H Pro (uses the Intel RealSense R200 3D camera) now with collision avoidance.
Lidar Sensors for Ground and Above Ground Imagery: The latest lidar sensors have integrated optical altimeter technology which deliver accurate distance measurements above ground level while meeting size, weight, and cost requirements of UAV manufacturers.  Agriculture and forestry use lidar to inspect vegetation such as leaves and crops. Also, the above ground imagery (for example forest canopy) can be removed to view the ground surface area. Lidar Drones for Structural Inspection: The best lidar sensors have powerful built-in signal processing, large field of view, and multi-segment measurements which generate critical distance data and efficient obstacle detection which enable safe navigation when performing structural inspections. Lidar Sensor at Night:  Lidar sensors also described as laser scanning will work in low contrast or shadowy situations, even at night. The many uses of LIDAR include, Agriculture, Mapping, Topography, Building & Oil Rig inspection, Costal surge modeling, Hydrodynamic modeling, and Digital Elevation modeling.
Multispectral and Photogrammetry Imagery Lidar, multispectral and photogrammetry imagery are all very closely related technologies.  In some sectors and situations, images from all 3 are required to give a full analysis of the terrain, vegetation or structure (DroneZon, 2017).

References
        from https://www.dronezon.com/learn-about-drones-quadcopters/best-lidar-sensors-for-drones-
        great-uses-for-lidar-sensors/

        delivery. Retrieved from http://theconversation.com/obstacle-avoidance-the-challenge-for- 
        drone-package-delivery-70241


No comments:

Post a Comment