Saturday, August 12, 2017

Activity 7.5 UAV Sense and Avoid Sensor

7.5 - Blog Activity: Sense and Avoid Sensor Selection

Sense and Avoid Sensor Selection




          The world’s first drone deliveries have begun trial runs in the United Kingdom and the U.S. Once primarily used by militaries, small quadcopter and octocopter drones are now so commonplace they are for sale at home improvement stores and toy stores. People are flying drones for fun, for entertainment and for commercial purposes as diverse as filmmaking and farming.  All these uses have one thing in common: The drone’s human operator is required by law to be able to see the drone always. Why? The answer is simple: to make sure the drone doesn’t hit anything.  Beyond just wanting not to crash and damage their drones or themselves, drone operators must avoid collisions with people, property and other vehicles. Specifically, federal aviation regulations forbid aircraft – including drones – from flying “so close to another aircraft as to create a collision hazard.” The rules also require that “vigilance shall be maintained by each person operating an aircraft so as to see and avoid other aircraft.” These requirements are commonly referred to simply as “see-and-avoid”: Pilots must see and avoid other traffic.  But that places a significant limitation on drone operations. The whole point of drones is that they are unmanned. Without a human operator, see Fig 1. on board, though, how can a drone steer clear of collisions?


                                             Fig 1. Drone delivery 

          To be practical, delivery drones would have to be able to fly long distances, well out of sight of a human operator. How, then, can the operator prevent the drone from hitting a tree, building, airplane or even another drone? Although cameras could be mounted on the drone for this purpose, current civil drone video transmission technology is limited to a range of a few miles. As a result, to perform long-distance deliveries, the drone must autonomously detect nearby objects and avoid hitting them.
New research into sensors – at least some of which come from development of autonomous cars – is making increased autonomy possible for drones, potentially opening the skies to even more innovation (Conversation, 2017). 

          So, how do operators avoid running into objects along their flight path autonomously? The system that I found is the use of Light Detection and Ranging (LIDAR). Below is Fig 2. that shows how miniaturized LIDAR systems have become and are perfect for Unmanned Aerial Vehicles (UAV’s) that weigh less than 55 lbs. and easy to install.

Radar and LIDAR: Lidar, developed more recently, uses laser beams instead of radio waves, and can provide extremely detailed images of nearby features. The catch is that both radar and lidar systems have been bulky, heavy and expensive. That makes them hard to fit on relatively small drones; also, heavier drones require more battery power to stay aloft, which requires bigger (and heavier) batteries.





                                                             Fig 2. Small LIDAR Sensor

There is hope, though. Research in obstacle sensors and collision avoidance technology for autonomous automobiles has spurred the development of small, lower-cost radar and lidar devices. Once they are sufficiently small, and energy-efficient enough not to quickly drain drone batteries, both types of sensors could help solve the drone “see-and-avoid,” or really, because drones don’t have eyes, the “detect-and-avoid” problem. An in-flight view: A recent test flight here at Ohio University involved a lidar sensor mounted on a drone. When the drone was approximately five feet above the ground, the lidar could create an image of its surroundings.





                                       Fig 3. A LIDAR image from Drone in flight

On one side in Fig 3. the image had bushy-looking areas representing trees and foliage. One the other there were parallel lines indicating the location of a building wall. And in the middle, were some circular shapes representing the ground. This sort of obstacle detection capability and discernment will be essential for routine drone operation, particularly during takeoff and landing.
We are currently in what might be called the “Wright Brothers era” of drone development. Removing the human from the cockpit has challenged innovators and designers in many ways – including solving the task of obstacle detection. But as our technology advances, eventually – just like elevators that used to be operated by humans – people will grow used to the idea of these machines operating autonomously (Conversation, 2017).

          I would like to give just a basic review of what Light Detection and Ranging (LIDAR) is and its capabilities.

          Most of the latest UAV lidar systems can rotate around their own axis and offer 360-degree visibility. Modern devices achieve very high data rates with over one million distance points per second. How does LIDAR sensors work? The system emits a laser pulse and it is recorded as back scattered signals giving you a distance (Time of flight x speed of light) retrieving the UAV’s position and altitude to a precise echo position computation. Lidar Sensors for Drone Collision Avoidance:
Within a lidar sensor, many independent elements are integrated into a single device and will generate critical ranging data for safe navigation along with precise positioning. Lidar sensors have obstacle detection capabilities over a wide field of view which makes them ideal as part of a sense and avoid solution.  Collision avoidance technology has now moved across into the consumer drone sector with the highly innovative DJI Phantom 4 Pro (using 2 Ultrasound sensors  and 4 monocular sensors) and the Yuneec Typhoon H Pro (uses the Intel RealSense R200 3D camera) now with collision avoidance.
Lidar Sensors for Ground and Above Ground Imagery: The latest lidar sensors have integrated optical altimeter technology which deliver accurate distance measurements above ground level while meeting size, weight, and cost requirements of UAV manufacturers.  Agriculture and forestry use lidar to inspect vegetation such as leaves and crops. Also, the above ground imagery (for example forest canopy) can be removed to view the ground surface area. Lidar Drones for Structural Inspection: The best lidar sensors have powerful built-in signal processing, large field of view, and multi-segment measurements which generate critical distance data and efficient obstacle detection which enable safe navigation when performing structural inspections. Lidar Sensor at Night:  Lidar sensors also described as laser scanning will work in low contrast or shadowy situations, even at night. The many uses of LIDAR include, Agriculture, Mapping, Topography, Building & Oil Rig inspection, Costal surge modeling, Hydrodynamic modeling, and Digital Elevation modeling.
Multispectral and Photogrammetry Imagery Lidar, multispectral and photogrammetry imagery are all very closely related technologies.  In some sectors and situations, images from all 3 are required to give a full analysis of the terrain, vegetation or structure (DroneZon, 2017).

References
        from https://www.dronezon.com/learn-about-drones-quadcopters/best-lidar-sensors-for-drones-
        great-uses-for-lidar-sensors/

        delivery. Retrieved from http://theconversation.com/obstacle-avoidance-the-challenge-for- 
        drone-package-delivery-70241


Monday, August 7, 2017

Activity 6.4 Control Station Analysis




Control station for an unmanned ground or maritime (surface or undersea) system.
Unmanned Undersea Vehicles (UUVs) are the most recent addition to the autonomous vehicle world. These platforms where and are being developed to replace divers and manned deep submerge vehicles (DSVs) to do work in the ocean such as exploration, science, search and rescue and ocean floor survey. Using UUV’s reduces the operational risk management (ORM) dramatically to the human being.
Market scope: Military unmanned maritime vehicle (UMV) technology has multiple applications in mine counter-measures; anti-submarine warfare; intelligence, reconnaissance and surveillance; port security; forward battlespace security and many other areas in the military and security sectors (Cision, 2015).
For my discussion, I will be examining the REMUS 600 AUV. This upgraded REMUS (from the REMUS 100) has increased endurance, payload, and operating depth.


  1. Vehicle diameter: 32.4 cm (12.75 in); diameter varies depending upon module (for 600 m depth configuration).
  2. Vehicle length: Min length ~2.7 m (~9 ft) Max length ~5.5 m (~18 ft); length varies depending upon module configuration.
  3. Weight in air: Min weight ~220 kg (~500 lbs) Max weight ~385 kg (~850 lbs); weight varies depending upon module configuration.
  4. Maximum operating depth: 600 m (1500 m configurations available).
  5. Power: 5.4 kWh rechargeable Lithium ion battery. (Second 5.4 kWh battery tray is optional). Exchangeable battery option available.
  6. Endurance: Typical mission endurance is up to 24 hours in standard configuration. Subject to speed, battery and sensor configurations.
  7. Propulsion: Direct drive DC brushless motor to an open two bladed propeller.
  8. Velocity range: Up to 2.3 m/s (4.5 knots) variable over range.
  9. Control: 3 independent control fins providing yaw, pitch and roll control. Altitude, depth, yo-yo and track-line following provided. Optional forward fins available for heading control during bottom tracking with a cross current.
  10. External hook-up: Two connectors, one for shore power and one for shore data. Alternatively, 802.11G wireless network (Wi-Fi) provided via dorsal fin antenna.
  11. Casualty circuits: Ground fault, housing leak detection and all sensors and systems have operational go / no-go fault indicators.
  12. Navigation methods: Inertial, Long Baseline (LBL) Acoustic, SBAS enabled GPS, Ultra Short Baseline Acoustic and Acoustic Transponder.
  13. Communication: Acoustic modem, Iridium modem, Wi-Fi 2.4 GHz, 100 Base-T Ethernet (standard), 1000 Base-T Ethernet (optional).
  14. Software: REMUS Vehicle Interface Program (VIP), GUI-based laptop interface for programming, training, documentation, maintenance and troubleshooting.

Operators can monitor the AUV's progress and status via an acoustic link. This also enables amendments to the mission plan to be sent to the vehicle along with position updates if required. The HiPAP (KONGSBERG acoustic underwater positioning and navigation system) or Ranger positioning systems provide acoustic aiding to the on-board IMU (Inertial Measurement Unit) and DVL (Doppler Velocity Log) equipment to make the real-time position solution as accurate as possible. Some HYDROID AUVs also transmit real-time side-scan and bathymetry data back to the operator acoustically. This data is displayed on the payload computer screen to give the operations team confidence that the mission is progressing as planned and there are no gaps in the data. When 
the AUVs are on the surface, they can communicate via Wi-Fi or radio with the operator. They are also equipped with GPS receivers to update the IMU position with the most accurate information available (Kongsberg, 2017).

The REMUS 600 uses the Vehicle Interface Program (VIP) which makes it easier for users for mission planning, maintenance, and data analysis. The communication between the control station and the vehicle is via an Ethernet or Wi-Fi connection which allows the operators to build a mission target deck, view the target deck on a map for any adjustments or corrections, check for any warning messages if mission target deck is loaded incorrectly and to display vehicle status with green indicators meaning OK or red indicators that mean a fault exists.

Identify any negative issues or challenges that are currently faced by users, then recommend changes or additions. 

With the future development of UUV’s and AUV’s there needs to be standardized interfaces with a common architecture to communicate and control the vehicles. This will produce minimal logistics and longer performance that all underwater vehicles can cooperate with other underwater systems. There needs to be more onboard processing of data into information then compressing the data to be transmitted to other vehicles and the ground control station aboard a ship or on land. All underwater operations have a difficult time communicating and in the near term more research needs to be done around communication capabilities. There needs to be more autonomy which would reduce the communications piece immensely. Also, the vehicles need to have the ability to navigate and give coordinate position to the control station in order integrate the data from the sensors. A possible solution would be to attach small modules to the sea floor that could send radio transmissions or use Global Positioning System (GPS) navigational fixes for the vehicles to follow.
I also think that the UUV should be able to recharge itself from some type energy packet that has been set in place on the sea floor. The UUV could receive data from other vehicles in the working area via small buoy or antenna on the ocean surface to re-gain it’s GPS and timing (basically like re-aligning an INS on an airplane). By using this buoy or antenna system the UUV can transmits its data via satellite or unmanned aerial vehicle (UAV) relay which would provide constant and real-time warning indications to the control station.



References
           2016-2026. Retrieved from http://www.prnewswire.com/news-releases/military-unmanned-  
           maritime-vehicles-umv-market-report-2016-2026-561412801.html

            https://www.km.kongsberg.com/ks/web/nokbg0240.nsf/AllWeb
            /F0437252E45256BDC12574AD004BDD4A?OpenDocument