Friday, January 30, 2015

Multicopter Photography and Racing: A Comparison of Commercially Available Platforms (UNSY 603 Activity 3.4)


Introduction
This paper will identify two commercial-off-the-shelf (COTS) Unmanned Aircraft Systems (UAS) platforms, each to be used in a different application. One will be used in aerial photography and videography below 400 ft. above ground level (AGL), and the other will be used for racing utilizing a first-person view (FPV) camera. These applications require the same basic task: to achieve flight, carry a camera, and transmit the imagery from that camera to its operator in real-time. The difference between the platforms will be the regimes of flight that they are expected to carry out. The photo/video UAS will need to fly stably, carry a large camera with a mechanical stabilizing device (a gimbal), and fly for extended durations. The racing UAS will need to have a high power to weight ratio, a small imaging device, and a power system optimized for high current draw and short flight times.

Aerial Photography and Videography
DJI is a manufacturer of aerial photography and videography UAS of various sizes and configurations. The decision of which platform to use depends on factors such as the user’s experience level, size of the camera, expected duration of flight, and the user’s budget. This paper will evaluate the DJI Inspire 1. While this is not the best or most capable UAS platform for this task, it is an excellent entry-level UAS at a good price point with good capabilities.
The Inspire 1 was released in late 2014 in time for the Christmas holiday and the Consumer Electronics Show in early 2015. This UAS was advertised as a Ready-to-Fly (RTF) platform for aerial photography and videography and marketed towards individuals with little to no experience with UAS. The Inspire 1 makes use of the operator’s existing portable electronics, such as a smartphone or tablet, and allows for single or dual operator control (one operator flies the aircraft, the other manipulates the camera system). The camera for the Inspire 1 uses a Sony 1/2.3 sensor capable of shooting Ultra High Definition video (4096 x 2160 pixels, also known as 4K video). The sensor is a Complementary Metal-Oxide Semiconductor (CMOS) Sensor, which is cheaper and easier to produce than a Charge-Coupled Device (CCD) Sensor, but may cause some undesirable effects in the images. CMOS Sensors scan line by line, in what’s called a “rolling shutter.” CCD sensors capture every pixel at once, in what’s called a “global shutter.” Rolling shutter becomes apparent when shooting fast moving objects, or when the camera is being moved quickly at the moment the image is taken. The result is that fast moving objects appear to be slanted to one side (Fig. 1).

Fig. 1: Rolling Shutter Effect

For aerial photography UAS, the designer usually places the sensor well below the aircraft, where it will have a good view of the ground, unobstructed by the airframe and propellers. Usually, this requires the airframe to have retractable landing gear, or have landing gear that rotate with the camera (Fig.2). The Inspire 1 utilizes a unique retract system that reconfigures the shape of the airframe after takeoff to give the camera a 360o unobstructed field of view (Fig 3). The aircraft configures itself automatically using an ultrasonic altimeter to detect its proximity to the ground.


Fig 2: Retractable Landing Gear (Top) and Rotating Landing Gear (Bottom)


Fig. 3: DJI Inspire 1 Flying Configuration (Top) and Landing Configuration (Bottom)

With its low cost, simplistic operation, out-of-the-box functionality, and an approximately 18 minute flight time, the DJI Inspire will accomplish most amateur aerial photographers and videographers’ goals. If the objective is to carry larger, industry standard cameras for professional photography and videography, the author suggests the DJI Spreading Wings S-800, S-900, and S-1000 UAS.

First Person View (FPV) Racing
The foremost goal in racing vehicles of any type is to have a high power-to-weight ratio. That is to say, have a great deal of power compared to the total weight of the vehicle. Multicopters, and helicopters in general, were created to take off and land vertically, and to maneuver in all directions. They are not particularly suited for moving in any direction at great speed, compared to their fixed-wing aerial counterparts. For helicopters, their speed is limited by a phenomenon known as retreating blade stall- when the side of the rotor disk rotating to the aft cannot move through the air fast enough to generate lift and the aircraft stalls sharply to one side, invariably causing catastrophic failure. Multicopters are not limited so drastically. Their speed limitation is a simple physics vector equation (Fig. 4). The aircraft requires a certain amount of thrust to sustain level flight- not surprisingly, this thrust required is equal to the aircraft’s weight. To achieve lateral flight, the propulsion system must produce not only the power to support the aircraft’s weight, but also the power to overcome aerodynamic drag. It becomes possible, if the aircraft has sufficient power available, to angle the aircraft significantly and achieve high lateral speeds. However, because multicopters use the variance of speed of their motors to maneuver, utilizing all of their available power means there is no remaining power left to maneuver. The author has experienced this effect while flying a multicopter at its maximum speed, resulting in an uncontrolled descent and subsequent crash at high speed. For this reason, Multicopter autopilots are programmed with a pitch and roll limit which results in a top forward speed.


Fig. 4: Vertical and Horizontal Components of Lift

The sensing system on an FPV racing multicopter needs to be lightweight, have a large field of view, and should be relatively unobstructed. To achieve high speeds, the aircraft must pitch forward sharply to accelerate, and an unstabilized camera subsequently faces toward the ground and the pilot no longer has a view of the forward flight. Therefore, it is prudent that the camera have a simple servo-controlled pitch stabilizer to maintain forward visibility during acceleration. Many FPV racers use the GoPro camera for its wide field of view (170o), small size, and high resolution recording capabilities. There are certainly smaller cameras, such as the Sony board camera line, that are more compact and lighter, but if the pilot wishes to record the race in high-definition to review later, the GoPro is a good choice. In order to achieve a view mostly unobstructed by propellers, the camera for an FPV racer is typically placed as far forward on the airframe as possible. There is no compelling reason to place the camera underneath the frame, as photo/video UAS do. This results in unnecessary drag and necessitates the use of large landing gear, which also contribute to drag. Many FPV racers utilize a frame shape that places the midsection of the aircraft, and hence the camera, as far forward as possible, which is grotesquely referred to as the “dead cat frame” (Fig. 5). Most FPV racers purchase the components for their multicopter separately (frame, motors, propellers, speed controllers, battery, and electronics) and assemble the airframe themselves. This is a very cost-effective way to purchase a multicopter.



Fig. 5: “Dead Cat” Quadcopter Frame

As the focus of this paper is to evaluate COTS multicopters, the best off-the-shelf offering for a racing multicopter is most likely the Blade 350 QX. The 350 QX has a nearly 4:1 power to weight ratio, an “agility mode” that gives the pilot a great deal of controllability and maneuvering range, and a flight time of about 15 minutes. With the addition of a GoPro camera or similar, the 350 QX is a very competitive FPV racing UAS.


Fig. 6: Blade 350 QX



References
DJI Inspire 1 Specifications. (n.d.) DJI, Inc. Retrieved from: http://www.dji.com/product/inspire-1/spec
Active Pixel Sensor (APS). (n.d.) Wikipedia. Retrieved from: http://en.wikipedia.org/wiki/Active_pixel_sensor
Rolling-Shutter-Effekt. (n.d.) Wikipedia Deutschland. Retrieved from: http://de.wikipedia.org/wiki/Rolling-Shutter-Effekt
Blade 350 QX. (n.d.). Horizon Hobby, Inc. Retrieved from:  http://www.bladehelis.com/350qx/




Thursday, January 22, 2015

Unmanned Systems Maritime Search & Rescue (UNSY 605 Assignment 2.4)

The REMUS 6000 Autonomous Underwater Vehicle (AUV), employing a wide array of underwater sensors, was critical to finding the wreckage of Air France flight 447. AF447 was en-route from Brazil to France when it encountered icing conditions over the Atlantic Ocean which blocked its pitot tubes, which are used to measure the aircraft's speed. The pilots responded incorrectly, stalling the airplane and ultimately crashing it into the ocean, killing all of the occupants (Ferrante, Kutzleb & Purcell, 2011). 

The REMUS 6000 incorporates numerous sensors to determine its position and navigate. Underwater navigation requires aggregating the data from many sources to determine an accurate position and track. To this end, it uses an Inertial Measurement Unit (IMU) that records linear and angular accelerations to estimate its travel from a known location. Augmenting this estimation is an Acoustic Doppler Current Profiler (ADCP), which takes acoustic measurements of the vehicle's movement over the seabed. Two transducers receive acoustic position data from Deep Ocean Transmitters (DOTs) pre-lain in known locations. The vehicle is equipped with a Global Positioning System (GPS) receiver, but this sensor is only usable on the water surface. To avoid collisions, the vehicle uses a pencil-beam sonar collision avoidance system that informs the control system of obstacles in the AUV's immediate path necessitating evasive action. Depth is measured using a pressure sensor combined with the ADCP measurement. The vehicle is also equipped with a conductivity (salinity) sensor and ground fault detection (Kongsberg, 2012). 

To search the ocean floor for wreckage, the purpose-built REMUS 6000 uses several exteroperceptive sensors, many developed specifically for the underwater environment. It uses Edgetech dual frequency side-scan sonar sensors to map the topography of the ocean floor and search for sonar returns characteristic of man-made objects, 400 to 700 meters to the left and right of the vehicle (Ferrante, Kutzleb & Purcell, 2011). It uses these sensors in a raster pattern. These are similar to side scan radar used on aircraft, but radio waves dissipate too quickly underwater, thus acoustic waves are used instead. When an object of interest is found, the REMUS 6000 employs a multi-beam profiling sonar, which creates a 3-dimensional sonar image. To confirm the existence of wreckage, the vehicle uses an electro-optical imager synchronized to a strobe light. When searching a known debris field, the vehicle employs a sub-bottom profiling sonar to search for buried debris (Woods Hole Oceanographic Institute, 2012). 

Would the REMUS 6000 be more effective if paired with an Unmanned Aircraft System (UAS)? Without some idea of where to begin to search, AUVs are very ineffective tools. AUVs scan the ocean bottom slowly (1-4 knots). Aircraft, manned and unmanned, scan the ocean surface much faster than AUVs scan the ocean floor. Any floating debris or oil slicks indicative of the last known position (LKP) of the aircraft will narrow the underwater search considerably and make the REMUS 6000 more effective. AF447's wreckage was found 6.5 nautical miles from its LKP (Ferrante, Kutzleb & Purcell, 2011).

Are AUVs more or less effective at carrying underwater sensing equipment than a manned submersible? Both vehicles are capable of carrying and employing the same sensors. However, a manned submersible carries a great deal of superfluous equipment, not the least of which being its human occupants. AUVs are smaller and more hydrodynamic, require less power for operation and locomotion, and only require small pressure vessels for its equipment and flood the remainder of the vehicle (Christ & Wernli, 2013). In the event of a catastrophic failure of the power system of pressure vessels, AUVs ideally jettison ballast and return to the surface. A manned sub can do the same, but possibly endangering its occupants. If an AUV can perform the same sensory tasks, perhaps this implies a moral obligation not to put humans in harm's way. Finally, the REMUS 6000 can perform its sensing missions for up to 16 hours. Human factors would preclude a manned submarine from staying on station for that amount of time. 

Are there any conceivable ways to improve upon the REMUS 6000's role in underwater search and rescue? Without being part of the design process, it is difficult to understand all of the system level trade offs. One idea is to simply increase the vehicle's battery size, ideally resulting in a longer time on station. However, since the data must be downloaded directly from the vehicle, perhaps 16 hours is the longest interval that its designers wanted to wait for data. Another idea is to employ a small, go-between AUV that offloads the large datasets from the REMUS 6000 and returns them to the control vessel. 




References:
Ferrante, B., Kutzleb, B, Purcell, M. (2011). AF477 Underwater Search and Recovery Operations: A Shared Government-Industry Process. Sterling, VA: International Society of Aviation Safety Investigators.
Christ, R. & Wernli, R. (2013). The ROV Manual, Second Edition. Oxford: Elsevier/ Butterworth-Heinemann Press. 
REMUS 6000 Specifications. (2012). Woods Hole Oceanographic Institution, Oceanographic Systems Laboratory. Woods Hole, MA. Retrieved 22 January 2015, from http://www.whoi.edu/page.do?pid=105976 (Links to an external site.)
REMUS 6000 Datasheet. (2012). Kongsberg: Hydroid, Inc. Retrieved 22 January 2015, from http://www.km.kongsberg.com/ (Links to an external site.)



Sunday, January 18, 2015

Gorgon Stare (UNSY 605 Activity 1.5)

"With Air Force's Gorgon Drone 'we can see everything'" (Summary)

This article highlights the latest in the Gorgon Stare saga, a story of an unmanned sensing system that breaks the mold- observing a city all at once, and documenting everything it sees. The article begins with a circulated and speculated-over photo depicting a U.S. Air Force MQ-9 Reaper in Afghanistan carrying what the article surmises are new Gorgon Stare modules (called Increment II).

The article points to the limitations of current Unmanned Aircraft Systems (UAS) sensors that can only observe one thing at a time (also know as the Soda Straw Effect). The Gorgon Stare, it reports can send 65 different images to users simultaneously. Perhaps the greatest achievement of this new technology is its ability to track patterns of life, offering a forensic analysis of movement.

The Gorgon Stare made a public appearance in the PBS documentary Rise of the Drones, where the documentary pointed to it as the future of unmanned warfare. The technology enabling the Gorgon Stare is disturbingly simple in theory: it is the cobbling together of 386 five-megapixel smartphone cameras that results in a 1.81 billion megapixel sensor. The article explains that the name Gorgon refers to a mythical Greek creature with an unblinking eye that would turn all who beheld it to stone. All in all, it gives this futuristic sensor a science fiction aura.

The second half of the article turns the readers' attention to the military's use of intelligence, and questions whether more information (like that provided by Gorgon Stare) is actually a solution to a problem. It details how intelligence analysts watch hours of mind-numbing surveillance video without result. One idea to file the massive amount of imagery that Gorgon Stare will create is to catalog the imagery by location and event. "[A]n analyst in Afghanistan can retrieve the last month's worth of bombings in a particular stretch of road with a push of a button."

Pointing to the future, the article addresses some of the Air Force's ideas of how the technology will be used. The Air Force reportedly wants to implement wide-area surveillance systems on airships, and hopes to replace boots on the ground to some degree with better sensing systems. Air Force officials also say that the system will have civilian potential being used in border surveillance and natural disasters.

References:

Nakashima, E., Whitlock, C. (2011). With Air Force's Gorgon Drone, 'we can see everything.' The Washington Post. www.washingtonpost.com. Web. Accessed 18 Jan 2015

Jennings, G. (2014). USAF Image Appears to Show Gorgon Stare Increment II in Afghanistan. IHS Janes 360. www.janes.com. Web. Accessed 18 Jan 2015


Research: History of UAS (ASCI 530 Assignment 1.5)

Though many parallels may be drawn between early and modern Unmanned Aircraft Systems (UAS), perhaps the most direct continued practice is that of the Optionally Piloted Vehicle (OPV). Early UAS prototypes made use of existing manned aircraft platforms as test beds for new autonomous technologies. Today, OPV platforms are used for the same practice- evaluating untested UAS technology in a proven manned platform, allowing the redundant safety of a human pilot in the loop. 

The Sperry Aerial Torpedo project in 1911-1918 made use of the Curtis N-9 Seaplane, a proven manned aircraft design, to create a possibly viable unmanned aerial bombing platform. At the beginning of the project manned, controlled, heavier-than-air flight had only been achieved eight years prior and there were a great deal of technical challenges yet to be solved before the Sperry Aerial Torpedo could succeed. Human pilots were used to perform the takeoff, and subsequently transferred controls to the Sperry autopilot. When the Curtis airplane company created a purpose-built airframe called the "Curtis-Sperry Flying Bomb," Elmer Sperry himself elected to be the test pilot. Sperry crashed twice, once when the aircraft hit a patch of ice on takeoff, and again while transferring controls to the autopilot. Sperry was unhurt in both accidents. 

A modern example of the OPV is the Centaur by Aurora Flight Sciences. Based on the Diamond DA-42 manned aircraft, the Centaur provides for fully autonomous, manned pilotage, or a combination of the two control regimes. Its unmanned functionality may be accessed remotely or from a terminal within the aircraft. The impetus behind creating an OPV in the present day coincides with the reasons behind early optionally piloted UAS, and incorporates some new ones. Similar to early UAS Centaur allows component manufacturers to test new technologies in a low-risk environment for all or part of the flight profile. Centaur offers benefits that may not have been realized in early UAS, such as mitigation of legal restrictions and re-positioning of the aircraft. 

References:
Centaur. Manassas: Aurora Flight Sciences, Inc. www.aurora.aero. (n.d.) Web. Accessed 18 Jan 2015
Hewitt-Sperry Automatic Airplane (n.d.) en.wikipedia.com. Web. Accessed 18 January 2015