Saturday, February 14, 2015

UAS in the NAS (ASCI 503 Assignment 4.5)

The busy and often congested airspace over the United States accommodates commercial, military, and recreational air traffic. With the addition of Unmanned Aircraft Systems (UAS) to this airspace, current and future stakeholders are concerned that the safety of air transportation will be at risk. Pilots of all aircraft are required by law to maintain vigilance and avoid other aircraft if weather permits1. Because the pilots of UAS are removed from their aircraft, some other means of avoiding other aircraft would be necessary to adhere to this regulation.

To aid in separation, manned aircraft are monitored by Air Traffic Control (ATC) by a variety of means. At large airports, Air Traffic Controllers sit in a control tower, and elevated building with windows, and visually monitor the aircraft. Controllers may also have a surveillance system such as a radar that measures radio reflectance of aircraft to determine their position. These aircraft may have equipment on-board (called a transponder) that transmits augmentary information to the controller such as airspeed, altitude, aircraft identification, and coordinates. Additionally, aircraft may carry a Traffic Collision Avoidance System (TCAS) that alerts the pilot to nearby aircraft. Aircraft may be controlled or uncontrolled, depending on the airspace they are flying in and the type of flight. Controlled aircraft must fly in controlled airspace, file a flight plan, and remain in communication with ATC. Uncontrolled aircraft must fly in certain classes of airspace, do not file flight plans, in most cases do not have to communicate with ATC, and must fly in good weather.

In order to comply with the “see and avoid” requirement, the implementation of UAS into the national air space to date has involved keeping the UAS within the visual line of sight of an observer on the ground or in a manned chase aircraft. This observer must have direct communication with the pilot of the UAS in order to give commands to avoid aircraft. For some applications of UAS, this method is sufficient to carry out the task. However, the benefit of using UAS is often diminished by the limitation of a ground observer’s line of sight or the expense of concurrently flying a chase aircraft. The Federal Aviation Administration (FAA) is responsible for regulating the safety of air transportation in the United States. The FAA has moved slowly in implementing UAS, due to the complex nature of the problem and the risk of an accident. One might argue that the FAA might have chosen not to implement UAS at all, were it not for a mandate from Congress to integrate UAS into the National Airspace System (NAS) by 2015.
The FAA’s goal in integrating UAS into the NAS is to achieve an equivalent level of safety to manned aircraft users of the airspace. To attain this level of safety, it is important to understand the risks involved. If a commercial airliner crashes, there is the potential for hundreds of fatalities and property destruction. If a small recreational passenger plane crashes, there are usually only a handful of fatalities, if any, and minimal property damage. Accordingly, there are many more regulations governing the operation of turbine-powered commercial aircraft than for recreational aircraft. UAS vary in size from the Global Hawk, which has a wingspan of 116 feet, to the Hummingbird UAS, whose wingspan is only 6.3 inches. While both of these platforms share the issue of having the pilot removed from the aircraft, the potential for loss of life and property vary greatly between them. For this reason, one set of regulations will not be appropriate for all UAS.
The first step that the FAA will take in creating these regulations for UAS is to divide them into categories. The FAA has already stated that it will not regulate UAS used for recreational purposes (no size limitation was imposed). Next, the FAA is expected to release rules for what it calls small UAS, which are arbitrarily defined as 55 pounds or less. A “small UAS rule” was expected to be released from the FAA in late 2014, but had not been at the writing of this paper. For these small UAS, the FAA may still only allow flights within visual line of sight.
The question remains of how UAS can operate in the NAS beyond the line of sight of their operators. The aforementioned technology for separation of aircraft can be employed by UAS in some cases. UAS large enough to carry the equipment for broadcasting its position and receiving traffic data may be able to meet the level of safety of manned aircraft. However, this technology is only relevant when all aircraft are equipped. The FAA has mandated that nearly all aircraft must be equipped with position-reporting transponders by 2020 as part of its Next-Gen ATC system. These transponders are becoming smaller, lighter, and cheaper allowing them to be used for small UAS.
While this Next-Gen system is put into place, would-be operators of unmanned aircraft are calling for a solution. Companies that have property and infrastructure spread out over large areas, such as pipeline and power line companies, would like to survey these infrastructures with UAS. Without some means of detecting and avoiding air traffic, this application will likely have to stay within line of sight.
Until new technology becomes commercially available to reliably detect, sense and avoid air traffic, the FAA’s limitations on UAS are going to remain. The largest of UAS may be able to remain in positively controlled airspace and rely completely on ATC for avoidance, but this still does not represent an equivalent level of safety to manned aircraft. Many view the forthcoming UAS regulations from the FAA as an inevitability, but the agency that regulates the world’s safest form of transportation is not going to make any hasty changes that could jeopardize that record.

1Code of Federal Regulations 14, Part 91.113(b)

References:
NASA Armstrong Fact Sheet: Unmanned Aircraft Systems Integration in the National Airspace System. (2014). Armstrong Flight Research Center. National Aeronautics and Space Administration. Retrieved from: http://www.nasa.gov/centers/armstrong/news/FactSheets/FS-075-DFRC.html#.VN_R0PnF_ZI\
Sagetech Unmanned Solutions. (2015). Sagetech, Inc. Retrieved from: http://www.nasa.gov/centers/armstrong/news/FactSheets/FS-075-DFRC.html#.VN_R0PnF_ZI
Air Traffic NextGen Briefing. (2014). Federal Aviation Administration. Retrieved from: http://www.faa.gov/air_traffic/briefing/
FAA Reauthorization and Modernization Act of 2012. (2012). U.S. Government Printing Office.

Sunday, February 8, 2015

Data Storage for the Black Hornet UAS (UNSY 605 Assignment 4.6)

The PD 100 Black Hornet Nano is a helicopter Unmanned Aircraft System (UAS), weighing only 18 grams (.04 lbs) made by Prox Dynamics in Norway. It has been used by the British military since 2013 in Afghanistan. The tiny UAS is capable of delivering live video and still images to operators via its handheld portable ground station. In 2014, the U.S. Army evaluated the Black Hornet under its initiative to acquire an intelligence gathering UAS that could fit in a soldier’s pocket, so called the Cargo Pocket-Intelligence, Surveillance, and Reconnaissance (CP-ISR). The U.S. Army  U.S. Natick Soldier Research, Development and Engineering Center made some requests for changes to the system, including the ability to see at night, fly indoors, and conform to the Army’s Digital Data Link (DDL) UAS communications standard. In late 2014, Prox Dynamics was scheduled to perform a demonstration of the Black Hornet Version 2 (v2)that incorporates these enhanced capabilities. The author will assume, for the sake of deduction, that the Black Hornet v2 successfully met those standards.

The aim of this paper is to identify the methods, procedures, and protocols that are used to collect, transfer and store imagery on the Black Hornet UAS. Because the Black Hornet is a new, proprietary system and information about it is protected by Norwegian export controls, the exact details of the Black Hornet UAS’s functionality are not public information. Nevertheless, the author will utilize all available information and make educated deductions to arrive at the most probable description of the systems.

The first version of the Black Hornet had three electro-optical (EO) cameras- one facing forward, one facing down, and another facing 45 degrees down from the forward direction. The v2 is said to have incorporated thermal InfraRed (IR). Small, 1 gram EO cameras, like those used on cell phones, are ubiquitous and were likely used in the Black Hornet. These types of cameras require 75 milliAmps and produce 720 x 480 resolution. The IR camera selected was most likely the Flir Lepton, a newly-developed long wave infrared sensor. The Lepton requires 150 milliAmps and produces 80 x 60 resolution.

The Black Hornet initially utilized a digital data link, but the waveform and protocol was not specified. The v2 will utilize the U.S. Army Digital Data Link (DDL) protocol, which is defined as LAW Tactical 802.3. It utilizes the L- and S-bands, which spans from 1 to 4 GigaHertz. Existing DDL radios are too large and/or heavy to be carried by the Black Hornet, thus Prox Dynamics will have to either custom engineer a radio and/or increase the overall size of the Black Hornet v2. Compression of the videos and images could take place in any file format, but are likely JPEG and MPEG*.

Specification sheets for the Black Hornet advertise that the ground station can store the video and images from over six flights. The most taxing sensor for storage is the EO camera running at 30 frames per second. If we assume that the data link throughput is 3.7 Megabits per second, which is typical for an off-the-shelf DDL radio, and the MPEG2 compression format is used, this results in 555 megabytes per flight. Multiplied by six, we know that the ground station is capable of storing at least 3.33 Gigabytes of imagery. Digital stills from the EO camera will be negligibly small, at just 42 kilobytes.


*Joint Photographic Experts Group (JPEG) and Motion Picture Experts Group (MPEG)


References:

Sisto, J. (2014). Army Researchers Develop Cargo Pocket ISR. Defense AT&L: September-October 2014

MICRO Secure Digital Data Link - MICRO SDDL. (2012). San Diego: L-3 Southern California Microwave. Retrieved from: http://www2.l-3com.com/scm/pdf/datasheets/SCMML628_Rev%20B.pdf

Prox Dynamics Launches Midlife Upgrade Of PD-100 Black Hornet PRS. (2014). Defence Procurement International - Summer 2014. Retrieved from: http://www.proxdynamics.com/backgrounds/9389a8cd-07f6-4a48-b0a8-91a4af465863.pdf

PD-100 Black Hornet PRS System. (2012). Product brochure, Prox Dynamics. Retrieved from: http://www.marlboroughcomms.com/media/9427/black-hornet-uas.pdf

Personal Reconnaissance System: PD-100 Black Hornet. (2014). Product Brochure, Prox Dynamics. Retrieved from: http://adsinc.com/download/sell_sheets/Prox%20Dynamics%20PD-100%20Sell%20Sheet.pdf

PD-100 Black Hornet PRS: Your Personal Reconnaissance System. (2013). Product Brochure, Prox Dynamics. Retrieved from: http://www.marlboroughcomms.com/media/13459/UAS-Black-Hornet-v2.pdf

1 Gram PAL Camera. (n.d.) Product details, FPV Hobby.com. Retrieved from: http://www.fpvhobby.com/63-1-gram-nano-camera-480tvl.html

FLIR LEPTON® Long Wave Infrared (LWIR) Datasheet. (2014). FLIR Systems, Inc. Retrieved from: http://www.flir.com/cores/display/?id=62648

Nano Digital Data Link. (2014). Product brochure, Microhard Systems, Inc. Retrieved from: http://www.microhardcorp.com/brochures/Nano.DDL.Brochure.Rev.1.7.pdf

Video Space Calculator. (2015). Online tool, Digital Rebellion, LLC. Retrieved from: http://www.digitalrebellion.com/webapps/video_calc.html

Friday, January 30, 2015

Multicopter Photography and Racing: A Comparison of Commercially Available Platforms (UNSY 603 Activity 3.4)


Introduction
This paper will identify two commercial-off-the-shelf (COTS) Unmanned Aircraft Systems (UAS) platforms, each to be used in a different application. One will be used in aerial photography and videography below 400 ft. above ground level (AGL), and the other will be used for racing utilizing a first-person view (FPV) camera. These applications require the same basic task: to achieve flight, carry a camera, and transmit the imagery from that camera to its operator in real-time. The difference between the platforms will be the regimes of flight that they are expected to carry out. The photo/video UAS will need to fly stably, carry a large camera with a mechanical stabilizing device (a gimbal), and fly for extended durations. The racing UAS will need to have a high power to weight ratio, a small imaging device, and a power system optimized for high current draw and short flight times.

Aerial Photography and Videography
DJI is a manufacturer of aerial photography and videography UAS of various sizes and configurations. The decision of which platform to use depends on factors such as the user’s experience level, size of the camera, expected duration of flight, and the user’s budget. This paper will evaluate the DJI Inspire 1. While this is not the best or most capable UAS platform for this task, it is an excellent entry-level UAS at a good price point with good capabilities.
The Inspire 1 was released in late 2014 in time for the Christmas holiday and the Consumer Electronics Show in early 2015. This UAS was advertised as a Ready-to-Fly (RTF) platform for aerial photography and videography and marketed towards individuals with little to no experience with UAS. The Inspire 1 makes use of the operator’s existing portable electronics, such as a smartphone or tablet, and allows for single or dual operator control (one operator flies the aircraft, the other manipulates the camera system). The camera for the Inspire 1 uses a Sony 1/2.3 sensor capable of shooting Ultra High Definition video (4096 x 2160 pixels, also known as 4K video). The sensor is a Complementary Metal-Oxide Semiconductor (CMOS) Sensor, which is cheaper and easier to produce than a Charge-Coupled Device (CCD) Sensor, but may cause some undesirable effects in the images. CMOS Sensors scan line by line, in what’s called a “rolling shutter.” CCD sensors capture every pixel at once, in what’s called a “global shutter.” Rolling shutter becomes apparent when shooting fast moving objects, or when the camera is being moved quickly at the moment the image is taken. The result is that fast moving objects appear to be slanted to one side (Fig. 1).

Fig. 1: Rolling Shutter Effect

For aerial photography UAS, the designer usually places the sensor well below the aircraft, where it will have a good view of the ground, unobstructed by the airframe and propellers. Usually, this requires the airframe to have retractable landing gear, or have landing gear that rotate with the camera (Fig.2). The Inspire 1 utilizes a unique retract system that reconfigures the shape of the airframe after takeoff to give the camera a 360o unobstructed field of view (Fig 3). The aircraft configures itself automatically using an ultrasonic altimeter to detect its proximity to the ground.


Fig 2: Retractable Landing Gear (Top) and Rotating Landing Gear (Bottom)


Fig. 3: DJI Inspire 1 Flying Configuration (Top) and Landing Configuration (Bottom)

With its low cost, simplistic operation, out-of-the-box functionality, and an approximately 18 minute flight time, the DJI Inspire will accomplish most amateur aerial photographers and videographers’ goals. If the objective is to carry larger, industry standard cameras for professional photography and videography, the author suggests the DJI Spreading Wings S-800, S-900, and S-1000 UAS.

First Person View (FPV) Racing
The foremost goal in racing vehicles of any type is to have a high power-to-weight ratio. That is to say, have a great deal of power compared to the total weight of the vehicle. Multicopters, and helicopters in general, were created to take off and land vertically, and to maneuver in all directions. They are not particularly suited for moving in any direction at great speed, compared to their fixed-wing aerial counterparts. For helicopters, their speed is limited by a phenomenon known as retreating blade stall- when the side of the rotor disk rotating to the aft cannot move through the air fast enough to generate lift and the aircraft stalls sharply to one side, invariably causing catastrophic failure. Multicopters are not limited so drastically. Their speed limitation is a simple physics vector equation (Fig. 4). The aircraft requires a certain amount of thrust to sustain level flight- not surprisingly, this thrust required is equal to the aircraft’s weight. To achieve lateral flight, the propulsion system must produce not only the power to support the aircraft’s weight, but also the power to overcome aerodynamic drag. It becomes possible, if the aircraft has sufficient power available, to angle the aircraft significantly and achieve high lateral speeds. However, because multicopters use the variance of speed of their motors to maneuver, utilizing all of their available power means there is no remaining power left to maneuver. The author has experienced this effect while flying a multicopter at its maximum speed, resulting in an uncontrolled descent and subsequent crash at high speed. For this reason, Multicopter autopilots are programmed with a pitch and roll limit which results in a top forward speed.


Fig. 4: Vertical and Horizontal Components of Lift

The sensing system on an FPV racing multicopter needs to be lightweight, have a large field of view, and should be relatively unobstructed. To achieve high speeds, the aircraft must pitch forward sharply to accelerate, and an unstabilized camera subsequently faces toward the ground and the pilot no longer has a view of the forward flight. Therefore, it is prudent that the camera have a simple servo-controlled pitch stabilizer to maintain forward visibility during acceleration. Many FPV racers use the GoPro camera for its wide field of view (170o), small size, and high resolution recording capabilities. There are certainly smaller cameras, such as the Sony board camera line, that are more compact and lighter, but if the pilot wishes to record the race in high-definition to review later, the GoPro is a good choice. In order to achieve a view mostly unobstructed by propellers, the camera for an FPV racer is typically placed as far forward on the airframe as possible. There is no compelling reason to place the camera underneath the frame, as photo/video UAS do. This results in unnecessary drag and necessitates the use of large landing gear, which also contribute to drag. Many FPV racers utilize a frame shape that places the midsection of the aircraft, and hence the camera, as far forward as possible, which is grotesquely referred to as the “dead cat frame” (Fig. 5). Most FPV racers purchase the components for their multicopter separately (frame, motors, propellers, speed controllers, battery, and electronics) and assemble the airframe themselves. This is a very cost-effective way to purchase a multicopter.



Fig. 5: “Dead Cat” Quadcopter Frame

As the focus of this paper is to evaluate COTS multicopters, the best off-the-shelf offering for a racing multicopter is most likely the Blade 350 QX. The 350 QX has a nearly 4:1 power to weight ratio, an “agility mode” that gives the pilot a great deal of controllability and maneuvering range, and a flight time of about 15 minutes. With the addition of a GoPro camera or similar, the 350 QX is a very competitive FPV racing UAS.


Fig. 6: Blade 350 QX



References
DJI Inspire 1 Specifications. (n.d.) DJI, Inc. Retrieved from: http://www.dji.com/product/inspire-1/spec
Active Pixel Sensor (APS). (n.d.) Wikipedia. Retrieved from: http://en.wikipedia.org/wiki/Active_pixel_sensor
Rolling-Shutter-Effekt. (n.d.) Wikipedia Deutschland. Retrieved from: http://de.wikipedia.org/wiki/Rolling-Shutter-Effekt
Blade 350 QX. (n.d.). Horizon Hobby, Inc. Retrieved from:  http://www.bladehelis.com/350qx/




Thursday, January 22, 2015

Unmanned Systems Maritime Search & Rescue (UNSY 605 Assignment 2.4)

The REMUS 6000 Autonomous Underwater Vehicle (AUV), employing a wide array of underwater sensors, was critical to finding the wreckage of Air France flight 447. AF447 was en-route from Brazil to France when it encountered icing conditions over the Atlantic Ocean which blocked its pitot tubes, which are used to measure the aircraft's speed. The pilots responded incorrectly, stalling the airplane and ultimately crashing it into the ocean, killing all of the occupants (Ferrante, Kutzleb & Purcell, 2011). 

The REMUS 6000 incorporates numerous sensors to determine its position and navigate. Underwater navigation requires aggregating the data from many sources to determine an accurate position and track. To this end, it uses an Inertial Measurement Unit (IMU) that records linear and angular accelerations to estimate its travel from a known location. Augmenting this estimation is an Acoustic Doppler Current Profiler (ADCP), which takes acoustic measurements of the vehicle's movement over the seabed. Two transducers receive acoustic position data from Deep Ocean Transmitters (DOTs) pre-lain in known locations. The vehicle is equipped with a Global Positioning System (GPS) receiver, but this sensor is only usable on the water surface. To avoid collisions, the vehicle uses a pencil-beam sonar collision avoidance system that informs the control system of obstacles in the AUV's immediate path necessitating evasive action. Depth is measured using a pressure sensor combined with the ADCP measurement. The vehicle is also equipped with a conductivity (salinity) sensor and ground fault detection (Kongsberg, 2012). 

To search the ocean floor for wreckage, the purpose-built REMUS 6000 uses several exteroperceptive sensors, many developed specifically for the underwater environment. It uses Edgetech dual frequency side-scan sonar sensors to map the topography of the ocean floor and search for sonar returns characteristic of man-made objects, 400 to 700 meters to the left and right of the vehicle (Ferrante, Kutzleb & Purcell, 2011). It uses these sensors in a raster pattern. These are similar to side scan radar used on aircraft, but radio waves dissipate too quickly underwater, thus acoustic waves are used instead. When an object of interest is found, the REMUS 6000 employs a multi-beam profiling sonar, which creates a 3-dimensional sonar image. To confirm the existence of wreckage, the vehicle uses an electro-optical imager synchronized to a strobe light. When searching a known debris field, the vehicle employs a sub-bottom profiling sonar to search for buried debris (Woods Hole Oceanographic Institute, 2012). 

Would the REMUS 6000 be more effective if paired with an Unmanned Aircraft System (UAS)? Without some idea of where to begin to search, AUVs are very ineffective tools. AUVs scan the ocean bottom slowly (1-4 knots). Aircraft, manned and unmanned, scan the ocean surface much faster than AUVs scan the ocean floor. Any floating debris or oil slicks indicative of the last known position (LKP) of the aircraft will narrow the underwater search considerably and make the REMUS 6000 more effective. AF447's wreckage was found 6.5 nautical miles from its LKP (Ferrante, Kutzleb & Purcell, 2011).

Are AUVs more or less effective at carrying underwater sensing equipment than a manned submersible? Both vehicles are capable of carrying and employing the same sensors. However, a manned submersible carries a great deal of superfluous equipment, not the least of which being its human occupants. AUVs are smaller and more hydrodynamic, require less power for operation and locomotion, and only require small pressure vessels for its equipment and flood the remainder of the vehicle (Christ & Wernli, 2013). In the event of a catastrophic failure of the power system of pressure vessels, AUVs ideally jettison ballast and return to the surface. A manned sub can do the same, but possibly endangering its occupants. If an AUV can perform the same sensory tasks, perhaps this implies a moral obligation not to put humans in harm's way. Finally, the REMUS 6000 can perform its sensing missions for up to 16 hours. Human factors would preclude a manned submarine from staying on station for that amount of time. 

Are there any conceivable ways to improve upon the REMUS 6000's role in underwater search and rescue? Without being part of the design process, it is difficult to understand all of the system level trade offs. One idea is to simply increase the vehicle's battery size, ideally resulting in a longer time on station. However, since the data must be downloaded directly from the vehicle, perhaps 16 hours is the longest interval that its designers wanted to wait for data. Another idea is to employ a small, go-between AUV that offloads the large datasets from the REMUS 6000 and returns them to the control vessel. 




References:
Ferrante, B., Kutzleb, B, Purcell, M. (2011). AF477 Underwater Search and Recovery Operations: A Shared Government-Industry Process. Sterling, VA: International Society of Aviation Safety Investigators.
Christ, R. & Wernli, R. (2013). The ROV Manual, Second Edition. Oxford: Elsevier/ Butterworth-Heinemann Press. 
REMUS 6000 Specifications. (2012). Woods Hole Oceanographic Institution, Oceanographic Systems Laboratory. Woods Hole, MA. Retrieved 22 January 2015, from http://www.whoi.edu/page.do?pid=105976 (Links to an external site.)
REMUS 6000 Datasheet. (2012). Kongsberg: Hydroid, Inc. Retrieved 22 January 2015, from http://www.km.kongsberg.com/ (Links to an external site.)



Sunday, January 18, 2015

Gorgon Stare (UNSY 605 Activity 1.5)

"With Air Force's Gorgon Drone 'we can see everything'" (Summary)

This article highlights the latest in the Gorgon Stare saga, a story of an unmanned sensing system that breaks the mold- observing a city all at once, and documenting everything it sees. The article begins with a circulated and speculated-over photo depicting a U.S. Air Force MQ-9 Reaper in Afghanistan carrying what the article surmises are new Gorgon Stare modules (called Increment II).

The article points to the limitations of current Unmanned Aircraft Systems (UAS) sensors that can only observe one thing at a time (also know as the Soda Straw Effect). The Gorgon Stare, it reports can send 65 different images to users simultaneously. Perhaps the greatest achievement of this new technology is its ability to track patterns of life, offering a forensic analysis of movement.

The Gorgon Stare made a public appearance in the PBS documentary Rise of the Drones, where the documentary pointed to it as the future of unmanned warfare. The technology enabling the Gorgon Stare is disturbingly simple in theory: it is the cobbling together of 386 five-megapixel smartphone cameras that results in a 1.81 billion megapixel sensor. The article explains that the name Gorgon refers to a mythical Greek creature with an unblinking eye that would turn all who beheld it to stone. All in all, it gives this futuristic sensor a science fiction aura.

The second half of the article turns the readers' attention to the military's use of intelligence, and questions whether more information (like that provided by Gorgon Stare) is actually a solution to a problem. It details how intelligence analysts watch hours of mind-numbing surveillance video without result. One idea to file the massive amount of imagery that Gorgon Stare will create is to catalog the imagery by location and event. "[A]n analyst in Afghanistan can retrieve the last month's worth of bombings in a particular stretch of road with a push of a button."

Pointing to the future, the article addresses some of the Air Force's ideas of how the technology will be used. The Air Force reportedly wants to implement wide-area surveillance systems on airships, and hopes to replace boots on the ground to some degree with better sensing systems. Air Force officials also say that the system will have civilian potential being used in border surveillance and natural disasters.

References:

Nakashima, E., Whitlock, C. (2011). With Air Force's Gorgon Drone, 'we can see everything.' The Washington Post. www.washingtonpost.com. Web. Accessed 18 Jan 2015

Jennings, G. (2014). USAF Image Appears to Show Gorgon Stare Increment II in Afghanistan. IHS Janes 360. www.janes.com. Web. Accessed 18 Jan 2015


Research: History of UAS (ASCI 530 Assignment 1.5)

Though many parallels may be drawn between early and modern Unmanned Aircraft Systems (UAS), perhaps the most direct continued practice is that of the Optionally Piloted Vehicle (OPV). Early UAS prototypes made use of existing manned aircraft platforms as test beds for new autonomous technologies. Today, OPV platforms are used for the same practice- evaluating untested UAS technology in a proven manned platform, allowing the redundant safety of a human pilot in the loop. 

The Sperry Aerial Torpedo project in 1911-1918 made use of the Curtis N-9 Seaplane, a proven manned aircraft design, to create a possibly viable unmanned aerial bombing platform. At the beginning of the project manned, controlled, heavier-than-air flight had only been achieved eight years prior and there were a great deal of technical challenges yet to be solved before the Sperry Aerial Torpedo could succeed. Human pilots were used to perform the takeoff, and subsequently transferred controls to the Sperry autopilot. When the Curtis airplane company created a purpose-built airframe called the "Curtis-Sperry Flying Bomb," Elmer Sperry himself elected to be the test pilot. Sperry crashed twice, once when the aircraft hit a patch of ice on takeoff, and again while transferring controls to the autopilot. Sperry was unhurt in both accidents. 

A modern example of the OPV is the Centaur by Aurora Flight Sciences. Based on the Diamond DA-42 manned aircraft, the Centaur provides for fully autonomous, manned pilotage, or a combination of the two control regimes. Its unmanned functionality may be accessed remotely or from a terminal within the aircraft. The impetus behind creating an OPV in the present day coincides with the reasons behind early optionally piloted UAS, and incorporates some new ones. Similar to early UAS Centaur allows component manufacturers to test new technologies in a low-risk environment for all or part of the flight profile. Centaur offers benefits that may not have been realized in early UAS, such as mitigation of legal restrictions and re-positioning of the aircraft. 

References:
Centaur. Manassas: Aurora Flight Sciences, Inc. www.aurora.aero. (n.d.) Web. Accessed 18 Jan 2015
Hewitt-Sperry Automatic Airplane (n.d.) en.wikipedia.com. Web. Accessed 18 January 2015