Friday, March 13, 2015

A few thoughts on UAS crewmember selection

photo credit: pixabay.com

Different philosophies in crew member selection for UAS arise from the wide array of crew positions, UAS size and capabilities, and operating environments (NAS or not).  UAS crews are comprised of external (takeoff and landing) and internal (mission operations) pilots, sensor operators, and other assorted positions that would require marginal knowledge of air traffic control rules and procedures.  Most UAS operations occur in visual meteorological conditions and not under positive control of Air Traffic Control.  Many UAS operations do not occur in airspace that conflicts with manned aircraft operations (Hottman, S. B., & Sortland, K., 2006).  UAS operations are expanding at a rate that cannot be matched by traditional flight training, therefore there is a need to expand selection criteria of crew members beyond traditional aviation selection if it is prudent and safe to do so.  However UAS operations are more complex and require cognitive skills and mental capabilities that are different from many other civil and military occupations.  In keeping with the FAA regulations currently under review, civilian UAS operators of mini UAS would still need to have basic aviation specific training on rules and UAS safety (training and qualification is the defacto selection criteria for civilian aviation operators).

I agree with qualifications tailored to the UAS platform, but not for the reasons listed by Cooke et al.  They assert that qualifications should be tailored to UAS based on the level of automation capabilities of the UAS platform.  I contend that the level of qualifications required for a UAS pilot should be in concert with the level of interaction the UAS would have with manned aircraft in the NAS, in addition to the roles the individual performs in the UAS mission.  In this scenario, internal pilots who have primary responsibility for monitoring the operations of the UAS while it conducts an automated mission would still be required to have a full awareness of operations in the NAS and would need to be selected largely in concert with manned aircraft pilot selection (a few notable exceptions like medical conditions that limit a pilot’s capability to operate at altitude). External pilots of mini and micro UAS which do not ever operate in concert with manned aircraft should not be required to have the level of training required by larger UAS that do integrate into the UAS.  These larger UAS operated primarily by the military should continue to have more stringent psychological, cognitive and personality testing due to the high risk nature of the missions and flight operations involved.
Hottman, S. B., and Sortland, K. (2006) UAV operators, other airspace users and regulators: Critical components of an uninhabited system.  In Cooke, N. J., Pringle, H. L., Pedersen, H. K. and Connor, O.  (Eds.) Human Factors of Remotely Operated Vehicles.

The Touch Screen Based Operator Control Interface: an Ecological Perspective

Photo credit: pixabay.com

Unmanned Aerial Systems (UAS) are developing at a very quick pace; a rapid area of this development is micro UAS.  These small vehicles can be deployed from a backpack and controlled by a laptop or an Ipad in the battle field.  The display design in these vehicles must make efficient and safe operation of the vehicle intuitive, allowing operators ease of use in achieving the mission.  Cooke, Pringle and Pederson (2006) evaluate one such touch screen display.  The problems they found in testing operators working with these displays can be improved upon by looking at some basic models of human information processing and ecological approaches to display design for UAS swarm operations.
The touch screen display analyzed operates in two modes.  In map mode, a terrain map with the mission path, UAS location and operator location are displayed.  An inset window displays imagery collected from the mission sensor.  In full screen video mode, the sensor image dominates the screen, the map display disappears, and command buttons around the outside of the screen provide operational control.   Their tests revealed crashes due to a lack of awareness of altitude and a lack of salient feedback for inputs, among other challenges (Cooke, et al., 2006).
The problem of altitude awareness can be seen in the pictoral representation of both displays, where the altitude is presented in small text at the top of the screen, outside of the primary field of attention inside a black frame.  In his paper on multiple resource theory, Christopher Wickens proposes a multiple resource model to explain how operators control attention (2008).  In the case of this display, nearly all of the information is visual, and focal, utilizing the same modality of perception.  Although the map and sensor information is more spatial and the altitude display is more verbal in nature, utilizing two different “codes of processing” as described by Wickens, there is a bottle neck of processing in the visual level.  In addition, the altitude readout is focal in nature, so if the operator has focused his or her attention on the pictoral representation of either the map mode or the full screen video mode, the altitude information is not triggered by the unfocused awareness of peripheral vision.  One solution for the presentation of altitude can be found in Fuchs, Brost, deCroon, van Passen and Mulder’s (2014) proposal for an ecological display design for swarm UAV operations.  Their ecological design provides an inset window which displays the airspeed and current waypoint of each of four UAS.  Because it is an inset on the main display, the information is brought into the awareness of the operator in a way that it is not in the touch screen display of Cooke et al.; as an inset, the altitude information is closer to the operator’s primary field of view.  Another method to increase operator attention on altitude is to add an auditory cue for altitude.  By Wickens’ model of multiple resources, employing and auditory cue would take advantage of a separate modality for perception, and better equip the operator with limited perceptual space to attend to the altitude stimulus (2008).  In takeoff and landing modes, the rate of auditory altitude information could be increased to encourage the increase situational awareness required in these stages of flight.  Finally, programming minimum safe altitudes for the flight profile and incorporating a visual and or auditory alert would draw the operators awareness to the hazard.
Lack of salient feedback for controller inputs has a direct parallel to manned aircraft operation.  Even aircraft do not always respond immediately to an input, as a military jet pilot who transitions to a heavy aircraft would likely report.  The difference is the manned aircraft operator has a control surface that the pilot manipulates leaving little doubt, short of a failure of the flight control system, that the input has been received by the machine.   The touch interaction system studied by Cooke et al. utilizes a low cognitive, perceptual, and motor load interaction device with a low contribution to overall fatigue (Wickens, Lee, Liu & Becker, 2004).  However, a good design must include feedback of the control state in order to indicate the system response to the operator (Wickens et al., 2004).  In the case of the touch screen, an ideal feedback response would be a vibration such as one would experience with a modern touchscreen smartphone.  Once again this feedback would utilize a separate perceptual modality and take full advantage of the human sensory system, informing the operator the minute the system has registered a control input.
The touchscreen display studied by Cooke et al (2006) has shortcomings that they discover even as they conduct their studies.  A more human-centered display design can reduce training time and enhance operator capabilities in high workload tasks.  Making the altitude and feedback cues more salient and utilizing different modes of perception optimize the operator’s ability to integrate all of the information critical to operating a UAS safely and conduct the mission effectively.



References
Cooke, N. J., Pringle, H., & Pedersen, H. (Eds.). (2006). Human Factors of Remotely Operated Vehicles, Volume 7. Amsterdam, NLD: JAI Press. Retrieved from http://www.ebrary.com
Fuchs, C., Borst, C., de Croon, G. C. H. E., van Paassen, M. M. (RenĂ©), & Mulder, M. (2014). An Ecological Approach to the Supervisory Control of UAV Swarms. International Journal of Micro Air Vehicles, 6(4), 211–230. doi:10.1260/1756-8293.6.4.211

Wickens, Christopher, et al. (2004).  An Introduction to Human Factors Engineering. Upper Saddle River, New Jersey: Pearson Education.

Wickens, C. D. (2008). Multiple resources and mental workload. Human Factors, 50(3), 449–55. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/18689052

 



An exercise in UAS ORM

This is a Preliminary and Operational Hazard Assessment and Review for a notional police department Honeywell T-Hawk MAV program.  The Honeywell T-Hawk MAV is micro UAV with autonomous flight with dynamic intervention and retasking,   and hover and stare capability (Honeywell, 2015).  The risk assessment matrix used is the FAA safety system process risk assessment matrix (FAA, 2015).  Risks are evaluated based on the likelihood of occurrence, and the severity of the consequences.  The likelihood and severity are applied to the risk assessment matrix and a resultant overall risk code is determined. 

















 
Two of the risks identified during development of the Preliminary Hazard List as suggested by Cooke, et al. (2006) were obstacles in the takeoff and landing profiles, and mistakes in initial programming or mid-mission reprogramming for the UAS (Figure 2).  The initial likelihood and severity of these two risks resulted in an overall risk assessment code of High and Serious, respectively.  Adjusting the UAS program by creating several distinct and pre-programmed takeoff and recovery sequences, as well as a periodic review (not more than one year and more frequent as required by changes in the urban landscape) reduced the risk assessment codes to Medium and low, respectively.  



An operational risk assessment and review yielded three risks in mission operations:  obstacles in the search area, spatial disorientation of the operator in dynamic reconnaissance during suspect pursuit, and crashing due to staying on scene for a mission with longer than anticipated length.  All three of these risks were deemed to be high.  For obstacle planning, for each mission two qualified planners are required to sign off on contingency plans developed for the operation.  For spatial disorientation during dynamic missions, initial and annual spatial disorientation training, as well as recovery procedures will be enacted to address spatial disorientation when it occurs.  Finally, in anticipation of missions that run longer than the anticipated time,  designated “land out” areas would be identified and beat cops would be trained in proper recovery of the UAS to avoid damaging sensitive equipment.  These land out areas would allow the UAS to extend on scene time without making the long trek back to base, and reduce the possibility that an operator might choose to stay on scene and sacrifice the UAS so the bad guy doesn’t get away. 
The operational risk management tool is used before a mission to assess whether the mission is low, medium or high risk.  The gain of the mission is subjectively assessed, only high gain missions should launch on high risk missions.    This hazard analysis demonstrates how this process can improve operations and reduce the overall risk for accidents.  In addition the operational risk management tool can be used by operators on a daily basis to safely assess the risk involved in a mission and make decisions based on the overall risk and gain of the proposed mission.



References

Cooke, N. J., Pringle, H., & Pedersen, H. (Eds.). (2006). Human Factors of Remotely Operated Vehicles, Volume 7. Amsterdam, NLD: JAI Press. Retrieved from http://www.ebrary.com

Federal Aviation Administation. (2015).  System Safety Process.  Retrieved from: http://www.faasafety.gov/gslac/alc/libview_normal.aspx?id=6877

Honeywell. (2015). T-Hawk MAV product page.  Retrieved from : https://aerospace.honeywell.com/thawk





Tuesday, March 10, 2015

This discussion only touches the surface of the link between fatigue and stress in shift work such as UAS operations, and is a starting point for policy sharing between manned and unmanned aircraft organizations when it comes to fatigue and stress.

Fatigue and stress are related in many ways, here I will discuss just a few of them.  The long term effects of fatigue can be seen in the co-morbidities reported by Culpepper (2010): metabolic, gastrointestinal, cardiovascular, mood and anxiety issues have all been linked to problems caused by long term shift work, they are also symptoms of long term stress. In addition, chronic stress can disturb sleep patterns already compromised by working against natural circadian rhythms.  Research has shown that a complete lack of sleep is fatal (Pinel,2011). 
The hormone cortisol (one of many glutocorticoids released by the anterior pituitary in a natural stress response) serves a natural function to arouse an individual from sleep in a normal circadian rhythm (Drake, 2010; Pinel 2011).  This dual function of cortisol is a biological connection between fatigue and stress.  Circadian dsynchonization (a waking cortisol signal sent during the shift worker’s scheduled sleeping time) creates a biological stress on an organism and can underlay stress caused by the challenges and pressures of work performance (Drake, 2010). Stress and fatigue are so closely related that Pinel laments the lack of sleep studies that control for stress and vice versa (2011).  Even more problematic for UAS operators, research has shown that executive function, such as might be required to control a UAS, is the most vulnerable cognitive ability to long term chronic stress (Arnsten, Mazure & Sinha, 2012), as might be caused by shift work disorder.
UAS and manned aircraft share sources of stress caused by both vigilance tasks and acute events (Wickens, Lee, Liu and Becker, 2004), such as emergencies and offensive missions.  UAS operations involve long term (over months or years) shift work (manned aviation shift work is also a concern but varies by organization and mission, and is limited to some degree by vehicle maintenance cycles and refueling requirements); manned aviation involves the physical stresses associated with being in the aircraft at high altitudes, exposed to dynamic forces such as vibrations and aerodynamics.  In both cases, careful management of qualified personnel in number (allowing for coverage of mission schedules without compromising crew rest requirements) and shift schedules are paramount to addressing this dual threat to aviation safety.  This idea highlights one area that UAS communities would do well to learn from manned aviation:  strict adherence to established crew rest requirements by both the organization and the operator can preserve a protection against long term fatigue. The difference in motivation on the part of the operator is that the UAS operator does not encounter physical danger when fatigued, so the motivation to alter daily activities to observe crew rest is less salient.  Study in motivation and organizational psychology can suggest policies to encourage compliance with crew rest guidelines and regulations in UAS operators.  
Arnsten, A.,Mazure, C.M., and Sinha, R. (2012).  Everyday stress can shutdown the brain’s chief command center. Scientific American, 306(4). 48-53.
Culpepper, L. (2010).  The social and economic burden of shift work disorder. Supplement to The Journal of Family Practice. 59 (1), S3-S11.
Drake, C. (2010). The characterization and pathology of circadian rhythm sleep disorders. Supplement to The Journal of Family Practice. 59 (1), S12-S17. 
Pinel, J.P. (2011).  Biospychology. Boston, MA: Allen & Bacon.
Wickens, C. D., Lee, J. D., Liu, Y., Gordon Becker, S.E. (2004).  An Introduction to Human Factors Engineering. Upper Saddle River, New Jersey: Pearson Education.

A short discussion piece I wrote regarding spatial disorientation (SD) in unmanned aircraft operations. SD is complicated enough when you are in the seat! Conflicting or absent sensory inputs get even more complicated when the operator has a variety of experiences, locations, and displays.


The three types of Spatial Disorientation (SD) encountered by operators of aerial vehicles are recognized, unrecognized and incapacitating. Recognized SD occurs when there is a conflict between the aircraft position or motion and the operator’s sensory inputs which is noticed by the operator. Unrecognized SD is when the aircraft is in an unintended position or motion and the operator is unaware that a change has taken place.  Incapacitating SD is when the operator is unable to make control inputs to correct the situation, due to overwhelming conflicting sensory information.  While incapacitating can be catastrophic in manned aviation, it occurs rarely if ever in UAS operations (Cooke, 2006).  SD influence on mishaps is summarized nicely in the Cooke text:  “SD Mishaps occur when inaccurate operator perceptions result in inappropriate control inputs” (Cooke, 2006).
In manned aircraft, SD usually occurs from a mismatch between vestibular and proprioceptive cues, and visual cues that are received by the operator.  The visual system in the human accounts for a much larger ratio of sensory and cognitive processing so visual cues tend to dominate human perception of self motion in space, and operators can find it difficult or even impossible (in the case of incapacitating SD) to override visual sensations in order to right the aircraft.  In unmanned aircraft, vestibular cues and proprioceptive cues are not available so the operator relies on visual cues to judge vehicle motion and position.  This leaves the operator vulnerable to visual illusions and without the benefit of other modes of sensory input to tell him or her what the vehicle is doing.  UAS operators who work in a mobile ground control station such as a automobile, another aircraft or a ship would be subject to unrelated vestibular and proprioceptive cues which would exacerbate SD problems. Operators with manned aircraft flying experience could be more susceptible to SD due to the lack of vestibular and proprioceptive cues he or she is used to receiving to complement visual references.  Operators without manned aircraft experience may not understand how to compensate for visual illusions, such as the mishap cited by Cooke et al. (2006, p 139).
There are many variations in UAS configuration and operation that make UAS operators vulnerable to SD, including the type of display (egocentric, exocentric, orientation of cardinal direction) and type of control inputs (fully manual, full automatic or some combination of the two).  A common flight regime for SD is during night takeoffs and landings.  One idea to compensate for SD in this stage of flight would be to increase the salience (in size, location or color on the display) of instrument indications during night takeoffs and landings, as the lack of visual cues at night increase the number of ways in which an operator can misjudge vehicle position.   Another idea would be to increase automation when operating in night takeoffs and landings, again in order to compensate for the lack of visual cues and operator proneness to error. 
Spatial disorientation is a real concern in UAS operations, and reduction of the influence of SD in this growing industry promises to keep human factors experts employed for years to come.
Cooke, N. J., Pringle, H., & Pedersen, H. (Eds.). (2006). Human Factors of Remotely Operated Vehicles, Volume 7. Amsterdam, NLD: JAI Press. Retrieved from http://www.ebrary.com

Sunday, January 18, 2015

Novel Applications for UAS Fail to Innovate Human Error

In December of 2014, a popular restaurant chain launched what seemed to be a fun and harmless use for mini UAS: a hovering bunch of mistletoe encouraging people to share a friendly kiss over the holidays.  Never mind that the people selected might not be interested in that kind of involvement and create a mishap through their disagreement, a guest was injured in the execution of  a landing.  


HFACS would be useful in understanding this accident and in some of the ways human error affects UAS flight.

A notional HFACS analysis of this mishap:
Unsafe acts: 1. Skill based error: landing vehicle in a location not intended for landing, failure to perceive the motion of the landing zone, allowing aircraft to cause injury. 2. Decision error:  due to the remote location of the operator, there is less of a sense of the hazard presented by the vehicle.  The decision to land the aircraft too close to an individual could have been influenced by this false perception of safety caused by distance from the vehicle.  The act could be seen as an exceptional violation, but only if a policy regarding flying the UAS too close to humans was in place.
Preconditions for unsafe acts: 1. The article gives little information regarding most of the preconditions.  Alcohol was in the environment and there were likely social tensions exerting pressure on the operator and creating an adverse mental state.  Crew Resource management may have also played a role in that the landing zone had not been pre-briefed on the hazards of moving her arm with a UAS on it.
Unsafe supervision: The news article seems to indicate that the operator of the UAS was an independent contractor, lack of communication between contractor and parent company on regulations and limitations of the promotion could have caused gaps in policy to arise.  Inadequate supervision might have played a role.  
Organizational influences:  Finally the organizational climate was such that the operation of the UAS was a “promotional period”.  This analysis is purely theoretical: A promotional event might not have received the type of hazard analysis required of operating an aerial vehicle.  The contracting company takes on the risk of the operation of such a vehicle even though the specifics of it are handled by the expertise of the contractor. Organizational influence might have played out in that the contracting company might not have given much thought to preventing aviation mishaps at the dinner table.

Stampler, L. (2014, December 8).  TGI Fridays Mistletoe Drone Literally Cut Off Part of Someone's Nose. TIME. Retrieved from 
Leduc, P.A., Rash, C.E., and Manning, S.E. (2006, January 15). Human Factors in UAV Accidents. Retrieved from
Human Factors Analysis and Classification System. Retrieved 17 January, 2014 from Wikipedia: