开发工具:
文件大小: 2mb
下载次数: 0
上传时间: 2019-03-23
详细说明:一种视觉导航的里程碑文献,描述了场景匹配的评价方式。Kendoul: Survey of advances in GNC of ruAs 317
for launch and recovery. These three categories are domi- Finally, discussions and conclusions about published pa-
nated by conventional helicopter configurations with a sin- pers and developed systems are given in Section 7
gle main rotor and a tail rotor. On the other hand,most
mini and micro UAS of categories IV and V are multirotor 2. RESEARCH GROUPS INVOLVED IN RUAS
platforms(e.g, quad rotors and coaxial vehicles)that can fly RESEARCH AND DEVELOPMENT
outdoors as well as indoors and can be launched by hand
or from small and narrow spaces
A number of research groups are working on the develop
ment of autonomy technologies for RUAS. Figures 2 lists
12,M。 otivation for and sc。 pe of This survey
some 27 research groups that are involved in research and
development of autonomous RUAS. The list is not exhaus
Nonmilitary research in RUAs only began in the carly tive and excludes military and industrial research groups
1990s, and they have now become a popular research area
Over the past 20 years, an enormous amount of research 3. AUTONOMY LEVELS FOR UNMANNED
has gone into guidance, navigation, and control(GNC) for ROTORCRAFT SYSTEMS
RUAS, resulting in veried techniques and a large number of
published papers. Although some survey papers have tried
Before review of recent advances in research and devel-
to review small subsets of methods in a particular area-
opment of autonomous RUAS, it is important to develop
Ollero and merino(2004) for flight controllers, Chao, Cao
a framework that provides standard definitions and met
and Chen(2010) for autopilots, Goerzen, Kong, and mcttlo
rics characterizing and measuring the autonomy level of
(2010)for path planning algorithms, and Valavanis(2007)
a RUAS. In this section the autonomy levels for un-
for UAS in general--there is a real need for a comprehen
manned rotorcraft systems(ALFURS) framework is pro-
sive survey to report and organize the large variety of GNC Posed, which is based on the generic NIST ALFUS frame
methods, providing a context for viewing and comparing work (Huang, Messina, Albus, 2007)with some mod
autonomy technologies developed for RUAS. This work hica tions and extensions to make it specific to RUAs and
was mainly motivated by the fact that we have not found research-oriented. Another of the main objectives of this
a single survey specializing on GNC systems for UAS in
ection is to identify the key components that constitute the
general and RUAS in particular, despite the need for a such autonomy architecture onboard a RUAS. This section is in-
work after two decades of active research in these arcas
tended to facilitate the understanding of this survey paper,
This paper provides an overview of GNC systems devel
but it can also serve as a common reference for the uas
oped to date to increase the autonomous capabilities of un- communuty
manned rotorcraft systems. The approaches that have been
reported are organized into three main categories: control, 31. Terminology and Key Definitions
navigation, and guidance. For each category, methods are In this section, definitions of terms that are most relevant to
grouped at the highest level based on the autonomy level
they provide, and then according to the algorithmic ap
RUAS autonomy are proposed. As such, consistency with
the nist alFus (Huang 2008) definitions is assumed
proach used, which in most cases is closely associated with
the type of sensors used. The central objective of this sur-
However, for certain terms, the NIST generic definitions
have been modified to better suit RUAS. For some other
vey paper is to serve the UAS research community by pro-
terms, new definitions are proposed based on information
iding an overview of the state of the art, major milestones, from various sources(ICAO, FAA, UAS Roadmap 2010
and unsolved problems in the areas of GNC for RUAS. This
2035, NIST, ctc)
will help researchers to reduce reinvention and enable them
to better identify the key critical gaps that prevent advance Definition 1. Rotorcraft: A heavier-thar-air aircrafts 5 that is
in the field
The rest of the paper is organized as follows: The main
supported in flight by the dynmamic reaction of the air against its
rescarch groups involved in rescarch and devclopment of power-driven rotors on a substantially vertical axi
autonomous RUAS are presented in Section 2. Section 3 in
troduces autonomy aspects onboard RUAS, including au
Definition 2. Rotorcraft Unmanned Aerial Vehicle(ruav)
A powered rotorcraft that does not require an onboard crew, can
tonomy definition from the UAS Perspective, autonom
levels and metrics, and the main components of a typical
autonomous system. Sections 4, 5, and 6 provide a com- National Institute of Standards and Technology.
prehensive survey of major works focusing on flight cor
trol, autonomous navigation, and guidance, respectivel
The Autonomy Levels For Unmanned Systems(ALFUS)Ad Hoc
Workgroup is a NiST sponsored effort that aims at formulating a
logical framework for characterizing the autonomy of unmanned
The book (valavanis, 2007), published in 2007, also provided some
systems in general, covering issues of definitions, metrics, levels of
overview of recent advances in UAS, but it is more a concatenation autonomy, etc
of contributed chapters from different groups than a survey
See icao and faa definitions of aircraft
Journal of field robotics doi 10.1002/ rob
318.Journal of Field robotics 2012
NAME OF THE GROUP ROTORCRAFT CURRENT RESEARCH MILESTONES AND MAJOR
AND NSTITUTION
PLATFORMS
AREAS PROJECTS
ACHIEVEMENTS
Field Robotics Centre, CMU, Class 1: Eurocopter FC135-Obstacles detection and avoidanceI-Successful obstacle detection and
CMU-FRC-US
and the Boeing unmanned Using a 3D)LIDAR
wwwfrc.ri.cmu.eduprojects/ Little Bird (ULB)helicopter Landing zone detection and safe voidance with RMAX helicopter
www.frc.ri.cmu.educademoandR-50
ClassI1:YamahaRMAX landing using sweeping 2D LIDAR. automatic landing with full-scale
Class Iv: Otto Quadrotor
Visual localization and perception. helicopter
UJAV Research Facility
Class l[: Yamaha rmax
Georgia Institute of Techno-land Sung Woo Eng
Adaptive autopilot for 3D trajectory
logy, UAVRF-Glcch-US Remo-H helicopter
Rotorcraft control using adaptive |tracking and flight control
techniques and neural networks
http://controls.ae.gatech.edu/classIv:Getpsyducted
Vision-based navigation
Vision-based target tracking
wikiuuvrf
fan), GlQ(quadrotor)and
Vision-based formation flight
GrAma(co-axial
The Nasa Army autono
3D navigation in urban environ-|- Mapping and path planning using a
mous Rotorcraft Project
Class I: Yamaha RMAX ments using vision and LIDAR
pinning
NASA-ARP-US
hulp: //ii. arc. nasa gop/projects
helicopter.
Safe landing area detection
Vision-based state estimation
PALACE
Successful SLad tests using stereo
apex/projectaRe nhp
Mission and path planning
and LIDAR
Mission planning for surveillance.
AR Group, Berkeley
Class i[: Yamaha rmax
Flight control using MPC
University, BEAR-US
andr-50 helicopters
No current research activities on
Formation flight
robotics.cccs. berkeley. cdu/be Class III: Elcctric Maxi- UAS(to our knowledge)
Vision-bascd landin
loker helicopters
LIDAR-based obstacle avoiadnce
UASTcch Lab, Linkopings Class II: Yamaha RMAx Vision-based landing and localiza- Vision-based landing and localization
University: UASTech-SH
helicopter
Path planning
www.ida.liu.se/divisions/uicsclassIv:Quadrotors
Mission and path planning
Mission planning and execution
aiicssite/uustech
Artificial Intelligence
monitoring.
The French Acrospace Labs
Vision-based navigation(target
Ground target tracking using vision
(ONERA); ONERA-FR Class IL: Yamaha RMAX tracking and safe landing
Safe landing area detection using
helicopters
Mission planning and decisional stereo visi
httpaction.onera.fraccueil
autonomy( ReSS.AC project
Mission managcment
UAS cooperation
USL, Univ of South Florida: Class IL: Yamaha RMAX, -Flight control
JSL-USF-US
Class Ill: Burgen, Raptor -Vision-bascd navigation
Automatic flight
www.cse.usfedu/usl
90. Maxi Joker 2
Fault dctection and isolation
Trafic data collection and analysis
Kenzo Lab, Chiha University: Class IL: Sky Surveyor, QTW
CHIBA-U-JP
Class Ill: Hirobo SST-Eaglc
Autopilots desi
Automatic flight of diffcrent
mec 2. tm.chiba -u.jpil-nonamni Class IV: Astech quadrotors. -Formation flight control
otorcraft platforms
Vision-based flight
mec:2. tm.chiba-u ip/uav'main/Class V: Epson micro Flyin
Vision-based state estimation
Formation flight of two helicopters
Robot (uFr)
USW(ADFA, australia
Class ll: Yamaha RMaX heli
siteshrpliesearch/UAP ua html/ Class l: Hirobo Eagle
Flight control
Optic flow-based terrain following
Optic flow-based navigation
using the Yamaha RMAX helicopte
Ihtp:/sei. nsw, ad/a. edu. wu iacme helicopter
Automatic landing on a shi
researchconsult flight/index. html
Shenyang Institute of
Class II: ServoHeli-120 Avionics development/ integration. -Waypoint navigation with automatic
Automation(SIA), SIA-CN (120 kg). ScrvoHcli-40 Flight contre
htp. /av.sia.cn/en/index php(40 kg)
Path planning and UAS coopcration -Vision-based ground target tracking
Gierman Aerospace Centre
3D mapping and path planning using
Class Ill: ARTIS helicopter-Stcrco vision-bascd mapping and stereo vision
http:/www.dlr:deift/en/desk(25kg)
obstacle avoidance
Vision-based flight through obstacle
default. aspx/tabid
Mission management
gates
l377/905rel3350′
ARCAA(CSIRO-QUT
robust control of rotorcraft
ARCAA-AU
Beyond visual range(BvR)flights
http:/www.arcaa.aero'
Class III: Vario helicopters.- Dependable autonomous UAS
Class IV: Quadrotors and
BVR infrastructure inspection
http:/research.ictcsiroauresearch/lalOctocopters
Obstacle detection and path
Mapping and obstacle avoidance
Syautonomous--
planning using vision and LIDAR. using LIDAR and/or stereo vision
lsystems field-roboties/field-robotic
Acrospacc Systcms and
Class lll: Voyager gisr
UAS tlight control
Control Lab. KAIST
260 helicopter
-Vision-hased navigation
Trajectory tracking and waypoint
avigation
ASCL-KR
Class IV: Electric T-REX -Obstacle dctection and path
planning
Vision-based flight
hilp: //ascl kaist. ac kr:
600 helicopter.
National University of Class IlL: AF25B from
Nonlincar control of ruas
Automatic flight control
Singapore, NUS-SGi
Copterworks Inc: Raptor 90
3D indoor navigation
Formation flight of a ruas with a
http://vlab.ee.nus.edu.sg/classiv:trex-450heli
Vision-based navigation
virtual leader
coaxial rotorcraft
Vision-based indoor flight
Figure 2. Re
working on RUAS
Journal of field robotics doi 10 1002/rob
Kendoul: Survey of advances in GNC Of ruas
319
NAME OF THE GROUP
ROTORCRAFT
CURRENT RESEARCH MILESTONES AND
AND INSTITUTION
PLATFORMS
areas PRojeCts MAJOR Achievements
Computcr Vision Group
Class lll: bergen industrial
Vision-based pose estimation. -Vision-bascd statc estimation
Universidad Politecnica de Twin and Rotomotion SR20 heli. -Object tracking
and ground target tracking
Madrid. CVG-ES
Class Iv: Octocopter from
3D mapping using vision.
Visual 3D SLam
http://wwwvisionauavcom
Mikrokopter: quadrotor
Vision-based flight
Visual servoing
Robotics vision and control
Cuntroloflas
Cooperative detection of forest
Group, University of Seville, Class Ill: Heliv and HERO
Vision-based navigation and fire using two RUAS
GRVC-ES
helicopters
forest fire detection
-Visual odometry and slam
Cooperative perception and -Cooperative faults dctection
control of multiple UAS
The Center for Advanced fleet of heterogeneous UASs -Automatic flight control
Aerospace Technologies
(about 6 fixed-wing UASs, 6|-Navigation and sense avoid. -European COMETS and
(CATEC), CATEC-ES
unmanned helicopters, and 10 -Multi-vehicle coordination. AWARE projects
http:/www.catec.c
quadrotors)
Avionics and onboard systems
Automation and robotics
Laboratory for Autonomous Class Il: Aero-Tec CB-5000 -Control of VTOL UAS
Load transportation using 3
Flying Robots, TU Berlin, heli (12-16 kg)
Collision detection& avoidance helicopter
FAFR-DE
Class iv: Mikado logo 14
Distributed control of multiple -Sensor nodes deployment by
http:/pdvcs.tu-berlin.derlfafirself-madequadrotor(5kg).RuaS
three autonomous helicopters
Autonomous Vehicles Group, I Clas I: Bergen Industrial win. I - Flight control with slung load. I-Control of a helicopter slung
University, A
Urban UAs navigation
load system
http:/www.es.cu.dk/projects/700Nitro,ThunderTigerE550
Task allocation and
Class Iv: IMH- 120 Corona, T-Rex coordinated night of AV
Robust control
450, X-3D-BL quadrotor
Collision-free path generation
utonomous Helicopter project Class lll: 90-size XCell
Acrobatic and aggressive flight-Autorotation-based andin
Tempest, 90-size Synergy N9. control
Learning-based aerobatic flight.
STARMAC project
hitp: /hybrid. stanford. eduiostarn Class IV: STARMAC platform [Learning-based control
Bakflip control for a quadrotor
Multi-agent contro
-collision avoidance tlight
ACFR, University of sydney,
Weed monitoring and animals-Vision- based mapping and
ACFR-AU
Class III: UAV vision G18 tracking
classification (fixed-wing
hittp:/www.acfr:usvd.edu.au/reshelicopter
Multi-UAV active SLAM
Aquatic weed surveillance and
UAS SYSter
roSpace Controls Lab, MIT, Class IV: Dragonfly
UAS modeling and control
Indoor flight using vIcoN
quadrotors, Asc lech quadrotors
Multi-UAS task assignement.-Persistent mission plannin
httP acl mit. edu
Multi-vehicle health management/-Health management of UAS
Robust robotics Group, MIT
-Non-GPS indoor navigation
waypoint guidance and
RRG-US
Planning under uncertainties. geolocation of ground target
htp: /groups. csail mit. eduirg/ Class IV: ASC Tech quadrotors. -SLAM-based navigation and -Autonomous tlight in unknown
Planning for target tracking
Autonomous systems lab
Class Iv: home-designed
AIRobots project (Inspection
ETH. ASL-CH
I quadrotors and coaxial rotorcraft Rotorcrafts)
Design of miniature platforms
http://www.asl.ethz.ch/research:classv:under-developmentfor-sflyproject(swarmofMicro|-indoorhovering
Heudiasyc Lab, University of Class IV: quadrotors and other -Design of multi-rotor
Technology of Compiegne,
home-designed multi-rotor
configurations
Niching.
HDS-FR
configurations
Modeling and control of RUAS. -Vision-based hovering
http://www.hds.utcfr
Vision-based navigation
GRASP Lab. Univcrsity of
Autonomous navigation and
Aggressive and accurate
Pennsylvania GRASP-US
Class iv: Asc tech
control of quadrotors using
http:/alliance.seas.upenn.edu-quadrotors.
VICON
control for 3d dvnamic
Cooperative manipulation and
tation with aerial robots
Automatic flight
Collaborative rescarch bctwccn Class IV: Homc-built
Optic flow-based terrain
vision-based hovering
France(CEA and USI) and
quadrotor
follow
Terrain following in a
Australia(ANU)
Visual servo
structured indoor en
Figure 2. Continued
Journal of field robotics doi 10.1002/ rob
320. Journal of field Robotics-2012
GUIDANCE
NAVIGATION
igh-level decision-making
Situational awareness
mission planning and
execution monitoring
Perception
path planning
le estimation
low-level decision-makin
Inl se
Sensing
trajectory generation
RUas state
FLIGHT CONTROL
erence
3D position/velocity control
attitude control, etc.
RUAy rotorcraft with avionics, communication equipment, mission payload, etc.
lowest
links
other modules and functions
visualizati
telemetry and data logging
RUAS
GCS
Human-Robot Interface(HRT)
Unmanned
Other Systems
Autonomous Systems
Human
(LAS, UGV, etc.)
(GPS, etc.
Figure 3. The overall structure of guidance, navigation, and control systems onboard a RUAS
operate with some degree of autonomy, and can be expendable Definition 4. Autonomy: The condition or quality of being
or reusable. Most RUAVs include integrated equipment such as self-governing. When applied to RUAS, autonomy can be de
avionics, data links, payload, and various algorithms needed for fined as RUAs's own abilities of integrated sensing, perceiving,
flight
analyzing, communicating planning, decision-making and act
ing/executing, to achieve its goals as assigned by its human op
Definition 3. Rotorcraft unmanned Aerial or Aircraft System erator(s)through a designed human-robot interface (Hrd)or by
(RUIAS): A RUIAS is a plysical system that includes a ruia another system that the ruas communicates with
communicalion architecture, and a ground control station with
no human element'aboard any component. The ruAs acts on Definition 5. Autonomous RUAS: A RUAS is defined to be
the physical world for the purpose of achieving an assigned mis- autonomous relative to a given mission (relational notion)when
sion. Contrary to the uas definition proposed in the us dod it accomplishes its assigned mission successfully, within a defined
UAS Roadmap 2010-2035(Roadmap, 2010), here the human el-
cope, with or without further interaclion with human or other
ement is not part of the ruas but rather an external system that external systems. A RUAS is fully autonomous if it accomplishes
interacts with the ruas; see Figure 3
its assigned mission successfully without any intervention from a
bThe plural of ruas will be also denoted RUAS
ee the UAS Roadmap(Roadmap, 2010) for the definition of hu- BOwn"implies independence from human or any other external
man element
systems
Journal of field robotics doi 10 1002/rob
Kendoul: Survey of Advances in GNC of RUAs .321
human or any other external system while adapting to operational mation analysis, decision and action selection, and action
and environmental condition
implementation Other relevant concepts and results have
been developed by academia, especially from the human
machine interaction and artificial intelligence areas( Castel-
Definition 6. Autonomy Level(AL): The term"autonomy franchi & Falcone, 2003; Zeigler, 1990), as well as by NASA
level"is used in different contexts in the research community. In and the military, using mainly the ooDA(Observe, ori-
Huang (2008)and Huang, Messina, 8 Albus (2007) for exam- ent, Decide, and Act)loop. A more fully developed frame
ple, AL is equivalent to human independence (Hl). In this paper, work for defining autonomy levels for unmanned systems
AL is defined as a set of progressive indices, typically numbers (ALFUS) has been proposed by an NIST-sponsored ad hoc
and/or names, identifying a RUAS capability for performing al- workgroup(Huang, Messina, Albus, 2007). In the AL
tonomously assigned missions. A RULAS'S AL can be character
FUS framework, the autonomy level, later renamed contex-
ized by the missions that the ruAs is capable of performing(mis- tual autonomous capability(CAC), is measured by weight-
sion complexity or MC), the environments within which the mi
ing the score of various metrics for three aspects, or axes
sions are performed (environment complexity or EC), and inde- which are human independence(Hi), mission complexity
pendence from any external system including any human element (MC), and environmental complexity(EC). In 2002, the U.S
external system independence or ESI). Note that this al defini-
Air Force Research Laboratory (AFRl) presented the re
tion is similar to the contextual autonomous capability (CAc) sults of a rescarch study on how to measure the auton-
definition in the NIST aLFuS framework(Huang, 2008), except
omy level of a UAV(Clough, 2002). The result of this study
for Hl, which is replaced here by esI
is the autonomous control levels(ACL)chart, where 11
autonomy levels have been identified and described. The
As in Clough(2002)and Merz(2004), we make a autonomy level is determincd using the OODA concept,
distinction between automatic, autonomous, and intelli- namely, perception/situationalawareness(observe),anal-
gent systems. An automatic system will do exactly as ysis/coordination(orient), decision-making (decide), and
programmed because it has no capabilities of reasoning, capability(act). A few other papers also briefly discussed
decision-making, or planning. An autonomous system has the UAS autonomy(Fabiani, Fuertes, Piquercau, Mampey
the capability to make decisions and to plan its tasks and &Tcichteil-Knigsbuch, 2007; Lacroix, Alami, Lemaire, Hat
path in order to achieve its assigned mission. An intelligent tenberger, Gancet, 2007; Merz, 2004; Mettler et al.,2003
system has the capabilities of an autonomous system plus Roadmap, 2010), but to our knowledge there have been no
the ability to generate its own goals from inside by moti- other papers published about metrics and autonomy levels
vations and without any instruction or influence from out- for UAS in general and rUas in particular
side. In this paper, we are interested in automatic and au
Although the NIST ALFUS and AFRL ACL frame
tonomous systems; intelligent systems are out of the scope works provide significant insight and progress in the field
of this paper because such systems do not vet exist for UAS. of unmanned system autonomy characterization and eval-
Generally, we do not want an intelligent system, but an au- uation, they are difficult to apply directly to RUAS,es
tonomous system that does the job assigned
pecially from the research perspective. Indeed, the AFRL
ACL chart is most useful and applicable to relatively large
UAS operating at high altitudes in obstacle-free environ-
3.2. Autonomy Levels and Metrics
ments.Furthermore, the used metrics are military scenario-
From reviewing the ruas litcrature, it became evident that
oriented and are based on the OODA loop, originally de
there is an overall need for a comprehensive framework
veloped by the military to illustrate how to take advantage
of an enemy. On the other hand, ALFUS is a generic frame
that allows RUAS Practitioners, particularly researchers, work covering all unmanned systems, and its application
to evaluate and characterize the autonomous capabili
to Ruas is not straightforward. It is also important to note
ties of RUAs. The next paragraph gives a brief overview
that autonomy metrics and taxonomies have evolved and
of autonomy-related works, which are also some of the
sources that were consulted for developing the ALFURs xpanded in theory and practice since then. In this section,
framework
we attempt to address the challenge of ruas autonomy
Many of the autonomy articles use Sheridans work
characterization by proposing the autonomy levels for un-
Sheridan, 1992)as a reference for initial understanding of
manned rotorcraft systems(AL FURS) framework Based on
utonomy and human-computer interaction. In his book
research, including NIST ALFUS and AFRL ACL studies,
Sheridan, 1992), Sheridan proposed a 10-level scale of de-
and the desire to have a research-oriented autonomy frame
grees of autonomy based on who makes the decision(ma
work that better suits RUAS operating at low altitudes and
chine or human)and on how those decisions are executed
in cluttered environments, the alfurs framework was
In 2000, Parasuraman, Sheridan, and wickens(2000)in-
troduced a revised model of autonomy levels based on Future combat systems(FCS)Program, autonomous collaborative
four classes of functions: information acquisition, infor- operations(ACO)program, etc
Journal of field robotics doi 10.1002/ rob
322. Journal of field robotics -2012
developed ALFURS is based on the RUAS onboard func- Perception: RUAS perception is the ability to use inputs
tions that enable its autonomy. These autonomy enabling
from sensors to build an internal model of the environment
functions(AEF) can be regrouped into three main cate-
within which the vehicle is operating, and to assign entities,
gories: guidance, navigation, and control (GNC). Before
events and situations perceived in the environment to classes
elaborating this concept, let us first define GNC systems
The classification(or recognition) process involves comparing
and their relevant components or aef when related to
what is observed with the Rlias's a priori knowoledge(huang,
RUAS, as well as their interaction in a typical RUAS au-
2008). Perception can be further divided into various func
tonomy software implementation; sce Figure 3. Indeed, we
tions on different levels such as mapping, obstacle and target
found in our literature review that gnc terms are com-
detection, and object recognition
monly used in UAS research, but they are rarely defined Situational Awareness(SA): The notion of Sa is commonly
and are sometimes mistakenly use
used in aviation systems, and numerous definitions of Sa
have been proposed. In this paper, we adopt Endsley,'s defi
Definition7. Automatic Flight Control System(AFCS): Au- nition(Endsley, 1999)of SA as" the perception of elements in
tomatic controll can be defined as the process of manipulating
the environment within a desirable volume of time and space,
the inputs to a dynamical system to obtain a desired effect on its
the comprehension of their meaning, and the projection of their
outputs without a human in the control loop. For RUAS, the de-
status in the near future." SA therefore is higher than per
sign of flight controllers consists of synthesizing algorithms or
ception because it requires the comprehension of the situation
control lazos that compute inputs for vehicle actuators(rotors,
and then the extrapolation or projection of this information
aileron, elevator, etc. )to produce torques and forces that act on
forward in time to determine how it will affect future states of
the vehicle in controlling its 3D motion(position, orientation,
the operational environment
and their time derivatives). AFCS, called also autopilot, is thus
the integrated software and hardware that serve the control func- Definition 9. Guidance System(GS): A guidance system can
tion as defined
be defined as the"driver"of a ruas that exercises planning and
decision-making functions to achieve assigned missions or goal
Definition 8. Navigation System (NS ): In the broad sense, The role of a guidance system for RUAS is to replace the cog-
navigation is the process of monitoring and controlling the move- nitive processes of a human pilot and operator. It takes inputs
ment of a craft or vehicle from one place to another. For RUAS, from the navigation system and uses targeting information(mis
navigation can be defined as the process of data acquisition, data sion goals)to make appropriate decisions at its high level and Lo
analysis, and extraction and inference of information about the generate reference trajectories and commands for the AFCs at its
vehicle's states and its surrounding environment with the objec- low level. GS decisions can also spark requests to the navigation
tive of accomplishing assigned missions successfully and safely. system for new information. A guidance system comprises var-
This information can be metric, such as distances, topological, ious autonomy-cnabling functions including trajectory genera
such as landmarks, or any other attributes that are useful for mis- tion Path planning, mission planning, and reasoning and hig/
sion achievement. The main autonomy-enabling functions of a level decision making
navigation system, from lower to higher level, are as follows
Trajectory Generation: A trajectory generator has the role
Sensing: A sensing system involues one or a group of devices
of compuling different molion functions(reference position,
(sensors) that respond to a specific physical phenomenon or
reference heading, etc. that are physically possible, satisfy
stimulus and generate signals that reflect some features of or
RUAS dynamics and constraints, and can be directly used as
informalion about an object or a physical phenomenon. Sen
reference trajectories for the flight controller. Reference trajec
sors such as gyroscopes, accelerometers, magnetometers, static
tories can be preprogrammed, uploaded, or generated in real-
and dynamic pressure sensors, cameras, and lidars are com
time onboard the ruas(dynamic trajectory generation )ac-
monly used onboard LAs to provide raw measurements for
cording to the outputs of higher-level guidance module
state estimation and perception algorithms
Path Planning: The process of using accumulated navigation
Stale Estimation: This concerns mainly the processing of raz
data and a priori informalion to allow the ruas to find the
Sensor measurements to estimate variables that are related to
best and safest way to reach a goal position/configuration or
the vehicle's state, particularly those related to its pose and
to accomplish a specific task. Dynamic path planning refers to
motion, such as attitude, position, and velocity. These esti-
onboard, real-time path planning
mates can be absolute or relative. Localization is a particular
Mission Planning: The process of generating tactical goals,
case of state estimation that is limited to position estimation
route general or specific ), a commanding structure, coordina
relative to some map or other locations
tion, and timing for a rlAs or a team of unmanned systems
(Huang, 2008). The mission plans can be generated either in
l In the remainder of the paper, the term control refers to automatic This definition is also similar to the one used in the alFUS frame
control
work(Huang, 2008
Journal of field robotics doi 10 1002/rob
Kendoul: Survey of advances in GNC Of ruas
393
advance or in real time. They can be generated by operators teraction with an external system, the rUas needs higher
or by onboard software systems in either centralized or dis- levels of GNC. One of the primary motivations for using
tributed ways. The termdynamic mission planning"can also GNC as aspects or axes for characterizing the autonomy
be used to refer to onboard, real-time mission plannin
level of ruas is to use terms and concepts that are famil
Decision Making: The RUAS's ability to select a course of ac- iar to the UAS research community. Indeed, we are inter-
tions and choices among several alternative scenarios based on ested in a framework that describes the autonomy levels in
available analysis and information. The decisions reached are a simple but meaningful way, so that it can easily be un-
relevant to achieving assigned missions efficiently and safeli
derstood and used by other researchers The intent of this
Decision-making processes can differ in type and complex- GNC-based ALFURS framework is also to help categorize
ity, ranging from low(e. g, fly home if the communication the RUas literature and research presented in Sections 4,
link is lost) to high-level decision making. Trajectory gener- 5, and 6. Differentiating among consecutive autonomy lev
ation, path planning, and mission planning also involve some els is not trivial and may even be subjective. On the other
ision-making prod
hand, autonomy levels need to be distinguished to be use
Reasoning and Cognizance: The RUAS's ability to analyze ful for evaluation and comparison, and to be easily usable
and reason using contextual associations between different by the research community. Therefore, an 11-levell3scale
entities. These are the highest level aEf that a RUAS can of autonomy, shown in Figure 4, was proposed, based on
perform, with varying levels of augmentation or replacement gradual increase(autonomy as a gradual property)of GNC
of human cognitive process Reasoning and cognizance occur functions and capabilities. Main or key GNc functions that
prior to the point of decision making. Note that transition enable each autonomy level are verbally described, along
from high-level navigation(situational awareness) to high- with their correspondences with MC, EC, and ESI metrics
level guidance(reasoning and cognizance) is of course quite (illustrated by color gradient). The GNC category aspect of
blurry.
this scale is advantageous because it helps ruas develo
ers to easily and correctly determine the autonomy level
of an existing algorithm or system, but also to identify the
For better understanding of these GNc-related defi
AEF needed to achieve a certain autonomy level during the
nitions, the reader is encouraged to read the nist docu- design of a new system
ment(Huang, 2008)for more details about the meaning
of key terms such as"mission, ""goal, ""operator, "and
environment. Figure 3 shows a simple block diagran of Observation 1. Although the ALFURS framework was
key gnc functions and their interaction in a typical au-
proposed for RUAS, it can be used for UAS in general
tonomous RUAS. Traditionally, GNC has always been the
bottom blobs of Figure 3, 1.e, flight control, state estimation
In addition to autonomy characterization and eval-
and trajectory generation/waypoint navigation. However, uation, two other aspects are important for comparing
many of the rescarch programs today are geared toward re- and evaluating RUAS or autonomy technologies: perfor-
alizai
g all of the GNC functions in Figure 3 onboard the mance and dependability. Autonomy is related to what the
RUAS
RUAS Can do(mc, ec, esi), performance is related to how
ALFURS levels are determined based on degrees of well the ruaS meets mission requirements(accuracy, time
RUAS involvement and cfforts in performing AEF or GNC etc.). ama ff he vis face a gaonthms or GNC systems that
functions. The general trend may be that RUAs autonomy accomplishe
sion without problems(success rate
level increases when the levels of GNC functions increase. failure rate, etc .In
In other words, autonomy level is higher when the gnc are developed to serve the same autonomy level can still
systems include high-level AEF functions, and they are per- be compared and evaluated based on performance and de
formed by the rUas to a greater extent. Because the main pendability metrics. This is out of the scope of this paper,
focus of this paper is not on autonomy characterization, but such a work will benefit the rescarch community and
this concept will not be elaborated in detail. However, it UAS practitioners in general
is important to note that there is a direct correspondence
As a direct application of the ALFURS framework
between GNC functions or systems and the mc, ec and RUAS-related works, reviewed in Sections 4, 5, and 6, are
ESI metrics used in the ALFUS project. Therefore, it is pos- classified based on the GNC aspects and the level of auton-
sible to establish GNC metrics by mapping ALFUS met
omy they are addressing, starting from low-level AEF such
rics to the alfurs framework. indeed to achieve a com-
as automatic control to high-level functions such as cooper-
plex mission in a complex environment without any in- ative mission planning
1?Metrics for measuring the level of GNC functions, and process
for determining the RUAS's level of autonomy using the scores of Eleven scales, to be consistent with AFRL ACL chart and NIST
these various metrics
Journal of field robotics doi 10.1002/ rob
324. Journal of field robotics -2012
LEVEL
EVEL
DESCRIPTOR
GUIDANCE
NAVIGATION
CoNTR。L
ESI EC MC
Human-level decision-making, Human-like navigation capabili
accomplishment of most
Same or better control
ties for most missions
Autonomous missions without any interven- fast SA that outper
per formance as for a piloted g
rforms human
tion from ES(100% ESI
aircrafi in the same situalion
Long track awareness of very Ability to choose the appro-&/ e al
SA in extremely complex
cognizant of all within the environments and situations
and conditions
operalion range
Swarm
Distributed strategic group
Cognizance and planning, selection of strategic complex environments and priate control architecture
goals,mission execution with nol situations, inference and anticipa- based on the understanding
9 Group Decision supervisory assistance, negolial- tion of other agents intents and pf the current situation/cont-
Making ing with team members and ES. strategies, high-level team SA. ext and future consequences 35
ing and higher lev
bility to change or switch
Awareness
strategic decision-making.
environments and situations, between different contro
inference of self/others intent
and
strategic mission planning, most
Cos
gnizance /of supervision by RUAS, choose anticipation of near-future events understanding of the current
strategic goals, cognizance
and consequences(high fidelity situation/context and future
consequences
Collaborative mission planning Combination of capabilities in
RT Collaborative and execution, evaluation and levels 5 and 6 in highly complex, same as in previous levels
/
Mission
optimization of multi-vehicle adversarial and uncertain environ-(no-additional control
Planning mission performance. allocation ment, collaborative mid fidelity capabilities are required)
of tactical tasks lo each agent
Reasoning, high-level decision Higher-level of perception to
Dynamic making, mission driven decisions, recognize and classify detected sane as in previous levels
Mission
high adaptation to mission objects/events and to infere some
Planning
changes, tactical task allocation. of their attributes, mid fidelity SA/ capabilities are required)
execution monitoring
RT Cooperative Collision avoidance, cooperative Relative navigation between
Navigation and path planning and execution to RUAS, cooperative perception,
Distributed or centralised
Path Planning
meet common goals, swarm or data sharing, collision detection, flight control architectures,
group optimization
shared low fidelity Sa
coordinated maneuvers
Hazard avoidance, RT path Perception capabilities for ob
Obstacle/Event planning and re-planning, event cle, risks, target and environment Accurate and robust 3D
Detection and driven decisions, robust response changes detection, RI mapping trajectory tracking capability
to mission changes
(optional). low fidelity SA
is desired
Path Planning
Health diagnosis. limited
Robust flight controller,
Fault/ Event adaptation, onboard conservative Most health and status sensing reconfigurable or adaptive
by the rUAs, delection of
Adaptive
and low-level decisions
control to compensate fo
hardware and software faults
RUAS
execution of pre-programmed
most failures, mission and
environment changes
All sensing and state estimation
Navigation
Same as in level I
by the RUAS (no I
GPS), all perception and situation
in Level 1
(e,g, Non-GPS)
awareness by the human operator
re-programmed or uploaded
Automatic flight plans(waypoints, reference /Most sensing and state estimation Control commands are
Flight
trajectories, etc. ) all
by the RUAS, all perception and computed by the flight
Control
analyzing, planning and
situational awareness by the control system(automatic
decision-making by es
human operator
control of the RUAs 3D) pose)
all guidance functions are Sensing may be performed by the
0
givenby a remoteS//S
performed by external systems RUAS, all data is processed and Control commands are
Remote Control(mainly human pilot or operator) analyzed by an external system
(mainly human)
(mainly human pilot)
Figure 4. Illustration of ALFURS autonomy levels as a gradual increase of GNC capabilities and corresponding MC, EC, and ESI
Acronyms: ESI (external system independence), EC (environment complexity), MC (mission complexity), ES(external system), SA
(situational awareness), RT (real time
Journal of field robotics doi 10 1002/rob
(系统自动生成,下载前可以参看下载内容)
下载文件列表
相关说明
- 本站资源为会员上传分享交流与学习,如有侵犯您的权益,请联系我们删除.
- 本站是交换下载平台,提供交流渠道,下载内容来自于网络,除下载问题外,其它问题请自行百度。
- 本站已设置防盗链,请勿用迅雷、QQ旋风等多线程下载软件下载资源,下载后用WinRAR最新版进行解压.
- 如果您发现内容无法下载,请稍后再次尝试;或者到消费记录里找到下载记录反馈给我们.
- 下载后发现下载的内容跟说明不相乎,请到消费记录里找到下载记录反馈给我们,经确认后退回积分.
- 如下载前有疑问,可以通过点击"提供者"的名字,查看对方的联系方式,联系对方咨询.