Autonomous Systems

Overview

An autonomous system is composed of several parts, and the detail of each part is different at each level. Knowing which motors to control to rotate a few degrees is contrasted by calculating which waypoints to visit for a fastest route.

System

Perception

Sensory data is the first thing that an autonomous system requires to be able to function. This data can range from sensors directly on the system itself, a networked mesh or cloud, transponders and other broadcasts, to publicly available information such as traffic or weather.

The data is then processed to update the system’s model of the world. This model includes:

  • Ego: Where the system is located in relation to the world and where it will be

  • Static environment: Where non moving objects are located

  • Dynamic environment: Where moving objects are located, their behaviour and intent, and a prediction of where they will be in the future

Decision

Once the autonomous system has a model of the world, it can make a decision on what action to take next. This can be random from a set of probabilities that update based on the model, a selection of what it believes to be the best action, or something else. The autonomous system’s behaviour can be programmed to be more aggressive, conservative, or favour some actions more than others.

Control

Once the autonomous system has made a decision, it needs to make it occur in the physical world with actuators. The entire process repeats with new sensory data.

Scope

Each part of an autonomous system is different at different scopes. There are many ways to divide scope.

Legend:

  • Red: Perception

  • Blue: Decision

  • Green: Control

Autonomous Cars

And other ground vehicles.

Scope

Perception (Ego)

Perception (Static)

Perception (Dynamic)

Mission Execution (Decision and Control)

Scope

Perception (Ego)

Perception (Static)

Perception (Dynamic)

Mission Execution (Decision and Control)

Route

Road network pose:

  • Next intersection

  • Next turn

  • Distance to destination

  • Distance until empty

Global mapping:

  • Roads

  • Bridges

  • Train track crossings

  • Buildings

Traffic summary:

  • Congestion

  • Road closures

  • Accidents

  • Speed traps

Route navigation:

  • Google Maps

Lane

Lane level pose:

  • How far to left and right line

  • Lane index (e.g. 0 is leftmost)

Road configuration:

  • Lane markings

  • Is lane ending

  • Type of lane (e.g. bus, exit)

  • Signs and traffic signals

Situation:

  • Type of object

  • Moving object intent

Behavioural planning:

  • Continue in same lane

  • Switch lanes

  • Accelerate/brake

Continous

System state:

  • Speed

  • Steering wheel angle

  • Oil temperature

Static objects:

  • Location

  • Time to collision

Moving objects:

  • Location and direction

  • Time to collision

Motion control:

  • Actuator command

AEAC 2023 Student UAS Competition

WARG competition drone.

Scope

Perception (Ego)

Perception (Static)

Perception (Dynamic)

Mission Execution (Decision and Control)

Scope

Perception (Ego)

Perception (Static)

Perception (Dynamic)

Mission Execution (Decision and Control)

Cruise

GPS coordinates

Waypoint locations

Diversion

Path optimization:

  • Order of waypoints

Search

Local position relative to waypoint location

Locations of possible landing pads and false positives

N/A

Search and pick a landing location

Land

Local position relative to landing target

Single landing pad

N/A

Landing attempt:

  • Descend to landing pad

  • Abort

Continuous

System state:

  • Position

  • Altitude

  • Orientation (rotation)

  • Battery voltage

  • Motor RPM

Static objects:

  • Landing pad

N/A

Motion control:

  • Flight controller

  • Actuator command

Â