Autonomous Systems
Overview
An autonomous system is composed of several parts, and the detail of each part is different at each level. Knowing which motors to control to rotate a few degrees is contrasted by calculating which waypoints to visit for a fastest route.
System
Perception
Sensory data is the first thing that an autonomous system requires to be able to function. This data can range from sensors directly on the system itself, a networked mesh or cloud, transponders and other broadcasts, to publicly available information such as traffic or weather.
The data is then processed to update the system’s model of the world. This model includes:
Ego: Where the system is located in relation to the world and where it will be
Static environment: Where non moving objects are located
Dynamic environment: Where moving objects are located, their behaviour and intent, and a prediction of where they will be in the future
Decision
Once the autonomous system has a model of the world, it can make a decision on what action to take next. This can be random from a set of probabilities that update based on the model, a selection of what it believes to be the best action, or something else. The autonomous system’s behaviour can be programmed to be more aggressive, conservative, or favour some actions more than others.
Control
Once the autonomous system has made a decision, it needs to make it occur in the physical world with actuators. The entire process repeats with new sensory data.
Scope
Each part of an autonomous system is different at different scopes. There are many ways to divide scope.
Legend:
Red: Perception
Blue: Decision
Green: Control
Autonomous Cars
And other ground vehicles.
Scope | Perception (Ego) | Perception (Static) | Perception (Dynamic) | Mission Execution (Decision and Control) |
---|---|---|---|---|
Route | Road network pose:
| Global mapping:
| Traffic summary:
| Route navigation:
|
Lane | Lane level pose:
| Road configuration:
| Situation:
| Behavioural planning:
|
Continous | System state:
| Static objects:
| Moving objects:
| Motion control:
|
AEAC 2023 Student UAS Competition
WARG competition drone.
Scope | Perception (Ego) | Perception (Static) | Perception (Dynamic) | Mission Execution (Decision and Control) |
---|---|---|---|---|
Cruise | GPS coordinates | Waypoint locations | Diversion | Path optimization:
|
Search | Local position relative to waypoint location | Locations of possible landing pads and false positives | N/A | Search and pick a landing location |
Land | Local position relative to landing target | Single landing pad | N/A | Landing attempt:
|
Continuous | System state:
| Static objects:
| N/A | Motion control:
|
Â