Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Table of Contents
stylenone
Panel
panelIconIdatlassian-warning
panelIcon:warning:
bgColor#FFFAE6

This is a controlled document.

Only the following members may edit:

  • Autonomy leads

  • Airside project manager

  • Xierumeng (I’m leaving the team :surebud:)

If changes are required, contact one of these members on Discord.

Overview

Audience: All members.

The airside system is an autonomous perception-decision-control system that runs on the drone. The airside system implements the multiprocessing worker model: Python multiprocessing worker model

All worker use the local space coordinate system except for Flight interface worker: Unit and Coordinate Conventions

More information on autonomous systems in general: Autonomous Systems

Additional resources

System

The airside system implements multiprocessing: Python multiprocessing worker model

There are 2 dataflow paths:

  • A: Selection of landing location

  • B: Autonomous landing

Drawio
mVer2
zoom1
simple0
inComment0
custContentId2607153682
pageId2590016999
lbox1
diagramDisplayNameAirside system software architecture.drawio
contentVer3
revision3
baseUrlhttps://uwarg-docs.atlassian.net/wiki
diagramNameAirside system software architecture.drawio
pCenter0
width588.0555555555555
links
tbstyle
height689.0000000000002

Key:

  • Red: Perception

  • Orange: Tracking

  • White: Planning

  • Green: Control (also perception from drone telemetry)

  • Blue: Not the responsibility of the system

Workers

General

Worker

Description

Task

Flight interface worker

Interfaces with the flight controller. The output of the worker is used for perception and decision making; the input is used for controlling the drone.

Task:

  • Telemetry:

    • Polls the common flight controller for drone telemetry

    • Converts the telemetry position from world to local

    • Timestamps the telemetry and sends it to the appropriate worker(s)

  • Commands:

Video input worker

Interfaces with the camera device.

Task:

  • Gets frames from the common camera and timestamps them

Detect target worker

Detects objects (if any) in images.

Task:

  • Gets images and detect objects to create bounding boxes

    • There may be 0 object detections and therefore 0 bounding boxes in the list

  • Sends the bounding boxes and with forwarded timestamp

    • Decision worker controls which output queue is used

Decision worker

Builds a model of the world and takes action.

Input list is sorted in descending order of confidence (i.e. index 0 is best, 1 is next best, etc.).

Task:

  • Maintain model of world

  • Decide which landing pad to investigate

    • Search if there is no landing pad: TODO Write documentation

TODO CREATE DECISION WORKER DOCUMENT

Selection of landing location

Worker

Description

Task

Data merge worker

Synchronizes telemetry and detected objects.

Task:

  • Gets telemetry data and bounding boxes along with their corresponding timestamp

  • Merges the telemetry data to the bounding boxes by closest timestamp

Geolocation worker

Finds out where the detected object is in the world.

Task:

  • Gets telemetry data and bounding boxes

  • Converts bounding boxes to locations in the world

Cluster estimation worker

Estimates the location of objects in the world based on groups of detections.

Input list length is the number of simultaneous detections in a single frame.

Task:

  • Collects a scatter plot of locations of detected objects in the world

  • Estimates the location of each object

Autonomous landing

Worker

Description

Task

Autonomous landing worker

Lands on selected landing pad.

Task:

  • Gets bounding boxes

  • Converts the bounding boxes into TODO