Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The airside system is an autonomous perception-decision-control system that runs on the drone. All workers use local space except for flight_interface_worker. A diagram of the pipeline is shown below.The airside system implements the multiprocessing worker model: Python multiprocessing worker model

All worker use the local space coordinate system except for Flight interface worker: Unit and Coordinate Conventions

System

Drawio
mVer2
zoom1
simple0
zoominComment10
inCommentcustContentId02544435218
pageId2261909721
custContentIdlbox25444352181
diagramDisplayNameairsideSystem.drawio
lbox1
contentVer2
revision2
baseUrlhttps://uwarg-docs.atlassian.net/wiki
diagramNameairsideSystem.drawio
pCenter0
width438
links
tbstyle
height689

Multiprocessing

TODO R D : Link to multiprocessing page when written

Workers

...

Key:

  • Blue: Not the responsibility of the system

Workers

Worker

Description

Task

Flight interface worker

Interfaces with the flight controller. The output of the worker is used for perception and decision making; the input is used for

...

controlling the drone.

Task:

  • Telemetry:

    • Polls the common flight controller for drone telemetry

    • Converts the telemetry position from global to local

    • Timestamps the telemetry and sends it to the appropriate worker(s)

  • Commands:

    • Receives commands from decision worker

    • If applicable, converts the command from local space to global

...

Input:

  • decision_worker: AirsideCommand

Output:

  • data_merge_worker: TODO: OdometryAndTime

  • decision_worker: Tuple of:

    • TODO: telemetry

    • TODO: waypoint state

    • TODO: command ready state

video_input_worker

...

Video input worker

Interfaces with the camera device.

Task:

  • Gets frames from the common camera and timestamps them

Output:

...

Detect target

...

worker

...

detect_target_worker

Perception.

Input:

...

Detects objects (if any) in images.

Task:

  • Gets images and detect objects to create bounding boxes

    • There may be 0 object detections and therefore 0 bounding boxes in the list

  • Sends the bounding boxes and with forwarded timestamp

Output:

...

Data merge

...

worker

...

data_merge_worker

Perception.

Input:

  • flight_input_worker: OdometryAndTime

  • detect_target_worker: DetectionsAndTime

Synchronizes telemetry and detected objects.

Task:

  • Gets telemetry data and bounding boxes along with their corresponding timestamp

  • Merges the telemetry data to the bounding boxes by closest timestamp

Output:

  • geolocation_worker: MergedOdometryDetections

geolocation_worker

Perception.

Input:

  • data_merge_worker: MergedOdometryDetections

Geolocation worker

Finds out where the detected object is in the world.

Task:

  • Gets telemetry data and bounding boxes

  • Converts bounding boxes to locations in the world

Output:

...

Cluster estimation

...

worker

...

cluster_estimation_worker

Perception.

Input:

geolocation_worker: list[DetectionInWorld]

...

Estimates the location of objects in the world based on groups of detections.

Input list length is the number of simultaneous detections in a single frame.

Task:

  • Collects a scatter plot of locations of detected objects in the world

  • Estimates the location of each object

...

Output:

controller_worker: list[ObjectInWorld]

...

Decision worker

Builds a model of the world and takes action.

Input list is sorted in descending order of confidence (i.e. index 0 is best, 1 is next best, etc.)

...

decision_worker

...

.

Input:

  • flight_interface_worker: Tuple of:

    • TODO: telemetry

    • TODO: waypoint state

    • TODO: command ready state

  • cluster_estimation_worker: list[PositionWorld]

Task:

  • Maintain model of world

  • Decide which landing pad to investigate

    • Search if there is no landing pad: TODO Write documentation

...

Output:

...