Bootcamp Architecture
- 1 Overview
- 2 Multiprocessing
- 3 Workers
- 3.1 simulation_worker
- 3.1.1 Physical
- 3.1.2 Flight controller
- 3.1.3 Camera
- 3.2 detect_landing_pad_worker
- 3.3 geolocation_worker
- 3.4 display_worker
- 3.5 decision_worker
- 3.1 simulation_worker
- 4 Simulators
Overview
The Autonomy bootcamp is a software in the loop (SITL) drone simulator. The simulator is composed of 5 worker processes, with the loop completed with commands from decision_worker to simulation_worker. The simulator is synchronous, requiring a command to be sent from decision_worker to simulation_worker every time step.
There are 3 simulators:
Task 2: Decision example
Task 3: Decision with simple waypoint
Task 4: Decision with waypoint and landing pad
A variety of unit tests and integration tests exist to ensure there are no bugs in the starter code.
Multiprocessing
Python multiprocessing worker model
Workers
simulation_worker
Task:
Simulates a physical drone
Input:
Command
Output:
Tuple of:
DroneReport
: Drone informationlist
: Empty list (for compatibility)np.ndarray
: Camera image
Physical
The drone flies at a height of 30 metres with a maximum speed of 5m/s and a downwards facing camera with a diagonal field of view of ~40°. The camera’s image resolution is 1200x900 pixels, which gives a viewing area of 20x15m. The drone does not rotate.
The drone either travels at a speed of 5m/s or is not moving, with no transition state. An implicit Euler approximation is used to calculate the drone’s position from its velocity. There is a correction step when the drone is within the acceptance boundary and/or overshoots its destination, which teleports the drone’s position to the destination.
The timestep is set in main and passed to the simulation.
Flight controller
The drone has 3 statuses:
Halted: The drone is not moving.
Moving: The drone is moving.
Landed: The drone has landed and the simulator is ending.
The drone also has a global target destination. When the drone status is Halted or Landed, the destination is the same as the drone’s position.
The drone accepts several commands:
Null: Default. Does nothing but required to advance the simulator.
Set relative destination: Move distance relative to current position of the drone. Requires drone to be halted. The destination must also be within the flight boundary.
Halt: Makes the drone stop immediately at its current position.
Land: Lands the drone at the current position and ends the simulation. Requires drone to be halted.
If an invalid command is received by the drone, it is ignored with a warning.
There is no set absolute/global destination command, which is a deliberate design choice to acclimatize the user to substandard interfaces.
Camera
The map is composed of map images and landing pad images which are dynamically stitched together at runtime. The file name of each map image is its coordinate following the pixel system direction, with a default image used if the file for the image coordinate is missing. The flight boundary images are copies of the boundary image.
The pixel space’s y direction is opposite of the world space’s y direction. As a result:
Top right is positive in world space
Bottom right is positive in pixel space
TODO: The 2023 landing pad model is of such low quality only the default images are used within the flight boundary to avoid false positives.
detect_landing_pad_worker
Task:
Detects landing pad on camera image
Input:
Tuple of:
DroneReport
: Drone information (unused)list
: Empty list (unused)np.ndarray
: Camera image (input)
Output:
Tuple of:
DroneReport
: Drone information (unchanged)list[BoundingBox]
: List of bounding boxes (output)np.ndarray
: Annotated image (unchanged)
The landing pad is detected using Ultralytics on the 2023 landing pad model trained for the 2023 AEAC competition. TODO: Improved model?
geolocation_worker
Task:
Converts bounding boxes to locations
Input:
Tuple of:
DroneReport
: Drone information (input)list[BoundingBox]
: List of bounding boxes (input)np.ndarray
: Annotated image (unused)
Output:
Tuple of:
DroneReport
: Drone information (unchanged)list[Location]
: List of landing pad positions (output)np.ndarray
: Annotated image (unchanged)
As the drone camera’s is always facing directly downwards with no rotation, geolocation is a simple translation and scaling operation.
The pixel space’s y direction is opposite of the world space’s y direction. As a result:
Top right is positive in world space
Bottom right is positive in pixel space
display_worker
Task:
Displays the camera image and drone information
Input:
Tuple of:
DroneReport
: Drone information (input)list
: List (unused)np.ndarray
: Image (input)
Output:
Tuple of:
DroneReport
: Drone information (unchanged)list
: List (unchanged)np.ndarray
: Image (unchanged)
The display worker generates an information pane with the drone’s information along with the provided seed. The information pane is concatenated with the input image and displayed. When the drone is landed, the display image is saved as a screenshot of the landing position.
decision_worker
Task:
Commands the drone
Input:
Tuple of:
DroneReport
: Drone information (input)list[Location]
: List of landing pad positions (input)np.ndarray
: Annotated image (unused)
Output:
Command
Commands the drone every timestep with the provided information.
Simulators
The simulator uses a status queue from workers to main to indicate that a worker requests the simulator to end. A worker may request the simulator to end because of an error or because the drone is landed.
At the start of the simulator, the seed is logged for reproducibility. At the end of the simulator, the drone report, waypoint destination, landing pad positions, and seed are logged.
Helpers
generate_destination
A single waypoint is generated outside of the drone’s initial position but far enough within the flight boundary so that the generated landing pads are also within the flight boundary.
1-3 (inclusive) landing pad positions are generated around the waypoint. The landing pads are within the image boundary seen by the camera when the drone is centred on the waypoint.
TODO: The 2023 landing pad model is of such low quality that landing pads are further contrained in the y direction to avoid overlap with the default image coordinate text.
Decision example
No waypoint is generated. The landing pad positions are hardcoded for consistency.
The decision example simulator demonstrates a figure 8 movement of the drone in the flight boundary.
Decision with simple waypoint
The waypoint is generated. The landing pads are at the initial position of the drone and at the waypoint.
Decision with waypoint and landing pad
The waypoint and landing pad positions are generated.
Â