LTE Communication Architecture

 

**UPDATE: This project is archived as of 2025. The drone will no longer be using LTE communication as all computation will be done on a raspberry pi on the drone rather than a laptop/jetson on the ground. This was due to the cost of data and power needed to run LTE.

Overview

For the AEAC 2024 Competition, the Mechanical team brought up the issue of the drone being overweight. To help reduce the load, the Autonomy suggested to not put the Jetson onboard the drone and instead run the airside system on the ground.

To have the airside system work on the ground, we will need to transmit drone telemetry and images from the $200 CV camera to the ground station.

Drone Telemetry

The airside system will run on the Jetson which will run on the ground. To stream drone telemetry to the Jetson, we can connect the Jetson to the Zerotier network and have it connected to a mobile phone hotspot. On the Raspberry Pi, we can configure MavProxy to stream drone telemetry to a UDP port on the Jetson and the FlightInterface module will be able to read MAVLink messages by specifying the UDP port for the connection address.

Connect Jetson to Zerotier Network

Install Zerotier CLI and connect to the Zerotier network.

Steps

  • Install ZeroTier with the following command (Note: Make sure to connect the Jetson to an Ethernet cable)

curl -s https://install.zerotier.com | sudo bash
  • Join the ZeroTier network (Network ID, Device Address and IP can be found in WARG’s ZeroTier Account)

sudo zerotier-cli join [YOUR_NETWORK_ID]

Configure MavProxy on Raspberry Pi

We need to configure MavProxy on the Raspberry Pi to stream drone telemetry to UDP port 15550 on the Jetson.

Follow this document and add --out JETSON_IP_ADDRESS:15550 at the end of the ExecStart section.

Diagram

Streaming Drone Telemetry to the Jetson

These instructions assume that the Jetson is connected to the ZeroTier network and the Raspberry Pi streams drone telemetry to a UDP port the Jetson can read from.

  1. Power on the drone with 2 6-S batteries in series. Note: If you are unsure how to power on the drone ask for directions from the Electrical team.

  2. Ensure that the Raspberry Pi with the LTE hat is turned on.

  3. Open ZeroTier central on the WARG laptop to ensure all devices are active in the private network.

    1. Remember the IP Address of the Raspberry Pi (listed as WARGRPi on ZeroTier). This will be important for step 4.

  4. Use Ping IP_ADDRESS on the WARG laptop to ensure the raspberry pi and Jetson is online. (IP address can be found in ZeroTier central.

  5. Change the connection string in the configuration.yml of the computer-vision-python repository to <IP Address of Raspberry PI>:15550.

Images

We can collect images from our $200 dollar CV camera using the camera module in the common repository. We can then use the socket library in Python to transfer the images to the ground station computer. We will want to collect and send the images once we have reached the end of the mission.

Sequence of Program

  1. We will communicate to the Flight Controller via UART and use DroneKit to poll when the mission has ended.

  2. When the mission has ended, we will collect images from camera and transmit them to the drone.

Sending Images over Sockets

See the common repository, under the image_encoding and network modules: https://uwarg-docs.atlassian.net/wiki/x/RAA_hg. As an overview, the image_encoding module encodes and decodes images into jpeg format, and the network module sends data on a network through sockets.

Communication protocol between "server"(ground station) and "client"(drone)

*Note: At the socket layer, the protocol being used is not enforced, but note that the receive function must know how many bytes to receive. Additionally, network buffer sizes may vary across devices depending on OS and hardware, thus sending too many messages at the same time may end up filling it and overwriting previous messages (although the code does try to avoid this, testing should be done per device).

Below are the protocols are adapted for Autonomy’s image sending (sending images at 1Hz):

Preferred: protocol using TCP sockets (connection-based):

  1. Establish connection (start server and start client)

  2. Client sends the byte length of encoded image to server (big-endian unsigned int, 4 bytes)

  3. Server receives the 4 bytes representing length of image

  4. Client sends image to server

  5. Server receives the image

  6. Repeat 2-5 every time it is desired to send an image

  7. Close the connection (both server and client) once we no longer want to send images

Other models can include server responses (if so, it may be worth using the HTTP protocol instead). Data integrity is tested and guaranteed for TCP sockets.

Very Risky: protocol using UDP sockets (connectionless):

  1. Start server (stays open forever)

  2. Client send byte length of encoded image to server (big-endian unsigned int, 4 bytes)

  3. Server receives the 4 bytes representing length of image

  4. Client sends image to server

  5. Server receives the image

  6. Repeat 2-5 every time it is desired to send an image

  7. Close server once we no longer want to send images.

Note: UDP does not guarantee that messages are successfully sent, the order of the packets, and also does not support server responses. The current implementation does not support packet numbering, so there is a chance that sending large images through UDP (ie sending many packets) will arrive the wrong order and therefore be unrecoverable. Data integrity is not guaranteed and not tested. Our UDP sockets are better suited for small amounts of data that need to be sent quickly.