Overview
Training is run on the WARG desktop on the Windows partition. A default Ultralytics model is loaded (e.g. nano) with no initial weights. Training configurations are described here: https://docs.ultralytics.com/usage/cfg/
Repository:
https://github.com/UWARG/model-training
Software
Setup
Follow the instructions: Autonomy Workflow Software
Install packages:
pip install -r requirements.txt
Run initial training to create the configuration files:
python -m training
If it reaches the dataset checking phase, press ctrl-c to stop the program. It will probably fail before that point with a file not found error.
Open the Ultralytics configuration file:
Windows:
C:/Users/[Username]/AppData/Roaming/Ultralytics/settings.yaml
MacOS:
~/Library/Application Support/Ultralytics/settings.yaml
Linux:
~/.config/Ultralytics/settings.yaml
Go to the directories of the 1st 3 lines and delete the directory there:
Example:
runs_dir: C:\Users\WARG\model-training\runs
, so go tomodel-training
and deleteruns
Change the 1st 3 lines to this:
datasets_dir: C:\Users\WARG\Ultralytics\datasets weights_dir: C:\Users\WARG\Ultralytics\weights runs_dir: C:\Users\WARG\Ultralytics\runs
Use other directories if desired.
Usage
Move or copy the 3 directories of the dataset so that it is in the dataset directory:
C:\Users\WARG\Ultralytics\datasets\[test, train, val]
Make sure that any old datasets are out of this directory or have their test, train, val directories renamed (e.g. test_landing_pad
, train-old
, val0
)! Hiding them in a directory underneath dataset is not sufficient (e.g. ...\datasets\landing_pad\test
might still be erroneously used).
Activate the environment: Autonomy Workflow Software
Navigate into the repository and run training:
python -m training
Training will take a few hours.
If training is interrupted, change the model load path in training.py
:
MODEL_RESUME_PATH = C:\Users\WARG\Ultralytics\runs\[latest training number]\last.pt ... model.train( data=MODEL_RESUME_PATH, ..., )
Where [latest training number]
is the number of the checkpoint.
Hardware
Each epoch takes approximately 5 minutes to complete on an NVIDIA GeForce RTX 2060 with 6GB VRAM: https://www.techpowerup.com/gpu-specs/geforce-rtx-2060.c3310
There is only enough VRAM for nano and small models, not larger ones.
WARG desktop details: WARG Desktop
CUDA compatibility information: CUDA and PyTorch