Learning Loop Trainer and Detector Node for Yolov5 (object detection and classification of images). The DL part is based on https://github.com/ultralytics/yolov5 This repository is an implementation of Nodes that interact with the Zauberzeug Learning Loop using the Zauberzeug Learning Loop Node Library.
This node is used to train Yolov5 Models in the Learning Loop. It is based on this image running Python 3.12. Trainer is tested with Nvidia Driver Version: 580.95.05 and CUDA Version: 13.0. Olders versions may not work.
We support all native hyperparameters of YOLOv5 (cf. hyp_det.yaml for reference).
In addition, we support the following hyperparameters:
epochs: The number of epochs to train the model.detect_nms_conf_thres: The confidence threshold for the NMS during inference and validation (not relevant for training).detect_nms_iou_thres: The IoU threshold for the NMS during inference and validation (not used for training).
Further, we support the following hyperparameters for point detection:
reset_points: Whether to reset the size of the points after data augmentation.point_sizes_by_id: A dictionary that maps from point category uuids to the size of the points in the output (fractional size 0-1).flip_label_pairs: A list of pairs of point uuids that should be swapped when a horizontal flip is applied during data augmentation.
Trainer Docker-Images are published on https://hub.docker.com/r/zauberzeug/yolov5-trainer
New images can be pulled with docker pull zauberzeug/yolov5-trainer:A.B.C-nlvX.Y.Z, where A.B.C is the version of the trainer node and X.Y.Z is the version of the learning loop node library.
During development, i.e. when building the container from code it is recommended to use the script docker.sh in the folder training to build/start/interact with the image.
When using the script it is required to set up a .env file in the training folder that contains the loop-related configuration. The following variables should be set (note that some are inherited from the Zauberzeug Learning Loop Node Library ):
| Name | Purpose | Value | Default | Requi. only with ./docker.sh |
|---|---|---|---|---|
| CLASSIFICATION or DETECTION | - | No | ||
| TRAINER_NAME | Will be the name of the container | String | - | Yes |
| LINKLL | Link the node library into the container? | TRUE/FALSE | FALSE | Yes |
| UVICORN_RELOAD | Enable hot-reload | TRUE/FALSE/0/1 | FALSE | No |
| RESTART_AFTER_TRAINING | Auto-restart after training | TRUE/FALSE/0/1 | FALSE | No |
| KEEP_OLD_TRAININGS | Do not remove old trainings, when starting a new one | TRUE/FALSE/0/1 | FALSE | No |
This node is used to run Yolov5 Models for object detection. GPU Detector is tested with Nvidia Driver Version: 580.95.05 and CUDA Version: 13.0. Olders versions may not work.
Detector Images are published on https://hub.docker.com/r/zauberzeug/yolov5-detector. There are two variants of the detector:
- to be deployed on a regular Linux computer, e.g. running Ubuntu (referred to as cloud-detectors)
- to be deployed on a Jetson Nano running Linux4Tegra (L4T)
Mandatory parameters are those described in Zauberzeug Learning Loop Node Library. Besides, the following parameters may be set:
| Name | Purpose | Value | Default | Required only with ./docker.sh |
|---|---|---|---|---|
| LINKLL | Link the node library into the container? | TRUE or FALSE | FALSE | Yes |
| DETECTOR_NAME | Will be the name of the container | String | - | Yes |
| WEIGHT_TYPE | Data type to convert weights to | String [FP32, FP16, INT8] | FP16 | No |
| IOU_THRESHOLD | IoU threshold for NMS | Float | 0.45 | No |
| CONF_THRESHOLD | Confidence threshold for NMS | Float | 0.2 | No |
Pulled images can be run with the docker.sh script by calling ./docker.sh run-image.
Local builds can be run with ./docker.sh run.
Images can be pulled with docker pull zauberzeug/yolov5-detector:A.B.C-nlvX.Y.Z-cloud, where A.B.C is the version of the detector node and X.Y.Z is the version of the learning loop node library.
Images can be pulled with docker pull zauberzeug/yolov5-detector:A.B.C-nlvX.Y.Z-cloud-cpu, where A.B.C is the version of the detector node and X.Y.Z is the version of the learning loop node library.
Images can be pulled with docker pull zauberzeug/yolov5-detector:A.B.C-nlvX.Y.Z-L.4.T, where A.B.C is the version of the detector node, X.Y.Z is the version of the node-lib used and L.4.T is the L4T version.
curl --request POST -H 'mac: FF:FF:FF:FF:FF' -F '[email protected]' http://localhost:8004/detectheaders = {'mac': '0:0:0:0', 'tags': 'some_tag'}
with open('test.jpg', 'rb') as f:
data = [('file', f)]
response = requests.post(
'http://localhost:8004/detect', files=data, headers=headers)The trainer uses the yolov5_pytorch format identifier (yolov5_cla_pytorch for classification).
When it saves a model to the Learning Loop it saves the model as yolov5_pytorch and yolov5_wts (respectively yolov5_cla_pytorch and yolov5_cla_wts for classification).
The wts formats may be used by a detector running on a NVIDIA Jetson device to create an engine file as required by tensorrtx (see https://github.com/wang-xinyu/tensorrtx/tree/master/yolov5).
This code is licensed under the AGPL-3.0 License. The code in
trainer/app_code/yolov5trainer/app_code/train_cla.pytrainer/app_code/train_det.pytrainer/app_code/pred_cla.pytrainer/app_code/pred_det.pydetector_cla/app_code/yolov5
is largely based on the repository https://github.com/ultralytics/yolov5 which is also published under the [AGPL-3.0 License] for non-commercial use.
Original license disclaimer in https://github.com/ultralytics/yolov5:
Ultralytics offers two licensing options to accommodate diverse use cases:
- AGPL-3.0 License: This OSI-approved open-source license is ideal for students and enthusiasts, promoting open collaboration and knowledge sharing. See the LICENSE file for more details.
- Enterprise License: Designed for commercial use, this license permits seamless integration of Ultralytics software and AI models into commercial goods and services, bypassing the open-source requirements of AGPL-3.0. If your scenario involves embedding our solutions into a commercial offering, reach out through Ultralytics Licensing.