Object detection
Step-by-step tutorial explains how to integrate custom object detection neural network into Supervisely platform
Introduction
This tutorial will teach you how to integrate your custom object detection model into Supervisely by using ObjectDetectionTrainDashboad class.
Full code of object detecting sample app can be found here

How to debug this tutorial
Step 1. Prepare ~/supervisely.env file with credentials. Learn more here.
Step 2. Clone repository with source code and create Virtual Environment.
Step 3. Open the repository directory in Visual Studio Code.
Step 4. Start debugging src/main.py
Integrate your model
The integration of your own NN with ObjectDetectionTrainDashboad is really simple:
Step 1. Define pytorch dataset
Step 2. Define pytorch object detection model
Step 3. Define subclass ObjectDetectionTrainDashboad and implement train method
Step 4. Configure your dashboard using parameters and run the app. That's all. π
How to customize the dashboard?
Configuration via parameters
This section provide detailed information about parameters for ObjectDetectionTrainDashboad initialize and how to change it.
pretrained_weights: Dict - it defines the table of pretraned model weights in UI
hyperparameters_categories: List - list of tabs names in hyperparameters UI.
extra_hyperparams: Dict - they will be added at the end of list hyperparams in the tab by passed tab name, which used as parent key.
hyperparams_edit_mode: String - the ways to define hyperparameters.
show_augmentations_ui: Bool - show/hide flag for augmentations card
Default: True
extra_augmentation_templates: List - these augmentations templates will be added to beginning of the list for selector in augmentations card:
download_batch_size: int - How much data to download per batch. Increase this value for speedup download on big projects.
Default: 100
loggers: List - additional user loggers
Configuration via methods re-implemetation
How to change all hyperparameters in the hyperparameters card?
All what you neeed is just re-define hyperparameters_ui method in subclass of ObjectDetectionTrainDashboad
Additional notes
Environment variable SLY_APP_DATA_DIR in src.globals is used to provide access to app files when the app will be finished. If something went wrong in your training process at any moment - you won't lose checkpoints and other important artifacts. They will be available by SFTP.
By default object detection training template app use this directoties structure from src/sly_globals:
remote_data_dir = f"/train_dashboard/{project.name}/runs/{time.strftime('%Y-%m-%d %H:%M:%S')}" - the destination dir in Team files for all training artefacts.
Last updated
Was this helpful?






