objectidentification.resources.matching package

Submodules

objectidentification.resources.matching.SiameseNetworkAPI module

class dronebuddylib.atoms.objectidentification.resources.matching.SiameseNetworkAPI.SiameseNetworkAPI(obj_tensor=None)[source]

Bases: object

get_detected_objects(img)[source]
get_embeddings(img)[source]
inference(img)[source]
inference_1(img)[source]
inference_2(img)[source]
two_image_inference(img_1, img_2)[source]
two_image_inference_difference(img_1, img_2)[source]

objectidentification.resources.matching.dataset module

class dronebuddylib.atoms.objectidentification.resources.matching.dataset.SiameseDataset(images_folder_path, transform=None, num_samples=10, **kwargs)[source]

Bases: Dataset

create_dataset()[source]
get_sample()[source]

objectidentification.resources.matching.inferenceDataset module

class dronebuddylib.atoms.objectidentification.resources.matching.inferenceDataset.InferenceDataset(all_img_of_obj, crop_img_of_obj, transform=None)[source]

Bases: Dataset

dronebuddylib.atoms.objectidentification.resources.matching.inferenceDataset.load_images_from_folder(folder_path, transform=None)[source]

objectidentification.resources.matching.main module

objectidentification.resources.matching.model module

class dronebuddylib.atoms.objectidentification.resources.matching.model.SiameseModel(base_model, base_model_weights)[source]

Bases: Module

forward(img1, img2)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

forward_difference(img1, img2)[source]
forward_difference_tsne(img1, img2)[source]
get_embedding(x)[source]
training: bool

objectidentification.resources.matching.tune_api module

dronebuddylib.atoms.objectidentification.resources.matching.tune_api.dataloader(full_dataset, args, output_folder_path)[source]
dronebuddylib.atoms.objectidentification.resources.matching.tune_api.loadModel()[source]
dronebuddylib.atoms.objectidentification.resources.matching.tune_api.plt_metric(history, metric, title, has_valid=True)[source]

Plots the given ‘metric’ from ‘history’.

Parameters:
  • history – history attribute of History object returned from Model.fit.

  • metric – Metric to plot, a string value present as key in ‘history’.

  • title – A string to be used as title of plot.

  • has_valid – Boolean, true if valid data was passed to Model.fit else false.

Returns:

None.

dronebuddylib.atoms.objectidentification.resources.matching.tune_api.predict_with_reference_images(model, image_path, reference_embeddings, device)[source]
dronebuddylib.atoms.objectidentification.resources.matching.tune_api.train(model, criterion, optimizer, trainloader, valloader, args, device, output_folder_path, lr_scheduler=None)[source]
dronebuddylib.atoms.objectidentification.resources.matching.tune_api.tune(feature_extractor_model='efficientnetv2', num_samples=100, emb_size=20, epochs=10, lr=1e-05, batch_size=4, train_val_split=0.8, num_workers=1, seed=0, output_folder_name=None, lr_scheduler=False, pretrained_weights=None)[source]

Module contents