Skip to content

Sifat-Ahmed/SiameseNN

Repository files navigation

This repository contains the source code of Siamese Neural Network used for object identification.

1. Folder Structure

+ dataReader
    ---- dataset_reader.py
+ json_helper
    ---- json_creator.py
    ---- json_info.py
    ---- json_parser.py
+ loss
    ---- loss_func.py
+ models
    ---- resnet.py
    ---- siamese.py
    ---- siamese2.py
    ---- Siamese_EfficientNet.py
---- config.py
---- evaluator.py
---- helper.py
---- json_creator_main.py
---- main.py

2. How to Run

2.1. Dataset Creation

  • Dataset Directory: /data/mnist/

  • command : python json_creator_main.py

  • parameters :

    • [--train-json] :'path to train dataset folder'

    • [--train-output] : 'output filename/directory without .json'

    • [--num-train-classes] : 'number of training classes to take'

    • [--val-json] : 'path to val dataset folder'

    • [--val-output] : 'output filename/directory without .json'

    • [--num-val-classes] : 'number of validation classes to take'

    • [--test-json] : 'path to train dataset folder'

    • [--test-output] : 'output filename/directory without .json'

    • [--num-test-classes] : 'number of test classes to take'

    Example: python json_creator_main.py --train-json 'mnist' --train-output 'training_dataset' --num-train-classes 30

    This will create training_dataset.json in the default directory.
    Validation and test dataset creation is same as above. Can be created simultaneously.

    By default it will pick classes that have more than 500 images and less than 1000 images.
    To change this behaviour move to json_helper/json_creator.py Line No. 50
    and edit the if condition. Has to be done manually. (TODO)

2.2 Model Training

Here is a list of available models for training:

  1. Siamese [Uses only convolution layers, no fully connected layers]
  2. SiameseNetwork [Uses convolution with a fully connected layer]
  3. SiameseEfficientNet [Uses efficientNet-b0 as feature extractor followed by a fully connected layer]
  4. ResNet50 [Uses resnet50 as a feature extractor followed by a fully connected layer]
  5. ResNet101 [Uses resnet101 as a feature extractor followed by a fully connected layer]
  6. ResNet152 [Uses resnet152 as a feature extractor followed by a fully connected layer]
  • Model class definitions are in models folder. For any change required Please refer to that.
  • All the models with fully connected layers has 5 output neurons. This provides the optimal value. Some other values like 1, 2, 5, 8, 16, 32 has been tried before.
  • By default main.py will start training all the models. To change this behavious, please refer to main.py, line no. 146 . No command line argument added for this(TODO).

2.2.1 How to run

config.py contains all the configuration and parameters.

  1. image width
  2. image height
  3. learning rate
  4. epochs
  5. criterion
  6. train batch size
  7. validation batch size
  8. test batch size
  9. number of workers
  10. transform function
  11. optimizer
  12. learning rate scheduler
  • First activate the venv as described in 2.1.2 unless not activated

  • command python main.py

  • parameters :

    • [--train-json] 'Directory of the training Json' (Required)
    • [--val-json] 'Directory of the validation Json'
  • If no validation json is given, it will use the training dataset to create a validation dataset

  • Example: > python main.py --train-json 'dataset_train.json'

To change any parameter, please refer to config.py

By default all the above mentioned models will be trained for 50 epochs using the parameters defined in config.py Models will be saved according to their names and the best validation loss Loss Curves will be saved according to the model names.

2.4 Testing

  • command python test.py
  • parameters :
    • [--test-json] 'Directory of the testing Json file' (Required)
    • [--model-name] 'Name of the Model' (Required)
  • Output:
    1. test loss
    2. Classification report
    3. ROC Curves

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages