Code and Models from the paper Learning from Synthetic InSAR with Vision Transformers: The case of volcanic unrest detection, IEEE Transactions on Geoscience and Remote Sensing, 2022
If you use the code or models in this repo cite our paper:
@ARTICLE{9791383,
author={Bountos, Nikolaos Ioannis and Michail, Dimitrios and Papoutsis, Ioannis},
journal={IEEE Transactions on Geoscience and Remote Sensing},
title={Learning from Synthetic InSAR with Vision Transformers: The case of volcanic unrest detection},
year={2022},
volume={},
number={},
pages={1-1},
doi={10.1109/TGRS.2022.3180891}}
You can download the pretrained models here.
Directory structure:
-
Model zoo root
- architecture
- architecture checkpoint
- architecture
-
The available models are:
Model usage example:
torch.load('swin.pt')
You can train a new model by executing main.py
with the proper arguments. The encoder will be automatically initialized with weights pretrained on ImageNet. Example usage for a model based on the Swin Transformer:
python main.py --encoder=swin --synthetic_train_dataset=TRAIN_PATH --synthetic_val_dataset=VALIDATION_PATH --test_dataset=TEST_PATH --batch_size=40
Based on the models pretrained on the synthetic dataset you can proceed with the pseudo training process by running the pseudo_training_utils.py
script.
Example:
python pseudo_training_utils.py --unlabeled_path=PATH_OF_UNLABELED_DATASET --target_path=PATH_TO_STORE_PSEUDOLABELED_SAMPLES --model_root_path=PATH_OF_DOWNLOADED_MODELS --arch=ARCHITECTURE(E.G swin, deit, convit) --test_path=REAL_TEST_PATH --synthetic_validation_path=SYNTHETIC_VAL_PATH
Make sure the target path exists and has the following structure:
- Pseudo_Directory (Name is irrelevant)
- 0
- 1
The following list contains the synthetic data used in this work:
The test set C1, as published by [1], can be found here.
The unlabeled dataset used for domain adaptation can be found here.
[1] Bountos, Nikolaos Ioannis, et al. "Self-supervised contrastive learning for volcanic unrest detection." IEEE Geoscience and Remote Sensing Letters 19 (2021): 1-5.