Experimental results and addictional information related to the paper “STACKED AUTOENCODERS FOR MULTICLASS CHANGE DETECTION IN HYPERSPECTRAL IMAGES”, accepted in the International Geoscience and Remote Sensing Symposium, IGARSS 2018.
Change detection (CD) in multitemporal datasets is a key task in remote sensing. In this paper, a scheme to perform multiclass CD for remote sensing hyperspectral datasets extracting features by means of Stacked Autoencoders (SAEs) is introduced. The scheme combines multiclass and binary CD to obtain an accurate multiclass change map. The multiclass CD begins with the fusion of the multitemporal data followed by feature extraction by SAE. The binary CD is based on the spectral nformation by calculating pixel-wise distances and thresholding, and it also incorporates spatial information through watershed segmentation. The data coming from the multiclass CD is filtered by using the binary CD map and later classified by a Support Vector Machine or an Extreme Learning Machine algorithm. The scheme was evaluated over a multitemporal hyperspectral dataset obtained from the Hyperion sensor. Experimental results show the effectiveness of the proposed scheme using SAE for extracting the relevant features of the fused information when compared to other published feature extraction methods
All the images are avaiable in Matlab (.mat) format, among others. For further information see the readme in the files.
* Codes were run in Ubuntu 14.04.
* Caffe framework 1.0.0-rc3 to perform the feature extraction by means of SAE.
* NWFE and PCA used for comparision purposes retaining 12 features.
* ELM and SVM trained with 5% of the reference data available for each class.
Corect | Missed Alarms | False Alarms | Total Error |
77020 (98.74%) | 509 | 471 | 980 (1.25%) |
Classifier | Parameters | FE | OA (%) | AA (%) | Kappa |
ELM | N=120 | PCA | 91.73 | 76.06 | 86.83 |
ELM | N=120 | NWFE | 91.76 | 76.75 | 86.83 |
ELM | N=60 | SAE | 95.19 | 90.45 | 92.31 |
SVM | C: 64.0 γ: 32.0 | PCA | 91.46 | 71.16 | 86.46 |
SVM | C: 32.0 γ: 16.0 | NWFE | 91.29 | 90.61 | 86.05 |
SVM | C: 32.0 γ: 0.0625 | SAE | 95.52 | 92.56 | 92.90 |
C: penalty term in the training of the SVM. γ: radius of the gaussian function of the SVM. N: Number of neurons in the hidden layer of the ELM. FE: Feature Extraction method.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.