Displaying all 2 publications

Abstract:
Sort:
  1. Zhang G, Roslan SNAB, Wang C, Quan L
    Sci Rep, 2023 Sep 28;13(1):16275.
    PMID: 37770628 DOI: 10.1038/s41598-023-43317-1
    In recent years, remote sensing images of various types have found widespread applications in resource exploration, environmental protection, and land cover classification. However, relying solely on a single optical or synthetic aperture radar (SAR) image as the data source for land cover classification studies may not suffice to achieve the desired accuracy in ground information monitoring. One widely employed neural network for remote sensing image land cover classification is the U-Net network, which is a classical semantic segmentation network. Nonetheless, the U-Net network has limitations such as poor classification accuracy, misclassification and omission of small-area terrains, and a large number of network parameters. To address these challenges, this research paper proposes an improved approach that combines both optical and SAR images in bands for land cover classification and enhances the U-Net network. The approach incorporates several modifications to the network architecture. Firstly, the encoder-decoder framework serves as the backbone terrain-extraction network. Additionally, a convolutional block attention mechanism is introduced in the terrain extraction stage. Instead of pooling layers, convolutions with a step size of 2 are utilized, and the Leaky ReLU function is employed as the network's activation function. This design offers several advantages: it enhances the network's ability to capture terrain characteristics from both spatial and channel dimensions, resolves the loss of terrain map information while reducing network parameters, and ensures non-zero gradients during the training process. The effectiveness of the proposed method is evaluated through land cover classification experiments conducted on optical, SAR, and combined optical and SAR datasets. The results demonstrate that our method achieves classification accuracies of 0.8905, 0.8609, and 0.908 on the three datasets, respectively, with corresponding mIoU values of 0.8104, 0.7804, and 0.8667. Compared to the traditional U-Net network, our method exhibits improvements in both classification accuracy and mIoU to a certain extent.
  2. Zhang G, Roslan SNAB, Shafri HZM, Zhao Y, Wang C, Quan L
    Sci Rep, 2024 Jul 13;14(1):16212.
    PMID: 39003342 DOI: 10.1038/s41598-024-67109-3
    To obtain seasonable and precise crop yield information with fine resolution is very important for ensuring the food security. However, the quantity and quality of available images and the selection of prediction variables often limit the performance of yield prediction. In our study, the synthesized images of Landsat and MODIS were used to provide remote sensing (RS) variables, which can fill the missing values of Landsat images well and cover the study area completely. The deep learning (DL) was used to combine different vegetation index (VI) with climate data to build wheat yield prediction model in Hebei Province (HB). The results showed that kernel NDVI (kNDVI) and near-infrared reflectance (NIRv) slightly outperform normalized difference vegetation index (NDVI) in yield prediction. And the regression algorithm had a more prominent effect on yield prediction, while the yield prediction model using Long Short-Term Memory (LSTM) outperformed the yield prediction model using Light Gradient Boosting Machine (LGBM). The model combining LSTM algorithm and NIRv had the best prediction effect and relatively stable performance in single year. The optimal model was then used to generate 30 m resolution wheat yield maps in the past 20 years, with higher overall accuracy. In addition, we can define the optimum prediction time at April, which can consider simultaneously the performance and lead time. In general, we expect that this prediction model can provide important information to understand and ensure food security.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator ([email protected])

External Links