Hopp til hovedinnholdet

Publikasjoner

NIBIOs ansatte publiserer flere hundre vitenskapelige artikler og forskningsrapporter hvert år. Her finner du referanser og lenker til publikasjoner og andre forsknings- og formidlingsaktiviteter. Samlingen oppdateres løpende med både nytt og historisk materiale. For mer informasjon om NIBIOs publikasjoner, besøk NIBIOs bibliotek.

2022

Til dokument

Sammendrag

Parts of the limited agricultural land area in Norway are taken up by buildings, roads, and other permanent changes every year. A method that detects such changes immediately after they have taken place is required in order to monitor the agricultural areas closely. To that end, Sentinel-2 satellite image time series (SITS) acquired during the summer of 2019 were used to detect the agricultural areas taken up by permanent changes such as buildings and roads. A deep-learning algorithm using 1D convolutional neural network (CNN), with the convolution in the temporal dimension, was applied to the SITS data. The training data was collected from the building footprints dataset filtered using a mono-temporal image aided with the areal resource map (AR5). The deep-learning model was trained and evaluated before being used for prediction in two regions of Norway. Procedures to reduce overfitting of the model to the training data were also implemented. The trained model showed a high level of accuracy and robustness when evaluated based on a test dataset kept out of the training process. The trained model was then used to predict new built-up areas in agricultural fields in two Sentinel-2 tiles. The prediction was able to detect areas taken by new buildings, roads, parking areas and other similar changes. The prediction was then evaluated with respect to the existing building footprints after a few post-processing procedures. A high percentage of the buildings were detected by the method, except for small buildings. The details of the methods and the results obtained, together with brief discussion, are presented in this paper.