Hopp til hovedinnholdet

Publikasjoner

NIBIOs ansatte publiserer flere hundre vitenskapelige artikler og forskningsrapporter hvert år. Her finner du referanser og lenker til publikasjoner og andre forsknings- og formidlingsaktiviteter. Samlingen oppdateres løpende med både nytt og historisk materiale. For mer informasjon om NIBIOs publikasjoner, besøk NIBIOs bibliotek.

2021

Til dokument

Sammendrag

Dead greens in spring due to winterkill is common on Nordic golf courses. The objective of this research was to evaluate drop seeding, spike seeding and slit seeding of creeping bentgrass (Agrostis stolonifera L.‘007’) and rough bluegrass (Poa trivialis L.) in comparison with an unseeded control treatment for reestablishment of annual bluegrass (Poa annua L.) putting greens after winterkill. Three trials were conducted on golf courses in Central Sweden (60–61° N, 15–16° E 70–170 m a.s.l.); two in 2017 with soil temperatures varying from 6 to 16 °C during the trial period, and one in 2018 with temperatures varying from 13 to 26 °C. On average for the three trials, turfgrass coverage 4 and 6 wk after seeding was better with spike seeding or slit seeding than with drop seeding which was not different from the unseeded control. Creeping bent grass and rough bluegrass coverage did not differ on average for three trials but slit seeded rough bluegrass had better coverage after 4 wk than any of the other treatments on average for the two trials in 2017. Together with the evaluation of seed mixtures in the SCANGREEN program, this research shows that slit seeding of rough bluegrass can be recommended for faster recovery of winterkilled annual bluegrass greens in central and northern parts of the Nordic countries. Rough bluegrass can either be seeded alone to enable faster golf course opening, or it can be seeded in mixture with creeping bentgrass as part of a long-term strategy to replace annual bluegrass with creeping bentgrass.

Til dokument

Sammendrag

Several factors may define storability in root crops. In the following paper, preliminary results are presented from two experiments performed to test factors affecting storage quality of carrot. The study have focused on 1) soil loosening/soil compaction and 2) different cultivars of carrot and root age considered by the length of the growing period. The results so far indicate that the soil compaction had few effects on storability of carrot, but did seem to negatively affect the length of the carrot. Soil loosening reduced the occurrence of liquorice rot caused by Mycocentrospora acerina. Large differences were found in storability between the ten tested carrot cultivars and length of growing period tended to be negatively correlated to storability. We conclude that a number of precautions in carrot production may increase storability and thus economic performance.

Til dokument

Sammendrag

Denne rapporten omhandlar resultat frå feltforsøk med epletre i vekstsesongane 2016 - 2020 ved NLR Viken i Lier og NIBIO Ullensvang. Føremålet med dette prosjektet var å stimulera til auka produksjon av norske eple med god fruktkvalitet dyrka på ein effektiv måte. Det vart gjennomført feltforsøk med kjemisk tynning under blomstringa og på karten med bruk av ulike bioregulatorar, mekanisk tynning med maskin under blomstringa, rotskjering med kniv påmontert traktor, utprøving av vekstregulatoren Regalis og gjødselvatning til to eplesortar. Kontrollert gjødseltilføring med epletre i potter i plasthus vart gjennomført ved Nibio Særheim. Målsetjinga heile tida var å gjera forsøk med miljøvennlege teknikkar til å gjødsla epletre optimalt for å auka den generelle fruktkvaliteten og heva andelen av årleg klasse 1 kvalitet. Dessutan var det viktig å nytta bærekraftige metodar for å redusere skotveksten og regulera avlinga i trea for å oppnå rett fruktsetjing under norske vilkår.

Til dokument

Sammendrag

Crop residue incorporation is a common practice to increase or restore organic matter stocks in agricultural soils. However, this practice often increases emissions of the powerful greenhouse gas nitrous oxide (N2O). Previous meta-analyses have linked various biochemical properties of crop residues to N2O emissions, but the relationships between these properties have been overlooked, hampering our ability to predict N2O emissions from specific residues. Here we combine comprehensive databases for N2O emissions from crop residues and crop residue biochemical characteristics with a random-meta-forest approach, to develop a predictive framework of crop residue effects on N2O emissions. On average, crop residue incorporation increased soil N2O emissions by 43% compared to residue removal, however crop residues led to both increases and reductions in N2O emissions. Crop residue effects on N2O emissions were best predicted by easily degradable fractions (i.e. water soluble carbon, soluble Van Soest fraction (NDS)), structural fractions and N returned with crop residues. The relationship between these biochemical properties and N2O emissions differed widely in terms of form and direction. However, due to the strong correlations among these properties, we were able to develop a simplified classification for crop residues based on the stage of physiological maturity of the plant at which the residue was generated. This maturity criteria provided the most robust and yet simple approach to categorize crop residues according to their potential to regulate N2O emissions. Immature residues (high water soluble carbon, soluble NDS and total N concentration, low relative cellulose, hemicellulose, lignin fractions, and low C:N ratio) strongly stimulated N2O emissions, whereas mature residues with opposite characteristics had marginal effects on N2O. The most important crop types belonging to the immature residue group – cover crops, grasslands and vegetables – are important for the delivery of multiple ecosystem services. Thus, these residues should be managed properly to avoid their potentially high N2O emissions.

Sammendrag

Diameter at breast height (DBH) distributions offer valuable information for operational and strategic forest management decisions. We predicted DBH distributions using Norwegian national forest inventory and airborne laser scanning data and compared the predictive performances of linear mixed-effects (PPM), generalized linear-mixed (GLM), and k nearest-neighbor (NN) models. While GLM resulted in smaller prediction errors than PPM, both were clearly outperformed by NN. We therefore studied the ability of the NN model to improve the precision of stem frequency estimates by DBH classes in the 8.7 Mha study area using a model-assisted (MA) estimator suitable for systematic sampling. MA estimates yielded greater than or approximately equal efficiencies as direct estimates using field data only. The relative efficiencies (REs) associated with the MA estimates ranged between 0.95–1.47 and 0.96–1.67 for 2 and 6 cm DBH class widths, respectively, when dominant tree species were assumed to be known. The use of a predicted tree species map, instead of the observed information, decreased the REs by up to 10%.

Sammendrag

Butt rot (BR) damage of a tree results from a decay caused by a pathogenic fungus. BR damages associated with Norway spruce (Picea abies [L.] Karst.) account for considerable economic losses in timber production across the northern hemisphere. While information on BR damages is critical for optimal decision-making in forest management, maps of BR damages are typically lacking in forest information systems. Timber volume damaged by BR was predicted at the stand-level in Norway using harvester information of 186,026 stems (clear-cuts), remotely sensed, and environmental data (e.g. climate and terrain characteristics). This study utilized Random Forests models with two sets of predictor variables: (1) predictor variables available after harvest (theoretical case) and (2) predictor variables available prior to harvest (mapping case). Our findings showed that forest attributes characterizing the maturity of forest, such as remote sensing-based height, harvested timber volume and quadratic mean diameter at breast height, were among the most important predictor variables. Remotely sensed predictor variables obtained from airborne laser scanning data and Sentinel-2 imagery were more important than the environmental variables. The theoretical case with a leave-stand-out cross-validation resulted in an RMSE of 11.4 m3 · ha−1 (pseudo-R2: 0.66) whereas the mapping case resulted in a pseudo-R2 of 0.60. When spatially distinct clusters of harvested forest stands were used as units in the cross-validation, the RMSE value and pseudo-R2 associated with the mapping case were 15.6 m3 · ha−1 and 0.37, respectively. The findings associated with the different cross-validation schemes indicated that the knowledge about the BR status of spatially close stands is of high importance for obtaining satisfactory error rates in the mapping of BR damages.

Sammendrag

Leaf blotch diseases (LBD), such as Septoria nodorum bloch (Parastagnospora nodorum), Septoria tritici blotch (Zymoseptoria tritici) and Tan spot (Pyrenophora tritici-repentis) can cause severe yield losses (up to 50%) in Norwegian spring wheat (Triticum aestivum) and are mainly controlled by fungicide applications. A forecasting model to predict disease risk can be an important tool to optimize disease control. The association between specific weather variables and the development of LBD differs between wheat growth stages. In this study, a mathematical model to estimate phenological development of spring wheat was derived based on sowing date, air temperature and photoperiod. Weather factors associated with LBD severity were then identified for selected phenological growth stages by a correlation study of LBD severity data (17 years). Although information regarding host resistance and previous crop were added to the identified weather factors, two purely weather-based risk prediction models (CART, classification and regression tree algorithm) and one black box model (KNN, based on K nearest neighbor algorithm) were most accurate to predict moderate to high LBD severity (>5% infection). The predictive accuracy of these models (76–83%) was compared to that of two existing models used in Norway and Denmark (60 and 61% accuracy, respectively). The newly developed models performed better than the existing models, but still had the tendency to overestimate disease risk. Specificity of the new models varied between 49 and 74% compared to 40 and 37% for the existing models. These new models are promising decision tools to improve integrated LBD management of spring wheat in Norway.