Torfinn Torp

Senior Adviser (OAP Agreement)

(+47) 466 27 834
torfinn.torp@nibio.no

Place
Ås H7

Visiting address
Høgskoleveien 7, 1433 Ås

Abstract

Weeds may reduce crop yields significantly if managed improperly. However, excessive herbicide use increases risk of unwanted effects on ecosystems, humans and herbicide resistance development. Weed harrowing is a traditional method to manage weeds mechanically in organic cereals but could also be used in conventional production. The weed control efficacy of weed harrowing can be adjusted by e.g. the angle of the tines. Due to its broadcast nature (both crop and weed plants are disturbed), weed harrowing may have relatively poor selectivity (i.e. small ratio between weed control and crop injury). To improve selectivity, a sensor-based model which takes into account the intra-field variation in weediness and “soil density” in the upper soil layer (draft force of tines), is proposed. The suggested model is a non-linear regression model with three parameters and was based on five field trials in spring barley in SE Norway. The model predicts the optimal weed harrowing intensity (in terms of the tine angle) from the estimated total weed cover and SD per sub-field management unit, as well as a pre-set biological weed threshold (defined as the acceptable total weed cover left untreated). Weed cover and SD were estimated with RGB images (analysed with custom-made machine vision) and an electronic load cell, respectively. With current parameter values, the model should be valid for precision weed harrowing in spring barley in SE Norway. The next step is to test the model, and if successful, adjust it to more cereal species. Weeds may reduce crop yields significantly if managed improperly. However, excessive herbicide use increases risk of unwanted effects on ecosystems, humans and herbicide resistance development. Weed harrowing is a traditional method to manage weeds mechanically in organic cereals but could also be used in conventional production. The weed control efficacy of weed harrowing can be adjusted by e.g. the angle of the tines. Due to its broadcast nature (both crop and weed plants are disturbed), weed harrowing may have relatively poor selectivity (i.e. small ratio between weed control and crop injury). To improve selectivity, a sensor-based model which takes into account the intra-field variation in weediness and “soil density” in the upper soil layer (draft force of tines), is proposed. The suggested model is a non-linear regression model with three parameters and was based on five field trials in spring barley in SE Norway. The model predicts the optimal weed harrowing intensity (in terms of the tine angle) from the estimated total weed cover and SD per sub-field management unit, as well as a pre-set biological weed threshold (defined as the acceptable total weed cover left untreated). Weed cover and SD were estimated with RGB images (analysed with custom-made machine vision) and an electronic load cell, respectively. With current parameter values, the model should be valid for precision weed harrowing in spring barley in SE Norway. The next step is to test the model, and if successful, adjust it to more cereal species.

To document

Abstract

Field experiments were conducted in 2015 and 2016 to study the effect of tillage frequency, seed rate, and glyphosate on teff and weeds. The experiments were arranged in a split plot design with three replications consisting of tillage frequency (conventional, minimum, and zero tillage) as the main plot and the combination of seed rate (5, 15, and 25 kg ha−1) and glyphosate (with and without) as subplots. Results showed that zero tillage reduced teff biomass yield by 15% compared to minimum tillage and by 26% compared to conventional tillage. Zero tillage and minimum tillage also diminished grain yield by 21% and 13%, respectively, compared to conventional tillage. Lowering the seed rate to 5 kg ha−1 reduced biomass yield by 22% and 26% compared to 15 and 25 kg ha−1, respectively. It also reduced the grain yield by around 21% compared to 15 and 25 kg ha−1 seed rates. Conventional tillage significantly diminished weed density, dry weight, and cover by 19%, 29%, and 33%, respectively, compared to zero tillage. The highest seed rate significantly reduced total weed density, dry weight, and cover by 18%, 19%, and 15%, respectively, compared to the lowest seed rate. Glyphosate did not affect weed density but reduced weed dry weight by 14% and cover by 15%. Generally, sowing teff using minimum tillage combined with glyphosate application and seed rate of 15 kg ha−1 enhanced its productivity and minimized weed effects.

To document

Abstract

Clarireedia spp., Fusarium culmorum, and Microdochium nivale are destructive and widespread fungal pathogens causing turfgrass disease. Chemical control is a key tool for managing these diseases on golf greens but are most effective when used in a manner that reduces overall inputs, maximizes fungicide efficacy, and minimizes the risk of fungicide resistance. In this study, sensitivity to eight commonly used fungicides was tested in 13 isolates of Clarireedia spp., F. culmorum, and M. nivale via in vitro toxicity assays. Fungicide sensitivity varied significantly among the three species, with isolates of F. culmorum showing the least sensitivity. The sensitivity of M. nivale to all tested fungicides was high (with the exception of tebuconazole), but only four fungicides (Banner Maxx®, Instrata® Elite, Medallion TL, and Switch® 62,5 WG) suppressed the growth of M. nivale completely at a concentration of 1% of the recommended dose. All three fludioxonil-containing fungicides either alone (Medallion TL) or in combination with difeconazole (Instrata® Elite) or cyprodinil (Switch® 62,5 WG) had the same high efficacy against isolates of both M. nivale and Clarireedia spp. On average, the Clarireedia isolates tested in this study showed high sensitivity to the tested fungicides, except for Heritage (azoxystrobin). The observed variation in sensitivity among isolates within the same fungal species to different fungicides needs further investigation, as an analysis of the differences in fungal growth within each fungal group revealed a significant isolate × fungicide interaction (p < .001).

To document

Abstract

Precision weeding or site-specific weed management (SSWM) take into account the spatial distribution of weeds within fields to avoid unnecessary herbicide use or intensive soil disturbance (and hence energy consumption). The objective of this study was to evaluate a novel machine vision algorithm, called the ‘AI algorithm’ (referring to Artificial Intelligence), intended for post-emergence SSWM in cereals. Our conclusion is that the AI algorithm should be suitable for patch spraying with selective herbicides in small-grain cereals at early growth stages (about two leaves to early tillering). If the intended use is precision weed harrowing, in which also post-harrow images can be used to control the weed harrow intensity, the AI algorithm should be improved by including such images in the training data. Another future goal is to make the algorithm able to distinguish weed species of special interest, for example cleavers (Galium aparine L.).

Abstract

Soil organic carbon (SOC) was studied at 0–45 cm depth after 28 years of cropping with arable and mixed dairy rotations on a soil with an initial SOC level of 2.6% at 0–30 cm. Measurements included both carbon concentration (SOC%) and soil bulk density (BD). Gross C input was calculated from yields. Averaged over all systems, topsoil SOC% declined significantly (−0.20% at 0–15 cm, p = 0.04, −0.39% at 15–30 cm, p = 0.05), but changed little at 30–45 cm (+0.11%, p = 0.15). Declines in topsoil SOC% tended to be greater in arable systems than in mixed dairy systems. Changes in BD were negatively related to those in SOC%, emphasizing the need to measure both when assessing SOC stocks. The overall SOC mass at 0–45 cm declined significantly from 98 to 89 Mg ha−1, representing a loss of 0.3% yr−1 of the initial SOC. Variability within systems was high, but arable cropping showed tendencies of high SOC losses, whilst SOC stocks appeared to be little changed in conventional mixed dairy with 50% ley and organic mixed dairy with 75% ley. The changes were related to the level of C input. Mean C input was 22% higher in mixed dairy than in arable systems.

To document

Abstract

Dollar spot, caused by at least five Clarireedia species (formerly Sclerotinia homoeocarpa F. T. Benn.), is one of the economically most important turfgrass diseases worldwide. The disease was detected for the first time in Scandinavia in 2013. There is no available information from Scandinavian variety trials on resistance to dollar spot in turfgrass species and cultivars (http://www.scanturf.org/). Our in vitro screening (in glass vials) of nine turfgrass species comprising a total of 20 cultivars showed that on average for ten Clarireedia isolates of different origin, the ranking for dollar spot resistance in turfgrass species commonly found on Scandinavian golf courses was as follows: perennial ryegrass = slender creeping red fescue > strong creeping red fescue > Kentucky bluegrass = velvet bentgrass > colonial bentgrass = Chewings fescue ≥ creeping bentgrass = annual bluegrass. Significant differences in aggressiveness among Clarireedia isolates of different origin were found in all turfgrass species except annual bluegrass (cv. Two Putt). The U.S. C. jacksonii isolate MB-01 and Canadian isolate SH44 were more aggressive than C. jacksonii isolates from Denmark and Sweden (14.10.DK, 14.15.SE, and 14.16.SE) in velvet bentgrass and creeping bentgrass. The Swedish isolate 14.112.SE was generally more aggressive than 14.12.NO despite the fact that they most likely belong to the same Clarireedia sp. The U.S. C. monteithiana isolate RB-19 had similar aggressiveness as the Scandinavian C. jacksonii isolates, but was less aggressive than two U.S. C. jacksonii isolates MB-01 and SH44. Thus, aggressiveness of Clarireedia isolates was more impacted by their geographic origin and less by species of the isolate and/or the host turfgrass species.

To document

Abstract

Fava bean (Vicia faba L.) yields are featured by high variability, influenced by the agro-environmental conditions during the growing seasons. These legume crops are sensitive to hydric and heat stresses. The adaptation depends on the efficiency of specific cultivars to use the available resources to produce biomass. This capacity is determined by the genotype and agronomical management practices. The present work aimed to uncover the influence of Baltic agro-environmental conditions (fava bean cultivar, plant density, climate, and soil features) on yield and protein content. For this, field trials were set under Baltic agro-climatic conditions, in Latvia and Estonia with five commercially available fava bean cultivars, representing broad genetic variation (‘Gloria’, ‘Julia’, ‘Jogeva’, ‘Lielplatones’, and ‘Bauska’). The results evidenced ‘Bauska’, ‘Julia’, and ‘Lielplatones’, as the most productive cultivars in terms of seed yield (4.5, 3.7, and 4.6 t ha−1, respectively) and protein yield (1.39, 1.22, and 1.36 t ha−1, respectively) under Estonian and Latvian agro-climatic conditions. Sowing these specific cultivars at densities of 30–40 seeds m−2 constitutes sustainable management for fava bean production in conventional cropping systems in the Baltic region.

To document

Abstract

Allelopathic potential of 10 teff varieties was assessed in laboratory experimentation (conducted in NIBIO, Norway), and determined with an agar-based bioassay using ryegrass and radish as model weeds. Field experiments were conducted in Tigray, Ethiopia during 2015 and 2016 to identify the most important agronomic traits of teff contributing to its weed competitive ability. A split plot design with three blocks was used considering hand weeding as the main plot and varieties as the subplot. Randomized complete block design (RCBD) with four blocks was used in the laboratory experiment. The highest potential allelopathic activity (PAA) and specific potential allelopathic activity (SPAA) were recorded from a local landrace with an average PAA value of 11.77% and SPAA value of 1.21%/mg respectively, when ryegrass was used as the model weed. ‘Boset’ had the highest average PAA value of 16.25% and an SPAA value of 1.53%/mg, when using radish as the model weed. The lowest PAA and SPAA values were recorded from ‘DZ-Cr-3870 when using ryegrass and radish as model weeds. Days to emergence, height, tiller no./plant, biomass yield, and PAA of the crop significantly contributed to the variance of the weed biomass, cover, and density. Hence, they were the most important agronomic traits enhancing the competitive ability of teff.

Abstract

Norway spruce (Picea abies) is a widely used Christmas tree species in the Nordic countries. Postharvest needle retention is an important characteristic for Christmas trees and compared to many fir (Abies) species, Norway spruce has poor postharvest needle retention. This trait is one of the most important qualities in choice of natural versus plastic trees. In this study, current year shoots were cut from 30 Norway spruce seedlot sources, including the most widely used Norwegian Christmas tree provenances, and tested to identify genetic variation in postharvest needle retention. Current year shoots were collected from one field in November and December 2018, and from three fields in October, November and December 2019. The current year shoots were displayed indoors under controlled conditions and allowed to dry. Differences in postharvest needle retention were seen between seedlots, harvesting dates and locations. Our study indicates possibilities of selecting for improved postharvest needle retention in Norway spruce seed sources. Furthermore, postharvest needle retention should be considered as one characteristic to add in the ongoing Norway spruce Christmas tree breeding program.

Abstract

The invasive slug Arion vulgaris (Gastropoda: Arionidae) is an agricultural pest and serious nuisance in gardens of Central and Northern Europe. To investigate if the success of A.vulgaris in Norway can be attributed to a release from parasites, we compared the prevalence and parasite load of nematodes and trematodes in A. vulgaris to that of three native gastropod species, A. circumscriptus, A. fasciatus and Arianta arbustorum, in SE Norway. We found A. vulgaris to have the highest prevalence of both parasite groups (49% nematodes, 76% trematodes), which does not support the parasite release hypothesis, but rather points to A. vulgaris as a potentially important intermediate host of these parasites. For trematodes the number of individuals (parasite load) did not differ among host species; for nematodes it was higher in A. vulgaris than A. fasciatus. To further compare the parasite susceptibility of the surveyed gastropods, we exposed A. vulgaris, A. fasciatus, and A. arbustorum to a slug parasitic nematode, Phasmarhabditis hermaphrodita, in the laboratory. This nematode is commercially available and widely used to control A. vulgaris. The non-target species A. fasciatus was most affected, with 100% infection, 60% mortality and significant feeding inhibition. A. vulgaris was also 100% infected, but suffered only 20% mortality and little feeding inhibition. The load of P. hermaphrodita in infected specimens was not significantly different for the two Arion species (median: 22.5 and 45, respectively). Only 35% of A. arbustorum snails were infected, none died, and parasite load was very low (median: 2). However, they showed a near complete feeding inhibition at highest nematode dose, and avoided nematode-infested soil. Our results indicate that A. vulgaris may be less susceptible to P. hermaphrodita than the native A. fasciatus, and that non-target effects of applying this nematode in fields and gardens should be further investigated.

To document

Abstract

Teff is a staple and well adapted crop in Ethiopia. Weed competition and control have major effects on yields and economic returns of the crop in the country. Among the weed management methods, development and use of weed competitive teff varieties remain the cheapest and most sustainable weed management option. Ten teff varieties were tested for their weed competitive ability in two locations. Treatments were applied using a split plot design with three blocks at each location for two consecutive seasons. Hand weeding and non-weeded treatments were applied to whole plot treatments with teff varieties assigned as split plots within the whole plot. The main objective was to determine relative competitive ability among teff varieties. Results showed that teff varieties showed significant variation in their weed competitive abilities. The varieties ‘Kora’ and ‘DZ-Cr-387’ significantly reduced weed density, dry weight, and cover more than the other teff varieties. They also had the lowest yield losses with a loss of 6% in biomass yield and 18% in grain yield recorded from ‘Kora’ and a loss of 17% in biomass yield and 21% in grain yield recorded from ‘DZ-Cr-387’. Therefore, they showed the highest weed competitive ability compared to the other varieties.

To document

Abstract

Potato soft rot Pectobacteriaceae (SRP) cause large yield losses and are persistent in seed lots once established. In Norway, different Pectobacterium species are the predominant cause of soft rot and blackleg disease. This work aimed to evaluate the potential of real-time PCR for quantification of SRP in seed tubers, as well as investigating the status of potato seed health with respect to SRP in Norway. A total of 34 seed potato lots, including certified seeds, was grown and monitored over three consecutive years. All seed lots contained a quantifiable amount of SRP after enrichment, with very few subsamples being free of the pathogens. A high SRP prevalence based on a qPCR assay, as well as a high symptom incidence in certified seeds were observed, suggesting that current criteria for seed certification are insufficient to determine tuber health and predict field outcomes. Pectobacterium atrosepticum was the most abundant species in the examined seed lots and present in all lots. Consistently good performance of first generation seed lots with respect to blackleg and soft rot incidence, as well as low quantity of SRP in these seed lots demonstrated the importance of clean seed potatoes. Weather conditions during the growing season seemed to govern disease incidence and SRP prevalence more than seed grade. The impact of temperature, potato cultivar and Pectobacterium species on tuber soft rot development were further examined in tuber infection experiments, which showed that temperature was the most important factor in nearly all cultivars. Large-scale quantification of latent infection and predictive models that include contributing factors like weather, infecting bacterial species and cultivar are needed to reduce soft rot and blackleg.

Abstract

Precipitation has generally increased in Norway during the last century, and climate projections indicate a further increase. The growing season has also become longer with higher temperatures, particularly in autumn. Previous studies have shown negative effects of high temperatures and, depending upon temperature conditions, contrasting effects of waterlogging on hardening capacity of timothy. We studied effects of waterlogging on seedlings of timothy (Phleum pratense, cv. Noreng) under three pre‐acclimation temperatures: 3°C, 7°C, 12°C, and in autumn natural light in a phytotron at Holt, Tromsø (69°N). After temperature treatments, all plants were cold acclimated at 2°C for three weeks under continued waterlogging treatments. Freezing tolerance was determined by intact plants being frozen in pots at incremental temperature decreases in a programmable freezer. Waterlogging resulted in a higher probability of death after freezing, and a significantly reduced regrowth after three weeks at 18°C, 24 hrs light in a greenhouse. Increasing pre‐acclimation temperatures also had a clear negative effect on freezing tolerance, but there was no interaction between temperatures and waterlogging. The results indicate that waterlogging may have negative implications for hardening of timothy and may contribute to reduced winter survival under the projected increase in autumn temperatures and precipitation.

To document

Abstract

The emissions of nitrous oxide (N2O) and leaching of nitrate (NO3) from agricultural cropping systems have considerable negative impacts on climate and the environment. Although these environmental burdens are less per unit area in organic than in non-organic production on average, they are roughly similar per unit of product. If organic farming is to maintain its goal of being environmentally friendly, these loadings must be addressed. We discuss the impact of possible drivers of N2O emissions and NO3 leaching within organic arable farming practice under European climatic conditions, and potential strategies to reduce these. Organic arable crop rotations are generally diverse with the frequent use of legumes, intercropping and organic fertilisers. The soil organic matter content and the share of active organic matter, soil structure, microbial and faunal activity are higher in such diverse rotations, and the yields are lower, than in non-organic arable cropping systems based on less diverse systems and inorganic fertilisers. Soil mineral nitrogen (SMN), N2O emissions and NO3 leaching are low under growing crops, but there is the potential for SMN accumulation and losses after crop termination, harvest or senescence. The risk of high N2O fluxes increases when large amounts of herbage or organic fertilisers with readily available nitrogen (N) and degradable carbon are incorporated into the soil or left on the surface. Freezing/thawing, drying/rewetting, compacted and/or wet soil and mechanical mixing of crop residues into the soil further enhance the risk of high N2O fluxes. N derived from soil organic matter (background emissions) does, however, seem to be the most important driver for N2O emission from organic arable crop rotations, and the correlation between yearly total N-input and N2O emissions is weak. Incorporation of N-rich plant residues or mechanical weeding followed by bare fallow conditions increases the risk of NO3 leaching. In contrast, strategic use of deep-rooted crops with long growing seasons or effective cover crops in the rotation reduces NO3 leaching risk. Enhanced recycling of herbage from green manures, crop residues and cover crops through biogas or composting may increase N efficiency and reduce N2O emissions and NO3 leaching. Mixtures of legumes (e.g. clover or vetch) and non-legumes (e.g. grasses or Brassica species) are as efficient cover crops for reducing NO3 leaching as monocultures of non-legume species. Continued regular use of cover crops has the potential to reduce NO3 leaching and enhance soil organic matter but may enhance N2O emissions. There is a need to optimise the use of crops and cover crops to enhance the synchrony of mineralisation with crop N uptake to enhance crop productivity, and this will concurrently reduce the long-term risks of NO3 leaching and N2O emissions.

To document

Abstract

The use of blankets in horses is widespread in Northern Europe. However, horses are very adaptable to low temperatures and the practice is questioned because blankets may hamper heat dissipation at high temperatures and also disturb free movement. The aim of the current study was to gain information about horses’ own preferences for wearing or not wearing a blanket under different weather conditions during the seasons. 10 horses usually wearing blankets and 13 horses usually not wearing blankets were kept outside in their paddock for 2 h during different weather conditions. Then, these horses were tested for their preference for wearing blankets (see Mejdell et al., 2016). When only considering air temperature and not the impact of other weather factors, the horses preferred to have the blanket on in 80% and 90% of the test at t < -10 °C in horses usually wearing and not wearing blankets, respectively. As air temperature increased, the preference for keeping the blanket on decreased and at air temperatures > 20 °C, the horses preferred to remove the blanket in all the tests. According to the statistical model, the probability for choosing to have a blanket on increased with increasing wind speed, and also precipitation increased the probability for choosing to have a blanket on. Sunshine however, reduced the probability for choosing to wear a blanket.

Abstract

High concentrations of the mycotoxins HT-2 and T-2 (HT2 + T2), primarily produced by Fusarium langsethiae, have occasionally been detected in Norwegian oat grains. In this study, we identified weather variables influencing accumulation of HT2 + T2 in Norwegian oat grains. Oat grain samples from farmers’ fields were collected together with weather data (2004–2013). Spearman rank correlation coefficients were calculated between the HT2 + T2 contamination in oats at harvest and a range of weather summarisations within estimated phenological windows of growth stages in oats (tillering, flowering etc.). Furthermore, we developed a mathematical model to predict the risk of HT2 + T2 in oat grains. Our data show that adequate predictions of the risk of HT2 + T2 in oat grains at harvest can be achieved, based upon weather data observed during the growing season. Humid and cool conditions, in addition to moderate temperatures during booting, were associated with increased HT2 + T2 accumulation in harvested oat grains, whereas warm and humid weather during stem elongation and inflorescence emergence, or cool weather and absence of rain during booting reduced the risk of HT2 + T2 accumulation. Warm and humid weather immediately after flowering increased the risk, while moderate to warm temperatures and absence of rain during dough development, reduced the risk of HT2 + T2 accumulation in oat grains. Our data indicated that HT2 + T2 contamination in oats is influenced by weather conditions both pre- and post-flowering. These findings are in contrast with a previous study examining the risk of deoxynivalenol contamination in oat reporting that toxin accumulation was mostly influenced by weather conditions from flowering onwards.

Abstract

Monitoring changes in forest height, biomass and carbon stock is important for understanding the drivers of forest change, clarifying the geography and magnitude of the fluxes of the global carbon budget and for providing input data to REDD+. The objective of this study was to investigate the feasibility of covering these monitoring needs using InSAR DEM changes over time and associated estimates of forest biomass change and corresponding net CO2 emissions. A wall-to-wall map of net forest change for Uganda with its tropical forests was derived from two Digital Elevation Model (DEM) datasets, namely the SRTM acquired in 2000 and TanDEM-X acquired around 2012 based on Interferometric SAR (InSAR) and based on the height of the phase center. Errors in the form of bias, as well as parallel lines and belts having a certain height shift in the SRTM DEM were removed, and the penetration difference between X- and C-band SAR into the forest canopy was corrected. On average, we estimated X-band InSAR height to decrease by 7 cm during the period 2000–2012, corresponding to an estimated annual CO2 emission of 5 Mt for the entirety of Uganda. The uncertainty of this estimate given as a 95% confidence interval was 2.9–7.1 Mt. The presented method has a number of issues that require further research, including the particular SRTM biases and artifact errors; the penetration difference between the X- and C-band; the final height adjustment; and the validity of a linear conversion from InSAR height change to AGB change. However, the results corresponded well to other datasets on forest change and AGB stocks, concerning both their geographical variation and their aggregated values.

To document

Abstract

Blue and yellow sticky traps equipped with blue light emitting diodes (LEDs) were evaluated for their attractiveness to the western flower thrips (Frankliniella occidentalis Pergande) and compared to similar traps without light in two greenhouses with commercial production of either mixed herbs or Alstroemeria cut flowers. Blue traps were more attractive to F. occidentalis than the yellow traps in both crops, regardless of whether they were equipped with light or not. In herbs, the blue light equipped traps caught 1.7 to 2.5 times more thrips compared to blue traps without light, and 1.7 to 3.0 times more thrips than yellow traps with light. Blue light on both blue and yellow traps increased thrips catches in one out of two experiments in Alstroemeria. The blue light equipped traps caught 3.4 and 4.0 times more thrips than blue traps without light in coloured and white Alstroemeria cultivars, respectively, whereas yellow light equipped traps increased thrips catches 4.5 times compared to yellow traps without light in both coloured and white cultivars. The yellow light equipped traps caught, however, only equal to or only slightly more thrips than blue traps without light, and caught fewer thrips than the light equipped blue traps. The relative trapping efficiency of the different combinations of trap colour and light varied with experiment, crop and Alstroemeria cultivars. This suggests that factors other than merely the addition of light influenced the thrips' phototactic response to the traps. Such factors could be differences in the relative strength of the competition between attractive signals from traps and plants between the two crops and Alstroemeria cultivars, thrips density, seasonal lighting conditions or different pest management strategies and other operational procedures in the greenhouses. The light from the traps did not increase the thrips population on the plants below the traps. The implications of the results for thrips control and suggestions for further studies are discussed.

To document

Abstract

The effects of a commercial seaweed (SW) product and extracts collected from wild SWs in the Northern Norway on cultivable commensal intestinal bacterial groups isolated from Norwegian White sheep ewes were studied in vivo and in vitro. Bacterial counts from faeces from the ewes fed with supplement which contained SW meal throughout the entire indoor winter period had significantly lower lactic acid bacteria (LAB) counts (P ≈ .05). The screening of extracts from red and brown SWs showed that a number of the organic extracts had an inhibitory effect on the growth of the two Enterococcus sp. isolates. The results indicate that Ascophyllum nodosum supplementation reduces LAB counts in the ewes and the lambs, and that extracts from this SW have an inhibitory effect on the growth of Enterococcus sp. isolates.

To document

Abstract

More sustainable production of high-quality, nutritious food is of worldwide interest. Increasing nutrient recycling into food systems is a step in this direction. The objective of the present study was to determine nitrogen (N) fertiliser effects of four waste-derived and organic materials in a cropping sequence of broccoli, potato and lettuce grown at two latitudes (58° and 67° N) in Norway during three years. Effects of anaerobically digested food waste (AD), shrimp shell (SS), algae meal (AM) and sheep manure (SM) at different N application rates (80 and 170 kg N ha–1 for broccoli, and 80 and 60 kg N ha–1 for potato and lettuce, respectively) and residual effects were tested on crop yield, N uptake, N recovery efficiency (NRE), N balance, N content in produce, mineral N in soil, product quality parameters and content of nitrate in lettuce. Mineral fertiliser (MF) served as control. Effects on yield, N uptake, NRE, N balance and product quality parameters could to a great extent be explained by estimated potentially plant-available N, which ranked in the order of AD>SS>SM>AM. Results for crops fertilised with AD and SS were not significantly different from MF at the same N application rate, while AM, in agreement with its negative effect on N mineralisation, gave negative or near-neutral effects compared to the control. No residual effect was detected after the year of application. The results showed that knowledge about N dynamics of relevant organic waste-derived fertilisers is necessary to decide on the timing and rate of application.

To document

Abstract

During the last ten years, Norwegian cereal grain industry has experienced large challenges due to Fusarium spp. and Fusarium mycotoxin contamination of small-grained cereals. To prevent severely contaminated grain lots from entering the grain supply chain, it is important to establish surveys for the most prevalent Fusarium spp. and mycotoxins. The objective of our study was to quantify and calculate the associations between Fusarium spp. and mycotoxins prevalent in oats and spring wheat. In a 6-year period from 2004-2009, 178 grain samples of spring wheat and 289 samples of oats were collected from farmers’ fields in South East Norway. The grains were analysed for 18 different Fusarium-mycotoxins by liquid chromatography – mass spectrometry. Generally, the median mycotoxin levels were higher than reported in Norwegian studies covering previous years. The DNA content of Fusarium graminearum, Fusarium culmorum, Fusarium langsethiae, Fusarium poae and Fusarium avenaceum were determined by quantitative PCR. We identified F. graminearum as the main deoxynivalenol (DON) producer in oats and spring wheat, and F. langsethiae as the main HT-2 and T-2-toxins producer in oats. No association was observed between quantity of F. graminearum DNA and quantity of F. langsethiae DNA nor for their respective mycotoxins, in oats. F. avenaceum was one of the most prevalent Fusarium species in both oats and spring wheat. The following ranking of Fusarium species was made based on the DNA concentrations of the Fusarium spp. analysed in this survey (from high to low): F. graminearum = F. langsethiae = F. avenaceum > F. poae > F. culmorum (oats); F. graminearum = F. avenaceum > F. culmorum > F. poae = F. langsethiae (spring wheat). Our results are in agreement with recently published data indicating a shift in the relative prevalence of Fusarium species towards more F. graminearum versus F. culmorum in Norwegian oats and spring wheat.

Abstract

High concentrations of the mycotoxin deoxynivalenol (DON), produced by Fusarium graminearum have occurred frequently in Norwegian oats recently. Early prediction of DON levels is important for farmers, authorities and the Cereal Industry. In this study, the main weather factors influencing mycotoxin accumulation were identified and two models to predict the risk of DON in oat grains in Norway were developed: (1) as a warning system for farmers to decide if and when to treat with fungicide, and (2) for authorities and industry to use at harvest to identify potential food safety problems. Oat grain samples from farmers’ fields were collected together with weather data (2004–2013). A mathematical model was developed and used to estimate phenology windows of growth stages in oats (tillering, flowering etc.). Weather summarisations were then calculated within these windows, and the Spearman rank correlation factor calculated between DON-contamination in oats at harvest and the weather summarisations for each phenological window. DON contamination was most clearly associated with the weather conditions around flowering and close to harvest. Warm, rainy and humid weather during and around flowering increased the risk of DON accumulation in oats, as did dry periods during germination/seedling growth and tillering. Prior to harvest, warm and humid weather conditions followed by cool and dry conditions were associated with a decreased risk of DON accumulation. A prediction model, including only pre-flowering weather conditions, adequately forecasted risk of DON contamination in oat, and can aid in decisions about fungicide treatments.

To document

Abstract

A controlled climatic chamber microcosm experiment was conducted to examine how light affects the hourly sporulation pattern of the beneficial mite pathogenic fungus Neozygites floridana during a 24 h cyclus over a period of eight consecutive days. This was done by inoculating two-spotted spider mites (Tetranychus urticae) with N. floridana and placing them on strawberry plants for death and sporulation. Spore (primary conidia) discharge was observed by using a spore trap. Two light regimes were tested: Plant growth light of 150 μmol m−2 s−1 for 12 h supplied by high pressure sodium lamps (HPS), followed by either; (i) 4 h of 50 μmol m−2 s−1 light with similar HPS lamps followed by 8 h darkness (full HPS light + reduced HPS light + darkness) or (ii) 4 h of 50 μmol m−2 s−1 red light followed by 8 h darkness (full HPS light + red light + darkness). A clear difference in hourly primary conidia discharge pattern between the two different light treatments was seen and a significant interaction effect between light treatment and hour in day during the 24 h cycle was observed. The primary conidia discharge peak for treatment (ii) that included red light was mainly reached within the red light hours (19:00–23:00) and the dark hours (23:00–07:00). The primary conidia discharge peak for treatment (i) with HPS light only was mainly reached within the dark hours (23:00–07:00).

To document

Abstract

The density and diversity of springtails (Collembola) in the upper soil layer (0–3.8 cm) were studied in a perennial grass-clover ley in NW Norway during April–June 2012. The study was part of a field experiment comparing yields and soil characteristics after application of non-digested slurry (NS) versus anaerobically digested slurry (DS) from dairy cows. In total for three sampling dates, 39 species of springtails were identified. In the Control plots receiving no manure, the density level was around 30 000 individuals (ind.) m−2 throughout the whole season. Three days after slurry application (40 t ha−1), the density of springtails had dropped significantly; from 55 214 to 7410 ind. m−2 in the NS treatment and from 41 914 to 10 260 ind. m−2 in the DS treatment. After 7 weeks the densities had increased again to 54% and 38% of the initial levels in NS and DS treatments, respectively. The springtails were divided into two ecological groups based on morphology and colour. The epigeic group comprised surface-dwelling species with eye organs and pigmentation. The endogeic group comprised soil-dwelling species lacking eye organs and pigmentation, and generally with shorter extremities than those found in the epigeic group. The negative effect of manure application on density was more severe and long-lasting in the epigeic than in the endogeic group. This effect was similar for both manure types. One species (Parisotoma notabilis) comprised 50% of the epigeic population, while three Mesaphorura spp. and Stanaphorura lubbocki comprised half the endogeic population. In general, the community structure, described by the relative abundance of each species, was more affected by manure application in the epigeic than in the endogeic group. Hence, slurry application seemed to affect surface-dwelling species more negatively than soil- dwelling species, even within the small sampling depth used here. The density of endogeic species seemed to recover faster than the density of the epigeic species. A simplified classification of epigeic and endogeic springtails, based on the presence or absence of pigmentation and eyes, may be useful in studies of soil springtails where identification of the actual species is not the primary purpose.

Abstract

Aim: The objective was to assess the nitrogen provided to following crops by peas and fava beans, with varying legume residue incorporation and use of cover crops. Meth: Organic field trial, ‘spilt-split plot design’ with 4 blocks. Whole plots (spring 2014) had legumes (pea or faba beans), and sub plots (autumn 2014) had 4 autumn soil treatments with combinations of legume residue incorporation and cover crop. The sub-sub plots (spring 2015) were with and without additional manure fertilization. Res: The root biomass of both legume pre cops had equal nitrogen (N) concentration, but total root biomass was twice as high for fava beans as for peas (5.08 vs. 2.41 kg m-2). Fava bean pre crop with biomass incorporation without cover crop gave the highest broccoli yield (4.10 t ha-1) compared with pea pre crop with biomass incorporation and no cover crop (2.44 t ha -1). Also the last crop in the rotation, lettuce, had 94% higher yield after fava beans (6.6 t ha-1) compared to peas (3.4 t ha-1). Rye as cover crop efficiently assimilated and conserved N during winter, shown by a 4 to 5 fold reduction in soil NO3-N, and nearly 2 fold reduction in soil N-min levels, compared to open soil. Additional manure application affected crop yield, with 3 and 2 fold increase in broccoli and lettuce respectively. Conc: Fava beans as pre crop resulted in higher yields of broccoli and lettuce the following seasons, compared to peas. This was explained twice as much root biomass for that crop.

Abstract

A meta-analysis based on experiments in organically cultivated grasslands in Norway was conducted to quantify the effects of management factors on herbage yield and feed quality. A dataset was collected that included 496 treatment means from experiments in five studies carried out at eight locations with the latitude range of 58.8 to 69.6 N between 1993 and 2010. We tested the effect of harvesting system (two vs. Three cuts annually), plant developmental stage at the first cut, growth period (temperature sum) and the herbage clover proportion. Plant maturity at the first cut and herbage clover proportion explained to a large extent herbage yield and quality of the first cut and annual yield. The timing of the first cut influenced also the yield and herbage quality of the second cut. The analysis confirmed the importance of legumes performance for herbage yield and quality from grasslands in organic production. Estimated annual herbage DM yield harvested at standardized plant development stage and at average clover proportion was 9%higher in the two—compared to the three-cut system. The crude protein concentration and in vitro dry matter digestibility was 17 and 3 % higher and the NDF concentration 7 % lower in the annual herbage from the three-cut than from the twocut system, respectively. The empirical equations developed in this study may be applied to explore different options for grassland management as basis for ration and production planning and in scenario analysis of economic performance of individual and model farms. The equations do also reveal in numeric terms the tradeoffs in management practice between high yields, yield digestibility, NDF and crude protein content in organic forage production relying on red clover N2 fixation as the engine in the system.

To document

Abstract

Embodied energy in barns is found to contribute to about 10–30% of total energy use on dairy farms. Nevertheless, research on sustainability of dairy farming has largely excluded consideration of embodied energy. The main objectives of this study were to apply an established model from the residential and commercial building sector and estimate the amount of embodied energy in the building envelopes on 20 dairy farms in Norway. Construction techniques varied across the buildings and our results showed that the variables which contributed most significantly to levels of embodied energy were the area per cow-place, use of concrete in walls and insulation in concrete walls. Our findings are in contrast to the assumption that buildings are similar and would show no significant differences. We conclude that the methodology is sufficiently flexible to accommodate different building design and use of materials, and allows for an efficient means of estimating embodied energy reducing the work compared to a mass material calculation. Choosing a design that requires less material or materials with a low amount of embodied energy, can significantly reduce the amount of embodied energy in buildings.

To document

Abstract

The use of artificial freezing tests, identification of biomarkers linked to or directly involved in the low-temperature tolerance processes, could prove useful in applied strawberry breeding. This study was conducted to identify genotypes of diploid strawberry that differ in their tolerance to low-temperature stress and to investigate whether a set of candidate proteins and metabolites correlate with the level of tolerance. 17 Fragaria vesca, 2 F. nilgerrensis, 2 F. nubicola, and 1 F. pentaphylla genotypes were evaluated for low-temperature tolerance. Estimates of temperatures where 50 % of the plants survived (LT50) ranged from −4.7 to −12.0 °C between the genotypes. Among the F. vesca genotypes, the LT50 varied from −7.7 °C to −12.0 °C. Among the most tolerant were three F. vesca ssp. bracteata genotypes (FDP821, NCGR424, and NCGR502), while a F. vesca ssp. californica genotype (FDP817) was the least tolerant (LT50 −7.7 °C). Alcohol dehydrogenase (ADH), total dehydrin expression, and content of central metabolism constituents were assayed in select plants acclimated at 2 °C. The LT50 estimates and the expression of ADH and total dehydrins were highly correlated (r adh = −0.87, r dehyd = −0.82). Compounds related to the citric acid cycle were quantified in the leaves during acclimation. While several sugars and acids were significantly correlated to the LT50 estimates early in the acclimation period, only galactinol proved to be a good LT50 predictor after 28 days of acclimation (r galact = 0.79). It is concluded that ADH, dehydrins, and galactinol show great potential to serve as biomarkers for cold tolerance in diploid strawberry.