26 resultados para Field evaluation
em CentAUR: Central Archive University of Reading - UK
Resumo:
Two field trials were conducted using established apple (Malus cv. Golden Delicious) and pear (Pyrus communis 'Williams' Bon Chretien') to assess the efficacy of three commercially available systemic inducing resistance (SIR) products, Messenger (a.i. Harpin protein), Phoenix (a.i. Potassium phosphite) and Rigel (a.i. Salicylic acid derivative) applied at four different growth stages of tree development (bud break, green cluster, 90% petal fall, early fruitlet) against the foliar pathogens Venturia inaequalis and Venturia pirina which cause apple and pear scab respectively. A conventional synthetic fungicide (penconazole) used within the UK for apple and pear scab control was included for comparison. Little efficacy as scab protectants was demonstrated when each SIR product and penconazole was applied at only two growth stages (bud break, green cluster). However when the above compounds were applied at three or more growth stages efficacy as scab protectants was confirmed. The synthetic fungicide penconazole provided greatest protection against apple and pear scab in both the 2006 and 2007 field trials. There was little difference in the magnitude of scab protection conferred by each SIR agent. Results suggest application of at least three sprays during bud break to early fruitlet formation with an appropriate SIR agent may provide a useful addition to existing methods of apple and pear scab management under field conditions. (C) 2009 Published by Elsevier Ltd.
Resumo:
The ability of PCR to detect infections of Theileria parva, the cause of East Coast Fever, in field-collected tick and bovine samples from Tanzania was evaluated. PCR-detected infection prevalence was high (15/20, 75%) in unfed adult Rhipicephalus appendiculatus ticks that fed as nymphs on an acutely-infected calf, but low (22/836, 2.6%) in unfed adult R. appendiculatus collected from field sites in Tanzania. Tick infection prevalence was comparable to that in previous studies that used salivary gland staining to detect T parva infection in field-collected host-seeking ticks. Of 282 naturally-exposed zebu calves, seven had PCR-positive buffy coat samples prior to detection of Theileria spp. parasites in stained huffy coat cells or lymph node biopsies. Evidence of Theileria spp. infections was detected in stained smears of lymph node biopsies from 109 calves (38.6%) and huffy coat samples from 81 (28.7%), while huffy coat samples from 66 (23.4%) were PCR-positive for T parva. Implications of these findings for the sensitivity and specificity of the PCR are discussed. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
A detached leaf bioassay was used to determine the influence of several film forming polymers and a conventional triazole fungicide on apple scab (Venturia inaequalis (Cooke) G. Wint.) development under laboratory in vitro conditions, supported by two field trials using established apple cv. Golden Delicious to further assess the efficacy of foliar applied film forming polymers as scab protectant compounds. All film forming polymers used in this investigation (Bond, Designer, Nu-Film P, Spray Gard, Moisturin, Companion PCT12) inhibited germination of conidia, subsequent formation of appressoria and reduced leaf scab severity using a detached leaf bioassay. Regardless of treatment, there were no obvious trends in the percentage of conidia with one to four appressoria 5 days after inoculation. The synthetic fungicide penconazole resulted in the greatest levels of germination inhibition, appressorium development and least leaf scab severity. Under field conditions, scab severity on leaves and fruit of apple cv. Golden Delicious treated with a film forming polymer (Bond, Spray Gard, Moisturin) was less than on untreated controls. However, greatest protection in both field trials was provided by the synthetic fungicide penconazole. Higher chlorophyll fluorescence Fv/Fm emissions in polymer and penconazole treated trees indicated less damage to the leaf photosynthetic system as a result of fungal invasion. In addition, higher SPAD values as measures of leaf chlorophyll content were recorded in polymer and penconazole treated trees. Application of a film forming polymer or penconazole resulted in a higher apple yield per tree at harvest in both the 2005 and 2006 field trials compared to untreated controls. Results suggest application of an appropriate film forming polymer may provide a useful addition to existing methods of apple scab management. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The Cambridge Tropospheric Trajectory model of Chemistry and Transport (CiTTyCAT), a Lagrangian chemistry model, has been evaluated using atmospheric chemical measurements collected during the East Atlantic Summer Experiment 1996 (EASE '96). This field campaign was part of the UK Natural Environment Research Council's (NERC) Atmospheric Chemistry Studies in the Oceanic Environment (ACSOE) programme, conducted at Mace Head, Republic of Ireland, during July and August 1996. The model includes a description of gas-phase tropospheric chemistry, and simple parameterisations for surface deposition, mixing from the free troposphere and emissions. The model generally compares well with the measurements and is used to study the production and loss of O3 under a variety of conditions. The mean difference between the hourly O3 concentrations calculated by the model and those measured is 0.6 ppbv with a standard deviation of 8.7 ppbv. Three specific air-flow regimes were identified during the campaign – westerly, anticyclonic (easterly) and south westerly. The westerly flow is typical of background conditions for Mace Head. However, on some occasions there was evidence of long-range transport of pollutants from North America. In periods of anticyclonic flow, air parcels had collected emissions of NOx and VOCs immediately before arriving at Mace Head, leading to O3 production. The level of calculated O3 depends critically on the precise details of the trajectory, and hence on the emissions into the air parcel. In several periods of south westerly flow, low concentrations of O3 were measured which were consistent with deposition and photochemical destruction inside the tropical marine boundary layer.
Resumo:
Retinal blurring resulting from the human eye's depth of focus has been shown to assist visual perception. Infinite focal depth within stereoscopically displayed virtual environments may cause undesirable effects, for instance, objects positioned at a distance in front of or behind the observer's fixation point will be perceived in sharp focus with large disparities thereby causing diplopia. Although published research on incorporation of synthetically generated Depth of Field (DoF) suggests that this might act as an enhancement to perceived image quality, no quantitative testimonies of perceptional performance gains exist. This may be due to the difficulty of dynamic generation of synthetic DoF where focal distance is actively linked to fixation distance. In this paper, such a system is described. A desktop stereographic display is used to project a virtual scene in which synthetically generated DoF is actively controlled from vergence-derived distance. A performance evaluation experiment on this system which involved subjects carrying out observations in a spatially complex virtual environment was undertaken. The virtual environment consisted of components interconnected by pipes on a distractive background. The subject was tasked with making an observation based on the connectivity of the components. The effects of focal depth variation in static and actively controlled focal distance conditions were investigated. The results and analysis are presented which show that performance gains may be achieved by addition of synthetic DoF. The merits of the application of synthetic DoF are discussed.
Resumo:
Diffuse pollution, and the contribution from agriculture in particular, has become increasingly important as pollution from point sources has been addressed by wastewater treatment. Land management approaches, such as construction of field wetlands, provide one group of mitigation options available to farmers. Although field wetlands are widely used for diffuse pollution control in temperate environments worldwide, there is a shortage of evidence for the effectiveness and viability of these mitigation options in the UK. The Mitigation Options for Phosphorus and Sediment Project aims to make recommendations regarding the design and effectiveness of field wetlands for diffuse pollution control in UK landscapes. Ten wetlands have been built on four farms in Cumbria and Leicestershire. This paper focuses on sediment retention within the wetlands, estimated from annual sediment surveys in the first two years, and discusses establishment costs. It is clear that the wetlands are effective in trapping a substantial amount of sediment. Estimates of annual sediment retention suggest higher trapping rates at sandy sites (0.5–6 t ha�1 yr�1), compared to silty sites (0.02–0.4 t ha�1 yr�1) and clay sites (0.01–0.07 t ha�1 yr�1). Establishment costs for the wetlands ranged from £280 to £3100 and depended more on site specific factors, such as fencing and gateways on livestock farms, rather than on wetland size or design. Wetlands with lower trapping rates would also have lower maintenance costs, as dredging would be required less frequently. The results indicate that field wetlands show promise for inclusion in agri-environment schemes, particularly if capital payments can be provided for establishment, to encourage uptake of these multi-functional features.
Resumo:
For an increasing number of applications, mesoscale modelling systems now aim to better represent urban areas. The complexity of processes resolved by urban parametrization schemes varies with the application. The concept of fitness-for-purpose is therefore critical for both the choice of parametrizations and the way in which the scheme should be evaluated. A systematic and objective model response analysis procedure (Multiobjective Shuffled Complex Evolution Metropolis (MOSCEM) algorithm) is used to assess the fitness of the single-layer urban canopy parametrization implemented in the Weather Research and Forecasting (WRF) model. The scheme is evaluated regarding its ability to simulate observed surface energy fluxes and the sensitivity to input parameters. Recent amendments are described, focussing on features which improve its applicability to numerical weather prediction, such as a reduced and physically more meaningful list of input parameters. The study shows a high sensitivity of the scheme to parameters characterizing roof properties in contrast to a low response to road-related ones. Problems in partitioning of energy between turbulent sensible and latent heat fluxes are also emphasized. Some initial guidelines to prioritize efforts to obtain urban land-cover class characteristics in WRF are provided. Copyright © 2010 Royal Meteorological Society and Crown Copyright.
Resumo:
Longitudinal flow bursts observed by the European Incoherent Scatter (EISCAT) radar, in association with dayside auroral transients observed from Svalbard, have been interpreted as resulting from pulses of enhanced reconnection at the dayside magnetopause. However, an alternative model has recently been proposed for a steady rate of magnetopause reconnection, in which the bursts of longitudinal flow are due to increases in the field line curvature force, associated with the By component of the magnetosheath field. We here evaluate these two models, using observations on January 20, 1990, by EISCAT and a 630-nm all-sky camera at Ny Ålesund. For both models, we predict the behavior of both the dayside flows and the 630-nm emissions on newly opened field lines. It is shown that the signatures of steady reconnection and magnetosheath By changes could possibly resemble the observed 630-nm auroral events, but only for certain locations of the observing site, relative to the ionospheric projection of the reconnection X line: however, in such cases, the flow bursts would be seen between the 630-nm transients and not within them. On the other hand, the model of reconnection rate pulses predicts that the flows will be enhanced within each 630-nm transient auroral event. The observations on January 20, 1990, are shown to be consistent with the model of enhanced reconnection rate pulses over a background level and inconsistent with the effects of periodic enhancements of the magnitude of the magnetosheath By component. We estimate that the reconnection rate within the pulses would have to be at least an order of magnitude larger than the background level between the pulses.
Resumo:
As the ideal method of assessing the nutritive value of a feedstuff, namely offering it to the appropriate class of animal and recording the production response obtained, is neither practical nor cost effective a range of feed evaluation techniques have been developed. Each of these balances some degree of compromise with the practical situation against data generation. However, due to the impact of animal-feed interactions over and above that of feed composition, the target animal remains the ultimate arbitrator of nutritional value. In this review current in vitro feed evaluation techniques are examined according to the degree of animal-feed interaction. Chemical analysis provides absolute values and therefore differs from the majority of in vitro methods that simply rank feeds. However, with no host animal involvement, estimates of nutritional value are inferred by statistical association. In addition given the costs involved, the practical value of many analyses conducted should be reviewed. The in sacco technique has made a substantial contribution to both understanding rumen microbial degradative processes and the rapid evaluation of feeds, especially in developing countries. However, the numerous shortfalls of the technique, common to many in vitro methods, the desire to eliminate the use of surgically modified animals for routine feed evaluation, paralleled with improvements in in vitro techniques, will see this technique increasingly replaced. The majority of in vitro systems use substrate disappearance to assess degradation, however, this provides no information regarding the quantity of derived end-products available to the host animal. As measurement of volatile fatty acids or microbial biomass production greatly increases analytical costs, fermentation gas release, a simple and non-destructive measurement, has been used as an alternative. However, as gas release alone is of little use, gas-based systems, where both degradation and fermentation gas release are measured simultaneously, are attracting considerable interest. Alternative microbial inocula are being considered, as is the potential of using multi-enzyme systems to examine degradation dynamics. It is concluded that while chemical analysis will continue to form an indispensable part of feed evaluation, enhanced use will be made of increasingly complex in vitro systems. It is vital, however, the function and limitations of each methodology are fully understood and that the temptation to over-interpret the data is avoided so as to draw the appropriate conclusions. With careful selection and correct application in vitro systems offer powerful research tools with which to evaluate feedstuffs. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
The effectiveness of development assistance has come under renewed scrutiny in recent years. In an era of growing economic liberalisation, research organisations are increasingly being asked to account for the use of public funds by demonstrating achievements. However, in the natural resources (NR) research field, conventional economic assessment techniques have focused on quantifying the impact achieved rather understanding the process that delivered it. As a result, they provide limited guidance for planners and researchers charged with selecting and implementing future research. In response, “pathways” or logic models have attracted increased interest in recent years as a remedy to this shortcoming. However, as commonly applied these suffer from two key limitations in their ability to incorporate risk and assess variance from plan. The paper reports the results of a case study that used a Bayesian belief network approach to address these limitations and outlines its potential value as a tool to assist the planning, monitoring and evaluation of development-orientated research.
Resumo:
As an immunogen of the coronavirus, the nucleoprotein (N) is a potential antigen for the serological monitoring of infectious bronchitis virus (IBV). In this report, recombinant N protein from the Beaudette strain of IBV was produced and purified from Escherichia coli as well as Sf9 ( insect) cells, and used for the coating of enzyme-linked immunosorbent assay ( ELISA) plates. The N protein produced in Sf9 cells was phosphorylated whereas N protein from E. coli was not. Our data indicated that N protein purified from E. coli was more sensitive to anti-IBV serum than the protein from Sf9 cells. The recombinant N protein did not react with the antisera to other avian pathogens, implying that it was specific in the recognition of IBV antibodies. In addition, the data from the detection of field samples and IBV strains indicated that using the recombinant protein as coating antigen could achieve an equivalent performance to an ELISA kit based on infected material extracts as a source of antigen(s). ELISAs based on recombinant proteins are safe ( no live virus), clean ( only virus antigens are present), specific ( single proteins can be used) and rapid ( to respond to new viral strains and strains that cannot necessarily be easily cultured).
Resumo:
Reconfigurable computing is becoming an important new alternative for implementing computations. Field programmable gate arrays (FPGAs) are the ideal integrated circuit technology to experiment with the potential benefits of using different strategies of circuit specialization by reconfiguration. The final form of the reconfiguration strategy is often non-trivial to determine. Consequently, in this paper, we examine strategies for reconfiguration and, based on our experience, propose general guidelines for the tradeoffs using an area-time metric called functional density. Three experiments are set up to explore different reconfiguration strategies for FPGAs applied to a systolic implementation of a scalar quantizer used as a case study. Quantitative results for each experiment are given. The regular nature of the example means that the results can be generalized to a wide class of industry-relevant problems based on arrays.
Resumo:
A manageable, relatively inexpensive model was constructed to predict the loss of nitrogen and phosphorus from a complex catchment to its drainage system. The model used an export coefficient approach, calculating the total nitrogen (N) and total phosphorus (P) load delivered annually to a water body as the sum of the individual loads exported from each nutrient source in its catchment. The export coefficient modelling approach permits scaling up from plot-scale experiments to the catchment scale, allowing application of findings from field experimental studies at a suitable scale for catchment management. The catchment of the River Windrush, a tributary of the River Thames, UK, was selected as the initial study site. The Windrush model predicted nitrogen and phosphorus loading within 2% of observed total nitrogen load and 0.5% of observed total phosphorus load in 1989. The export coefficient modelling approach was then validated by application in a second research basin, the catchment of Slapton Ley, south Devon, which has markedly different catchment hydrology and land use. The Slapton model was calibrated within 2% of observed total nitrogen load and 2.5% of observed total phosphorus load in 1986. Both models proved sensitive to the impact of temporal changes in land use and management on water quality in both catchments, and were therefore used to evaluate the potential impact of proposed pollution control strategies on the nutrient loading delivered to the River Windrush and Slapton Ley
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.