963 resultados para evaluating methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Project to evaluate the role of brassica crops in the western farming system area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The EZ-Ject herbicide system was evaluated as a stem injection method for controlling woody weeds in a range of situations where traditional chemical application methods have limited scope. The equipment was trialled on three Queensland weed species; pond apple (Annona glabra), velvety tree pear (Opuntia tomentosa) and yellow oleander (Cascabela thevetia); at five different cartridge densities (0, 1, 2, 3 and 4) and with two herbicides (glyphosate and imazapyr). Cartridges filled with imazapyr were significantly more effective at controlling the three woody weed species than those filled with glyphosate. Injecting plants with three imazapyr cartridges resulted in plant kills ranging from 93 to 100%, compared with glyphosate kills of 17 to 100%. Pond apple was the most susceptible species, requiring one imazapyr cartridge or two glyphosate cartridges to kill 97 and 92% of the treated plants. Plant mortality increased as the number of cartridges injected increased. Mortality did not differ significantly for treatments receiving three and four imazapyr cartridges, as these cartridge densities met the criterion of injecting one cartridge per 10-cm basal circumference, a criterion recommended by the manufacturers for treating large plants (>6.35 cm in diameter at breast height). The cost of treating a weed infestation of 1500 plants ha–1 with three cartridges per tree is $1070 ha–1, with labour costs accounting for 16% of the total. The high chemical costs would preclude this technique from broad-scale use, but the method could have application for treating woody weeds in sensitive, high conservation areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To develop approaches to the evaluation of programmes whose strategic objectives are to halt or slow weed spread. Location: Australia. Methods: Key aspects in the evaluation of weed containment programmes are considered. These include the relevance of models that predict the effects of management intervention on spread, the detection of spread, evidence for containment failure and metrics for absolute or partial containment. Case studies documenting either near-absolute (Orobanche ramosa L., branched broomrape) or partial (Parthenium hysterophorus (L.) King and Robinson, parthenium) containment are presented. Results: While useful for informing containment strategies, predictive models cannot be employed in containment programme evaluation owing to the highly stochastic nature of realized weed spread. The quality of observations is critical to the timely detection of weed spread. Effectiveness of surveillance and monitoring activities will be improved by utilizing information on habitat suitability and identification of sites from which spread could most compromise containment. Proof of containment failure may be difficult to obtain. The default option of assuming that a new detection represents containment failure could lead to an underestimate of containment success, the magnitude of which will depend on how often this assumption is made. Main conclusions: Evaluation of weed containment programmes will be relatively straightforward if containment is either absolute or near-absolute and may be based on total containment area and direct measures of containment failure, for example, levels of dispersal, establishment and reproduction beyond (but proximal to) the containment line. Where containment is only partial, other measures of containment effectiveness will be required. These may include changes in the rates of detection of new infestations following the institution of interventions designed to reduce dispersal, the degree of compliance with such interventions, and the effectiveness of tactics intended to reduce fecundity or other demographic drivers of spread. © 2012 Blackwell Publishing Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modifications of surface materials and their effects on cleanability have important impacts in many fields of activity. In this study the primary aim was to develop radiochemical methods suitable for evaluating cleanability in material research for different environments. Another aim was to investigate the effects of surface modifications on cleanabilitity and surface properties of plastics, ceramics, concrete materials and also their coatings in conditions simulating their typical environments. Several new 51Cr and 14C labelled soils were developed for testing situations. The new radiochemical methods developed were suitable for examining different surface materials and different soil types, providing quantitative information about the amount of soil on surfaces. They also take into account soil soaked into surfaces. The supporting methods colorimetric determination and ATP bioluminescence provided semi-quantitative results. The results from the radiochemical and supporting methods partly correlated with each other. From a material research point of view numerous new materials were evaluated. These included both laboratory-made model materials and commercial products. Increasing the amount of plasticizer decreased the cleanability of poly(vinyl chloride) (PVC) materials. Microstructured surfaces of plastics improved the cleanability of PVC from particle soils, whereas for oil soil microstructuring reduced the cleanability. In the case of glazed ceramic materials, coatings affected the cleanability. The roughness of surfaces correlated with cleanability from particle soils and the cleanability from oil soil correlated with the contact angles. Organic particle soil was removed more efficiently from TiO2-coated ceramic surfaces after UV-radiation than without UV treatment, whereas no effect was observed on the cleanability of oil soil. Coatings improved the cleanability of concrete flooring materials intended for use in animal houses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background A cancer diagnosis elicits greater distress than any other medical diagnosis, and yet very few studies have evaluated the efficacy of structured online self-help therapeutic programs to alleviate this distress. This study aims to assess the efficacy over time of an internet Cognitive Behaviour Therapy (iCBT) intervention (‘Finding My Way’) in improving distress, coping and quality of life for individuals with a recent diagnosis of early stage cancer of any type. Methods/Design The study is a multi-site Randomised Controlled Trial (RCT) seeking to enrol 188 participants who will be randomised to either the Finding My Way Intervention or an attention-control condition. Both conditions are delivered online; with 6 modules released once per week, and an additional booster module released one month after program-completion. Participants complete online questionnaires on 4 occasions: at baseline (immediately prior to accessing the modules); post-treatment (immediately after program-completion); then three and six months later. Primary outcomes are general distress and cancer-specific distress, with secondary outcomes including Health-Related Quality of Life (HRQoL), coping, health service utilisation, intervention adherence, and user satisfaction. A range of baseline measures will be assessed as potential moderators of outcomes. Eligible participants are individuals recently diagnosed with any type of cancer, being treated with curative intent, aged over 18 years with sufficient English language literacy, internet access and an active email account and phone number. Participants are blinded to treatment group allocation. Randomisation is computer generated and stratified by gender. Discussion Compared to the few prior published studies, Finding My Way will be the first adequately powered trial to offer an iCBT intervention to curatively treated patients of heterogeneous cancer types in the immediate post-diagnosis/treatment period. If found efficacious, Finding My Way will assist with overcoming common barriers to face-to-face therapy in a cost-effective and accessible way, thus helping to reduce distress after cancer diagnosis and consequently decrease the cancer burden for individuals and the health system. Trial registration Australian New Zealand Clinical Trials Registry ACTRN12613000001​796 16.10.13

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increased utilization of advanced composites in strategic industries, the concept of Structural Health Monitoring (SHM) with its inherent advantages is gaining ground over the conventional methods of NDE and NDI. The most attractive feature of this concept is on-line evaluation using embedded sensors. Consequently, development of methodologies with identification of appropriate sensors such as PVDF films becomes the key for exploiting the new concept. And, of the methods used for on-line evaluation acoustic emission has been most effective. Thus, Acoustic Emission (AE) generated during static tensile loading of glass fiber reinforced plastic composites was monitored using a Polyvinylidene fluoride (PVDF) film sensor. The frequency response of the film sensor was obtained with pencil lead breakage tests to choose the appropriate band of operation. The specimen considered for the experiments were chosen to characterize the differences in the operation of the failure mechanisms through AE parametric analysis. The results of the investigations can be characterized using AE parameter indicating that a PVDF film sensor was effective as an AE sensor used in structural health monitoring on-line.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims at evaluating the methods of multiclass support vector machines (SVMs) for effective use in distance relay coordination. Also, it describes a strategy of supportive systems to aid the conventional protection philosophy in combating situations where protection systems have maloperated and/or information is missing and provide selective and secure coordinations. SVMs have considerable potential as zone classifiers of distance relay coordination. This typically requires a multiclass SVM classifier to effectively analyze/build the underlying concept between reach of different zones and the apparent impedance trajectory during fault. Several methods have been proposed for multiclass classification where typically several binary SVM classifiers are combined together. Some authors have extended binary SVM classification to one-step single optimization operation considering all classes at once. In this paper, one-step multiclass classification, one-against-all, and one-against-one multiclass methods are compared for their performance with respect to accuracy, number of iterations, number of support vectors, training, and testing time. The performance analysis of these three methods is presented on three data sets belonging to training and testing patterns of three supportive systems for a region and part of a network, which is an equivalent 526-bus system of the practical Indian Western grid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Instrument landing systems (ILS) and the upcoming microwave landing systems (MLS) are (or are planned to be) very important navigational aids at most major airports of the world. However, their performance is directly affected by the features of the site in which they are located. Currently, validation of the ILS performance is through costly and time-consuming experimental methods. This paper outlines a powerful and versatile analytical approach for performing the site evaluation, as an alternative to the experimental methods. The approach combines a multi-plate model for the terrain with a powerful and exhaustive ray-tracing technique and a versatile and accurate formulation for estimating the electromagnetic fields due to the array antenna in the presence of the terrain. It can model the effects of the undulation, the roughness and the impedance (depending on the soil type) of the terrain at the site. The results computed from the analytical method are compared with the actual measurements and good agreement is shown. Considerations for site effects on MLS are also outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Restricted Boltzmann Machines (RBM) can be used either as classifiers or as generative models. The quality of the generative RBM is measured through the average log-likelihood on test data. Due to the high computational complexity of evaluating the partition function, exact calculation of test log-likelihood is very difficult. In recent years some estimation methods are suggested for approximate computation of test log-likelihood. In this paper we present an empirical comparison of the main estimation methods, namely, the AIS algorithm for estimating the partition function, the CSL method for directly estimating the log-likelihood, and the RAISE algorithm that combines these two ideas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2006, the National Marine Fisheries Service, NOAA, initiated development of a national bycatch report that would provide bycatch estimates for U.S. commercial fisheries at the fishery and species levels for fishes, marine mammals, sea turtles, and seabirds. As part of this project, the need to quantify the relative quality of available bycatch data and estimation methods was identified. Working collaboratively with fisheries managers and scientists across the nation, a system of evaluation was developed. Herein we describe the development of this system (the “tier system”), its components, and its application. We also discuss the value of the tier system in allowing fisheries managers to identify research needs and efficiently allocate limited resources toward those areas that will result in the greatest improvement to bycatch data and estimation quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Management of West Coast groundfish resources by the Pacific Fishery Management Council involves Federal government and academic scientists conducting stock assessments, generally using the stock synthesis framework, applying the 40-10 rule to determine harvest guidelines for resources that are not overfished and conducting rebuilding analyses to determine harvest guidelines for resources that have been designated as overfished. However, this management system has not been evaluated in terms of its ability to satisfy the National Standard 1 goals of the Sustainable Fisheries Act. A Monte Carlo simulation framework is therefore outlined that can be used to make such evaluations. Based on simulations tailored to a situation similar to that of managing the widow rockfish (Sebastes entomelas) resource, it is shown that catches during recovery and thereafter are likely to be highly variable (up to ±30% from one year to the next). Such variability is far greater than has been presented to the decision makers to date. Reductions in interannual variability in catches through additional data collection are, however, unlikely. Rather, improved performance will probably arise from better methods for predicting future recruitment. Rebuilding analyses include quantities such as the year to which the desired probability of recovery applies. The estimates of such quantities are, however, very poorly determined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in the quality of intermediate moisture (IM) fish during storage at 38°C were monitored by assessing the moisture content, pH, acid value, peroxide value and thiobarbituric acid (TBA) value periodically. Results adequately portrayed the hydrolysis and peroxidation of fats and the concomitant protein degradation and crosslinking reactions that have been shown by more sophisticated methods to occur in intermediate moisture fish. Since these changes markedly affect the organoleptic quality, acceptability/shelf-life and nutritive value of IM flesh-foods their predictability by simple fat analytical techniques is of practical value where/when the more sophisticated monitoring techniques are not feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Iteration is unavoidable in the design process and should be incorporated when planning and managing projects in order to minimize surprises and reduce schedule distortions. However, planning and managing iteration is challenging because the relationships between its causes and effects are complex. Most approaches which use mathematical models to analyze the impact of iteration on the design process focus on a relatively small number of its causes and effects. Therefore, insights derived from these analytical models may not be robust under a broader consideration of potential influencing factors. In this article, we synthesize an explanatory framework which describes the network of causes and effects of iteration identified from the literature, and introduce an analytic approach which combines a task network modeling approach with System Dynamics simulation. Our approach models the network of causes and effects of iteration alongside the process architecture which is required to analyze the impact of iteration on design process performance. We show how this allows managers to assess the impact of changes to process architecture and to management levers which influence iterative behavior, accounting for the fact that these changes can occur simultaneously and can accumulate in non-linear ways. We also discuss how the insights resulting from this analysis can be visualized for easier consumption by project participants not familiar with simulation methods. Copyright © 2010 by ASME.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The anti-HIV-1 neutralizing antibody assay is widely used in AIDS vaccine research and other experimental and clinical studies. The vital dye staining method applied in the detection of anti-HIV-1 neutralizing antibody has been used in many laboratories. However, the unknown factor(s) in sera or plasma affected cell growth and caused protection when the tested sera or plasma was continuously maintained in cell culture. In addition, the poor solubility of neutral red in medium (such as RPMI-1640) also limited the use of this assay. Methods: In this study, human T cell line C8166 was used as host cells, and 3-(4,5-Dimethyl-2-thiazolyl)- 2,5-diphenyl-2H-tetrazolium bromide (MTT) instead of neutral red was used as vital dye. In order to avoid the effect of the unknown factor( s), the tested sera or plasma was removed by a washout procedure after initial 3 - 6 h culture in the assay. Result: This new assay eliminated the effect of the tested sera or plasma on cell growth, improved the reliability of detection of anti-HIV-1 neutralizing antibody, and showed excellent agreement with the p24 antigen method. Conclusion: The results suggest that the improved assay is relatively simple, highly duplicable, cost-effective, and well reliable for evaluating anti-HIV-1 neutralizing antibodies from sera or plasma.