893 resultados para Rapid assessment method
Resumo:
We examined nest site selection by Puerto Rican Parrots, a secondary cavity nester, at several spatial scales using the nest entrance as the central focal point relative to 20 habitat and spatial variables. The Puerto Rican Parrot is unique in that, since 2001, all known nesting in the wild has occurred in artificial cavities, which also provided us with an opportunity to evaluate nest site selection without confounding effects of the actual nest cavity characteristics. Because of the data limitations imposed by the small population size of this critically endangered endemic species, we employed a distribution-free statistical simulation approach to assess site selection relative to characteristics of used and unused nesting sites. Nest sites selected by Puerto Rican Parrots were characterized by greater horizontal and vertical visibility from the nest entrance, greater density of mature sierra palms, and a more westerly and leeward orientation of nest entrances than unused sites. Our results suggest that nest site selection in this species is an adaptive response to predation pressure, to which the parrots respond by selecting nest sites offering advantages in predator detection and avoidance at all stages of the nesting cycle. We conclude that identifying and replicating the “nest gestalt” of successful nesting sites may facilitate conservation efforts for this and other endangered avian species.
Resumo:
Pair Programming is a technique from the software development method eXtreme Programming (XP) whereby two programmers work closely together to develop a piece of software. A similar approach has been used to develop a set of Assessment Learning Objects (ALO). Three members of academic staff have developed a set of ALOs for a total of three different modules (two with overlapping content). In each case a pair programming approach was taken to the development of the ALO. In addition to demonstrating the efficiency of this approach in terms of staff time spent developing the ALOs, a statistical analysis of the outcomes for students who made use of the ALOs is used to demonstrate the effectiveness of the ALOs produced via this method.
Resumo:
The freshwaters of the Mersey Basin have been seriously polluted for over 200 years. Anecdotal evidence suggests that the water quality was relatively clean before the start of the Industrial Revolution. The development of the cotton and chemical industries increased the pollution load to rivers, and consequently a decline in biota supported by the water was observed. Industrial prosperity led to a rapid population increase and an increase in domestic effluent. Poor treatment of this waste meant that it was a significant pollutant. As industry intensified during the 19th century, the mix of pollutants grew more complex. Eventually, in the 1980s, the government acknowledged the problem and more effort was made to improve the water quality. Knowledge of social and economic history, as well as anecdotal evidence, has been used in this paper to extrapolate the changes in water quality that occurred. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Stable isotopic characterization of chlorine in chlorinated aliphatic pollution is potentially very valuable for risk assessment and monitoring remediation or natural attenuation. The approach has been underused because of the complexity of analysis and the time it takes. We have developed a new method that eliminates sample preparation. Gas chromatography produces individually eluted sample peaks for analysis. The He carrier gas is mixed with Ar and introduced directly into the torch of a multicollector ICPMS. The MC-ICPMS is run at a high mass resolution of >= 10 000 to eliminate interference of mass 37 ArH with Cl. The standardization approach is similar to that for continuous flow stable isotope analysis in which sample and reference materials are measured successively. We have measured PCE relative to a laboratory TCE standard mixed with the sample. Solvent samples of 200 nmol to 1.3 mu mol ( 24- 165 mu g of Cl) were measured. The PCE gave the same value relative to the TCE as measured by the conventional method with a precision of 0.12% ( 2 x standard error) but poorer precision for the smaller samples.
Resumo:
Lava domes comprise core, carapace, and clastic talus components. They can grow endogenously by inflation of a core and/or exogenously with the extrusion of shear bounded lobes and whaleback lobes at the surface. Internal structure is paramount in determining the extent to which lava dome growth evolves stably, or conversely the propensity for collapse. The more core lava that exists within a dome, in both relative and absolute terms, the more explosive energy is available, both for large pyroclastic flows following collapse and in particular for lateral blast events following very rapid removal of lateral support to the dome. Knowledge of the location of the core lava within the dome is also relevant for hazard assessment purposes. A spreading toe, or lobe of core lava, over a talus substrate may be both relatively unstable and likely to accelerate to more violent activity during the early phases of a retrogressive collapse. Soufrière Hills Volcano, Montserrat has been erupting since 1995 and has produced numerous lava domes that have undergone repeated collapse events. We consider one continuous dome growth period, from August 2005 to May 2006 that resulted in a dome collapse event on 20th May 2006. The collapse event lasted 3 h, removing the whole dome plus dome remnants from a previous growth period in an unusually violent and rapid collapse event. We use an axisymmetrical computational Finite Element Method model for the growth and evolution of a lava dome. Our model comprises evolving core, carapace and talus components based on axisymmetrical endogenous dome growth, which permits us to model the interface between talus and core. Despite explicitly only modelling axisymmetrical endogenous dome growth our core–talus model simulates many of the observed growth characteristics of the 2005–2006 SHV lava dome well. Further, it is possible for our simulations to replicate large-scale exogenous characteristics when a considerable volume of talus has accumulated around the lower flanks of the dome. Model results suggest that dome core can override talus within a growing dome, potentially generating a region of significant weakness and a potential locus for collapse initiation.
Resumo:
A rapid capillary electrophoresis method was developed simultaneously to determine artificial sweeteners, preservatives and colours used as additives in carbonated soft drinks. Resolution between all additives occurring together in soft drinks was successfully achieved within a 15-min run-time by employing the micellar electrokinetic chromatography mode with a 20 mM carbonate buffer at pH 9.5 as the aqueous phase and 62 mM sodium dodecyl sulfate as the micellar phase. By using a diode-array detector to monitor the UV-visible range (190-600 nm), the identity of sample components, suggested by migration time, could be confirmed by spectral matching relative to standards.
Resumo:
The paper presents the method and findings of a Delphi expert survey to assess the impact of UK government farm animal welfare policy, form assurance schemes and major food retailer specifications on the welfare of animals on forms. Two case-study livestock production systems are considered, dairy and cage egg production. The method identifies how well the various standards perform in terms of their effects on a number of key farm animal welfare variables, and provides estimates of the impact of the three types of standard on the welfare of animals on forms, taking account of producer compliance. The study highlights that there remains considerable scope for government policy, together with form assurance schemes, to improve the welfare of form animals by introducing standards that address key factors affecting animal welfare and by increasing compliance of livestock producers. There is a need for more comprehensive, regular and random surveys of on-farm welfare to monitor compliance with welfare standards (legislation and welfare codes) and the welfare of farm animals over time, and a need to collect farm data on the costs of compliance with standards.
Resumo:
Break-even analyses of the costs and benefits of six alternative bovine tuberculosis (bTB) control strategies were undertaken. The results show that some strategies, such as zoning, would require relatively small reductions in bTB incidence as a result to be cost effective, whilst for others, such as proactive badger removal, the costs would require a substantial and relatively rapid reduction in bTB incidence to be worthwhile.
Resumo:
An aggregated farm-level index, the Agri-environmental Footprint Index (AFI), based on multiple criteria methods and representing a harmonised approach to evaluation of EU agri-environmental schemes is described. The index uses a common framework for the design and evaluation of policy that can be customised to locally relevant agri-environmental issues and circumstances. Evaluation can be strictly policy-focused, or broader and more holistic in that context-relevant assessment criteria that are not necessarily considered in the evaluated policy can nevertheless be incorporated. The Index structure is flexible, and can respond to diverse local needs. The process of Index construction is interactive, engaging farmers and other relevant stakeholders in a transparent decision-making process that can ensure acceptance of the outcome, help to forge an improved understanding of local agri-environmental priorities and potentially increase awareness of the critical role of farmers in environmental management. The structure of the AFI facilitates post-evaluation analysis of relative performance in different dimensions of the agri-environment, permitting identification of current strengths and weaknesses, and enabling future improvement in policy design. Quantification of the environmental impact of agriculture beyond the stated aims of policy using an 'unweighted' form of the AFI has potential as the basis of an ongoing system of environmental audit within a specified agricultural context. (C) 2009 Elsevier Ltd. All rights reserved.
Statistical evaluation of the fixed concentration procedure for acute inhalation toxicity assessment
Resumo:
The conventional method for the assessment of acute inhalation toxicity (OECD Test Guideline 403, 1981) uses death of animals as an endpoint to identify the median lethal concentration (LC50). A new OECD Testing Guideline called the Fixed Concentration Procedure (FCP) is being prepared to provide an alternative to Test Guideline 403. Unlike Test Guideline 403, the FCP does not provide a point estimate of the LC50, but aims to identify an airborne exposure level that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonized System of Classification and Labelling scheme (GHS). The FCP has been validated using statistical simulation rather than byin vivo testing. The statistical simulation approach predicts the GHS classification outcome and the numbers of deaths and animals used in the test for imaginary substances with a range of LC50 values and dose response curve slopes. This paper describes the FCP and reports the results from the statistical simulation study assessing its properties. It is shown that the procedure will be completed with considerably less death and suffering than Test Guideline 403, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LC50 value.
Resumo:
In this study, we demonstrate the suitability of the vertebrate Danio rerio (zebrafish) for functional screening of novel platelet genes in vivo by reverse genetics. Comparative transcript analysis of platelets and their precursor cell, the megakaryocyte, together with nucleated blood cell elements, endothelial cells, and erythroblasts, identified novel platelet membrane proteins with hitherto unknown roles in thrombus formation. We determined the phenotype induced by antisense morpholino oligonucleotide (MO)–based knockdown of 5 of these genes in a laser-induced arterial thrombosis model. To validate the model, the genes for platelet glycoprotein (GP) IIb and the coagulation protein factor VIII were targeted. MO-injected fish showed normal thrombus initiation but severely impaired thrombus growth, consistent with the mouse knockout phenotypes, and concomitant knockdown of both resulted in spontaneous bleeding. Knockdown of 4 of the 5 novel platelet proteins altered arterial thrombosis, as demonstrated by modified kinetics of thrombus initiation and/or development. We identified a putative role for BAMBI and LRRC32 in promotion and DCBLD2 and ESAM in inhibition of thrombus formation. We conclude that phenotypic analysis of MO-injected zebrafish is a fast and powerful method for initial screening of novel platelet proteins for function in thrombosis.
Resumo:
The technique of rapid acidification and alkylation can be used to characterise the redox status of oxidoreductases, and to determine numbers of free cysteine residues within substrate proteins. We have previously used this method to analyse interacting components of the MHC class I pathway, namely ERp57 and tapasin. Here, we have applied rapid acidification alkylation as a novel approach to analysing the redox status of MHC class I molecules. This analysis of the redox status of the MHC class I molecules HLA-A2 and HLA-B27, which is strongly associated with a group of inflammatory arthritic disorders referred to as Spondyloarthropathies, revealed structural and conformational information. We propose that this assay provides a useful tool in the study of in vivo MHC class I structure. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
LDL oxidation may be important in atherosclerosis. Extensive oxidation of LDL by copper induces increased uptake by macrophages, but results in decomposition of hydroperoxides, making it more difficult to investigate the effects of hydroperoxides in oxidised LDL on cell function. We describe here a simple method of oxidising LDL by dialysis against copper ions at 4 degrees C, which inhibits the decomposition of hydroperoxides, and allows the production of LDL rich in hydroperoxides (626 +/- 98 nmol/mg LDL protein) but low in oxysterols (3 +/- 1 nmol 7-ketocholesterol/mg LDL protein), whilst allowing sufficient modification (2.6 +/- 0.5 relative electrophoretic mobility) for rapid uptake by macrophages (5.49 +/- 0.75 mu g I-125-labelled hydroperoxide-rich LDL vs. 0.46 +/- 0.04 mu g protein/mg cell protein in 18 h for native LDL). By dialysing under the same conditions, but at 37 degrees C, the hydroperoxides are decomposed extensively and the LDL becomes rich in oxysterols. This novel method of oxidising LDL with high yield to either a hydroperoxide- or oxysterol-rich form by simply altering the temperature of dialysis may provide a useful tool for determining the effects of these different oxidation products on cell function. (C) 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The reliable assessment of the quality of protein structural models is fundamental to the progress of structural bioinformatics. The ModFOLD server provides access to two accurate techniques for the global and local prediction of the quality of 3D models of proteins. Firstly ModFOLD, which is a fast Model Quality Assessment Program (MQAP) used for the global assessment of either single or multiple models. Secondly ModFOLDclust, which is a more intensive method that carries out clustering of multiple models and provides per-residue local quality assessment.
Resumo:
Background: Selecting the highest quality 3D model of a protein structure from a number of alternatives remains an important challenge in the field of structural bioinformatics. Many Model Quality Assessment Programs (MQAPs) have been developed which adopt various strategies in order to tackle this problem, ranging from the so called "true" MQAPs capable of producing a single energy score based on a single model, to methods which rely on structural comparisons of multiple models or additional information from meta-servers. However, it is clear that no current method can separate the highest accuracy models from the lowest consistently. In this paper, a number of the top performing MQAP methods are benchmarked in the context of the potential value that they add to protein fold recognition. Two novel methods are also described: ModSSEA, which based on the alignment of predicted secondary structure elements and ModFOLD which combines several true MQAP methods using an artificial neural network. Results: The ModSSEA method is found to be an effective model quality assessment program for ranking multiple models from many servers, however further accuracy can be gained by using the consensus approach of ModFOLD. The ModFOLD method is shown to significantly outperform the true MQAPs tested and is competitive with methods which make use of clustering or additional information from multiple servers. Several of the true MQAPs are also shown to add value to most individual fold recognition servers by improving model selection, when applied as a post filter in order to re-rank models. Conclusion: MQAPs should be benchmarked appropriately for the practical context in which they are intended to be used. Clustering based methods are the top performing MQAPs where many models are available from many servers; however, they often do not add value to individual fold recognition servers when limited models are available. Conversely, the true MQAP methods tested can often be used as effective post filters for re-ranking few models from individual fold recognition servers and further improvements can be achieved using a consensus of these methods.