30 resultados para 554
em CentAUR: Central Archive University of Reading - UK
Resumo:
This paper discusses the risks of a shutdown of the thermohaline circulation (THC) for the climate system, for ecosystems in and around the North Atlantic as well as for fisheries and agriculture by way of an Integrated Assessment. The climate model simulations are based on greenhouse gas scenarios for the 21st century and beyond. A shutdown of the THC, complete by 2150, is triggered if increased freshwater input from inland ice melt or enhanced runoff is assumed. The shutdown retards the greenhouse gas-induced atmospheric warming trend in the Northern Hemisphere, but does not lead to a persistent net cooling. Due to the simulated THC shutdown the sea level at the North Atlantic shores rises by up to 80 cm by 2150, in addition to the global sea level rise. This could potentially be a serious impact that requires expensive coastal protection measures. A reduction of marine net primary productivity is associated with the impacts of warming rather than a THC shutdown. Regional shifts in the currents in the Nordic Seas could strongly deteriorate survival chances for cod larvae and juveniles. This could lead to cod fisheries becoming unprofitable by the end of the 21st century. While regional socioeconomic impacts might be large, damages would be probably small in relation to the respective gross national products. Terrestrial ecosystem productivity is affected much more by the fertilization from the increasing CO2 concentration than by a THC shutdown. In addition, the level of warming in the 22nd to 24th century favours crop production in northern Europe a lot, no matter whether the THC shuts down or not. CO2 emissions corridors aimed at limiting the risk of a THC breakdown to 10% or less are narrow, requiring departure from business-as-usual in the next few decades. The uncertainty about THC risks is still high. This is seen in model analyses as well as in the experts’ views that were elicited. The overview of results presented here is the outcome of the Integrated Assessment project INTEGRATION.
Resumo:
The formation of a lava dome involves fractionation of the lava into core and clastic components. We show that for three separate, successive andesitic lava domes that grew at Soufrière Hills volcano, Montserrat, between 1999 and 2007, the volumetric proportion of the lava converted to talus or pyroclastic flow deposits was 50%–90% of the lava extruded. Currently, only 8% of the total magma extruded during the 1995–2007 eruption remains as core lava. The equivalent representation in the geological record will probably be even lower. Most of the lava extruded at the surface flowed no further than 150–300 m from the vent before disaggregation, resulting in a lava core whose shape tends to a cylinder. Moderate to high extrusion rates at the Soufrière Hills domes may have contributed to the large clastic fraction observed. Creating talus dissipates much of the energy that would otherwise be stored in the core lava of domes. The extreme hazards from large pyroclastic flows and blasts posed by wholesale collapse of a lava dome depend largely on the size of the lava core, and hence on the aggregate history of the partitioning process, not on the size of the dome.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
In this paper we present results from two choice experiments (CE), designed to take account of the different negative externalities associated with pesticide use in agricultural production. For cereal production, the most probable impact of pesticide use is a reduction in environmental quality. For fruit and vegetable production, the negative externality is on consumer health. Using latent class models we find evidence of the presence of preference heterogeneity in addition to reasonably high willingness to pay (WTP) estimates for a reduction in the use of pesticides for both environmental quality and consumer health. To place our WTP estimates in a policy context we convert them into an equivalent pesticide tax by type of externality. Our tax estimates suggest that pesticide taxes based on the primary externality resulting from a particular mode of agricultural production are a credible policy option that warrants further consideration.
Resumo:
We consider the case of a multicenter trial in which the center specific sample sizes are potentially small. Under homogeneity, the conventional procedure is to pool information using a weighted estimator where the weights used are inverse estimated center-specific variances. Whereas this procedure is efficient for conventional asymptotics (e. g. center-specific sample sizes become large, number of center fixed), it is commonly believed that the efficiency of this estimator holds true also for meta-analytic asymptotics (e.g. center-specific sample size bounded, potentially small, and number of centers large). In this contribution we demonstrate that this estimator fails to be efficient. In fact, it shows a persistent bias with increasing number of centers showing that it isnot meta-consistent. In addition, we show that the Cochran and Mantel-Haenszel weighted estimators are meta-consistent and, in more generality, provide conditions on the weights such that the associated weighted estimator is meta-consistent.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
Recombination is thought to occur only rarely in animal mitochondrial DNA ( mtDNA). However, detection of mtDNA recombination requires that cells become heteroplasmic through mutation, intramolecular recombination or ' leakage' of paternal mtDNA. Interspecific hybridization increases the probability of detecting mtDNA recombinants due to higher levels of sequence divergence and potentially higher levels of paternal leakage. During a study of historical variation in Atlantic salmon ( Salmo salar) mtDNA, an individual with a recombinant haplotype containing sequence from both Atlantic salmon and brown trout ( Salmo trutta) was detected. The individual was not an F1 hybrid but it did have an unusual nuclear genotype which suggested that it was a later-generation backcross. No other similar recombinant haplotype was found from the same population or three neighbouring Atlantic salmon populations in 717 individuals collected during 1948 - 2002. Interspecific recombination may increase mtDNA variability within species and can have implications for phylogenetic studies.
Resumo:
"Yor" is a traditional sausage like product widely consumed in Thailand. Its textures are usually set by steaming, in this experiment ultra-high pressure was used to modify the product. Three types of hydrocolloid; carboxymethylcellulose (CMC), locust bean gum (LBG) and xanthan gum, were added to minced ostrich meat batter at concentration of 0-1% and subjected to high pressure 600 Mpa, 50 degrees C, 40 min. The treated samples were analysed for storage (G) and loss (G '') moduli by dynamic oscillatory testing as well as creep compliance for control stress measurement. Their microstructures using confocal microscopy were also examined. Hydrocolloid addition caused a significant (P < 0.05) decrease in both the G' and G '' moduli. However the loss tangent of all samples remained unchanged. Addition of hydrocolloids led to decreases in the gel network formation but appears to function as surfactant materials during the initial mixing stage as shown by the microstructure. Confocal microscopy suggested that the size of the fat droplets decreased with gum addition. The fat droplets were smallest on the addition of xanthan gum and increased in the order CMC, LBG and no added gum, respectively. Creep parameters of ostrich yors with four levels of xanthan gum addition (0.50%, 0.75%, 1.00% and 1.25%) showed an increase in the instantaneous compliance (J(0)), the retarded compliance (J(1)) and retardation time (lambda(1)) but a decrease in the viscosity (eta(0)) with increasing levels of addition. The results also suggested that the larger deformations used during creep testing might be more helpful in assessing the mechanical properties of the product than the small deformations used in oscillatory rheology. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Background Screening instruments for autistic-spectrum disorders have not been compared in the same sample. Aims To compare the Social Communication Questionnaire (SCQ), the Social Responsiveness Scale (SRS) and the Children's Communication Checklist (CCC). Method Screen and diagnostic assessments on 119 children between 9 and 13 years of age with special educational needs with and without autistic-spectrum disorders were weighted to estimate screen characteristics for a realistic target population. Results The SCQ performed best (area under receiver operating characteristic curve (AUC)=0.90; sensitivity. 6; specificity 0.78). The SRS had a lower AUC (0.77) with high sensitivity (0.78) and moderate specificity (0.67). The CCC had a high sensitivity but lower specificity (AUC=0.79; sensitivity 0.93; specificity 0.46). The AUC of the SRS and CCC was lower for children with IQ < 70. Behaviour problems reduced specificity for all three instruments. Conclusions The SCQ, SRS and CCC showed strong to moderate ability to identify autistic-spectrum disorder in this at-risk sample of school-age children with special educational needs.
Resumo:
There are still major challenges in the area of automatic indexing and retrieval of digital data. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. Research has been ongoing for a few years in the field of ontological engineering with the aim of using ontologies to add knowledge to information. In this paper we describe the architecture of a system designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval.
Resumo:
This paper presents a novel two-pass algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS) for block base motion compensation. On the basis of research from previous algorithms, especially an on-the-edge motion estimation algorithm called hexagonal search (HEXBS), we propose the LHMEA and the Two-Pass Algorithm (TPA). We introduced hashtable into video compression. In this paper we employ LHMEA for the first-pass search in all the Macroblocks (MB) in the picture. Motion Vectors (MV) are then generated from the first-pass and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of MBs. The evaluation of the algorithm considers the three important metrics being time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms, Experimental results show that the proposed algorithm can offer the same compression rate as the Full Search. LHMEA with TPA has significant improvement on HEXBS and shows a direction for improving other fast motion estimation algorithms, for example Diamond Search.