819 resultados para Minimal entropy martingale measure
Resumo:
The entropy budget is calculated of the coupled atmosphere–ocean general circulation model HadCM3. Estimates of the different entropy sources and sinks of the climate system are obtained directly from the diabatic heating terms, and an approximate estimate of the planetary entropy production is also provided. The rate of material entropy production of the climate system is found to be ∼50 mW m−2 K−1, a value intermediate in the range 30–70 mW m−2 K−1 previously reported from different models. The largest part of this is due to sensible and latent heat transport (∼38 mW m−2 K−1). Another 13 mW m−2 K−1 is due to dissipation of kinetic energy in the atmosphere by friction and Reynolds stresses. Numerical entropy production in the atmosphere dynamical core is found to be about 0.7 mW m−2 K−1. The material entropy production within the ocean due to turbulent mixing is ∼1 mW m−2 K−1, a very small contribution to the material entropy production of the climate system. The rate of change of entropy of the model climate system is about 1 mW m−2 K−1 or less, which is comparable with the typical size of the fluctuations of the entropy sources due to interannual variability, and a more accurate closure of the budget than achieved by previous analyses. Results are similar for FAMOUS, which has a lower spatial resolution but similar formulation to HadCM3, while more substantial differences are found with respect to other models, suggesting that the formulation of the model has an important influence on the climate entropy budget. Since this is the first diagnosis of the entropy budget in a climate model of the type and complexity used for projection of twenty-first century climate change, it would be valuable if similar analyses were carried out for other such models.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
Jerdon's Courser Rhinoptilus bitorquatus is one of the most endangered and least understood birds in the world. It is endemic to scrub habitats in southeast India which have been lost and degraded because of human land use. We used satellite images from 1991 and 2000 and two methods for classifying land cover to quantify loss of Jerdon's Courser habitat. The scrub habitats on which this species depends decreased in area by 11-15% during this short period (9.6 years), predominantly as a result of scrub clearance and conversion to agriculture. The remaining scrub patches were smaller and further from human settlements in 2000 than in 1991, implying that much of the scrub loss had occurred close to human population centres. We discuss the implications of our results for the conservation of Jerdon's Courser and the use of remote sensing methods in conservation.
Resumo:
Runoff, sediment, total phosphorus and total dissolved phosphorus losses in overland flow were measured for two years on unbounded plots cropped with wheat and oats. Half of the field was cultivated with minimum tillage (shallow tillage with a tine cultivator) and half was conventionally ploughed. Within each cultivation treatment there were different treatment areas (TAs). In the first year of the experiment, one TA was cultivated up and down the slope, one TA was cultivated on the contour, with a beetle bank acting as a vegetative barrier partway up the slope, and one had a mixed direction cultivation treatment, with cultivation and drilling conducted up and down the slope and all subsequent operations conducted on the contour. In the second year, this mixed treatment was replaced with contour cultivation. Results showed no significant reduction in runoff, sediment losses or total phosphorus losses from minimum tillage when compared to the conventional plough treatment, but there were increased losses of total dissolved phosphorus with minimum tillage. The mixed direction cultivation treatment increased surface runoff and losses of sediment and phosphorus. Increasing surface roughness with contour cultivation reduced surface runoff compared to up and down slope cultivation in both the plough and minimum tillage treatment areas, but this trend was not significant. Sediment and phosphorus losses in the contour cultivation treatment followed a very similar pattern to runoff. Combining contour cultivation with a vegetative barrier in the form of a beetle bank to reduce slope length resulted in a non-significant reduction in surface runoff, sediment and total phosphorus when compared to up and down slope cultivation, but there was a clear trend towards reduced losses. However, the addition of a beetle bank did not provide a significant reduction in runoff, sediment losses or total phosphorus losses when compared to contour cultivation, suggesting only a marginal additional benefit. The economic implications for farmers of the different treatment options are investigated in order to assess their suitability for implementation at a field scale.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
Uncertainty contributes a major part in the accuracy of a decision-making process while its inconsistency is always difficult to be solved by existing decision-making tools. Entropy has been proved to be useful to evaluate the inconsistency of uncertainty among different respondents. The study demonstrates an entropy-based financial decision support system called e-FDSS. This integrated system provides decision support to evaluate attributes (funding options and multiple risks) available in projects. Fuzzy logic theory is included in the system to deal with the qualitative aspect of these options and risks. An adaptive genetic algorithm (AGA) is also employed to solve the decision algorithm in the system in order to provide optimal and consistent rates to these attributes. Seven simplified and parallel projects from a Hong Kong construction small and medium enterprise (SME) were assessed to evaluate the system. The result shows that the system calculates risk adjusted discount rates (RADR) of projects in an objective way. These rates discount project cash flow impartially. Inconsistency of uncertainty is also successfully evaluated by the use of the entropy method. Finally, the system identifies the favourable funding options that are managed by a scheme called SME Loan Guarantee Scheme (SGS). Based on these results, resource allocation could then be optimized and the best time to start a new project could also be identified throughout the overall project life cycle.
Resumo:
Prebiotics and probiotics are increasingly being used to produce potentially synbiotic foods, particularly through dairy products as vehicles. It is well known that both ingredients may offer benefits to improve the host health. This research aimed to evaluate the prebiotic potential of novel petit-suisse cheeses using an in vitro fermentation model. Five petit-suisse cheese formulations combining candidate prebiotics (inulin. oligofructose. hone) and probiotics (Lactobacillus acidophilus, Bifidobacterium lactis) were tested in vitro using, sterile. stirred, batch culture fermentations with human faecal slurry. Measurement of prebiotic effect (MPE) values were generated comparing bacterial changes through determination of maximum growth rates of groups, rate of substrate assimilation and production of lactate and short chain fatty acids. Fastest fermentation and high lactic acid production, promoting increased growth rates of bifidobacteria and lactobacilli. were achieved with addition of prebiotics to a probiotic cheese (made using starter + probiotics). Addition of probiotic strains to control cheese (made using just a starter culture) also resulted in high lactic acid production. Highest MPE values were obtained with addition of prebiotics to a probiotic cheese, followed by addition of prebiotics and/or probiotics to a control cheese. Under the in vitro conditions used, cheese made with the combination of different prebiotics and probiotics resulted in the most promising functional petit-suisse cheese. The study allowed comparison of potentially functional petit-suisse cheeses and screening of preferred synbiotic potential for future market use. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The efficacy of family interventions in psychosis is well documented. UK and USA schizophrenia treatment guidelines advocate the practice of family interventions within routine clinical services. However, less attention has been paid to the study of treatment fidelity and the tools used in its assessment. This study reports the inter-rater reliability of a new scale: Family Intervention in Psychosis-Adherence Scale (FIPAS). This measure is designed to assess therapist adherence to the Kuipers et al. (2002) family intervention in psychosis treatment manual. Reliability ratings were based on a sample of thirteen audiotapes drawn from a randomized controlled trial of family intervention. The results indicated that the majority of items of the FIPAS had acceptable levels of inter-rater reliability. The findings are discussed in terms of their implications for the training and monitoring of the effectiveness of practitioners for family interventions in psychosis.
Resumo:
Objective: Community-based care for mental disorders places considerable burden on families and carers. Measuring their experiences has become a priority, but there is no consensus on appropriate instruments. We aimed to review instruments carers consider relevant to their needs and assess evidence for their use. Method: A literature search was conducted for outcome measures used with mental health carers. Identified instruments were assessed for their relevance to the outcomes identified by carers and their psychometric properties. Results: Three hundred and ninety two published articles referring to 241 outcome measures were identified, 64 of which were eligible for review (used in three or more studies). Twenty-six instruments had good psychometric properties; they measured (i) carers' well-being, (ii) the experience of caregiving and (iii) carers' needs for professional support. Conclusion: Measures exist which have been used to assess the most salient aspects of carer outcome in mental health. All require further work to establish their psychometric properties fully.
Resumo:
The human electroencephalogram (EEG) is globally characterized by a 1/f power spectrum superimposed with certain peaks, whereby the "alpha peak" in a frequency range of 8-14 Hz is the most prominent one for relaxed states of wakefulness. We present simulations of a minimal dynamical network model of leaky integrator neurons attached to the nodes of an evolving directed and weighted random graph (an Erdos-Renyi graph). We derive a model of the dendritic field potential (DFP) for the neurons leading to a simulated EEG that describes the global activity of the network. Depending on the network size, we find an oscillatory transition of the simulated EEG when the network reaches a critical connectivity. This transition, indicated by a suitably defined order parameter, is reflected by a sudden change of the network's topology when super-cycles are formed from merging isolated loops. After the oscillatory transition, the power spectra of simulated EEG time series exhibit a 1/f continuum superimposed with certain peaks. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
A greedy technique is proposed to construct parsimonious kernel classifiers using the orthogonal forward selection method and boosting based on Fisher ratio for class separability measure. Unlike most kernel classification methods, which restrict kernel means to the training input data and use a fixed common variance for all the kernel terms, the proposed technique can tune both the mean vector and diagonal covariance matrix of individual kernel by incrementally maximizing Fisher ratio for class separability measure. An efficient weighted optimization method is developed based on boosting to append kernels one by one in an orthogonal forward selection procedure. Experimental results obtained using this construction technique demonstrate that it offers a viable alternative to the existing state-of-the-art kernel modeling methods for constructing sparse Gaussian radial basis function network classifiers. that generalize well.