960 resultados para Score Normalization
Resumo:
The current study aims to assess the applicability of direct or indirect normalization for the analysis of fractional anisotropy (FA) maps in the context of diffusion-weighted images (DWIs) contaminated by ghosting artifacts. We found that FA maps acquired by direct normalization showed generally higher anisotropy than indirect normalization, and the disparities were aggravated by the presence of ghosting artifacts in DWIs. The voxel-wise statistical comparisons demonstrated that indirect normalization reduced the influence of artifacts and enhanced the sensitivity of detecting anisotropy differences between groups. This suggested that images contaminated with ghosting artifacts can be sensibly analyzed using indirect normalization.
Resumo:
In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures.
Resumo:
Unless the benefits to society of measures to protect and improve the welfare of animals are made transparent by means of their valuation they are likely to go unrecognised and cannot easily be weighed against the costs of such measures as required, for example, by policy-makers. A simple single measure scoring system, based on the Welfare Quality® index, is used, together with a choice experiment economic valuation method, to estimate the value that people place on improvements to the welfare of different farm animal species measured on a continuous (0-100) scale. Results from using the method on a survey sample of some 300 people show that it is able to elicit apparently credible values. The survey found that 96% of respondents thought that we have a moral obligation to safeguard the welfare of animals and that over 72% were concerned about the way farm animals are treated. Estimated mean annual willingness to pay for meat from animals with improved welfare of just one point on the scale was £5.24 for beef cattle, £4.57 for pigs and £5.10 for meat chickens. Further development of the method is required to capture the total economic value of animal welfare benefits. Despite this, the method is considered a practical means for obtaining economic values that can be used in the cost-benefit appraisal of policy measures intended to improve the welfare of animals.
Resumo:
References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.
Resumo:
The continuous ranked probability score (CRPS) is a frequently used scoring rule. In contrast with many other scoring rules, the CRPS evaluates cumulative distribution functions. An ensemble of forecasts can easily be converted into a piecewise constant cumulative distribution function with steps at the ensemble members. This renders the CRPS a convenient scoring rule for the evaluation of ‘raw’ ensembles, obviating the need for sophisticated ensemble model output statistics or dressing methods prior to evaluation. In this article, a relation between the CRPS score and the quantile score is established. The evaluation of ‘raw’ ensembles using the CRPS is discussed in this light. It is shown that latent in this evaluation is an interpretation of the ensemble as quantiles but with non-uniform levels. This needs to be taken into account if the ensemble is evaluated further, for example with rank histograms.
Resumo:
The paper traces the evolution of the tally from a receipt for cash payments into the treasury, to proof of payments made by royal officials outside of the treasury and finally to an assignment of revenue to be paid out by royal officials. Each of these processes is illustrated by examples drawn from the Exchequer records and explains their significance for royal finance and for historians working on the Exchequer records.
Resumo:
In this paper the properties of a hydro-meteorological forecasting system for forecasting river flows have been analysed using a probabilistic forecast convergence score (FCS). The focus on fixed event forecasts provides a forecaster's approach to system behaviour and adds an important perspective to the suite of forecast verification tools commonly used in this field. A low FCS indicates a more consistent forecast. It can be demonstrated that the FCS annual maximum decreases over the last 10 years. With lead time, the FCS of the ensemble forecast decreases whereas the control and high resolution forecast increase. The FCS is influenced by the lead time, threshold and catchment size and location. It indicates that one should use seasonality based decision rules to issue flood warnings.
Resumo:
Cocoa flavanol (CF) intake improves endothelial function in patients with cardiovascular risk factors and disease. We investigated the effects of CF on surrogate markers of cardiovascular health in low risk, healthy, middle-aged individuals without history, signs or symptoms of CVD. In a 1-month, open-label, one-armed pilot study, bi-daily ingestion of 450 mg of CF led to a time-dependent increase in endothelial function (measured as flow-mediated vasodilation (FMD)) that plateaued after 2 weeks. Subsequently, in a randomised, controlled, double-masked, parallel-group dietary intervention trial (Clinicaltrials.gov: NCT01799005), 100 healthy, middle-aged (35–60 years) men and women consumed either the CF-containing drink (450 mg) or a nutrient-matched CF-free control bi-daily for 1 month. The primary end point was FMD. Secondary end points included plasma lipids and blood pressure, thus enabling the calculation of Framingham Risk Scores and pulse wave velocity. At 1 month, CF increased FMD over control by 1·2 % (95 % CI 1·0, 1·4 %). CF decreased systolic and diastolic blood pressure by 4·4 mmHg (95 % CI 7·9, 0·9 mmHg) and 3·9 mmHg (95 % CI 6·7, 0·9 mmHg), pulse wave velocity by 0·4 m/s (95 % CI 0·8, 0·04 m/s), total cholesterol by 0·20 mmol/l (95 % CI 0·39, 0·01 mmol/l) and LDL-cholesterol by 0·17 mmol/l (95 % CI 0·32, 0·02 mmol/l), whereas HDL-cholesterol increased by 0·10 mmol/l (95 % CI 0·04, 0·17 mmol/l). By applying the Framingham Risk Score, CF predicted a significant lowering of 10-year risk for CHD, myocardial infarction, CVD, death from CHD and CVD. In healthy individuals, regular CF intake improved accredited cardiovascular surrogates of cardiovascular risk, demonstrating that dietary flavanols have the potential to maintain cardiovascular health even in low-risk subjects.
Resumo:
The present work describes a new tool that helps bidders improve their competitive bidding strategies. This new tool consists of an easy-to-use graphical tool that allows the use of more complex decision analysis tools in the field of Competitive Bidding. The graphic tool described here tries to move away from previous bidding models which attempt to describe the result of an auction or a tender process by means of studying each possible bidder with probability density functions. As an illustration, the tool is applied to three practical cases. Theoretical and practical conclusions on the great potential breadth of application of the tool are also presented.
Resumo:
The aim of this study was to evaluate working conditions in the textile industry for different stages of Corporate Social Responsibility (CSR) development, and workers` perception of fatigue and workability. A cross-sectional study was undertaken with 126 workers in the production areas of five Brazilian textile plants. The corporate executive officers and managers of each company provided their personal evaluations of CSR. Companies were divided into 2 groups (higher and lower) of CSR scores. Workers completed questionnaires on fatigue, workability and working conditions. Ergonomic job analysis showed similar results for working conditions, independent of their CSR score. Multivariate analysis models were developed for fatigue and workability, indicating that they are both associated to factors related to working conditions and individual workers` characteristics and life styles. Work organization, (what, how, when, where and for how long the work is done), is also an associated factor for fatigue. This study suggests that workers` opinions should be taken into greater consideration when companies develop their CSR programs, in particular for those relating to working conditions. Relevance to industry: This paper underlines the importance of considering working conditions and workers` opinions of them, work organization and individual workers` characteristics and life styles in order to restore or to maintain workability and to reduce fatigue, independently of how developed a company may be in the field of Corporate Social Responsibility. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We present a method to determine the magnitude of the uncorrelated background distribution obtained with the event mixing technique, through the simultaneous observation of the projectile elastic scattering in different detectors, which correspond to random coincidences. The procedure is tested with alpha-d angular correlation data from the (6)Li + (59)Co reaction at E(lab) = 29.6 MeV. We also show that the method can be applied using the product of singles events, when singles measurements are available. (C) 2009 Elsevier B.V. All rights reserved.
An imaginary potential with universal normalization for dissipative processes in heavy-ion reactions
Resumo:
In this work we present new coupled channel calculations with the Sao Paulo potential (SPP) as the bare interaction, and an imaginary potential with system and energy independent normalization that has been developed to take into account dissipative processes in heavy-ion reactions. This imaginary potential is based on high-energy nucleon interaction in nuclear medium. Our theoretical predictions for energies up to approximate to 100 MeV/nucleon agree very well with the experimental data for the p, n + nucleus, (16)O + (27)Al, (16)O + (60)Ni, (58)Ni + (124)Sn, and weakly bound projectile (7)Li + (120)Sn systems. (C) 2008 Elsevier B.V. All rights reserved.