65 resultados para probability indicator


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The jackknife method is often used for variance estimation in sample surveys but has only been developed for a limited class of sampling designs.We propose a jackknife variance estimator which is defined for any without-replacement unequal probability sampling design. We demonstrate design consistency of this estimator for a broad class of point estimators. A Monte Carlo study shows how the proposed estimator may improve on existing estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present here an indicator of soil quality that evaluates soil ecosystem services through a set of 5 subindicators, and further combines them into a single general Indicator of Soil Quality (GISQ). We used information derived from 54 properties commonly used to describe the multifaceted aspects of soil quality. The design and calculation of the indicators were based on sequences of multivariate analyses. Subindicators evaluated the physical quality, chemical fertility, organic matter stocks, aggregation and morphology of the upper 5 cm of soil and the biodiversity of soil macrofauna. A GISQ combined the different subindicators providing a global assessment of soil quality. Research was conducted in two hillside regions of Colombia and Nicaragua, with similar types of land use and socio-economic context. However, soil and climatic conditions differed significantly. In Nicaragua, soil quality was assessed at 61 points regularly distributed 200 m apart on a regular grid across the landscape. In Colombia, 8 plots representing different types of land use were arbitrarily chosen in the landscape and intensively sampled. Indicators that were designed in the Nicaragua site were further applied to the Colombian site to test for their applicability. In Nicaragua, coffee plantations, fallows, pastures and forest had the highest values of GISQ (1.00; 0.80; 0.78 and 0.77, respectively) while maize crops and eroded soils (0.19 and 0.10) had the lowest values. Examination of subindicator values allowed the separate evaluation of different aspects of soil quality: subindicators of organic matter, aggregation and morphology and biodiversity of macrofauna had the maximum values in coffee plantations (0.89; 0.72 and 0.56, respectively on average) while eroded soils had the lowest values for these indicators (0.10; 0.31 and 0.33, respectively). Indicator formulae derived from information gained at the Nicaraguan sites were not applicable to the Colombian situation and site-specific constants were calculated. This indicator allows the evaluation of soil quality and facilitates the identification of problem areas through the individual values of each subindicator. It allows monitoring of change through time and can guide the implementation of soil restoration technologies. Although GISQ formulae computed on a set of data were only valid at a regional scale, the methodology used to create these indices can be applied everywhere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Imputation is commonly used to compensate for item non-response in sample surveys. If we treat the imputed values as if they are true values, and then compute the variance estimates by using standard methods, such as the jackknife, we can seriously underestimate the true variances. We propose a modified jackknife variance estimator which is defined for any without-replacement unequal probability sampling design in the presence of imputation and non-negligible sampling fraction. Mean, ratio and random-imputation methods will be considered. The practical advantage of the method proposed is its breadth of applicability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Individual identification via DNA profiling is important in molecular ecology, particularly in the case of noninvasive sampling. A key quantity in determining the number of loci required is the probability of identity (PIave), the probability of observing two copies of any profile in the population. Previously this has been calculated assuming no inbreeding or population structure. Here we introduce formulae that account for these factors, whilst also accounting for relatedness structure in the population. These formulae are implemented in API-CALC 1.0, which calculates PIave for either a specified value, or a range of values, for F-IS and F-ST.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the energy consumption and the evaluation of the performance of air supply systems for a ventilated room involving high- and low-level supplies. The energy performance assessment is based on the airflow rate, which is related to the fan power consumption by achieving the same environmental quality performance for each case. Four different ventilation systems are considered: wall displacement ventilation, confluent jets ventilation, impinging jet ventilation and a high level mixing ventilation system. The ventilation performance of these systems will be examined by means of achieving the same Air Distribution Index (ADI) for different cases. The widely used high-level supplies require much more fan power than those for low-level supplies for achieving the same value of ADI. In addition, the supply velocity, hence the supply dynamic pressure, for a high-level supply is much larger than for low-level supplies. This further increases the power consumption for high-level supply systems. The paper considers these factors and attempts to provide some guidelines on the difference in the energy consumption associated with high and low level air supply systems. This will be useful information for designers and to the authors' knowledge there is a lack of information available in the literature on this area of room air distribution. The energy performance of the above-mentioned ventilation systems has been evaluated on the basis of the fan power consumed which is related to the airflow rate required to provide equivalent indoor environment. The Air Distribution Index (ADI) is used to evaluate the indoor environment produced in the room by the ventilation strategy being used. The results reveal that mixing ventilation requires the highest fan power and the confluent jets ventilation needs the lowest fan power in order to achieve nearly the same value of ADI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To clarify the role of growth monitoring in primary school children, including obesity, and to examine issues that might impact on the effectiveness and cost-effectiveness of such programmes. Data sources: Electronic databases were searched up to July 2005. Experts in the field were also consulted. Review methods: Data extraction and quality assessment were performed on studies meeting the review's inclusion criteria. The performance of growth monitoring to detect disorders of stature and obesity was evaluated against National Screening Committee (NSC) criteria. Results: In the 31 studies that were included in the review, there were no controlled trials of the impact of growth monitoring and no studies of the diagnostic accuracy of different methods for growth monitoring. Analysis of the studies that presented a 'diagnostic yield' of growth monitoring suggested that one-off screening might identify between 1: 545 and 1: 1793 new cases of potentially treatable conditions. Economic modelling suggested that growth monitoring is associated with health improvements [ incremental cost per quality-adjusted life-year (QALY) of pound 9500] and indicated that monitoring was cost-effective 100% of the time over the given probability distributions for a willingness to pay threshold of pound 30,000 per QALY. Studies of obesity focused on the performance of body mass index against measures of body fat. A number of issues relating to human resources required for growth monitoring were identified, but data on attitudes to growth monitoring were extremely sparse. Preliminary findings from economic modelling suggested that primary prevention may be the most cost-effective approach to obesity management, but the model incorporated a great deal of uncertainty. Conclusions: This review has indicated the potential utility and cost-effectiveness of growth monitoring in terms of increased detection of stature-related disorders. It has also pointed strongly to the need for further research. Growth monitoring does not currently meet all NSC criteria. However, it is questionable whether some of these criteria can be meaningfully applied to growth monitoring given that short stature is not a disease in itself, but is used as a marker for a range of pathologies and as an indicator of general health status. Identification of effective interventions for the treatment of obesity is likely to be considered a prerequisite to any move from monitoring to a screening programme designed to identify individual overweight and obese children. Similarly, further long-term studies of the predictors of obesity-related co-morbidities in adulthood are warranted. A cluster randomised trial comparing growth monitoring strategies with no growth monitoring in the general population would most reliably determine the clinical effectiveness of growth monitoring. Studies of diagnostic accuracy, alongside evidence of effective treatment strategies, could provide an alternative approach. In this context, careful consideration would need to be given to target conditions and intervention thresholds. Diagnostic accuracy studies would require long-term follow-up of both short and normal children to determine sensitivity and specificity of growth monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The differential phase (ΦDP) measured by polarimetric radars is recognized to be a very good indicator of the path integrated by rain. Moreover, if a linear relationship is assumed between the specific differential phase (KDP) and the specific attenuation (AH) and specific differential attenuation (ADP), then attenuation can easily be corrected. The coefficients of proportionality, γH and γDP, are, however, known to be dependent in rain upon drop temperature, drop shapes, drop size distribution, and the presence of large drops causing Mie scattering. In this paper, the authors extensively apply a physically based method, often referred to as the “Smyth and Illingworth constraint,” which uses the constraint that the value of the differential reflectivity ZDR on the far side of the storm should be low to retrieve the γDP coefficient. More than 30 convective episodes observed by the French operational C-band polarimetric Trappes radar during two summers (2005 and 2006) are used to document the variability of γDP with respect to the intrinsic three-dimensional characteristics of the attenuating cells. The Smyth and Illingworth constraint could be applied to only 20% of all attenuated rays of the 2-yr dataset so it cannot be considered the unique solution for attenuation correction in an operational setting but is useful for characterizing the properties of the strongly attenuating cells. The range of variation of γDP is shown to be extremely large, with minimal, maximal, and mean values being, respectively, equal to 0.01, 0.11, and 0.025 dB °−1. Coefficient γDP appears to be almost linearly correlated with the horizontal reflectivity (ZH), differential reflectivity (ZDR), and specific differential phase (KDP) and correlation coefficient (ρHV) of the attenuating cells. The temperature effect is negligible with respect to that of the microphysical properties of the attenuating cells. Unusually large values of γDP, above 0.06 dB °−1, often referred to as “hot spots,” are reported for 15%—a nonnegligible figure—of the rays presenting a significant total differential phase shift (ΔϕDP > 30°). The corresponding strongly attenuating cells are shown to have extremely high ZDR (above 4 dB) and ZH (above 55 dBZ), very low ρHV (below 0.94), and high KDP (above 4° km−1). Analysis of 4 yr of observed raindrop spectra does not reproduce such low values of ρHV, suggesting that (wet) ice is likely to be present in the precipitation medium and responsible for the attenuation and high phase shifts. Furthermore, if melting ice is responsible for the high phase shifts, this suggests that KDP may not be uniquely related to rainfall rate but can result from the presence of wet ice. This hypothesis is supported by the analysis of the vertical profiles of horizontal reflectivity and the values of conventional probability of hail indexes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the kernel weights of the proposed pdf estimator based on the zero-norm approximation can be updated using the multiplicative nonnegative quadratic programming algorithm. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the significance of widely used leading indicators of the UK economy for predicting the cyclical pattern of commercial real estate performance. The analysis uses monthly capital value data for UK industrials, offices and retail from the Investment Property Databank (IPD). Prospective economic indicators are drawn from three sources namely, the series used by the US Conference Board to construct their UK leading indicator and the series deployed by two private organisations, Lombard Street Research and NTC Research, to predict UK economic activity. We first identify turning points in the capital value series adopting techniques employed in the classical business cycle literature. We then estimate probit models using the leading economic indicators as independent variables and forecast the probability of different phases of capital values, that is, periods of declining and rising capital values. The forecast performance of the models is tested and found to be satisfactory. The predictability of lasting directional changes in property performance represents a useful tool for real estate investment decision-making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research has suggested that relatively cold UK winters are more common when solar activity is low (Lockwood et al 2010 Environ. Res. Lett. 5 024001). Solar activity during the current sunspot minimum has fallen to levels unknown since the start of the 20th century (Lockwood 2010 Proc. R. Soc. A 466 303–29) and records of past solar variations inferred from cosmogenic isotopes (Abreu et al 2008 Geophys. Res. Lett. 35 L20109) and geomagnetic activity data (Lockwood et al 2009 Astrophys. J. 700 937–44) suggest that the current grand solar maximum is coming to an end and hence that solar activity can be expected to continue to decline. Combining cosmogenic isotope data with the long record of temperatures measured in central England, we estimate how solar change could influence the probability in the future of further UK winters that are cold, relative to the hemispheric mean temperature, if all other factors remain constant. Global warming is taken into account only through the detrending using mean hemispheric temperatures. We show that some predictive skill may be obtained by including the solar effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain