20 resultados para lower bound

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The determination of the minimum size of a k-neighborhood (i.e., a neighborhood of a set of k nodes) in a given graph is essential in the analysis of diagnosability and fault tolerance of multicomputer systems. The generalized cubes include the hypercube and most hypercube variants as special cases. In this paper, we present a lower bound on the size of a k-neighborhood in n-dimensional generalized cubes, where 2n + 1 <= k <= 3n - 2. This lower bound is tight in that it is met by the n-dimensional hypercube. Our result is an extension of two previously known results. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transient and equilibrium sensitivity of Earth's climate has been calculated using global temperature, forcing and heating rate data for the period 1970–2010. We have assumed increased long-wave radiative forcing in the period due to the increase of the long-lived greenhouse gases. By assuming the change in aerosol forcing in the period to be zero, we calculate what we consider to be lower bounds to these sensitivities, as the magnitude of the negative aerosol forcing is unlikely to have diminished in this period. The radiation imbalance necessary to calculate equilibrium sensitivity is estimated from the rate of ocean heat accumulation as 0.37±0.03W m^−2 (all uncertainty estimates are 1−σ). With these data, we obtain best estimates for transient climate sensitivity 0.39±0.07K (W m^−2)^−1 and equilibrium climate sensitivity 0.54±0.14K (W m^−2)^−1, equivalent to 1.5±0.3 and 2.0±0.5K (3.7W m^−2)^−1, respectively. The latter quantity is equal to the lower bound of the ‘likely’ range for this quantity given by the 2007 IPCC Assessment Report. The uncertainty attached to the lower-bound equilibrium sensitivity permits us to state, within the assumptions of this analysis, that the equilibrium sensitivity is greater than 0.31K (W m^−2)^−1, equivalent to 1.16K(3.7W m^−2)^−1, at the 95% confidence level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the applications of capture–recapture methods to human populations. Capture–recapture methods are commonly used in estimating the size of wildlife populations but can also be used in epidemiology and social sciences, for estimating prevalence of a particular disease or the size of the homeless population in a certain area. Here we focus on estimating the prevalence of infectious diseases. Several estimators of population size are considered: the Lincoln–Petersen estimator and its modified version, the Chapman estimator, Chao’s lower bound estimator, the Zelterman’s estimator, McKendrick’s moment estimator and the maximum likelihood estimator. In order to evaluate these estimators, they are applied to real, three-source, capture-recapture data. By conditioning on each of the sources of three source data, we have been able to compare the estimators with the true value that they are estimating. The Chapman and Chao estimators were compared in terms of their relative bias. A variance formula derived through conditioning is suggested for Chao’s estimator, and normal 95% confidence intervals are calculated for this and the Chapman estimator. We then compare the coverage of the respective confidence intervals. Furthermore, a simulation study is included to compare Chao’s and Chapman’s estimator. Results indicate that Chao’s estimator is less biased than Chapman’s estimator unless both sources are independent. Chao’s estimator has also the smaller mean squared error. Finally, the implications and limitations of the above methods are discussed, with suggestions for further development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

None of the current surveillance streams monitoring the presence of scrapie in Great Britain provide a comprehensive and unbiased estimate of the prevalence of the disease at the holding level. Previous work to estimate the under-ascertainment adjusted prevalence of scrapie in Great Britain applied multiple-list capture–recapture methods. The enforcement of new control measures on scrapie-affected holdings in 2004 has stopped the overlapping between surveillance sources and, hence, the application of multiple-list capture–recapture models. Alternative methods, still under the capture–recapture methodology, relying on repeated entries in one single list have been suggested in these situations. In this article, we apply one-list capture–recapture approaches to data held on the Scrapie Notifications Database to estimate the undetected population of scrapie-affected holdings with clinical disease in Great Britain for the years 2002, 2003, and 2004. For doing so, we develop a new diagnostic tool for indication of heterogeneity as well as a new understanding of the Zelterman and Chao’s lower bound estimators to account for potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a special, locally truncated Poisson likelihood equivalent to a binomial likelihood. This understanding allows the extension of the Zelterman approach by means of logistic regression to include observed heterogeneity in the form of covariates—in case studied here, the holding size and country of origin. Our results confirm the presence of substantial unobserved heterogeneity supporting the application of our two estimators. The total scrapie-affected holding population in Great Britain is around 300 holdings per year. None of the covariates appear to inform the model significantly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the applications of capture-recapture methods to human populations. Capture-recapture methods are commonly used in estimating the size of wildlife populations but can also be used in epidemiology and social sciences, for estimating prevalence of a particular disease or the size of the homeless population in a certain area. Here we focus on estimating the prevalence of infectious diseases. Several estimators of population size are considered: the Lincoln-Petersen estimator and its modified version, the Chapman estimator, Chao's lower bound estimator, the Zelterman's estimator, McKendrick's moment estimator and the maximum likelihood estimator. In order to evaluate these estimators, they are applied to real, three-source, capture-recapture data. By conditioning on each of the sources of three source data, we have been able to compare the estimators with the true value that they are estimating. The Chapman and Chao estimators were compared in terms of their relative bias. A variance formula derived through conditioning is suggested for Chao's estimator, and normal 95% confidence intervals are calculated for this and the Chapman estimator. We then compare the coverage of the respective confidence intervals. Furthermore, a simulation study is included to compare Chao's and Chapman's estimator. Results indicate that Chao's estimator is less biased than Chapman's estimator unless both sources are independent. Chao's estimator has also the smaller mean squared error. Finally, the implications and limitations of the above methods are discussed, with suggestions for further development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

None of the current surveillance streams monitoring the presence of scrapie in Great Britain provide a comprehensive and unbiased estimate of the prevalence of the disease at the holding level. Previous work to estimate the under-ascertainment adjusted prevalence of scrapie in Great Britain applied multiple-list capture-recapture methods. The enforcement of new control measures on scrapie-affected holdings in 2004 has stopped the overlapping between surveillance sources and, hence, the application of multiple-list capture-recapture models. Alternative methods, still under the capture-recapture methodology, relying on repeated entries in one single list have been suggested in these situations. In this article, we apply one-list capture-recapture approaches to data held on the Scrapie Notifications Database to estimate the undetected population of scrapie-affected holdings with clinical disease in Great Britain for the years 2002, 2003, and 2004. For doing so, we develop a new diagnostic tool for indication of heterogeneity as well as a new understanding of the Zelterman and Chao's lower bound estimators to account for potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a special, locally truncated Poisson likelihood equivalent to a binomial likelihood. This understanding allows the extension of the Zelterman approach by means of logistic regression to include observed heterogeneity in the form of covariates-in case studied here, the holding size and country of origin. Our results confirm the presence of substantial unobserved heterogeneity supporting the application of our two estimators. The total scrapie-affected holding population in Great Britain is around 300 holdings per year. None of the covariates appear to inform the model significantly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we apply one-list capture-recapture models to estimate the number of scrapie-affected holdings in Great Britain. We applied this technique to the Compulsory Scrapie Flocks Scheme dataset where cases from all the surveillance sources monitoring the presence of scrapie in Great Britain, the abattoir survey, the fallen stock survey and the statutory reporting of clinical cases, are gathered. Consequently, the estimates of prevalence obtained from this scheme should be comprehensive and cover all the different presentations of the disease captured individually by the surveillance sources. Two estimators were applied under the one-list approach: the Zelterman estimator and Chao's lower bound estimator. Our results could only inform with confidence the scrapie-affected holding population with clinical disease; this moved around the figure of 350 holdings in Great Britain for the period under study, April 2005-April 2006. Our models allowed the stratification by surveillance source and the input of covariate information, holding size and country of origin. None of the covariates appear to inform the model significantly. Crown Copyright (C) 2008 Published by Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent literature has described a “transition zone” between the average top of deep convection in the Tropics and the stratosphere. Here transport across this zone is investigated using an offline trajectory model. Particles were advected by the resolved winds from the European Centre for Medium-Range Weather Forecasts reanalyses. For each boreal winter clusters of particles were released in the upper troposphere over the four main regions of tropical deep convection (Indonesia, central Pacific, South America, and Africa). Most particles remain in the troposphere, descending on average for every cluster. The horizontal components of 5-day trajectories are strongly influenced by the El Niño–Southern Oscillation (ENSO), but the Lagrangian average descent does not have a clear ENSO signature. Tropopause crossing locations are first identified by recording events when trajectories from the same release regions cross the World Meteorological Organization lapse rate tropopause. Most crossing events occur 5–15 days after release, and 30-day trajectories are sufficiently long to estimate crossing number densities. In a further two experiments slight excursions across the lapse rate tropopause are differentiated from the drift deeper into the stratosphere by defining the “tropopause zone” as a layer bounded by the average potential temperature of the lapse rate tropopause and the profile temperature minimum. Transport upward across this zone is studied using forward trajectories released from the lower bound and back trajectories arriving at the upper bound. Histograms of particle potential temperature (θ) show marked differences between the transition zone, where there is a slow spread in θ values about a peak that shifts slowly upward, and the troposphere below 350 K. There forward trajectories experience slow radiative cooling interspersed with bursts of convective heating resulting in a well-mixed distribution. In contrast θ histograms for back trajectories arriving in the stratosphere have two distinct peaks just above 300 and 350 K, indicating the sharp change from rapid convective heating in the well-mixed troposphere to slow ascent in the transition zone. Although trajectories slowly cross the tropopause zone throughout the Tropics, all three experiments show that most trajectories reaching the stratosphere from the lower troposphere within 30 days do so over the west Pacific warm pool. This preferred location moves about 30°–50° farther east in an El Niño year (1982/83) and about 30° farther west in a La Niña year (1988/89). These results could have important implications for upper-troposphere–lower-stratosphere pollution and chemistry studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Gram-Schmidt (GS) orthogonalisation procedure has been used to improve the convergence speed of least mean square (LMS) adaptive code-division multiple-access (CDMA) detectors. However, this algorithm updates two sets of parameters, namely the GS transform coefficients and the tap weights, simultaneously. Because of the additional adaptation noise introduced by the former, it is impossible to achieve the same performance as the ideal orthogonalised LMS filter, unlike the result implied in an earlier paper. The authors provide a lower bound on the minimum achievable mean squared error (MSE) as a function of the forgetting factor λ used in finding the GS transform coefficients, and propose a variable-λ algorithm to balance the conflicting requirements of good tracking and low misadjustment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present an outlook on the climate system thermodynamics. First, we construct an equivalent Carnot engine with efficiency and frame the Lorenz energy cycle in a macroscale thermodynamic context. Then, by exploiting the second law, we prove that the lower bound to the entropy production is times the integrated absolute value of the internal entropy fluctuations. An exergetic interpretation is also proposed. Finally, the controversial maximum entropy production principle is reinterpreted as requiring the joint optimization of heat transport and mechanical work production. These results provide tools for climate change analysis and for climate models’ validation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Numerical methods are described for determining robust, or well-conditioned, solutions to the problem of pole assignment by state feedback. The solutions obtained are such that the sensitivity of the assigned poles to perturbations in the system and gain matrices is minimized. It is shown that for these solutions, upper bounds on the norm of the feedback matrix and on the transient response are also minimized and a lower bound on the stability margin is maximized. A measure is derived which indicates the optimal conditioning that may be expected for a particular system with a given set of closed-loop poles, and hence the suitability of the given poles for assignment.