82 resultados para Failure Probability
Resumo:
A dataset of 1,846,990 completed lactation record,; was created Using milk recording data from 8,967 commercial dairy farms in the United Kingdom over a five year period. Herd-specific lactation curves describing levels of milk, Cat and protein by lactation number and month of calving were generated for each farm. The actual yield of milk and protein proportion at the first milk recording of individual cow lactations were compared with the levels taken from the lactation curves. Logistic regression analysis showed that cows production milk with a lower percentage of protein than average had a significantly lower probability of being in-calf at 100 days post calving and it significantly higher probability of being culled at the end of lactation. The culling rates derived from the studied database demonstrate the current high wastage rate of commercial dairy cows. Well of this wastage is due to involuntary culling as a result of reproductive failure.
Resumo:
We consider the case of a multicenter trial in which the center specific sample sizes are potentially small. Under homogeneity, the conventional procedure is to pool information using a weighted estimator where the weights used are inverse estimated center-specific variances. Whereas this procedure is efficient for conventional asymptotics (e. g. center-specific sample sizes become large, number of center fixed), it is commonly believed that the efficiency of this estimator holds true also for meta-analytic asymptotics (e.g. center-specific sample size bounded, potentially small, and number of centers large). In this contribution we demonstrate that this estimator fails to be efficient. In fact, it shows a persistent bias with increasing number of centers showing that it isnot meta-consistent. In addition, we show that the Cochran and Mantel-Haenszel weighted estimators are meta-consistent and, in more generality, provide conditions on the weights such that the associated weighted estimator is meta-consistent.
Resumo:
The jackknife method is often used for variance estimation in sample surveys but has only been developed for a limited class of sampling designs.We propose a jackknife variance estimator which is defined for any without-replacement unequal probability sampling design. We demonstrate design consistency of this estimator for a broad class of point estimators. A Monte Carlo study shows how the proposed estimator may improve on existing estimators.
Resumo:
Imputation is commonly used to compensate for item non-response in sample surveys. If we treat the imputed values as if they are true values, and then compute the variance estimates by using standard methods, such as the jackknife, we can seriously underestimate the true variances. We propose a modified jackknife variance estimator which is defined for any without-replacement unequal probability sampling design in the presence of imputation and non-negligible sampling fraction. Mean, ratio and random-imputation methods will be considered. The practical advantage of the method proposed is its breadth of applicability.
Resumo:
Individual identification via DNA profiling is important in molecular ecology, particularly in the case of noninvasive sampling. A key quantity in determining the number of loci required is the probability of identity (PIave), the probability of observing two copies of any profile in the population. Previously this has been calculated assuming no inbreeding or population structure. Here we introduce formulae that account for these factors, whilst also accounting for relatedness structure in the population. These formulae are implemented in API-CALC 1.0, which calculates PIave for either a specified value, or a range of values, for F-IS and F-ST.
Resumo:
Accelerated failure time models with a shared random component are described, and are used to evaluate the effect of explanatory factors and different transplant centres on survival times following kidney transplantation. Different combinations of the distribution of the random effects and baseline hazard function are considered and the fit of such models to the transplant data is critically assessed. A mixture model that combines short- and long-term components of a hazard function is then developed, which provides a more flexible model for the hazard function. The model can incorporate different explanatory variables and random effects in each component. The model is straightforward to fit using standard statistical software, and is shown to be a good fit to the transplant data. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
Purpose – To evaluate the control strategy for a hybrid natural ventilation wind catchers and air-conditioning system and to assess the contribution of wind catchers to indoor air environments and energy savings if any. Design/methodology/approach – Most of the modeling techniques for assessing wind catchers performance are theoretical. Post-occupancy evaluation studies of buildings will provide an insight into the operation of these building components and help to inform facilities managers. A case study for POE was presented in this paper. Findings – The monitoring of the summer and winter month operations showed that the indoor air quality parameters were kept within the design target range. The design control strategy failed to record data regarding the operation, opening time and position of wind catchers system. Though the implemented control strategy was working effectively in monitoring the operation of mechanical ventilation systems, i.e. AHU, did not integrate the wind catchers with the mechanical ventilation system. Research limitations/implications – Owing to short-falls in the control strategy implemented in this project, it was found difficult to quantify and verify the contribution of the wind catchers to the internal conditions and, hence, energy savings. Practical implications – Controlling the operation of the wind catchers via the AHU will lead to isolation of the wind catchers in the event of malfunctioning of the AHU. Wind catchers will contribute to the ventilation of space, particularly in the summer months. Originality/value – This paper demonstrates the value of POE as indispensable tool for FM professionals. It further provides insight into the application of natural ventilation systems in building for healthier indoor environments at lower energy cost. The design of the control strategy for natural ventilation and air-conditioning should be considered at the design stage involving the FM personnel.
Resumo:
In real-world environments it is usually difficult to specify the quality of a preventive maintenance (PM) action precisely. This uncertainty makes it problematic to optimise maintenance policy.-This problem is tackled in this paper by assuming that the-quality of a PM action is a random variable following a probability distribution. Two frequently studied PM models, a failure rate PM model and an age reduction PM model, are investigated. The optimal PM policies are presented and optimised. Numerical examples are also given.
Resumo:
The tendency to neglect base-rates in judgment under uncertainty may be "notorious," as Barbey & Sloman (B&S) suggest, but it is neither inevitable (as they document; see also Koehler 1996) nor unique. Here we would like to point out another line of evidence connecting ecological rationality to dual processes, the failure of individuals to appropriately judge cumulative probability.
Resumo:
The availability of a network strongly depends on the frequency of service outages and the recovery time for each outage. The loss of network resources includes complete or partial failure of hardware and software components, power outages, scheduled maintenance such as software and hardware, operational errors such as configuration errors and acts of nature such as floods, tornadoes and earthquakes. This paper proposes a practical approach to the enhancement of QoS routing by means of providing alternative or repair paths in the event of a breakage of a working path. The proposed scheme guarantees that every Protected Node (PN) is connected to a multi-repair path such that no further failure or breakage of single or double repair paths can cause any simultaneous loss of connectivity between an ingress node and an egress node. Links to be protected in an MPLS network are predefined and an LSP request involves the establishment of a working path. The use of multi-protection paths permits the formation of numerous protection paths allowing greater flexibility. Our analysis will examine several methods including single, double and multi-repair routes and the prioritization of signals along the protected paths to improve the Quality of Service (QoS), throughput, reduce the cost of the protection path placement, delay, congestion and collision.
Resumo:
Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.
Resumo:
Interwar British retailing has been characterized as having lower productivity, less developed managerial hierarchies and methods, and weaker scale economies than its US counterpart. This article examines comparative productivity for one major segment of large-scale retailing in both countries—the department store sector. Drawing on exceptionally detailed contemporary survey data, we show that British department stores in fact achieved superior performance in terms of operating costs, margins, profits, and stock-turn. While smaller British stores had lower labour productivity than US stores of equivalent size, TFP was generally higher for British stores, which also enjoyed stronger scale economies. We also examine the reasons behind Britain's surprisingly strong relative performance, using surviving original returns from the British surveys. Contrary to arguments that British retailers faced major barriers to the development of large-scale enterprises, that could reap economies of scale and scope and invest in machinery and marketing to support the growth of their primary sales functions, we find that British department stores enthusiastically embraced the retail ‘managerial revolution’—and reaped substantial benefits from this investment.
Resumo:
A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately.
Resumo:
A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.
Resumo:
This paper analyzes the delay performance of Enhanced relay-enabled Distributed Coordination Function (ErDCF) for wireless ad hoc networks under ideal condition and in the presence of transmission errors. Relays are nodes capable of supporting high data rates for other low data rate nodes. In ideal channel ErDCF achieves higher throughput and reduced energy consumption compared to IEEE 802.11 Distributed Coordination Function (DCF). This gain is still maintained in the presence of errors. It is also expected of relays to reduce the delay. However, the impact on the delay behavior of ErDCF under transmission errors is not known. In this work, we have presented the impact of transmission errors on delay. It turns out that under transmission errors of sufficient magnitude to increase dropped packets, packet delay is reduced. This is due to increase in the probability of failure. As a result the packet drop time increases, thus reflecting the throughput degradation.