9 resultados para QIC
Resumo:
1. Ecological data sets often use clustered measurements or use repeated sampling in a longitudinal design. Choosing the correct covariance structure is an important step in the analysis of such data, as the covariance describes the degree of similarity among the repeated observations. 2. Three methods for choosing the covariance are: the Akaike information criterion (AIC), the quasi-information criterion (QIC), and the deviance information criterion (DIC). We compared the methods using a simulation study and using a data set that explored effects of forest fragmentation on avian species richness over 15 years. 3. The overall success was 80.6% for the AIC, 29.4% for the QIC and 81.6% for the DIC. For the forest fragmentation study the AIC and DIC selected the unstructured covariance, whereas the QIC selected the simpler autoregressive covariance. Graphical diagnostics suggested that the unstructured covariance was probably correct. 4. We recommend using DIC for selecting the correct covariance structure.
Resumo:
Six Sigma provides a framework for quality improvement and business excellence. Introduced in the 1980s in manufacturing, the concept of Six Sigma has gained popularity in service organizations. After initial success in healthcare and banking, Six Sigma has gradually gained traction in other types of service industries, including hotels and lodging. Starwood Hotels and Resorts was the first hospitality giant to embrace Six Sigma. In 2001, Starwood adopted the method to develop innovative, customer-focused solutions and to transfer these solutions throughout the global organization. To analyze Starwood's use of Six Sigma, the authors collected data from articles, interviews, presentations and speeches published in magazines, newspapers and Web sites. This provided details to corroborate information, and they also made inferences from these sources. Financial metrics can explain the success of Six Sigma in any organization. There was no shortage of examples of Starwood's success resulting from Six Sigma project metrics uncovered during the research.
Resumo:
Selecting an appropriate working correlation structure is pertinent to clustered data analysis using generalized estimating equations (GEE) because an inappropriate choice will lead to inefficient parameter estimation. We investigate the well-known criterion of QIC for selecting a working correlation Structure. and have found that performance of the QIC is deteriorated by a term that is theoretically independent of the correlation structures but has to be estimated with an error. This leads LIS to propose a correlation information criterion (CIC) that substantially improves the QIC performance. Extensive simulation studies indicate that the CIC has remarkable improvement in selecting the correct correlation structures. We also illustrate our findings using a data set from the Madras Longitudinal Schizophrenia Study.
Resumo:
Na presente comunicação quantificam-se os limites de tráfego rodoviário cor-respondentes a diferentes valores de referência dos níveis sonoros nas facha-das dos edifícios, tendo em vista facultar informação facilmente interpretável que possa servir de orientação no planeamento urbano, para dar satisfação aos requisitos regulamentares aplicáveis. Para tal são analisadas diferentes situações tipo de rodovias e envolvente urbana e apresentados alguns comen-tários sobre as opções mais adequadas.
Resumo:
In this paper, we extend the use of the variance dispersion graph (VDG) to experiments in which the response surface (RS) design must be blocked. Through several examples we evaluate the prediction performances of RS designs in non-orthogonal block designs compared with the equivalent unblocked designs and orthogonally blocked designs. These examples illustrate that good prediction performance of designs in small blocks can be expected in practice. Most importantly, we show that the allocation of the treatment set to blocks can seriously affect the prediction properties of designs; thus, much care is needed in performing this allocation.
Resumo:
Recent studies have shown that the (X) over bar chart with variable sampling intervals (VSI) and/or with variable sample sizes (VSS) detects process shifts faster than the traditional (X) over bar chart. This article extends these studies for processes that are monitored by both the (X) over bar and R charts. A Markov chain model is used to determine the properties of the joint (X) over bar and R charts with variable sample sizes and sampling intervals (VSSI). The VSSI scheme improves the joint (X) over bar and R control chart performance in terms of the speed with which shifts in the process mean and/or variance are detected.
Resumo:
Factorial experiments are widely used in industry to investigate the effects of process factors on quality response variables. Many food processes, for example, are not only subject to variation between days, but also between different times of the day. Removing this variation using blocking factors leads to row-column designs. In this paper, an algorithm is described for constructing factorial row-column designs when the factors are quantitative, and the data are to be analysed by fitting a polynomial model. The row-column designs are constructed using an iterative interchange search, where interchanges that result in an improvement in the weighted mean of the efficiency factors corresponding to the parameters of interest are accepted. Some examples illustrating the performance of the algorithm are given.
Resumo:
Varying the parameters of the (X) over bar chart has been explored extensively in recent years. In this paper, we extend the study of the (X) over bar chart with variable parameters to include variable action limits. The action limits establish whether the control should be relaxed or not. When the (X) over bar falls near the target, the control is relaxed so that there will be more time before the next sample and/or the next sample will be smaller than usual. When the (X) over bar falls far from the target but not in the action region, the control is tightened so that there is less time before the next sample and/or the next sample will be larger than usual. The goal is to draw the action limits wider than usual when the control is relaxed and narrower than usual when the control is tightened. This new feature then makes the (X) over bar chart more powerful than the CUSUM scheme in detecting shifts in the process mean.
Resumo:
The security of a passive plug-and-play QKD arrangement in the case of finite (resources) key lengths is analysed. It is assumed that the eavesdropper has full access to the channel so an unknown and untrusted source is assumed. To take into account the security of the BB84 protocol under collective attacks within the framework of quantum adversaries, a full treatment provides the well-known equations for the secure key rate. A numerical simulation keeping a minimum number of initial parameters constant as the total error sought and the number of pulses is carried out. The remaining parameters are optimized to produce the maximum secure key rate. Two main strategies are addressed: with and without two-decoy-states including the optimization of signal to decoy relationship.