917 resultados para pure error
Resumo:
Plasma is an innovative sterilization method characterized by a low toxicity to operators and patients, and also by its operation at temperatures close to room temperatures. The use of different parameters for this method of sterilization and the corresponding results were analyzed in this study. A low-pressure inductive discharge was used to study the plasma sterilization processes. Oxygen and a mixture of oxygen and hydrogen peroxide were used as plasma source gases. The efficacy of the processes using different combinations of parameters such as plasma-generation method, type of gas, pressure, gas flow rate, temperature, power, and exposure time was evaluated. Two phases were developed for the processes, one using pure oxygen and the other a mixture of gases. Bacillus subtilis var. niger ATCC 9372 (Bacillus atrophaeus) spores inoculated on glass coverslips were used as biological indicators to evaluate the efficacy of the processes. All cycles were carried out in triplicate for different sublethal exposure times to calculate the D value by the enumeration method. The pour-plate technique was used to quantify the spores. D values of between 8 and 3 min were obtained. Best results were achieved at high power levels (350 and 40oW) using pure oxygen, showing that plasma sterilization is a promising alternative to other sterilization methods. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
This study examined the test performance of distortion product otoacoustic emissions (DPOAEs) when used as a screening tool in the school setting. A total of 1003 children (mean age 6.2 years, SD = 0.4) were tested with pure-tone screening, tympanometry, and DPOAE assessment. Optimal DPOAE test performance was determined in comparison with pure-tone screening results using clinical decision analysis. The results showed hit rates of 0.86, 0.89, and 0.90, and false alarm rates of 0.52, 0.19, and 0.22 for criterion signal-to-noise ratio (SNR) values of 4, 5, and 11 dB at 1.1, 1.9, and 3.8 kHz respectively. DPOAE test performance was compromised at 1.1 kHz. In view of the different test performance characteristics across the frequencies, the use of a fixed SNR as a pass criterion for all frequencies in DPOAE assessments is not recommended. When compared to pure tone plus tympanometry results, the DPOAEs showed deterioration in test performance, suggesting that the use of DPOAEs alone might miss children with subtle middle ear dysfunction. However, when the results of a test protocol, which incorporates both DPOAEs and tympanometry, were used in comparison with the gold standard of pure-tone screening plus tympanometry, test performance was enhanced. In view of its high performance, the use of a protocol that includes both DPOAEs and tympanometry holds promise as a useful tool in the hearing screening of schoolchildren, including difficult-to-test children.
Resumo:
We show that quantum feedback control can be used as a quantum-error-correction process for errors induced by a weak continuous measurement. In particular, when the error model is restricted to one, perfectly measured, error channel per physical qubit, quantum feedback can act to perfectly protect a stabilizer codespace. Using the stabilizer formalism we derive an explicit scheme, involving feedback and an additional constant Hamiltonian, to protect an (n-1)-qubit logical state encoded in n physical qubits. This works for both Poisson (jump) and white-noise (diffusion) measurement processes. Universal quantum computation is also possible in this scheme. As an example, we show that detected-spontaneous emission error correction with a driving Hamiltonian can greatly reduce the amount of redundancy required to protect a state from that which has been previously postulated [e.g., Alber , Phys. Rev. Lett. 86, 4402 (2001)].
Resumo:
This paper presents a method for estimating the posterior probability density of the cointegrating rank of a multivariate error correction model. A second contribution is the careful elicitation of the prior for the cointegrating vectors derived from a prior on the cointegrating space. This prior obtains naturally from treating the cointegrating space as the parameter of interest in inference and overcomes problems previously encountered in Bayesian cointegration analysis. Using this new prior and Laplace approximation, an estimator for the posterior probability of the rank is given. The approach performs well compared with information criteria in Monte Carlo experiments. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
We calculate the stationary state of the system of two non-identical two-level atoms driven by a finite-bandwidth two-mode squeezed vacuum. It is well known that two identical two-level atoms driven by a broadband squeezed vacuum may decay to a pure state, called the pure two-atom squeezed state, and that the presence of the antisymmetric state can change its purity. Here, we show that for small interatomic separations the stationary state of two non-identical atoms is not sensitive to the presence of the antisymmetric state and is the pure two-atom squeezed state. This effect is a consequence of the fact that in the system of two non-identical atoms the antisymmetric state is no longer the trapping state. We also calculate the squeezing properties of the emitted field and find that the squeezing spectrum of the output field may exhibit larger squeezing than that in the input squeezed vacuum. Moreover, we show that squeezing in the total field attains the optimum value which can ever be achieved in the field emitted by two atoms.
Resumo:
An electrochemical investigation was carried out to study the corrosion of pure magnesium in 1 N NaCl at different pH values involving electrochemical polarisation, scanning tunnel microscopy (STM), measurement of hydrogen gas evolution and measurement of the elements dissolved from the magnesium specimen which were determined by inductively coupled plasma atomic emission spectrophotometry (ICPAES). A partially protective surface film was a principal factor controlling corrosion. Film coverage decreased with increasing applied electrode potential. Application of a suitable external cathodic current density was shown to inhibit magnesium dissolution whilst at the same time the hydrogen evolution rate was relatively small. This showed that cathodic protection could be used to significantly reduce magnesium corrosion. A new definition is proposed for the negative difference effect (NDE). (C) 1997 Elsevier Science Ltd.
Resumo:
Analysis of a major multi-site epidemiologic study of heart disease has required estimation of the pairwise correlation of several measurements across sub-populations. Because the measurements from each sub-population were subject to sampling variability, the Pearson product moment estimator of these correlations produces biased estimates. This paper proposes a model that takes into account within and between sub-population variation, provides algorithms for obtaining maximum likelihood estimates of these correlations and discusses several approaches for obtaining interval estimates. (C) 1997 by John Wiley & Sons, Ltd.
Resumo:
Background: Biochemical analysis of fluid is the primary laboratory approach hi pleural effusion diagnosis. Standardization of the steps between collection and laboratorial analyses are fundamental to maintain the quality of the results. We evaluated the influence of temperature and storage time on sample stability. Methods: Pleural fluid from 30 patients was submitted to analyses of proteins, albumin, lactic dehydrogenase (LDH), cholesterol, triglycerides, and glucose. Aliquots were stored at 21 degrees, 4 degrees, and-20 degrees C, and concentrations were determined after 1, 2, 3, 4, 7, and 14 days. LDH isoenzymes were quantified in 7 random samples. Results: Due to the instability of isoenzymes 4 and 5, a decrease in LDH was observed in the first 24 h in samples maintained at -20 degrees C and after 2 days when maintained at 4 degrees C. Aside from glucose, all parameters were stable for up to at least day 4 when stored at room temperature or 4 degrees C. Conclusions: Temperature and storage time are potential preanalytical errors in pleural fluid analyses, mainly if we consider the instability of glucose and LDH. The ideal procedure is to execute all the tests immediately after collection. However, most of the tests can be done in refrigerated sample;, excepting LDH analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Parenteral anticoagulation is a cornerstone in the management of venous and arterial thrombosis. Unfractionated heparin has a wide dose/response relationship, requiring frequent and troublesome laboratorial follow-up. Because of all these factors, low-molecular-weight heparin use has been increasing. Inadequate dosage has been pointed out as a potential problem because the use of subjectively estimated weight instead of real measured weight is common practice in the emergency department (ED). To evaluate the impact of inadequate weight estimation on enoxaparin dosage, we investigated the adequacy of anticoagulation of patients in a tertiary ED where subjective weight estimation is common practice. We obtained the estimated, informed, and measured weight of 28 patients in need of parenteral anticoagulation. Basal and steady-state (after the second subcutaneous shot of enoxaparin) anti-Xa activity was obtained as a measure of adequate anticoagulation. The patients were divided into 2 groups according the anticoagulation adequacy. From the 28 patients enrolled, 75% (group 1, n = 21) received at least 0.9 mg/kg per dose BID and 25% (group 2, n = 7) received less than 0.9 mg/kg per dose BID of enoxaparin. Only 4 (14.3%) of all patients had anti-Xa activity less than the inferior limit of the therapeutic range (<0.5 UI/mL), all of them from group 2. In conclusion, when weight estimation was used to determine the enoxaparin dosage, 25% of the patients were inadequately anticoagulated (anti-Xa activity <0.5 UI/mL) during the initial crucial phase of treatment. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Provision of an inert gas atmosphere with high-purity argon gas is recommended for preventing titanium castings from contamination although the effects of the level of argon purity on the mechanical properties and the clinical performance of Ti castings have not yet been investigated. The purpose of this study was to evaluate the effect of argon purity on the mechanical properties and microstructure of commercially pure (cp) Ti and Ti-6Al-4V alloys. The castings were made using either high-purity and/or industrial argon gas. The ultimate tensile strength (UTS), proportional limit (PL), elongation (EL) and microhardness (VHN) at different depths were evaluated. The microstructure of the alloys was also revealed and the fracture mode was analyzed by scanning electron microscopy. The data from the mechanical tests and hardness were subjected to a two-and three-way ANOVA and Tukey`s test (alpha = 0.05). The mean values of mechanical properties were not affected by the argon gas purity. Higher UTS, PL and VHN, and lower EL were observed for Ti-6Al-4V. The microhardness was not influenced by the argon gas purity. The industrial argon gas can be used to cast cp Ti and Ti-6Al-4V.
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.