990 resultados para 1 sigma error
Resumo:
Background: Biochemical analysis of fluid is the primary laboratory approach hi pleural effusion diagnosis. Standardization of the steps between collection and laboratorial analyses are fundamental to maintain the quality of the results. We evaluated the influence of temperature and storage time on sample stability. Methods: Pleural fluid from 30 patients was submitted to analyses of proteins, albumin, lactic dehydrogenase (LDH), cholesterol, triglycerides, and glucose. Aliquots were stored at 21 degrees, 4 degrees, and-20 degrees C, and concentrations were determined after 1, 2, 3, 4, 7, and 14 days. LDH isoenzymes were quantified in 7 random samples. Results: Due to the instability of isoenzymes 4 and 5, a decrease in LDH was observed in the first 24 h in samples maintained at -20 degrees C and after 2 days when maintained at 4 degrees C. Aside from glucose, all parameters were stable for up to at least day 4 when stored at room temperature or 4 degrees C. Conclusions: Temperature and storage time are potential preanalytical errors in pleural fluid analyses, mainly if we consider the instability of glucose and LDH. The ideal procedure is to execute all the tests immediately after collection. However, most of the tests can be done in refrigerated sample;, excepting LDH analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Interleukin (IL)-1 alpha and beta are important modulators of many functions of corneal epithelial and stromal cells that occur following injury to the cornea, including the influx of bone marrow-derived inflammatory cells into the stroma attracted by chemokines released from the stroma and epithelium. In this study, we examined the effect of topical soluble IL-1 receptor antagonist on bone marrow-derived cell influx following corneal epithelial scrape injury in a mouse model. C57BL/6 mice underwent corneal epithelial scrape followed by application of IL-1 receptor antagonist (Amgen, Thousand Oaks, CA) at a concentration of 20 mg/ml or vehicle for 24 h prior to immunocytochemical detection of marker CD11b-positive cells into the stroma. In two experiments, topical IL-1 receptor antagonist had a marked effect in blocking cell influx. For example, in experiment 1, topical IL-1 receptor antagonist markedly reduced detectible CD11b-positive cells into the corneal stroma at 24 It after epithelial injury compared with the vehicle control (3.5 +/- 0.5 (standard error of the mean) cells/400x field and 13.9 +/- 1.2 cells/400x field, respectively, p < 0.01). A second experiment with a different observer performing cell counting had the same result. Thus, the data demonstrate conclusively that topical IL-1 receptor antagonist markedly down-regulates CD-11b-positive monocytic cell appearance in the corneal stroma. Topical IL-1 receptor antagonist could be an effective adjuvant for clinical treatment of corneal conditions in which unwanted inflammation has a role in the pathophysiology of the disorder. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Parenteral anticoagulation is a cornerstone in the management of venous and arterial thrombosis. Unfractionated heparin has a wide dose/response relationship, requiring frequent and troublesome laboratorial follow-up. Because of all these factors, low-molecular-weight heparin use has been increasing. Inadequate dosage has been pointed out as a potential problem because the use of subjectively estimated weight instead of real measured weight is common practice in the emergency department (ED). To evaluate the impact of inadequate weight estimation on enoxaparin dosage, we investigated the adequacy of anticoagulation of patients in a tertiary ED where subjective weight estimation is common practice. We obtained the estimated, informed, and measured weight of 28 patients in need of parenteral anticoagulation. Basal and steady-state (after the second subcutaneous shot of enoxaparin) anti-Xa activity was obtained as a measure of adequate anticoagulation. The patients were divided into 2 groups according the anticoagulation adequacy. From the 28 patients enrolled, 75% (group 1, n = 21) received at least 0.9 mg/kg per dose BID and 25% (group 2, n = 7) received less than 0.9 mg/kg per dose BID of enoxaparin. Only 4 (14.3%) of all patients had anti-Xa activity less than the inferior limit of the therapeutic range (<0.5 UI/mL), all of them from group 2. In conclusion, when weight estimation was used to determine the enoxaparin dosage, 25% of the patients were inadequately anticoagulated (anti-Xa activity <0.5 UI/mL) during the initial crucial phase of treatment. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.
Resumo:
Accurate habitat mapping is critical to landscape ecological studies such as required for developing and testing Montreal Process indicator 1.1e, fragmentation of forest types. This task poses a major challenge to remote sensing, especially in mixedspecies, variable-age forests such as dry eucalypt forests of subtropical eastern Australia. In this paper, we apply an innovative approach that uses a small section of one-metre resolution airborne data to calibrate a moderate spatial resolution model (30 m resolution; scale 1:50 000) based on Landsat Thematic Mapper data to estimate canopy structural properties in St Marys State Forest, near Maryborough, south-eastern Queensland. The approach applies an image-processing model that assumes each image pixel is significantly larger than individual tree crowns and gaps to estimate crown-cover percentage, stem density and mean crown diameter. These parameters were classified into three discrete habitat classes to match the ecology of four exudivorous arboreal species (yellowbellied glider Petaurus australis, sugar glider P. breviceps, squirrel glider P. norfolcensis , and feathertail glider Acrobates pygmaeus), and one folivorous arboreal marsupial, the greater glider Petauroides volans. These species were targeted due to the known ecological preference for old trees with hollows, and differences in their home range requirements. The overall mapping accuracy, visually assessed against transects (n = 93) interpreted from a digital orthophoto and validated in the field, was 79% (KHAT statistic = 0.72). The KHAT statistic serves as an indicator of the extent that the percentage correct values of the error matrix are due to ‘true’ agreement verses ‘chance’ agreement. This means that we are able to reliably report on the effect of habitat loss on target species, especially those with a large home range size (e.g. yellow-bellied glider). However, the classified habitat map failed to accurately capture the spatial patterning (e.g. patch size and shape) of stands with a trace or sub-dominance of senescent trees. This outcome makes the reporting of the effects of habitat fragmentation more problematic, especially for species with a small home range size (e.g. feathertail glider). With further model refinement and validation, however, this moderateresolution approach offers an important, cost eff e c t i v e advancement in mapping the age of dry eucalypt forests in the region.
Resumo:
We investigate spectral functions extracted using the maximum entropy method from correlators measured in lattice simulations of the (2+1)-dimensional four-fermion model. This model is particularly interesting because it has both a chirally broken phase with a rich spectrum of mesonic bound states and a symmetric phase where there are only resonances. In the broken phase we study the elementary fermion, pion, sigma, and massive pseudoscalar meson; our results confirm the Goldstone nature of the π and permit an estimate of the meson binding energy. We have, however, seen no signal of σ→ππ decay as the chiral limit is approached. In the symmetric phase we observe a resonance of nonzero width in qualitative agreement with analytic expectations; in addition the ultraviolet behavior of the spectral functions is consistent with the large nonperturbative anomalous dimension for fermion composite operators expected in this model.
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Introduction: Visual anomalies that affect school-age children represent an important public health problem. Data on the prevalence are lacking in Portugal but is needed for planning vision services. This study was conducted to determine the prevalence of strabismus, decreased visual acuity, and uncorrected refractive error in Portuguese children aged 6 to 11 years. Methods and materials: A cross-sectional study was carried out on a sample of 672 school-age children (7.69 ± 1.19 years). Children received an orthoptic assessment (visual acuity, ocular alignment, and ocular movements) and non-cycloplegic autorefraction. Results: After orthoptic assessment, 13.8% of children were considered abnormal (n = 93). Manifest strabismus was found in 4% of the children. Rates of esotropia (2.1%) were slightly higher than exotropia (1.8%). Strabismus rates were not statistically significant different per sex (p = 0.681) and grade (p = 0.228). Decreased visual acuity at distance was present in 11.3% of children. Visual acuity ≤20/66 (0.5 logMAR) was found in 1.3% of the children. We also found that 10.3% of children had an uncorrected refractive error. Conclusions: Strabismus affects a small proportion of the Portuguese school-age children. Decreased visual acuity and uncorrected refractive error affected a significant proportion of school-age children. New policies need to be developed to address this public health problem.
Resumo:
In video communication systems, the video signals are typically compressed and sent to the decoder through an error-prone transmission channel that may corrupt the compressed signal, causing the degradation of the final decoded video quality. In this context, it is possible to enhance the error resilience of typical predictive video coding schemes using as inspiration principles and tools from an alternative video coding approach, the so-called Distributed Video Coding (DVC), based on the Distributed Source Coding (DSC) theory. Further improvements in the decoded video quality after error-prone transmission may also be obtained by considering the perceptual relevance of the video content, as distortions occurring in different regions of a picture have a different impact on the user's final experience. In this context, this paper proposes a Perceptually Driven Error Protection (PDEP) video coding solution that enhances the error resilience of a state-of-the-art H.264/AVC predictive video codec using DSC principles and perceptual considerations. To increase the H.264/AVC error resilience performance, the main technical novelties brought by the proposed video coding solution are: (i) design of an improved compressed domain perceptual classification mechanism; (ii) design of an improved transcoding tool for the DSC-based protection mechanism; and (iii) integration of a perceptual classification mechanism in an H.264/AVC compliant codec with a DSC-based error protection mechanism. The performance results obtained show that the proposed PDEP video codec provides a better performing alternative to traditional error protection video coding schemes, notably Forward Error Correction (FEC)-based schemes. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
The very high antiproliferative activity of [Co(Cl)(H2O)(phendione)(2)][BF4] (phendione is 1,10-phenanthroline-5,6-dione) against three human tumor cell lines (half-maximal inhibitory concentration below 1 mu M) and its slight selectivity for the colorectal tumor cell line compared with healthy human fibroblasts led us to explore the mechanisms of action underlying this promising antitumor potential. As previously shown by our group, this complex induces cell cycle arrest in S phase and subsequent cell death by apoptosis and it also reduces the expression of proteins typically upregulated in tumors. In the present work, we demonstrate that [Co(Cl)(phendione)(2)(H2O)][BF4] (1) does not reduce the viability of nontumorigenic breast epithelial cells by more than 85 % at 1 mu M, (2) promotes the upregulation of proapoptotic Bax and cell-cycle-related p21, and (3) induces release of lactate dehydrogenase, which is partially reversed by ursodeoxycholic acid. DNA interaction studies were performed to uncover the genotoxicity of the complex and demonstrate that even though it displays K (b) (+/- A standard error of the mean) of (3.48 +/- A 0.03) x 10(5) M-1 and is able to produce double-strand breaks in a concentration-dependent manner, it does not exert any clastogenic effect ex vivo, ruling out DNA as a major cellular target for the complex. Steady-state and time-resolved fluorescence spectroscopy studies are indicative of a strong and specific interaction of the complex with human serum albumin, involving one binding site, at a distance of approximately 1.5 nm for the Trp214 indole side chain with log K (b) similar to 4.7, thus suggesting that this complex can be efficiently transported by albumin in the blood plasma.
Resumo:
In this paper we exploit the nonlinear property of the SiC multilayer devices to design an optical processor for error detection that enables reliable delivery of spectral data of four-wave mixing over unreliable communication channels. The SiC optical processor is realized by using double pin/pin a-SiC:H photodetector with front and back biased optical gating elements. Visible pulsed signals are transmitted together at different bit sequences. The combined optical signal is analyzed. Data show that the background acts as selector that picks one or more states by splitting portions of the input multi optical signals across the front and back photodiodes. Boolean operations such as EXOR and three bit addition are demonstrated optically, showing that when one or all of the inputs are present, the system will behave as an XOR gate representing the SUM. When two or three inputs are on, the system acts as AND gate indicating the present of the CARRY bit. Additional parity logic operations are performed using four incoming pulsed communication channels that are transmitted and checked for errors together. As a simple example of this approach, we describe an all-optical processor for error detection and then provide an experimental demonstration of this idea. (C) 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia e Gestão Industrial
Função visual e desempenho na leitura em crianças do 1º ciclo do ensino básico do concelho de Lisboa
Resumo:
RESUMO - Esta tese pretende ser um contributo para o estudo das anomalias da função visual e da sua influência no desempenho da leitura. Apresentava como objetivos: (1) Identificar a prevalência de anomalias da função visual, (2) Caracterizar o desempenho da leitura em crianças com e sem anomalias da função visual, (3) Identificar de que modo as anomalias da função visual influenciam o desempenho da leitura e (4) Identificar o impacto das variáveis que determinam o desempenho da leitura. Foi recolhida uma amostra de conveniência com 672 crianças do 1º ciclo do ensino básico de 11 Escolas do Concelho de Lisboa com idades compreendidas entre os 6 e os 11 anos (7,69±1,19), 670 encarregados de educação e 34 Professores. Para recolha de dados, foram utilizados três instrumentos: 2 questionários de perguntas fechadas, avaliação da função visual e prova de avaliação da leitura com 34 palavras. Após observadas, as crianças foram classificadas em dois grupos: função visual normal (FVN=562) e função visual alterada (FVA=110). Identificou-se uma prevalência de 16,4% de crianças com FVA. No teste de leitura, estas crianças apresentaram um menor número de palavras lidas corretamente (FVA=31,00; FVN=33,00; p<0,001) e menor precisão (FVA=91,18%; FVN=97,06%; p<0,001). Esta tendência também foi observada na comparação entre os 4 anos de escolaridade. As crianças com função visual alterada mostraram uma tendência para a omissão de letras e a confusão de grafema. Quanto à fluência (FVA=24,71; FVN=27,39; p=0,007) esta foi inferior nas crianças com FVA para todos os anos de escolaridade, exceto o 3º ano. As crianças com hipermetropia (p=0,003) e astigmatismo (p=0,019) não corrigido leram menos palavras corretamente (30,00; 31,00) e com menor precisão (88,24%; 91,18%) que as crianças sem erro refrativo significativo (32,00; 94,12%). A performance escolar classificada pelos professores foi inferior nas crianças com FVA e mais de ¼ necessitavam de medidas de apoio especial na escola. Não se verificaram diferenças significativas na performance da leitura das crianças com FVA por grupos de habilitações dos encarregados de educação. Verificou-se que o risco de ter um desempenho na leitura alterado é superior [OR=4,29; I.C.95%(2,49;7,38)] nas crianças que apresentam FVA. Relativamente ao 1º ano de escolaridade, o 2º, 3º e 4º anos apresentam um menor risco de ter um desempenho na leitura alterado. As variáveis método de ensino, habilitações dos encarregados de educação, tipo de escola (pública/privada), idade do Professor e número de anos de experiência do Professor, não foram fatores estatisticamente significativos para explicar a alteração do desempenho na leitura, quando o efeito da função visual se encontra contemplado no modelo. Um mau desempenho na leitura foi considerado nas crianças que apresentaram uma precisão inferior a 90%. Este indicador pode ser utilizado para identificar crianças em risco, que necessitam de uma observação Ortóptica/Oftalmológica para confirmação ou exclusão da existência de alterações da função visual. Este trabalho constitui um contributo para a identificação de crianças em desvantagem educacional devido a anomalias da função visual tratáveis, propondo um modelo que pretende orientar os professores na identificação de crianças que apresentem um baixo desempenho na leitura.
Resumo:
Companies from the motorcycles components branch are dealing with a dynamic environment, resulting from the introduction of new products and the increase of market demand. This dynamic environment requires frequent changes in production lines and requires flexibility in the processes, which can cause reductions in the level of quality and productivity. This paper presents a Lean Six Sigma improvement project performed in a production line of the company's machining sector, in order to eliminate losses that cause low productivity, affecting the fulfillment of the production plan and customer satisfaction. The use of Lean methodology following the DMAIC stages allowed analyzing the factors that influence the line productivity loss. The major problems and causes that contribute to a reduction on productivity and that were identified in this study are the lack of standardization in the setup activities and the excessive stoppages for adjustment of the processes that caused an increase of defects. Control charts, Pareto analysis and cause-and-effect diagrams were used to analyze the problem. On the improvement stage, the changes were based on the reconfiguration of the line layout as well as the modernization of the process. Overall, the project justified an investment in new equipment, the defective product units were reduced by 84% and an increase of 29% of line capacity was noticed.