975 resultados para analytical method validation
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Mecânica
Resumo:
Dissertation presented to obtain the Ph.D degree in Chemistry.
Resumo:
Geographic information systems give us the possibility to analyze, produce, and edit geographic information. Furthermore, these systems fall short on the analysis and support of complex spatial problems. Therefore, when a spatial problem, like land use management, requires a multi-criteria perspective, multi-criteria decision analysis is placed into spatial decision support systems. The analytic hierarchy process is one of many multi-criteria decision analysis methods that can be used to support these complex problems. Using its capabilities we try to develop a spatial decision support system, to help land use management. Land use management can undertake a broad spectrum of spatial decision problems. The developed decision support system had to accept as input, various formats and types of data, raster or vector format, and the vector could be polygon line or point type. The support system was designed to perform its analysis for the Zambezi river Valley in Mozambique, the study area. The possible solutions for the emerging problems had to cover the entire region. This required the system to process large sets of data, and constantly adjust to new problems’ needs. The developed decision support system, is able to process thousands of alternatives using the analytical hierarchy process, and produce an output suitability map for the problems faced.
Resumo:
The moisture content in concrete structures has an important influence in their behavior and performance. Several vali-dated numerical approaches adopt the governing equation for relative humidity fields proposed in Model Code 1990/2010. Nevertheless there is no integrative study which addresses the choice of parameters for the simulation of the humidity diffusion phenomenon, particularly in concern to the range of parameters forwarded by Model Code 1990/2010. A software based on a Finite Difference Method Algorithm (1D and axisymmetric cases) is used to perform sensitivity analyses on the main parameters in a normal strength concrete. Then, based on the conclusions of the sensi-tivity analyses, experimental results from nine different concrete compositions are analyzed. The software is used to identify the main material parameters that better fit the experimental data. In general, the model was able to satisfactory fit the experimental results and new correlations were proposed, particularly focusing on the boundary transfer coeffi-cient.
Resumo:
A conventional method for seismic strengthening of masonry walls is externally application of reinforced concrete layer (shotcrete). However, due to the lack of analytical and experimental information on the behavior of strengthened walls, the design procedures are usually followed based on the empirical relations. Using these design procedures have resulted in massive strengthening details in retrofitting projects. This paper presents a computational framework for nonlinear analysis of strengthened masonry walls and its versatility has been verified by comparing the numerical and experimental results. Based on the developed numerical model and available experimental information, design relations and failure modes are proposed for strengthened walls in accordance with the ASCE 41 standard. Finally, a sample masonry structure has been strengthened using the proposed and available conventional methods. It has been shown that using the proposed method results in lower strengthening details and appropriate (ductile) failure modes
Resumo:
The present paper focuses on a damage identification method based on the use of the second order spectral properties of the nodal response processes. The explicit dependence on the frequency content of the outputs power spectral densities makes them suitable for damage detection and localization. The well-known case study of the Z24 Bridge in Switzerland is chosen to apply and further investigate this technique with the aim of validating its reliability. Numerical simulations of the dynamic response of the structure subjected to different types of excitation are carried out to assess the variability of the spectrum-driven method with respect to both type and position of the excitation sources. The simulated data obtained from random vibrations, impulse, ramp and shaking forces, allowed to build the power spectrum matrix from which the main eigenparameters of reference and damage scenarios are extracted. Afterwards, complex eigenvectors and real eigenvalues are properly weighed and combined and a damage index based on the difference between spectral modes is computed to pinpoint the damage. Finally, a group of vibration-based damage identification methods are selected from the literature to compare the results obtained and to evaluate the performance of the spectral index.
Resumo:
We elaborated an alternative culture method, which we denominated PKO (initials in tribute of respect to Petroff, Kudoh and Ogawa), for isolating Mycobacterium tuberculosis from sputum for diagnosis of pulmonary tuberculosis (TB), and to compare its performance with the Swab and Petroff methods. For the technique validation, sputum samples from patients suspected of pulmonary TB cases were examined by acid-fast microscopy (direct and concentrated smear), PKO, Swab and Petroff methods. We found that Petroff and PKO methods have parity in the effectiveness of M. tuberculosis isolation. However, by the PKO method, 65% of isolated strains were detected in a period of £15 days, while by the Petroff method the best detection was in an interval of 16-29 days (71%). In positive smear samples, the average time of PKO isolation is only superior to the one related for Bactec 460TB. In conclusion, the exclusion of the neutralization stage of pH in the PKO reduces the manipulation of the samples, diminishes the execution time of the culture according to the Petroff method and facilitates the qualification of professionals involved in the laboratorial diagnosis of Tuberculosis.
Resumo:
OBJECTIVE: This work was designed to validate the Portuguese version of the Contemplation Ladder, whose purpose is to assess the motivational phase to quit smoking among tobacco users using a telephone service. METHOD: A cross-sectional study was conducted in a nationwide drug use information hotline. In order to assess the convergent validation, the correlation between the Contemplation Ladder and the URICA Scale was calculated, which was previously validated. RESULTS: The study included 271 tobacco users. Statistically significant correlations were found between the Contemplation Ladder scores and the scores of the URICA precontemplation (r=-0.16; p<0.01), action (r=0.15; p<0.01) and maintenance (r=0.18; p<0.01) subscales. The correlation between the URICA Scale compound score and the Contemplation Ladder was also significant (r=0.31; p<0.01). CONCLUSION: The results of our study suggest that the Contemplation Ladder can be an efficient substitute for the URICA scale (whose application lasts at least 20 minutes), without submitting the interviewee to a heavy load of questions. The study presented evidences of convergent validity for the Contemplation Ladder when applied via telephone in tobacco users.
Resumo:
OBJECTIVES: To describe the process of translation and linguistic and cultural validation of the Evidence Based Practice Questionnaire for the Portuguese context: Questionário de Eficácia Clínica e Prática Baseada em Evidências (QECPBE). METHOD: A methodological and cross-sectional study was developed. The translation and back translation was performed according to traditional standards. Principal Components Analysis with orthogonal rotation according to the Varimax method was used to verify the QECPBE's psychometric characteristics, followed by confirmatory factor analysis. Internal consistency was determined by Cronbach's alpha. Data were collected between December 2013 and February 2014. RESULTS: 358 nurses delivering care in a hospital facility in North of Portugal participated in the study. QECPBE contains 20 items and three subscales: Practice (α=0.74); Attitudes (α=0.75); Knowledge/Skills and Competencies (α=0.95), presenting an overall internal consistency of α=0.74. The tested model explained 55.86% of the variance and presented good fit: χ2(167)=520.009; p = 0.0001; χ2df=3.114; CFI=0.908; GFI=0.865; PCFI=0.798; PGFI=0.678; RMSEA=0.077 (CI90%=0.07-0.08). CONCLUSION: confirmatory factor analysis revealed the questionnaire is valid and appropriate to be used in the studied context.
Resumo:
Dissertação de mestrado em Técnicas de Caracterização e Análise Química
Resumo:
OBJECTIVE - The aim of our study was to assess the profile of a wrist monitor, the Omron Model HEM-608, compared with the indirect method for blood pressure measurement. METHODS - Our study population consisted of 100 subjects, 29 being normotensive and 71 being hypertensive. Participants had their blood pressure checked 8 times with alternate techniques, 4 by the indirect method and 4 with the Omron wrist monitor. The validation criteria used to test this device were based on the internationally recognized protocols. RESULTS - Our data showed that the Omron HEM-608 reached a classification B for systolic and A for diastolic blood pressure, according to the one protocol. The mean differences between blood pressure values obtained with each of the methods were -2.3 +7.9mmHg for systolic and 0.97+5.5mmHg for diastolic blood pressure. Therefore, we considered this type of device approved according to the criteria selected. CONCLUSION - Our study leads us to conclude that this wrist monitor is not only easy to use, but also produces results very similar to those obtained by the standard indirect method.
Resumo:
Multiphase flows, hyperbolic model, Godunov method, nozzle flow, nonstrictly hyperbolic
Resumo:
A novel approach to measure carbon dioxide (CO2) in gaseous samples, based on a precise and accurate quantification by (13)CO2 internal standard generated in situ is presented. The main goal of this study was to provide an innovative headspace-gas chromatography-mass spectrometry (HS-GC-MS) method applicable in the routine determination of CO2. The main drawback of the GC methods discussed in the literature for CO2 measurement is the lack of a specific internal standard necessary to perform quantification. CO2 measurement is still quantified by external calibration without taking into account analytical problems which can often occur considering gaseous samples. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate in situ an internal labeled standard gas ((13)CO2) on the basis of the stoichiometric formation of CO2 by the reaction of hydrochloric acid (HCl) with sodium hydrogen carbonate (NaH(13)CO3). This method allows a precise measurement of CO2 concentration and was validated on various human postmortem gas samples in order to study its efficiency.
Resumo:
This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.
Resumo:
MOTIVATION: Microarray results accumulated in public repositories are widely reused in meta-analytical studies and secondary databases. The quality of the data obtained with this technology varies from experiment to experiment, and an efficient method for quality assessment is necessary to ensure their reliability. RESULTS: The lack of a good benchmark has hampered evaluation of existing methods for quality control. In this study, we propose a new independent quality metric that is based on evolutionary conservation of expression profiles. We show, using 11 large organ-specific datasets, that IQRray, a new quality metrics developed by us, exhibits the highest correlation with this reference metric, among 14 metrics tested. IQRray outperforms other methods in identification of poor quality arrays in datasets composed of arrays from many independent experiments. In contrast, the performance of methods designed for detecting outliers in a single experiment like Normalized Unscaled Standard Error and Relative Log Expression was low because of the inability of these methods to detect datasets containing only low-quality arrays and because the scores cannot be directly compared between experiments. AVAILABILITY AND IMPLEMENTATION: The R implementation of IQRray is available at: ftp://lausanne.isb-sib.ch/pub/databases/Bgee/general/IQRray.R. CONTACT: Marta.Rosikiewicz@unil.ch SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.