989 resultados para correction methods
Resumo:
Introdução – A estimativa da função renal relativa (FRR) através de cintigrafia renal (CR) com ácido dimercaptossuccínico marcado com tecnécio-99 metaestável (99mTc-DMSA) pode ser influenciada pela profundidade renal (PR), atendendo ao efeito de atenuação por parte dos tecidos moles que envolvem os rins. Dado que raramente é conhecida esta mesma PR, diferentes métodos de correção de atenuação (CA) foram desenvolvidos, nomeadamente os que utilizam fórmulas empíricas, como os de Raynaud, de Taylor ou de Tonnesen, ou recorrendo à aplicação direta da média geométrica (MG). Objetivos – Identificar a influência dos diferentes métodos de CA na quantificação da função renal relativa através da CR com 99mTc-DMSA e avaliar a respetiva variabilidade dos resultados de PR. Metodologia – Trinta e um pacientes com indicação para realização de CR com 99mTc-DMSA foram submetidos ao mesmo protocolo de aquisição. O processamento foi efetuado por dois operadores independentes, três vezes por exame, variando para o mesmo processamento o método de determinação da FRR: Raynaud, Taylor, Tonnesen, MG ou sem correção de atenuação (SCA). Aplicou-se o teste de Friedman para o estudo da influência dos diferentes métodos de CA e a correlação de Pearson para a associação e significância dos valores de PR com as variáveis idade, peso e altura. Resultados – Da aplicação do teste de Friedman verificaram-se diferenças estatisticamente significativas entre os vários métodos (p=0,000), excetuando as comparações SCA/Raynaud, Tonnesen/MG e Taylor/MG (p=1,000) para ambos os rins. A correlação de Pearson demonstra que a variável peso apresenta uma correlação forte positiva com todos os métodos de cálculo da PR. Conclusões – O método de Taylor, entre os três métodos de cálculo de PR, é o que apresenta valores de FRR mais próximos da MG. A escolha do método de CA influencia significativamente os parâmetros quantitativos de FRR.
Resumo:
Dissertação de mestrado integrado em Engenharia Mecânica
Resumo:
This work explores a concept for motion detection in brain MR examinations using high channel-count RF coil arrays. It applies ultrashort (<100 μsec) free induction decay signals, making use of the knowledge that motion induces variations in these signals when compared to a reference free induction decay signal. As a proof-of-concept, the method was implemented in a standard structural MRI sequence. The stability of the free induction decay-signal was verified in phantom experiments. Human experiments demonstrated that the observed variations in the navigator data provide a sensitive measure for detection of relevant and common subject motion patterns. The proposed methodology provides a means to monitor subject motion throughout a MRI scan while causing little or no impact on the sequence timing and image contrast. It could hence complement available motion detection and correction methods, thus further reducing motion sensitivity in MR applications.
Resumo:
One of the main problems in quantitative analysis of complex samples by x-ray fluorescence is related to interelemental (or matrix) effects. These effects appear as a result of interactions among sample elements, affecting the x-ray emission intensity in a non-linear manner. Basically, two main effects occur; intensity absorption and enhancement. The combination of these effects can lead to serious problems. Many studies have been carried out proposing mathematical methods to correct for these effects. Basic concepts and the main correction methods are discussed here.
Resumo:
The objective of this work was to develop a free access exploratory data analysis software application for academic use that is easy to install and can be handled without user-level programming due to extensive use of chemometrics and its association with applications that require purchased licenses or routines. The developed software, called Chemostat, employs Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), intervals Principal Component Analysis (iPCA), as well as correction methods, data transformation and outlier detection. The data can be imported from the clipboard, text files, ASCII or FT-IR Perkin-Elmer “.sp” files. It generates a variety of charts and tables that allow the analysis of results that can be exported in several formats. The main features of the software were tested using midinfrared and near-infrared spectra in vegetable oils and digital images obtained from different types of commercial diesel. In order to validate the software results, the same sets of data were analyzed using Matlab© and the results in both applications matched in various combinations. In addition to the desktop version, the reuse of algorithms allowed an online version to be provided that offers a unique experience on the web. Both applications are available in English.
Resumo:
Objectif: Définir l’effet des lipides et du traitement de la dyslipidémie sur les cancers de la prostate et de la vessie en utilisant différents devis d’étude et en tenant compte de la présence de plusieurs biais, particulièrement le biais du temps immortel. Devis: Le premier volet utilise un devis rétrospectif de type cas témoins. Un questionnaire semi-quantitatif de fréquence de consommation alimentaire validé a été utilisé. Le génotype COX2 de neuf polymorphisme nucléotidique unique (SNP) a été mesuré avec une plateforme Taqman. Des modèles de régression logistique non conditionnelle ont été utilisés pour comparer le risque de diagnostic d’un cancer de la prostate et l’interaction. Le deuxième volet utilise un devis rétrospectif de type cohorte basée sur les données administratives de la Régie de l’assurance-maladie du Québec (RAMQ). Des modèles de régression de Cox ont été employés pour mesurer l’association entre les statines et l’évolution du cancer de la vessie. Le troisième volet, porte un regard méthodologique sur le biais du temps immortel en examinant sa présence dans la littérature oncologique. Son importance est illustrée avec les données de la cohorte du deuxième volet, et les méthodes de correction possibles son appliquées. Résultats: L’étude du premier volet démontre qu’une diète riche en acides gras oméga-3 d’origine marine était fortement associée à un risque diminué de cancer de la prostate agressif (p<0.0001 pour la tendance). Le ratio de cote pour le cancer de la prostate du quartile supérieur d’oméga-3 était de 0.37 (IC 95% = 0.25 à 0.54). L’effet diététique était modifié par le génotype COX-2 SNP rs4648310 (p=0.002 pour l’interaction). En particulier, les hommes avec faible apport en oméga-3 et la variante rs4648310 avait un risque accru de cancer de la prostate (ratio de cote = 5.49, IC 95%=1.80 à 16.7), effet renversé par un apport en oméga-3 plus grand. L’étude du deuxième volet a observé que l’utilisation de statines est associée à une diminution du risque de progression du cancer de la vessie (risque relatif = 0.44, IC 95% = 0.20 à 0.96, p=0.039). Cette association était encore plus forte pour le décès de toute cause (HR = 0.57, 95% CI = 0.43 to 0.76, p=0.0001). L’effet des statines semble être dose-dépendant. L’étude du troisième volet démontre que le biais du temps immortel est fréquent et important dans les études épidémiologiques oncologiques. Il comporte plusieurs aspects dont certains sont mieux prévenus au stade du choix du devis d’étude et différentes méthodes statistiques permettent un contrôle de ce biais. Conclusion: 1) Une diète riche en oméga-3 aurait un effet protecteur pour le cancer de la prostate. 2) L’utilisation de statines aurait un effet protecteur sur la progression du cancer non invasif de la vessie. Les lipides semblent avoir un effet sur les cancers urologiques.
Resumo:
Cryptosystem using linear codes was developed in 1978 by Mc-Eliece. Later in 1985 Niederreiter and others developed a modified version of cryptosystem using concepts of linear codes. But these systems were not used frequently because of its larger key size. In this study we were designing a cryptosystem using the concepts of algebraic geometric codes with smaller key size. Error detection and correction can be done efficiently by simple decoding methods using the cryptosystem developed. Approach: Algebraic geometric codes are codes, generated using curves. The cryptosystem use basic concepts of elliptic curves cryptography and generator matrix. Decrypted information takes the form of a repetition code. Due to this complexity of decoding procedure is reduced. Error detection and correction can be carried out efficiently by solving a simple system of linear equations, there by imposing the concepts of security along with error detection and correction. Results: Implementation of the algorithm is done on MATLAB and comparative analysis is also done on various parameters of the system. Attacks are common to all cryptosystems. But by securely choosing curve, field and representation of elements in field, we can overcome the attacks and a stable system can be generated. Conclusion: The algorithm defined here protects the information from an intruder and also from the error in communication channel by efficient error correction methods.
Resumo:
Objetivo: Analizar los resultados de 111 pacientes pediátricos sometidos a enucleación o evisceración en la Clínica Barraquer durante 11 años (1990-2000), determinando tipo de corrección realizada y complicaciones asociadas al procedimiento quirúrgico. Métodos: Estudio retrospectivo de las historias clínicas de todos los pacientes menores de 18 años que fueron sometidos a los procedimientos mencionados, obteniendo datos demográficos,diagnóstico, cirugías oftalmológicas asociadas, características de implantes o injertos, seguimiento y complicaciones postoperatorias en todos los pacientes. Resultados: Se realizaron 55 enucleaciones y 56 evisceraciones. Sólo en un caso no se colocó algún implante o injerto. Durante los primeros 4 años del análisis los injertos lipodérmicos correspondían a un 25,45% y los implantes de Nylon a un 72,72% de las correcciones realizadas, mientras que en los últimos 7 años el implante de hidroxiapatita constituyó el 78,57% de los casos intervenidos. En 17 (15,32%) pacientes se observaron complicaciones que requirieron algún tipo de corrección quirúrgica, sin diferencias estadísticamente significativas entre las distintas correcciones utilizadas. Conclusiones: Para la población pediátrica los implantes de hidroxiapatita, además de brindar una excelente reconstrucción de la órbita anoftálmica, mejoran los resultados estéticos y de motilidad. Sin embargo otro tipo de correcciones como los injertos lipodérmicos siguen constituyendo una excelente alternativa en nuestro medio, teniendo en cuenta que sus costos son mucho menores. Aunque se observaron algunas complicaciones con los diferentes tipos de corrección, muy pocas requirieron una nueva intervención quirúrgica. La tasa de complicaciones se incrementa en pacientes de menos edad.
Resumo:
The calculation of interval forecasts for highly persistent autoregressive (AR) time series based on the bootstrap is considered. Three methods are considered for countering the small-sample bias of least-squares estimation for processes which have roots close to the unit circle: a bootstrap bias-corrected OLS estimator; the use of the Roy–Fuller estimator in place of OLS; and the use of the Andrews–Chen estimator in place of OLS. All three methods of bias correction yield superior results to the bootstrap in the absence of bias correction. Of the three correction methods, the bootstrap prediction intervals based on the Roy–Fuller estimator are generally superior to the other two. The small-sample performance of bootstrap prediction intervals based on the Roy–Fuller estimator are investigated when the order of the AR model is unknown, and has to be determined using an information criterion.
Resumo:
Background It can be argued that adaptive designs are underused in clinical research. We have explored concerns related to inadequate reporting of such trials, which may influence their uptake. Through a careful examination of the literature, we evaluated the standards of reporting of group sequential (GS) randomised controlled trials, one form of a confirmatory adaptive design. Methods We undertook a systematic review, by searching Ovid MEDLINE from the 1st January 2001 to 23rd September 2014, supplemented with trials from an audit study. We included parallel group, confirmatory, GS trials that were prospectively designed using a Frequentist approach. Eligible trials were examined for compliance in their reporting against the CONSORT 2010 checklist. In addition, as part of our evaluation, we developed a supplementary checklist to explicitly capture group sequential specific reporting aspects, and investigated how these are currently being reported. Results Of the 284 screened trials, 68(24%) were eligible. Most trials were published in “high impact” peer-reviewed journals. Examination of trials established that 46(68%) were stopped early, predominantly either for futility or efficacy. Suboptimal reporting compliance was found in general items relating to: access to full trials protocols; methods to generate randomisation list(s); details of randomisation concealment, and its implementation. Benchmarking against the supplementary checklist, GS aspects were largely inadequately reported. Only 3(7%) trials which stopped early reported use of statistical bias correction. Moreover, 52(76%) trials failed to disclose methods used to minimise the risk of operational bias, due to the knowledge or leakage of interim results. Occurrence of changes to trial methods and outcomes could not be determined in most trials, due to inaccessible protocols and amendments. Discussion and Conclusions There are issues with the reporting of GS trials, particularly those specific to the conduct of interim analyses. Suboptimal reporting of bias correction methods could potentially imply most GS trials stopping early are giving biased results of treatment effects. As a result, research consumers may question credibility of findings to change practice when trials are stopped early. These issues could be alleviated through a CONSORT extension. Assurance of scientific rigour through transparent adequate reporting is paramount to the credibility of findings from adaptive trials. Our systematic literature search was restricted to one database due to resource constraints.
Resumo:
We developed an analytical method and constrained procedural boundary conditions that enable accurate and precise Zn isotope ratio measurements in urban aerosols. We also demonstrate the potential of this new isotope system for air pollutant source tracing. The procedural blank is around 5 ng and significantly lower than published methods due to a tailored ion chromatographic separation. Accurate mass bias correction using external correction with Cu is limited to Zn sample content of approximately 50 ng due to the combined effect of blank contribution of Cu and Zn from the ion exchange procedure and the need to maintain a Cu/Zn ratio of approximately 1. Mass bias is corrected for by applying the common analyte internal standardization method approach. Comparison with other mass bias correction methods demonstrates the accuracy of the method. The average precision of delta(66)Zn determinations in aerosols is around 0.05% per atomic mass unit. The method was tested on aerosols collected in Sin Paulo City, Brazil. The measurements reveal significant variations in delta(66)Zn(Imperial) ranging between -0.96 and -0.37% in coarse and between -1.04 and 0.02% in fine particular matter. This variability suggests that Zn isotopic compositions distinguish atmospheric sources. The isotopic light signature suggests traffic as the main source. We present further delta(66)Zn(Imperial) data for the standard reference material NIST SRM 2783 (delta 66Z(Imperial) = 0.26 +/- 0.10%).
Resumo:
Introduction: The use of dermal filling techniques for soft tissue augmentation has greatly increased in recent years. Hyaluronic acid is one of the most used temporary dermal fillers in the treatment of facial wrinkles, furrows, and folds due to its effectiveness and safety. Objective: To evaluate the efficacy and safety of Perfectha®, a new hyaluronic acid filler, for nasolabial folds and lip correction. Methods: Open, multicenter study comprising 87 women. Efficacy was evaluated by the Global Aesthetic Improvement Scale and the Wrinkle Severity Rating Scale. Safety was evaluated through observation and the reporting of side effects. Results: One week after the injection of the filler, improvement in nasolabial folds and lips was observed in 86% and 89% of the women, respectively. Mild or moderate transient inflammatory reaction and ecchymoses occurred in 15% and 9% of patients, respectively, mainly in nasolabial folds. Two patients presented labial herpes simplex after treatment of the lips. The good results were maintained in 76% and 57% of women for nasolabial folds and in 72% and 45% of women for lips after 3 and 6 months, respectively. Conclusion: Perfectha® was effective and safe for these indications.
Resumo:
A way to investigate turbulence is through experiments where hot wire measurements are performed. Analysis of the in turbulence of a temperature gradient on hot wire measurements is the aim of this thesis work. Actually - to author's knowledge - this investigation is the first attempt to document, understand and ultimately correct the effect of temperature gradients on turbulence statistics. However a numerical approach is used since instantaneous temperature and streamwise velocity fields are required to evaluate this effect. A channel flow simulation at Re_tau = 180 is analyzed to make a first evaluation of the amount of error introduced by temperature gradient inside the domain. Hot wire data field is obtained processing the numerical flow field through the application of a proper version of the King's law, which connect voltage, velocity and temperature. A drift in mean streamwise velocity profile and rms is observed when temperature correction is performed by means of centerline temperature. A correct mean velocity pro�le is achieved correcting temperature through its mean value at each wall normal position, but a not negligible error is still present into rms. The key point to correct properly the sensed velocity from the hot wire is the knowledge of the instantaneous temperature field. For this purpose three correction methods are proposed. At the end a numerical simulation at Re_tau =590 is also evaluated in order to confirm the results discussed earlier.
Resumo:
The isotope composition of selenium (Se) can provide important constraints on biological, geochemical, and cosmochemical processes taking place in different reservoirs on Earth and during planet formation. To provide precise qualitative and quantitative information on these processes, accurate and highly precise isotope data need to be obtained. The currently applied ICP-MS methods for Se isotope measurements are compromised by the necessity to perform a large number of interference corrections. Differences in these correction methods can lead to discrepancies in published Se isotope values of rock standards which are significantly higher than the acclaimed precision. An independent analytical approach applying a double spike (DS) and state-of-the-art TIMS may yield better precision due to its smaller number of interferences and could test the accuracy of data obtained by ICP-MS approaches. This study shows that the precision of Se isotope measurements performed with two different Thermo Scientific™ Triton™ Plus TIMS is distinctly deteriorated by about ±1‰ (2 s.d.) due to δ80/78Se by a memory Se signal of up to several millivolts and additional minor residual mass bias which could not be corrected for with the common isotope fractionation laws. This memory Se has a variable isotope composition with a DS fraction of up to 20% and accumulates with increasing number of measurements. Thus it represents an accumulation of Se from previous Se measurements with a potential addition from a sample or machine blank. Several cleaning techniques of the MS parts were tried to decrease the memory signal, but were not sufficient to perform precise Se isotope analysis. If these serious memory problems can be overcome in the future, the precision and accuracy of Se isotope analysis with TIMS should be significantly better than those of the current ICP-MS approaches.
Resumo:
The presented database contains time-referenced sea ice draft values from upward looking sonar (ULS) measurements in the Weddell Sea, Antarctica. The sea ice draft data can be used to infer the thickness of the ice. They were collected during the period 1990-2008. In total, the database includes measurements from 13 locations in the Weddell Sea and was generated from more than 3.7 million measurements of sea ice draft. The files contain uncorrected raw drafts, corrected drafts and the basic parameters measured by the ULS. The measurement principle, the data processing procedure and the quality control are described in detail. To account for the unknown speed of sound in the water column above the ULS, two correction methods were applied to the draft data. The first method is based on defining a reference level from the identification of open water leads. The second method uses a model of sound speed in the oceanic mixed layer and is applied to ice draft in austral winter. Both methods are discussed and their accuracy is estimated. Finally, selected results of the processing are presented.