925 resultados para Weighted histogram analysis method
Resumo:
BACKGROUND: Studies on hexaminolevulinate (HAL) cystoscopy report improved detection of bladder tumours. However, recent meta-analyses report conflicting effects on recurrence. OBJECTIVE: To assess available clinical data for blue light (BL) HAL cystoscopy on the detection of Ta/T1 and carcinoma in situ (CIS) tumours, and on tumour recurrence. DESIGN, SETTING, AND PARTICIPANTS: This meta-analysis reviewed raw data from prospective studies on 1345 patients with known or suspected non-muscle-invasive bladder cancer (NMIBC). INTERVENTION: A single application of HAL cystoscopy was used as an adjunct to white light (WL) cystoscopy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: We studied the detection of NMIBC (intention to treat [ITT]: n=831; six studies) and recurrence (per protocol: n=634; three studies) up to 1 yr. DerSimonian and Laird's random-effects model was used to obtain pooled relative risks (RRs) and associated 95% confidence intervals (CIs) for outcomes for detection. RESULTS AND LIMITATIONS: BL cystoscopy detected significantly more Ta tumours (14.7%; p<0.001; odds ratio [OR]: 4.898; 95% CI, 1.937-12.390) and CIS lesions (40.8%; p<0.001; OR: 12.372; 95% CI, 6.343-24.133) than WL. There were 24.9% patients with at least one additional Ta/T1 tumour seen with BL (p<0.001), significant also in patients with primary (20.7%; p<0.001) and recurrent cancer (27.7%; p<0.001), and in patients at high risk (27.0%; p<0.001) and intermediate risk (35.7%; p=0.004). In 26.7% of patients, CIS was detected only by BL (p<0.001) and was also significant in patients with primary (28.0%; p<0.001) and recurrent cancer (25.0%; p<0.001). Recurrence rates up to 12 mo were significantly lower overall with BL, 34.5% versus 45.4% (p=0.006; RR: 0.761 [0.627-0.924]), and lower in patients with T1 or CIS (p=0.052; RR: 0.696 [0.482-1.003]), Ta (p=0.040; RR: 0.804 [0.653-0.991]), and in high-risk (p=0.050) and low-risk (p=0.029) subgroups. Some subgroups had too few patients to allow statistically meaningful analysis. Heterogeneity was minimised by the statistical analysis method used. CONCLUSIONS: This meta-analysis confirms that HAL BL cystoscopy significantly improves the detection of bladder tumours leading to a reduction of recurrence at 9-12 mo. The benefit is independent of the level of risk and is evident in patients with Ta, T1, CIS, primary, and recurrent cancer.
Resumo:
Introduction. Genetic epidemiology is focused on the study of the genetic causes that determine health and diseases in populations. To achieve this goal a common strategy is to explore differences in genetic variability between diseased and nondiseased individuals. Usual markers of genetic variability are single nucleotide polymorphisms (SNPs) which are changes in just one base in the genome. The usual statistical approach in genetic epidemiology study is a marginal analysis, where each SNP is analyzed separately for association with the phenotype. Motivation. It has been observed, that for common diseases the single-SNP analysis is not very powerful for detecting genetic causing variants. In this work, we consider Gene Set Analysis (GSA) as an alternative to standard marginal association approaches. GSA aims to assess the overall association of a set of genetic variants with a phenotype and has the potential to detect subtle effects of variants in a gene or a pathway that might be missed when assessed individually. Objective. We present a new optimized implementation of a pair of gene set analysis methodologies for analyze the individual evidence of SNPs in biological pathways. We perform a simulation study for exploring the power of the proposed methodologies in a set of scenarios with different number of causal SNPs under different effect sizes. In addition, we compare the results with the usual single-SNP analysis method. Moreover, we show the advantage of using the proposed gene set approaches in the context of an Alzheimer disease case-control study where we explore the Reelin signal pathway.
Resumo:
In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.
Resumo:
The goal of this study is to examine the intelligent home business network in order to determine which part of the network has the best financial abilities to produce new business models and products/services by using financial statement analysis. A group of 377 studied limited companies is divided into four examined segments based on their offering in producing intelligent homes. The segments are customer service providers, system integrators, subsystem suppliers and component suppliers. Eight different key figures are calculated from each of the companies to get a comprehensive view of their financial performances, after which each of the segments is studied statistically to determine the performances of the whole segments. The actual performance differences between the segments are calculated by using the multi-criteria decision analysis method in which the performances of the key figures are graded and each key figure is weighted according to its importance for the goal of the study. The results of this analysis showed that subsystem suppliers have the best financial performance. Second best are system integrators, third are customer service providers and fourth component suppliers. None of the segments were strikingly poor, but even component suppliers were rather reasonable in their performance; so, it can be said that no part of the intelligent home business network has remarkably inadequate financial abilities to develop new business models and products/services.
Resumo:
In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students’ concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students’ opinions about the characteristics of a successful researcher. Students’ difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research.
Resumo:
The aims of this study were to use the isotope analysis method to quantify the carbon of C3 photosynthetic cycle in commercial apple nectars and to determine the legal limit to identify the beverages that do not conform to the safety standards established by the Brazilian Ministry of Agriculture, Livestock and Food Supply. These beverages (apple nectars) were produced in the laboratory according to the Brazilian legislation. Adulterated nectars were also produced with an amount of pulp juice below the permitted threshold limit value. The δ13C values of the apple nectars and their fractions (pulp and purified sugar) were measured to quantify the C3 source percentage. In order to demonstrate the existence of adulteration, the values found were compared to the limit values established by the Brazilian Law. All commercial apple nectars analyzed were within the legal limits, which enabled to identify the nectars that were in conformity with the Brazilian Law. The isotopic methodology developed proved efficient to quantify the carbon of C3 origin in commercial apple nectars.
Resumo:
Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC) were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid) was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid) in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.
Resumo:
The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.
Resumo:
La fibrillation auriculaire (FA) est une arythmie touchant les oreillettes. En FA, la contraction auriculaire est rapide et irrégulière. Le remplissage des ventricules devient incomplet, ce qui réduit le débit cardiaque. La FA peut entraîner des palpitations, des évanouissements, des douleurs thoraciques ou l’insuffisance cardiaque. Elle augmente aussi le risque d'accident vasculaire. Le pontage coronarien est une intervention chirurgicale réalisée pour restaurer le flux sanguin dans les cas de maladie coronarienne sévère. 10% à 65% des patients qui n'ont jamais subi de FA, en sont victime le plus souvent lors du deuxième ou troisième jour postopératoire. La FA est particulièrement fréquente après une chirurgie de la valve mitrale, survenant alors dans environ 64% des patients. L'apparition de la FA postopératoire est associée à une augmentation de la morbidité, de la durée et des coûts d'hospitalisation. Les mécanismes responsables de la FA postopératoire ne sont pas bien compris. L'identification des patients à haut risque de FA après un pontage coronarien serait utile pour sa prévention. Le présent projet est basé sur l'analyse d’électrogrammes cardiaques enregistrées chez les patients après pontage un aorte-coronaire. Le premier objectif de la recherche est d'étudier si les enregistrements affichent des changements typiques avant l'apparition de la FA. Le deuxième objectif est d'identifier des facteurs prédictifs permettant d’identifier les patients qui vont développer une FA. Les enregistrements ont été réalisés par l'équipe du Dr Pierre Pagé sur 137 patients traités par pontage coronarien. Trois électrodes unipolaires ont été suturées sur l'épicarde des oreillettes pour enregistrer en continu pendant les 4 premiers jours postopératoires. La première tâche était de développer un algorithme pour détecter et distinguer les activations auriculaires et ventriculaires sur chaque canal, et pour combiner les activations des trois canaux appartenant à un même événement cardiaque. L'algorithme a été développé et optimisé sur un premier ensemble de marqueurs, et sa performance évaluée sur un second ensemble. Un logiciel de validation a été développé pour préparer ces deux ensembles et pour corriger les détections sur tous les enregistrements qui ont été utilisés plus tard dans les analyses. Il a été complété par des outils pour former, étiqueter et valider les battements sinusaux normaux, les activations auriculaires et ventriculaires prématurées (PAA, PVA), ainsi que les épisodes d'arythmie. Les données cliniques préopératoires ont ensuite été analysées pour établir le risque préopératoire de FA. L’âge, le niveau de créatinine sérique et un diagnostic d'infarctus du myocarde se sont révélés être les plus importants facteurs de prédiction. Bien que le niveau du risque préopératoire puisse dans une certaine mesure prédire qui développera la FA, il n'était pas corrélé avec le temps de l'apparition de la FA postopératoire. Pour l'ensemble des patients ayant eu au moins un épisode de FA d’une durée de 10 minutes ou plus, les deux heures précédant la première FA prolongée ont été analysées. Cette première FA prolongée était toujours déclenchée par un PAA dont l’origine était le plus souvent sur l'oreillette gauche. Cependant, au cours des deux heures pré-FA, la distribution des PAA et de la fraction de ceux-ci provenant de l'oreillette gauche était large et inhomogène parmi les patients. Le nombre de PAA, la durée des arythmies transitoires, le rythme cardiaque sinusal, la portion basse fréquence de la variabilité du rythme cardiaque (LF portion) montraient des changements significatifs dans la dernière heure avant le début de la FA. La dernière étape consistait à comparer les patients avec et sans FA prolongée pour trouver des facteurs permettant de discriminer les deux groupes. Cinq types de modèles de régression logistique ont été comparés. Ils avaient une sensibilité, une spécificité et une courbe opérateur-receveur similaires, et tous avaient un niveau de prédiction des patients sans FA très faible. Une méthode de moyenne glissante a été proposée pour améliorer la discrimination, surtout pour les patients sans FA. Deux modèles ont été retenus, sélectionnés sur les critères de robustesse, de précision, et d’applicabilité. Autour 70% patients sans FA et 75% de patients avec FA ont été correctement identifiés dans la dernière heure avant la FA. Le taux de PAA, la fraction des PAA initiés dans l'oreillette gauche, le pNN50, le temps de conduction auriculo-ventriculaire, et la corrélation entre ce dernier et le rythme cardiaque étaient les variables de prédiction communes à ces deux modèles.
Resumo:
It has been widely known that a significant part of the bits are useless or even unused during the program execution. Bit-width analysis targets at finding the minimum bits needed for each variable in the program, which ensures the execution correctness and resources saving. In this paper, we proposed a static analysis method for bit-widths in general applications, which approximates conservatively at compile time and is independent of runtime conditions. While most related work focus on integer applications, our method is also tailored and applicable to floating point variables, which could be extended to transform floating point number into fixed point numbers together with precision analysis. We used more precise representations for data value ranges of both scalar and array variables. Element level analysis is carried out for arrays. We also suggested an alternative for the standard fixed-point iterations in bi-directional range analysis. These techniques are implemented on the Trimaran compiler structure and tested on a set of benchmarks to show the results.
Resumo:
El Glioblastoma multiforme (GBM), es el tumor cerebral más frecuente, con pronóstico grave y baja sensibilidad al tratamiento inicial. El propósito de este estudio fue evaluar si la Difusión en RM (IDRM), es un biomarcador temprano de respuesta tumoral, útil para tomar decisiones tempranas de tratamiento y para obtener información pronostica. Metodología La búsqueda se realizo en las bases de datos EMBASE, CENTRAL, MEDLINE; las bibliografías también fueron revisadas. Los artículos seleccionados fueron estudios observacionales (casos y controles, cohortes, corte transversal), no se encontró ningún ensayo clínico; todos los participante tenían diagnostico histopatológico de GBM, sometidos a resección quirúrgica y/o radio-quimioterapia y seguimiento de respuesta al tratamiento con IDRM por al menos 6 meses. Los datos extraídos de forma independiente fueron tipo de estudio, participantes, intervenciones, seguimiento, desenlaces (sobrevida, progresión/estabilización de la enfermedad, muerte) Resultados Quince estudios cumplieron los criterios de inclusión. Entre las técnicas empleadas de IDRM para evaluar respuesta radiológica al tratamiento, fueron histogramas del coeficiente aparente de difusion ADC (compararon valores inferiores a la media y el percentil 10 de ADC, con los valores superiores); encontrando en términos generales que un ADC bajo es un fuerte predictor de sobrevida y/o progresión del tumor. (Esto fue significativo en 5 estudios); mapas funcionales de difusion (FDM) (midieron el porcentaje de cambio de ADC basal vs pos tratamiento) que mostro ser un fuerte predictor de sobrevida en pacientes con progresión tumoral. DISCUSION Desafortunadamente la calidad de los estudios fue intermedia-baja lo que hace que la aplicabilidad de los estudios sea limitada.
Resumo:
In this paper, we propose a new velocity constraint type for Redundant Drive Wire Mechanisms. The purpose of this paper is to demonstrate that the proposed velocity constraint module can fix the orientation of the movable part and to use the kinematical analysis method to obtain the moving direction of the movable part. First, we discuss the necessity of using this velocity constraint type and the possible applications of the proposed mechanism. Second, we derive the basic equations of a wire mechanism with this constraint type. Next, we present a method of motion analysis on active and passive constraint spaces, which is used to find the moving direction of a movable part. Finally, we apply the above analysis method on a wire mechanism with a velocity constraint module and on a wire mechanism with four double actuator modules. By evaluating the results, we prove the validity of the proposed constraint type.
Resumo:
Purpose – This case study presents an impact assessment of Corporate Social Responsibility (CSR) programs of the TFM Company in order to understand how they contribute to the sustainable development of communities in areas in which they operate. Design/Methodology/Approach - Data for this study was collected using qualitative data methods that included semi-structured interviews and Focus Group Discussions most of them audio and video recorded. Documentary analysis and a field visit were also undertaken for the purpose of quality analysis of the CSR programs on the terrain. Data collected was analyzed using the Seven Questions to sustainability (7Qs) framework, an evaluation tool developed by the Mining, Minerals and Sustainable Development (MMSD) North America chapter. Content analysis method was on the other hand used to examine the interviews and FGDs of the study participants. Findings - Results shows that CSR programs of TFM SA do contribute to community development, as there have been notable changes in the communities’ living conditions. But whether they have contributed to sustainable development is not yet the case as programs that enhance the capacity of communities and other stakeholders to support these projects development beyond the implementation stage and the mines operation lifetime need to be considered and implemented. Originality/Value – In DRC, there is paucity of information of research studies that focus on impact assessment of CSR programs in general and specifically those of mining companies and their contribution to sustainable development of local communities. Many of the available studies cover issues of minerals and conflict or conflict minerals as mostly referred to. This study addressees this gap.
Resumo:
In this paper, a thermoeconomic analysis method based on the first and second law of thermodynamics and applied to an evaporative cooling system coupled to an adsorption dehumidifier, is presented. The main objective is the use of a method called exergetic manufacturing cost (EMC) applied to a system that operates in three different conditions to minimize the operation costs. Basic parameters are the RIP ratio (reactivation air/process air) and the reactivation air temperature. Results of this work show that the minimum reactivation temperature and the minimum RIP ratio corresponds to the smaller EMC. This result can be corroborated through an energetic analysis. It is noted that this case is also the one corresponding to smaller energy loss. (C) 2003 Elsevier B.V. Ltd. All rights reserved.
Resumo:
This paper presents the Ergonomic Work Analysis method in a Brazilian Dentist's office. Through the study, the constraints and the strategies in avoiding them were identified. It was found that dentists hardly use the position most indicated by the International Organization for Standardization (ISO) and the Federation Dentaire Internacionale (FDI) for both the patient and the dentist, which is respectively supine and 9 o'clock, due to the limited space and layout. Five types of treatments performed by the professional have been studied. The frequency and duration of actions in these treatments were accounted for and the standard positions adopted were identified. The AET was found to be a very suitable method to grasp the dentist's activity and build a point of view of the profession, which is characterized as: stressful, perfectionist and restrictive. Time management is presented as an important strategy to control the tension arising from performing the treatments.