995 resultados para NUCLEAR-COMPLEX
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
Introduction: Paper and thin layer chromatography methods are frequently used in Classic Nuclear Medicine for the determination of radiochemical purity (RCP) on radiopharmaceutical preparations. An aliquot of the radiopharmaceutical to be tested is spotted at the origin of a chromatographic strip (stationary phase), which in turn is placed in a chromatographic chamber in order to separate and quantify radiochemical species present in the radiopharmaceutical preparation. There are several methods for the RCP measurement, based on the use of equipment as dose calibrators, well scintillation counters, radiochromatografic scanners and gamma cameras. The purpose of this study was to compare these quantification methods for the determination of RCP. Material and Methods: 99mTc-Tetrofosmin and 99mTc-HDP are the radiopharmaceuticals chosen to serve as the basis for this study. For the determination of RCP of 99mTc-Tetrofosmin we used ITLC-SG (2.5 x 10 cm) and 2-butanone (99mTc-tetrofosmin Rf = 0.55, 99mTcO4- Rf = 1.0, other labeled impurities 99mTc-RH RF = 0.0). For the determination of RCP of 99mTc-HDP, Whatman 31ET and acetone was used (99mTc-HDP Rf = 0.0, 99mTcO4- Rf = 1.0, other labeled impurities RF = 0.0). After the development of the solvent front, the strips were allowed to dry and then imaged on the gamma camera (256x256 matrix; zoom 2; LEHR parallel-hole collimator; 5-minute image) and on the radiochromatogram scanner. Then, strips were cut in Rf 0.8 in the case of 99mTc-tetrofosmin and Rf 0.5 in the case of 99mTc-HDP. The resultant pieces were smashed in an assay tube (to minimize the effect of counting geometry) and counted in the dose calibrator and in the well scintillation counter (during 1 minute). The RCP was calculated using the formula: % 99mTc-Complex = [(99mTc-Complex) / (Total amount of 99mTc-labeled species)] x 100. Statistical analysis was done using the test of hypotheses for the difference between means in independent samples. Results:The gamma camera based method demonstrated higher operator-dependency (especially concerning the drawing of the ROIs) and the measures obtained using the dose calibrator are very sensitive to the amount of activity spotted in the chromatographic strip, so the use of a minimum of 3.7 MBq activity is essential to minimize quantification errors. Radiochromatographic scanner and well scintillation counter showed concordant results and demonstrated the higher level of precision. Conclusions: Radiochromatographic scanners and well scintillation counters based methods demonstrate to be the most accurate and less operator-dependant methods.
Resumo:
Dissertação de Mestrado em Gerontologia Social
Resumo:
The evolution of hybrid polyploid vertebrates, their viability and their perpetuation over evolutionary time have always been questions of great interest. However, little is known about the impact of hybridization and polyploidization on the regulatory networks that guarantee the appropriate quantitative and qualitative gene expression programme. The Squalius alburnoides complex of hybrid fish is an attractive system to address these questions, as it includes a wide variety of diploid and polyploid forms, and intricate systems of genetic exchange. Through the study of genome-specific allele expression of seven housekeeping and tissue-specific genes, we found that a gene copy silencing mechanism of dosage compensation exists throughout the distribution range of the complex. Here we show that the allele-specific patterns of silencing vary within the complex, according to the geographical origin and the type of genome involved in the hybridization process. In southern populations, triploids of S. alburnoides show an overall tendency for silencing the allele from the minority genome, while northern population polyploids exhibit preferential biallelic gene expression patterns, irrespective of genomic composition. The present findings further suggest that gene copy silencing and variable expression of specific allele combinations may be important processes in vertebrate polyploid evolution.
Resumo:
Este artigo é uma introdução à teoria do paradigma desconstrutivo de aprendizagem cooperativa. Centenas de estudos provam com evidências o facto de que as estruturas e os processos de aprendizagem cooperativa aumentam o desempenho académico, reforçam as competências de aprendizagem ao longo da vida e desenvolvem competências sociais, pessoais de cada aluno de uma forma mais eficaz e usta, comparativamente às estruturas tradicionais de aprendizagem nas escolas. Enfrentando os desafios dos nossos sistemas educativos, seria interessante elaborar o quadro teórico do discurso da aprendizagem cooperativa, dos últimos 40 anos, a partir de um aspeto prático dentro do contexto teórico e metodológico. Nas últimas décadas, o discurso cooperativo elaborou os elementos práticos e teóricos de estruturas e processos de aprendizagem cooperativa. Gostaríamos de fazer um resumo desses elementos com o objetivo de compreender que tipo de mudanças estruturais podem fazer diferenças reais na prática de ensino e aprendizagem. Os princípios básicos de estruturas cooperativas, os papéis de cooperação e as atitudes cooperativas são os principais elementos que podemos brevemente descrever aqui, de modo a criar um quadro para a compreensão teórica e prática de como podemos sugerir os elementos de aprendizagem cooperativa na nossa prática em sala de aula. Na minha perspetiva, esta complexa teoria da aprendizagem cooperativa pode ser entendida como um paradigma desconstrutivo que fornece algumas respostas pragmáticas para as questões da nossa prática educativa quotidiana, a partir do nível da sala de aula para o nível de sistema educativo, com foco na destruição de estruturas hierárquicas e antidemocráticas de aprendizagem e, criando, ao mesmo tempo, as estruturas cooperativas.
Resumo:
This paper aims to present a contrastive approach between three different ways of building concepts after proving the similar syntactic possibilities that coexist in terms. However, from the semantic point of view we can see that each language family has a different distribution in meaning. But the most important point we try to show is that the differences found in the psychological process when communicating concepts should guide the translator and the terminologist in the target text production and the terminology planning process. Differences between languages in the information transmission process are due to the different roles the different types of knowledge play. We distinguish here the analytic-descriptive knowledge and the analogical knowledge among others. We also state that none of them is the best when determining the correctness of a term, but there has to be adequacy criteria in the selection process. This concept building or term building success is important when looking at the linguistic map of the information society.
Resumo:
3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.
Resumo:
The handling of waste and compost that occurs frequently in composting plants (compost turning, shredding, and screening) has been shown to be responsible for the release of dust and air borne microorganisms and their compounds in the air. Thermophilic fungi, such as A. fumigatus, have been reported and this kind of contamination in composting facilities has been associated with increased respiratory symptoms among compost workers. This study intended to characterize fungal contamination in a totally indoor composting plant located in Portugal. Besides conventional methods, molecular biology was also applied to overcome eventual limitations.
Resumo:
The regulatory mechanisms by which hydrogen peroxide (H2O2) modulates the activity of transcription factors in bacteria (OxyR and PerR), lower eukaryotes (Yap1, Maf1, Hsf1 and Msn2/4) and mammalian cells (AP-1, NRF2, CREB, HSF1, HIF-1, TP53, NF-κB, NOTCH, SP1 and SCREB-1) are reviewed. The complexity of regulatory networks increases throughout the phylogenetic tree, reaching a high level of complexity in mammalians. Multiple H2O2 sensors and pathways are triggered converging in the regulation of transcription factors at several levels: (1) synthesis of the transcription factor by upregulating transcription or increasing both mRNA stability and translation; (ii) stability of the transcription factor by decreasing its association with the ubiquitin E3 ligase complex or by inhibiting this complex; (iii) cytoplasm-nuclear traffic by exposing/masking nuclear localization signals, or by releasing the transcription factor from partners or from membrane anchors; and, (iv) DNA binding and nuclear transactivation by modulating transcription factor affinity towards DNA, co-activators or repressors, and by targeting specific regions of chromatin to activate individual genes. We also discuss how H2O2 biological specificity results from diverse thiol protein sensors, with different reactivity of their sulfhydryl groups towards H2O2, being activated by different concentrations and times of exposure to H2O2. The specific regulation of local H2O2 concentrations is also crucial and results from H2O2 localized production and removal controlled by signals. Finally, we formulate equations to extract from typical experiments quantitative data concerning H2O2 reactivity with sensor molecules. Rate constants of 140 M-1s−1 and ≥ 1.3 × 103 M-1s−1 were estimated, respectively, for the reaction of H2O2 with KEAP1 and with an unknown target that mediates NRF2 protein synthesis. In conclusion, the multitude of H2O2 targets and mechanisms provides an opportunity for highly specific effects on gene regulation that depend on the cell type and on signals received from the cellular microenvironment.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Química
Resumo:
Finding the structure of a confined liquid crystal is a difficult task since both the density and order parameter profiles are nonuniform. Starting from a microscopic model and density-functional theory, one has to either (i) solve a nonlinear, integral Euler-Lagrange equation, or (ii) perform a direct multidimensional free energy minimization. The traditional implementations of both approaches are computationally expensive and plagued with convergence problems. Here, as an alternative, we introduce an unsupervised variant of the multilayer perceptron (MLP) artificial neural network for minimizing the free energy of a fluid of hard nonspherical particles confined between planar substrates of variable penetrability. We then test our algorithm by comparing its results for the structure (density-orientation profiles) and equilibrium free energy with those obtained by standard iterative solution of the Euler-Lagrange equations and with Monte Carlo simulation results. Very good agreement is found and the MLP method proves competitively fast, flexible, and refinable. Furthermore, it can be readily generalized to the richer experimental patterned-substrate geometries that are now experimentally realizable but very problematic to conventional theoretical treatments.