1000 resultados para statistical emission
Resumo:
Introdução – Os estudos Gated – Single Photon Emission Computed Tomography (SPECT) são uma das técnicas de imagiologia cardíaca que mais evoluiu nas últimas décadas. Para a análise das imagens obtidas, a utilização de softwares de quantificação leva a um aumento da reprodutibilidade e exatidão das interpretações. O objetivo deste estudo consiste em avaliar, em estudos Gated-SPECT, a variabilidade intra e interoperador de parâmetros quantitativos de função e perfusão do miocárdio, obtidos com os softwares Quantitative Gated SPECT (QGS) e Quantitative Perfusion SPECT (QPS). Material e métodos – Recorreu-se a uma amostra não probabilística por conveniência de 52 pacientes, que realizaram estudos Gated-SPECT do miocárdio por razões clínicas e que integravam a base de dados da estação de processamento da Xeleris da ESTeSL. Os cinquenta e dois estudos foram divididos em dois grupos distintos: Grupo I (GI) de 17 pacientes com imagens com perfusão do miocárdio normal; Grupo II (GII) de 35 pacientes que apresentavam defeito de perfusão nas imagens Gated-SPECT. Todos os estudos foram processados 5 vezes por 4 operadores independentes (com experiência de 3 anos em Serviços de Medicina Nuclear com casuística média de 15 exames/semana de estudos Gated-SPECT). Para a avaliação da variabilidade intra e interoperador foi utilizado o teste estatístico de Friedman, considerando α=0,01. Resultados e discussão – Para todos os parâmetros avaliados, os respectivos valores de p não traduziram diferenças estatisticamente significativas (p>α). Assim, não foi verificada variabilidade intra ou interoperador significativa no processamento dos estudos Gated-SPECT do miocárdio. Conclusão – Os softwares QGS e QPS são reprodutíveis na quantificação dos parâmetros de função e perfusão avaliados, não existindo variabilidade introduzida pelo operador.
Resumo:
Introdução – A tomografia de emissão de fotão simples sincronizada com o sinal eletrocardiográfico (Gated-SPECT) é essencial para a avaliação conjunta da perfusão e da função ventricular esquerda (VE) do miocárdio. Objetivo – Investigar a relação entre a função VE e o índice de captação (IC) miocárdio/pulmão direito (M/PD) e M/P esquerdo (M/PE) nos estudos Gated-SPECT com 99mTc-Tetrofosmina. Metodologia – Amostra de 32 pacientes que realizaram estudos Gated-SPECT por indicação clínica, sendo subdividida em dois grupos: Grupo I (GI) – pacientes com a informação clínica de enfarte agudo do miocárdio (EAM); Grupo II (GII) – pacientes com a informação clínica de isquemia. Por cada paciente adquiriram-se imagens estáticas torácico-abdominais e dois estudos Gated-SPECT do miocárdio (protocolo de um dia esforço/repouso). Nas imagens estáticas
definiram-se regiões de interesse (Regions of interest – ROI) para calcular os IC. Nos estudos Gated-SPECT utilizou-se o software Quantitative Gated SPECT/Quantitative Perfusion SPECT para calcular a Fração de Ejeção do Ventrículo Esquerdo (FEVE). Efetuou-se análise estatística descritiva para caracterização da amostra. Aplicou-se o teste de Spearman para avaliar a correlação entre a FEVE e os IC por grupo de pacientes. O Teste de Willcoxon foi usado para comparar FEVE em repouso e em esforço. Resultados – Nos estudos Gated-SPECT em esforço não se verificou correlação estatisticamente significativa entre a FEVE e os IC, para GI e GII; em repouso existe correlação positiva estatisticamente significativa entre a FEVE e os IC, para GI; para GII não se verificou correlação. Na comparação dos valores de FEVE em esforço e repouso nos dois grupos constatou-se a existência de diferenças estatisticamente significativas, sendo a FEVE em Esforço
Resumo:
The mechanisms of speech production are complex and have been raising attention from researchers of both medical and computer vision fields. In the speech production mechanism, the articulator’s study is a complex issue, since they have a high level of freedom along this process, namely the tongue, which instigates a problem in its control and observation. In this work it is automatically characterized the tongues shape during the articulation of the oral vowels of Portuguese European by using statistical modeling on MR-images. A point distribution model is built from a set of images collected during artificially sustained articulations of Portuguese European sounds, which can extract the main characteristics of the motion of the tongue. The model built in this work allows under standing more clearly the dynamic speech events involved during sustained articulations. The tongue shape model built can also be useful for speech rehabilitation purposes, specifically to recognize the compensatory movements of the articulators during speech production.
Resumo:
Intensity Modulated Radiotherapy (IMRT) is a technique introduced to shape more precisely the dose distributions to the tumour, providing a higher dose escalation in the volume to irradiate and simultaneously decreasing the dose in the organs at risk which consequently reduces the treatment toxicity. This technique is widely used in prostate and head and neck (H&N) tumours. Given the complexity and the use of high doses in this technique it’s necessary to ensure as a safe and secure administration of the treatment, through the use of quality control programmes for IMRT. The purpose of this study was to evaluate statistically the quality control measurements that are made for the IMRT plans in prostate and H&N patients, before the beginning of the treatment, analysing their variations, the percentage of rejected and repeated measurements, the average, standard deviations and the proportion relations.
Resumo:
This study aimed to characterize air pollution and the associated carcinogenic risks of polycyclic aromatic hydrocarbon (PAHs) at an urban site, to identify possible emission sources of PAHs using several statistical methodologies, and to analyze the influence of other air pollutants and meteorological variables on PAH concentrations.The air quality and meteorological data were collected in Oporto, the second largest city of Portugal. Eighteen PAHs (the 16 PAHs considered by United States Environment Protection Agency (USEPA) as priority pollutants, dibenzo[a,l]pyrene, and benzo[j]fluoranthene) were collected daily for 24 h in air (gas phase and in particles) during 40 consecutive days in November and December 2008 by constant low-flow samplers and using polytetrafluoroethylene (PTFE) membrane filters for particulate (PM10 and PM2.5 bound) PAHs and pre-cleaned polyurethane foam plugs for gaseous compounds. The other monitored air pollutants were SO2, PM10, NO2, CO, and O3; the meteorological variables were temperature, relative humidity, wind speed, total precipitation, and solar radiation. Benzo[a]pyrene reached a mean concentration of 2.02 ngm−3, surpassing the EU annual limit value. The target carcinogenic risks were equal than the health-based guideline level set by USEPA (10−6) at the studied site, with the cancer risks of eight PAHs reaching senior levels of 9.98×10−7 in PM10 and 1.06×10−6 in air. The applied statistical methods, correlation matrix, cluster analysis, and principal component analysis, were in agreement in the grouping of the PAHs. The groups were formed according to their chemical structure (number of rings), phase distribution, and emission sources. PAH diagnostic ratios were also calculated to evaluate the main emission sources. Diesel vehicular emissions were the major source of PAHs at the studied site. Besides that source, emissions from residential heating and oil refinery were identified to contribute to PAH levels at the respective area. Additionally, principal component regression indicated that SO2, NO2, PM10, CO, and solar radiation had positive correlation with PAHs concentrations, while O3, temperature, relative humidity, and wind speed were negatively correlated.
Resumo:
Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.
Resumo:
A number of characteristics are boosting the eagerness of extending Ethernet to also cover factory-floor distributed real-time applications. Full-duplex links, non-blocking and priority-based switching, bandwidth availability, just to mention a few, are characteristics upon which that eagerness is building up. But, will Ethernet technologies really manage to replace traditional Fieldbus networks? Ethernet technology, by itself, does not include features above the lower layers of the OSI communication model. In the past few years, it is particularly significant the considerable amount of work that has been devoted to the timing analysis of Ethernet-based technologies. It happens, however, that the majority of those works are restricted to the analysis of sub-sets of the overall computing and communication system, thus without addressing timeliness at a holistic level. To this end, we are addressing a few inter-linked research topics with the purpose of setting a framework for the development of tools suitable to extract temporal properties of Commercial-Off-The-Shelf (COTS) Ethernet-based factory-floor distributed systems. This framework is being applied to a specific COTS technology, Ethernet/IP. In this paper, we reason about the modelling and simulation of Ethernet/IP-based systems, and on the use of statistical analysis techniques to provide usable results. Discrete event simulation models of a distributed system can be a powerful tool for the timeliness evaluation of the overall system, but particular care must be taken with the results provided by traditional statistical analysis techniques.
Resumo:
This paper proposes a stochastic mixed-integer linear approach to deal with a short-term unit commitment problem with uncertainty on a deregulated electricity market that includes day-ahead bidding and bilateral contracts. The proposed approach considers the typically operation constraints on the thermal units and a spinning reserve. The uncertainty is due to the electricity prices, which are modeled by a scenario set, allowing an acceptable computation. Moreover, emission allowances are considered in a manner to allow for the consideration of environmental constraints. A case study to illustrate the usefulness of the proposed approach is presented and an assessment of the cost for the spinning reserve is obtained by a comparison between the situation with and without spinning reserve.
Resumo:
Introdução – O melanoma maligno cutâneo (MMC) é considerado uma das mais letais neoplasias e no seu seguimento recorre-se, para além dos exames clínicos e da análise de marcadores tumorais, a diversos métodos imagiológicos, como é o exame Tomografia por Emissão de Positrões/Tomografia Computorizada (PET/CT, do acrónimo inglês Positron Emission Tomography/Computed Tomography) com 18fluor-fluorodeoxiglucose (18F-FDG). O presente estudo tem como objetivo avaliar a utilidade da PET/CT relativamente à análise da extensão e à suspeita de recidiva do MMC, comparando os achados imagiológicos com os descritos em estudos CT. Metodologia – Estudo retrospetivo de 62 estudos PET/CT realizados em 50 pacientes diagnosticados com MMC. Excluiu-se um estudo cujo resultado era duvidoso (nódulo pulmonar). As informações relativas aos resultados dos estudos anatomopatológicos e dos exames imagiológicos foram obtidas através da história clínica e dos relatórios médicos dos estudos CT e PET/CT. Foi criada uma base de dados com os dados recolhidos através do software Excel e foi efetuada uma análise estatística descritiva. Resultados – Dos estudos PET/CT analisados, 31 foram considerados verdadeiros positivos (VP), 28 verdadeiros negativos (VN), um falso positivo (FP) e um falso negativo (FN). A sensibilidade, especificidade, o valor preditivo positivo (VPP), o valor preditivo negativo (VPN) e a exatidão da PET/CT para o estadiamento e avaliação de suspeita de recidiva no MMC são, respetivamente, 96,9%, 96,6%, 96,9%, 96,6% e 96,7%. Dos resultados da CT considerados na análise estatística, 14 corresponderam a VP, 12 a VN, três a FP e cinco a FN. A sensibilidade, especificidade, o VPP e o VPN e a exatidão da CT para o estadiamento e avaliação de suspeita de recidiva no MMC são, respetivamente, 73,7%, 80,0%, 82,4%, 70,6% e 76,5%. Comparativamente aos resultados CT, a PET/CT permitiu uma mudança na atitude terapêutica em 23% dos estudos. Conclusão – A PET/CT é um exame útil na avaliação do MMC, caracterizando-se por uma maior acuidade diagnóstica no estadiamento e na avaliação de suspeita de recidiva do MMC comparativamente à CT isoladamente.
Resumo:
This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente
Resumo:
This paper proposes a stochastic mixed-integer linear approach to deal with a short-term unit commitment problem with uncertainty on a deregulated electricity market that includes day-ahead bidding and bilateral contracts. The proposed approach considers the typically operation constraints on the thermal units and a spinning reserve. The uncertainty is due to the electricity prices, which are modeled by a scenario set, allowing an acceptable computation. Moreover, emission allowances are considered in a manner to allow for the consideration of environmental constraints. A case study to illustrate the usefulness of the proposed approach is presented and an assessment of the cost for the spinning reserve is obtained by a comparison between the situation with and without spinning reserve.
Resumo:
The present study aims to characterize ultrafine particles emitted during gas metal arc welding of mild steel and stainless steel, using different shielding gas mixtures, and to evaluate the effect of metal transfer modes, controlled by both processing parameters and shielding gas composition, on the quantity and morphology of the ultrafine particles. It was found that the amount of emitted ultrafine particles (measured by particle number and alveolar deposited surface area) are clearly dependent from the main welding parameters, namely the current intensity and the heat input of the Welding process. The emission of airborne ultrafine particles increases with the current intensity as fume formation rate does. When comparing the shielding gas mixtures, higher emissions were observed for more oxidizing mixtures, that is, with higher CO2 content, which means that these mixtures originate higher concentrations of ultrafine particles (as measured by number of particles. by cubic centimeter of air) and higher values of alveolar deposited surface area of particles, thus resulting in a more hazardous condition regarding welders exposure.
Resumo:
The present study is focused on the characterization of ultrafine particles emitted in welding of steel using mixtures of Ar+CO2, and intends to analyze which are the main process parameters which may have influence on the emission itself. It was found that the amount of emitted ultrafine particles (measured by particle number and alveolar deposited surface area) are clearly dependent from the distance to the welding front and also from the main welding parameters, namely the current intensity and heat input in the welding process. The emission of airborne ultrafine particles seem to increase with the current intensity as fume formation rate does. When comparing the tested gas mixtures, higher emissions are observed for more oxidant mixtures, that is, mixtures with higher CO2 content, which result in higher arc stability. The later mixtures originate higher concentrations of ultrafine particles (as measured by number of particles by cm3 of air) and higher values of alveolar deposited surface area of particles, thus resulting in a more hazardous condition regarding worker's exposure. © 2014 Sociedade Portuguesa de Materiais (SPM). Published by Elsevier España, S.L. All rights reserved.
Resumo:
OBJECTIVE To estimate the budget impact from the incorporation of positron emission tomography (PET) in mediastinal and distant staging of non-small cell lung cancer.METHODS The estimates were calculated by the epidemiological method for years 2014 to 2018. Nation-wide data were used about the incidence; data on distribution of the disease´s prevalence and on the technologies’ accuracy were from the literature; data regarding involved costs were taken from a micro-costing study and from Brazilian Unified Health System (SUS) database. Two strategies for using PET were analyzed: the offer to all newly-diagnosed patients, and the restricted offer to the ones who had negative results in previous computed tomography (CT) exams. Univariate and extreme scenarios sensitivity analyses were conducted to evaluate the influence from sources of uncertainties in the parameters used.RESULTS The incorporation of PET-CT in SUS would imply the need for additional resources of 158.1 BRL (98.2 USD) million for the restricted offer and 202.7 BRL (125.9 USD) million for the inclusive offer in five years, with a difference of 44.6 BRL (27.7 USD) million between the two offer strategies within that period. In absolute terms, the total budget impact from its incorporation in SUS, in five years, would be 555 BRL (345 USD) and 600 BRL (372.8 USD) million, respectively. The costs from the PET-CT procedure were the most influential parameter in the results. In the most optimistic scenario, the additional budget impact would be reduced to 86.9 BRL (54 USD) and 103.8 BRL (64.5 USD) million, considering PET-CT for negative CT and PET-CT for all, respectively.CONCLUSIONS The incorporation of PET in the clinical staging of non-small cell lung cancer seems to be financially feasible considering the high budget of the Brazilian Ministry of Health. The potential reduction in the number of unnecessary surgeries may cause the available resources to be more efficiently allocated.