43 resultados para Experimental algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho de Dissertação de natureza científica para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização em Estruturas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização em Estruturas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Química e Biológica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most general Two Higgs Doublet Model potential without explicit CP violation depends on 10 real independent parameters. Excluding spontaneous CP violation results in two 7 parameter models. Although both models give rise to 5 scalar particles and 2 mixing angles, the resulting phenomenology of the scalar sectors is different. If flavour changing neutral currents at tree level are to be avoided, one has, in both cases, four alternative ways of introducing the fermion couplings. In one of these models the mixing angle of the CP even sector can be chosen in such a way that the fermion couplings to the lightest scalar Higgs boson vanishes. At the same time it is possible to suppress the fermion couplings to the charged and pseudo-scalar Higgs bosons by appropriately choosing the mixing angle of the CP odd sector. We investigate the phenomenology of both models in the fermiophobic limit and present the different branching ratios for the decays of the scalar particles. We use the present experimental results from the LEP collider to constrain the models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho apresenta um estudo laboratorial que teve como finalidade o reconhecimento de perda de propriedades da madeira de edifícios antigos quando degradada por carunchos. Foi estudada madeira antiga de pinho e de choupo, com idades compreendidas entre 100 e 200 anos. Como abordagem inicial são apresentadas as características da madeira e dos edifícios pombalinos e gaioleiros onde esta era bastante usada, não só como elemento deacabamento, mas principalmente como elemento estrutural. São apresentados os vários fatores que levam à degradação da madeira assim como alguns métodos de avaliação e diagnóstico dos estados de conservação dos elementos de madeira que se encontram nos edifícios. É também desenvolvido um estudo sobre o caruncho grande e caruncho pequeno, seu ciclo de vida e forma como degrada a madeira, servindo de base ao estudo laboratorial. No desenvolvimento foi avaliado o estado de degradação de provetes de madeira antiga com 30 x 30 x 90 mm e de seguida correlacionado com a sua resistência à compressão, o seu módulo de elasticidade e a extensão em fase plástica. Desta forma pretende-se estudar o modo como os diferentes estados de degradação por caruncho influenciam as caraterísticas mecânicas das peças de madeira.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Medicina Nuclear - Ramo de especialização: Tomografia por Emissão de Positrões

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes an efficient scalable Residue Number System (RNS) architecture supporting moduli sets with an arbitrary number of channels, allowing to achieve larger dynamic range and a higher level of parallelism. The proposed architecture allows the forward and reverse RNS conversion, by reusing the arithmetic channel units. The arithmetic operations supported at the channel level include addition, subtraction, and multiplication with accumulation capability. For the reverse conversion two algorithms are considered, one based on the Chinese Remainder Theorem and the other one on Mixed-Radix-Conversion, leading to implementations optimized for delay and required circuit area. With the proposed architecture a complete and compact RNS platform is achieved. Experimental results suggest gains of 17 % in the delay in the arithmetic operations, with an area reduction of 23 % regarding the RNS state of the art. When compared with a binary system the proposed architecture allows to perform the same computation 20 times faster alongside with only 10 % of the circuit area resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In global scientific experiments with collaborative scenarios involving multinational teams there are big challenges related to data access, namely data movements are precluded to other regions or Clouds due to the constraints on latency costs, data privacy and data ownership. Furthermore, each site is processing local data sets using specialized algorithms and producing intermediate results that are helpful as inputs to applications running on remote sites. This paper shows how to model such collaborative scenarios as a scientific workflow implemented with AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic), a decentralized framework offering a feasible solution to run the workflow activities on distributed data centers in different regions without the need of large data movements. The AWARD workflow activities are independently monitored and dynamically reconfigured and steering by different users, namely by hot-swapping the algorithms to enhance the computation results or by changing the workflow structure to support feedback dependencies where an activity receives feedback output from a successor activity. A real implementation of one practical scenario and its execution on multiple data centers of the Amazon Cloud is presented including experimental results with steering by multiple users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Applications involving biosignals, such as Electrocardiography (ECG), are becoming more pervasive with the extension towards non-intrusive scenarios helping targeting ambulatory healthcare monitoring, emotion assessment, among many others. In this study we introduce a new type of silver/silver chloride (Ag/AgCl) electrodes based on a paper substrate and produced using an inkjet printing technique. This type of electrodes can increase the potential applications of biosignal acquisition technologies for everyday life use, given that there are several advantages, such as cost reduction and easier recycling, resultant from the approach explored in our work. We performed a comparison study to assess the quality of this new electrode type, in which ECG data was collected with three types of Ag/AgCl electrodes: i) gelled; ii) dry iii) paper-based inkjet printed. We also compared the performance of each electrode when acquired using a professional-grade gold standard device, and a low cost platform. Experimental results showed that data acquired using our proposed inkjet printed electrode is highly correlated with data obtained through conventional electrodes. Moreover, the electrodes are robust to high-end and low-end data acquisition devices. Copyright © 2014 SCITEPRESS - Science and Technology Publications. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real structures can be thought as an assembly of components, as for instances plates, shells and beams. This later type of component is very commonly found in structures like frames which can involve a significant degree of complexity or as a reinforcement element of plates or shells. To obtain the desired mechanical behavior of these components or to improve their operating conditions when rehabilitating structures, one of the eventual parameters to consider for that purpose, when possible, is the location of the supports. In the present work, a beam-type structure is considered, and for a set of cases concerning different number and types of supports, as well as different load cases, the authors optimize the location of the supports in order to obtain minimum values of the maximum transverse deflection. The optimization processes are carried out using genetic algorithms. The results obtained, clearly show a good performance of the approach proposed. © 2014 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Pressure ulcers are a high cost, high volume issue for health and medical care providers, affecting patients’ recovery and psychological wellbeing. The current research of support surfaces on pressure as a risk factor in the development of pressure ulcers is not relevant to the specialised, controlled environment of the radiological setting. Method: 38 healthy participants aged 19-51 were placed supine on two different imaging surfaces. The XSENSOR pressure mapping system was used to measure the interface pressure. Data was acquired over a time of 20 minutes preceded by 6 minutes settling time to reduce measurement error. Qualitative information regarding participants’ opinion on pain and comfort was recorded using a questionnaire. Data analysis was performed using SPSS 22. Results: Data was collected from 30 participants aged 19 to 51 (mean 25.77, SD 7.72), BMI from 18.7 to 33.6 (mean 24.12, SD 3.29), for two surfaces, following eight participant exclusions due to technical faults. Total average pressure, average pressure for jeopardy areas (head, sacrum & heels) and peak pressure for jeopardy areas were calculated as interface pressure in mmHg. Qualitative data showed that a significant difference in experiences of comfort and pain was found in the jeopardy areas (P<0.05) between the two surfaces. Conclusion: A significant difference is seen in average pressure between the two surfaces. Pain and comfort data also show a significant difference between the surfaces, both findings support the proposal for further investigation into the effects of radiological surfaces as a risk factor for the formation of pressure ulcers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many learning problems require handling high dimensional datasets with a relatively small number of instances. Learning algorithms are thus confronted with the curse of dimensionality, and need to address it in order to be effective. Examples of these types of data include the bag-of-words representation in text classification problems and gene expression data for tumor detection/classification. Usually, among the high number of features characterizing the instances, many may be irrelevant (or even detrimental) for the learning tasks. It is thus clear that there is a need for adequate techniques for feature representation, reduction, and selection, to improve both the classification accuracy and the memory requirements. In this paper, we propose combined unsupervised feature discretization and feature selection techniques, suitable for medium and high-dimensional datasets. The experimental results on several standard datasets, with both sparse and dense features, show the efficiency of the proposed techniques as well as improvements over previous related techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data analytic applications are characterized by large data sets that are subject to a series of processing phases. Some of these phases are executed sequentially but others can be executed concurrently or in parallel on clusters, grids or clouds. The MapReduce programming model has been applied to process large data sets in cluster and cloud environments. For developing an application using MapReduce there is a need to install/configure/access specific frameworks such as Apache Hadoop or Elastic MapReduce in Amazon Cloud. It would be desirable to provide more flexibility in adjusting such configurations according to the application characteristics. Furthermore the composition of the multiple phases of a data analytic application requires the specification of all the phases and their orchestration. The original MapReduce model and environment lacks flexible support for such configuration and composition. Recognizing that scientific workflows have been successfully applied to modeling complex applications, this paper describes our experiments on implementing MapReduce as subworkflows in the AWARD framework (Autonomic Workflow Activities Reconfigurable and Dynamic). A text mining data analytic application is modeled as a complex workflow with multiple phases, where individual workflow nodes support MapReduce computations. As in typical MapReduce environments, the end user only needs to define the application algorithms for input data processing and for the map and reduce functions. In the paper we present experimental results when using the AWARD framework to execute MapReduce workflows deployed over multiple Amazon EC2 (Elastic Compute Cloud) instances.