971 resultados para Copy number variants
Resumo:
(...) Os number bonds (esquemas todo-partes) constituem um dos procedimentos didáticos mais famosos do Método de Singapura. Estas representações auxiliam a compreensão numérica basilar, nomeadamente a capacidade de decompor quantidades e a álgebra fundamental (adições e subtrações). Neste artigo, analisaremos o que são, quais as vantagens e a forma de utilização destes esquemas no 1.º ano de escolaridade. (...)
Resumo:
Cluster analysis for categorical data has been an active area of research. A well-known problem in this area is the determination of the number of clusters, which is unknown and must be inferred from the data. In order to estimate the number of clusters, one often resorts to information criteria, such as BIC (Bayesian information criterion), MML (minimum message length, proposed by Wallace and Boulton, 1968), and ICL (integrated classification likelihood). In this work, we adopt the approach developed by Figueiredo and Jain (2002) for clustering continuous data. They use an MML criterion to select the number of clusters and a variant of the EM algorithm to estimate the model parameters. This EM variant seamlessly integrates model estimation and selection in a single algorithm. For clustering categorical data, we assume a finite mixture of multinomial distributions and implement a new EM algorithm, following a previous version (Silvestre et al., 2008). Results obtained with synthetic datasets are encouraging. The main advantage of the proposed approach, when compared to the above referred criteria, is the speed of execution, which is especially relevant when dealing with large data sets.
Resumo:
Myocardial Perfusion Gated Single Photon Emission Tomography (Gated-SPET) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV). The purpose of this study is to evaluate the influence of the total number of counts acquired from myocardium, in the calculation of myocardial functional parameters using routine software procedures. Methods: Gated-SPET studies were simulated using Monte Carlo GATE package and NURBS phantom. Simulated data were reconstructed and processed using the commercial software package Quantitative Gated-SPECT. The Bland-Altman and Mann-Whitney-Wilcoxon tests were used to analyze the influence of the number of total counts in the calculation of LV myocardium functional parameters. Results: In studies simulated with 3MBq in the myocardium there were significant differences in the functional parameters: Left ventricular ejection fraction (LVEF), end-systolic volume (ESV), Motility and Thickness; between studies acquired with 15s/projection and 30s/projection. Simulations with 4.2MBq show significant differences in LVEF, end-diastolic volume (EDV) and Thickness. Meanwhile in the simulations with 5.4MBq and 8.4MBq the differences were statistically significant for Motility and Thickness. Conclusion: The total number of counts per simulation doesn't significantly interfere with the determination of Gated-SPET functional parameters using the administered average activity of 450MBq to 5.4MBq in myocardium.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.
Resumo:
Sleep-states are emerging as a first-class design choice in energy minimization. A side effect of this is that the release behavior of the system is affected and subsequently the preemption relations between tasks. In a first step we have investigated how the behavior in terms of number of preemptions of tasks in the system is changed at runtime, using an existing procrastination approach, which utilizes sleepstates for energy savings purposes. Our solution resulted in substantial savings of preemptions and we expect from even higher yields for alternative energy saving algorithms. This work is intended to form the base of future research, which aims to bound the number of preemptions at analysis time and subsequently how this may be employed in the analysis to reduced the amount of system utilization, which is reserved to account for the preemption delay.
Resumo:
We propose an efficient algorithm to estimate the number of live computer nodes in a network. This algorithm is fully distributed, and has a time-complexity which is independent of the number of computer nodes. The algorithm is designed to take advantage of a medium access control (MAC) protocol which is prioritized; that is, if two or more messages on different nodes contend for the medium, then the node contending with the highest priority will win, and all nodes will know the priority of the winner.
Resumo:
Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.
Resumo:
Myocardial perfusion gated-single photon emission computed tomography (gated-SPECT) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV) function. The aim of this study is to analyze the influence of counts/pixel and concomitantly the total counts in the myocardium for the calculation of myocardial functional parameters. Material and methods: Gated-SPECT studies were performed using a Monte Carlo GATE simulation package and the NCAT phantom. The simulations of these studies use the radiopharmaceutical 99mTc-labeled tracers (250, 350, 450 and 680MBq) for standard patient types, effectively corresponding to the following activities of myocardium: 3, 4.2, 5.4-8.2MBq. All studies were simulated using 15 and 30s/projection. The simulated data were reconstructed and processed by quantitative-gated-SPECT software, and the analysis of functional parameters in gated-SPECT images was done by using Bland-Altman test and Mann-Whitney-Wilcoxon test. Results: In studies simulated using different times (15 and 30s/projection), it was noted that for the activities for full body: 250 and 350MBq, there were statistically significant differences in parameters Motility and Thickness. For the left ventricular ejection fraction (LVEF), end-systolic volume (ESV) it was only for 250MBq, and 350MBq in the end-diastolic volume (EDV), while the simulated studies with 450 and 680MBq showed no statistically significant differences for global functional parameters: LVEF, EDV and ESV. Conclusion: The number of counts/pixel and, concomitantly, the total counts per simulation do not significantly interfere with the determination of gated-SPECT functional parameters, when using the administered average activity of 450MBq, corresponding to the 5.4MBq of the myocardium, for standard patient types.
Resumo:
Dissertação de Mestrado em Engenharia Informática
Resumo:
In the last twenty years genetic algorithms (GAs) were applied in a plethora of fields such as: control, system identification, robotics, planning and scheduling, image processing, and pattern and speech recognition (Bäck et al., 1997). In robotics the problems of trajectory planning, collision avoidance and manipulator structure design considering a single criteria has been solved using several techniques (Alander, 2003). Most engineering applications require the optimization of several criteria simultaneously. Often the problems are complex, include discrete and continuous variables and there is no prior knowledge about the search space. These kind of problems are very more complex, since they consider multiple design criteria simultaneously within the optimization procedure. This is known as a multi-criteria (or multiobjective) optimization, that has been addressed successfully through GAs (Deb, 2001). The overall aim of multi-criteria evolutionary algorithms is to achieve a set of non-dominated optimal solutions known as Pareto front. At the end of the optimization procedure, instead of a single optimal (or near optimal) solution, the decision maker can select a solution from the Pareto front. Some of the key issues in multi-criteria GAs are: i) the number of objectives, ii) to obtain a Pareto front as wide as possible and iii) to achieve a Pareto front uniformly spread. Indeed, multi-objective techniques using GAs have been increasing in relevance as a research area. In 1989, Goldberg suggested the use of a GA to solve multi-objective problems and since then other researchers have been developing new methods, such as the multi-objective genetic algorithm (MOGA) (Fonseca & Fleming, 1995), the non-dominated sorted genetic algorithm (NSGA) (Deb, 2001), and the niched Pareto genetic algorithm (NPGA) (Horn et al., 1994), among several other variants (Coello, 1998). In this work the trajectory planning problem considers: i) robots with 2 and 3 degrees of freedom (dof ), ii) the inclusion of obstacles in the workspace and iii) up to five criteria that are used to qualify the evolving trajectory, namely the: joint traveling distance, joint velocity, end effector / Cartesian distance, end effector / Cartesian velocity and energy involved. These criteria are used to minimize the joint and end effector traveled distance, trajectory ripple and energy required by the manipulator to reach at destination point. Bearing this ideas in mind, the paper addresses the planning of robot trajectories, meaning the development of an algorithm to find a continuous motion that takes the manipulator from a given starting configuration up to a desired end position without colliding with any obstacle in the workspace. The chapter is organized as follows. Section 2 describes the trajectory planning and several approaches proposed in the literature. Section 3 formulates the problem, namely the representation adopted to solve the trajectory planning and the objectives considered in the optimization. Section 4 studies the algorithm convergence. Section 5 studies a 2R manipulator (i.e., a robot with two rotational joints/links) when the optimization trajectory considers two and five objectives. Sections 6 and 7 show the results for the 3R redundant manipulator with five goals and for other complementary experiments are described, respectively. Finally, section 8 draws the main conclusions.
Resumo:
Invariant integrals are derived for nematic liquid crystals and applied to materials with small Ericksen number and topological defects. The nematic material is confined between two infinite plates located at y = -h and y = h (h is an element of R+) with a semi-infinite plate at y = 0 and x < 0. Planar and homeotropic strong anchoring boundary conditions to the director field are assumed at these two infinite and semi-infinite plates, respectively. Thus, a line disclination appears in the system which coincides with the z-axis. Analytical solutions to the director field in the neighbourhood of the singularity are obtained. However, these solutions depend on an arbitrary parameter. The nematic elastic force is thus evaluated from an invariant integral of the energy-momentum tensor around a closed surface which does not contain the singularity. This allows one to determine this parameter which is a function of the nematic cell thickness and the strength of the disclination. Analytical solutions are also deduced for the director field in the whole region using the conformal mapping method. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Two monoclonal antibodies anti-component 5 of Trypanosoma cruzi (I-35/115 and II-190/30) were tested in IFA and ELISA respectively against 35 T. cruzi laboratory clones. Among the 35 clones tested, 18 different isozyme patterns were detected. All clones were recognized by both monoclonal antibodies except one clone which did not react with II-190/30. These results support the universal expression of specific component 5 within the taxon T. cruzi.
Resumo:
Dissertation presented at Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia in fulfilment of the requirements for the Masters degree in Mathematics and Applications, specialization in Actuarial Sciences, Statistics and Operations Research
Resumo:
Bulletin of the Malaysian Mathematical Sciences Society, 2, 34 (1),(2011), p. 79–85
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding the management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.