997 resultados para force-directed algorithms
Resumo:
Viimeisten vuosien aikana laajakaistaoperaattoreiden laajakaistaverkot ovat nopeiden ja kiinteähintaisten laajakaistaliittymien johdosta kasvaneet suuriksi kokonaisuuksiksi. Kokonaisuuksia hallitaan erilaisilla verkonhallintatyökaluilla. Verkonhallintatyökalut sisältävät suuren määrän eri tasoista tietoa laitteista ja laitteiden välisistä suhteista. Kokonaisuuksien hahmottaminen ilman tiedoista rakennettua kuvaa on vaikeaa ja hidasta. Laajakaistaverkon topologian visualisoinnissa muodostetaan kuva laitteista ja niiden välisistä suhteista. Visualisoitua kuvaa voidaan käyttää osana verkonhallintatyökalua, jolloin käyttäjälle muodostuu nopeasti näkymä verkon laitteista ja rakenteesta eli topologiasta. Visualisoinnissa kuvan piirto-ongelma täytyy muuttaa graafin piirto-ongelmaksi. Graafin piirto-ongelmassa verkon rakennetta käsitellään graafina, joka mahdollistaa kuvan muodostamisen automaattisia piirtomenetelmiä hyväksikäyttäen. Halutunlainen ulkoasu kuvalle muodostetaan automaattisilla piirtomenetelmillä, joilla laitteiden ja laitteiden välisten suhteiden esitystapoja voidaan muuttaa. Esitystavoilla voidaan muuttaa esimerkiksi laitteiden muotoa, väriä ja kokoa. Esitystapojen lisäksi piirtomenetelmien tärkein tehtävä on laskea laitteiden sijaintien koordinaattien arvot, jotka loppujen lopuksi määräävät koko kuvan rakenteen. Koordinaattien arvot lasketaan piirtoalgoritmeilla, joista voimiin perustuvat algoritmit sopivat parhaiten laajakaistaverkkojen laitteiden sijaintien laskemiseen. Tämän diplomityön käytännön työssä toteutettiin laajakaistaverkon topologian visualisointityökalu.
Resumo:
Heuristic optimization algorithms are of great importance for reaching solutions to various real world problems. These algorithms have a wide range of applications such as cost reduction, artificial intelligence, and medicine. By the term cost, one could imply that that cost is associated with, for instance, the value of a function of several independent variables. Often, when dealing with engineering problems, we want to minimize the value of a function in order to achieve an optimum, or to maximize another parameter which increases with a decrease in the cost (the value of this function). The heuristic cost reduction algorithms work by finding the optimum values of the independent variables for which the value of the function (the “cost”) is the minimum. There is an abundance of heuristic cost reduction algorithms to choose from. We will start with a discussion of various optimization algorithms such as Memetic algorithms, force-directed placement, and evolution-based algorithms. Following this initial discussion, we will take up the working of three algorithms and implement the same in MATLAB. The focus of this report is to provide detailed information on the working of three different heuristic optimization algorithms, and conclude with a comparative study on the performance of these algorithms when implemented in MATLAB. In this report, the three algorithms we will take in to consideration will be the non-adaptive simulated annealing algorithm, the adaptive simulated annealing algorithm, and random restart hill climbing algorithm. The algorithms are heuristic in nature, that is, the solution these achieve may not be the best of all the solutions but provide a means to reach a quick solution that may be a reasonably good solution without taking an indefinite time to implement.
Resumo:
The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.
Resumo:
Using the MIT Serial Link Direct Drive Arm as the main experimental device, various issues in trajectory and force control of manipulators were studied in this thesis. Since accurate modeling is important for any controller, issues of estimating the dynamic model of a manipulator and its load were addressed first. Practical and effective algorithms were developed fro the Newton-Euler equations to estimate the inertial parameters of manipulator rigid-body loads and links. Load estimation was implemented both on PUMA 600 robot and on the MIT Serial Link Direct Drive Arm. With the link estimation algorithm, the inertial parameters of the direct drive arm were obtained. For both load and link estimation results, the estimated parameters are good models of the actual system for control purposes since torques and forces can be predicted accurately from these estimated parameters. The estimated model of the direct drive arm was them used to evaluate trajectory following performance by feedforward and computed torque control algorithms. The experimental evaluations showed that the dynamic compensation can greatly improve trajectory following accuracy. Various stability issues of force control were studied next. It was determined that there are two types of instability in force control. Dynamic instability, present in all of the previous force control algorithms discussed in this thesis, is caused by the interaction of a manipulator with a stiff environment. Kinematics instability is present only in the hybrid control algorithm of Raibert and Craig, and is caused by the interaction of the inertia matrix with the Jacobian inverse coordinate transformation in the feedback path. Several methods were suggested and demonstrated experimentally to solve these stability problems. The result of the stability analyses were then incorporated in implementing a stable force/position controller on the direct drive arm by the modified resolved acceleration method using both joint torque and wrist force sensor feedbacks.
Resumo:
Com a crescente geração, armazenamento e disseminação da informação nos últimos anos, o anterior problema de falta de informação transformou-se num problema de extracção do conhecimento útil a partir da informação disponível. As representações visuais da informação abstracta têm sido utilizadas para auxiliar a interpretação os dados e para revelar padrões de outra forma escondidos. A visualização de informação procura aumentar a cognição humana aproveitando as capacidades visuais humanas, de forma a tornar perceptível a informação abstracta, fornecendo os meios necessários para que um humano possa absorver quantidades crescentes de informação, com as suas capacidades de percepção. O objectivo das técnicas de agrupamento de dados consiste na divisão de um conjunto de dados em vários grupos, em que dados semelhantes são colocados no mesmo grupo e dados dissemelhantes em grupos diferentes. Mais especificamente, o agrupamento de dados com restrições tem o intuito de incorporar conhecimento a priori no processo de agrupamento de dados, com o objectivo de aumentar a qualidade do agrupamento de dados e, simultaneamente, encontrar soluções apropriadas a tarefas e interesses específicos. Nesta dissertação é estudado a abordagem de Agrupamento de Dados Visual Interactivo que permite ao utilizador, através da interacção com uma representação visual da informação, incorporar o seu conhecimento prévio acerca do domínio de dados, de forma a influenciar o agrupamento resultante para satisfazer os seus objectivos. Esta abordagem combina e estende técnicas de visualização interactiva de informação, desenho de grafos de forças direccionadas e agrupamento de dados com restrições. Com o propósito de avaliar o desempenho de diferentes estratégias de interacção com o utilizador, são efectuados estudos comparativos utilizando conjuntos de dados sintéticos e reais.
Resumo:
To investigate the control mechanisms used in adapting to position-dependent forces, subjects performed 150 horizontal reaching movements over 25 cm in the presence of a position-dependent parabolic force field (PF). The PF acted only over the first 10 cm of the movement. On every fifth trial, a virtual mechanical guide (double wall) constrained subjects to move along a straight-line path between the start and target positions. Its purpose was to register lateral force to track formation of an internal model of the force field, and to look for evidence of possible alternative adaptive strategies. The force field produced a force to the right, which initially caused subjects to deviate in that direction. They reacted by producing deviations to the left, into the force field, as early as the second trial. Further adaptation resulted in rapid exponential reduction of kinematic error in the latter portion of the movement, where the greatest perturbation to the handpath was initially observed, whereas there was little modification of the handpath in the region where the PF was active. Significant force directed to counteract the PF was measured on the first guided trial, and was modified during the first half of the learning set. The total force impulse in the region of the PF increased throughout the learning trials, but it always remained less than that produced by the PF. The force profile did not resemble a mirror image of the PF in that it tended to be more trapezoidal than parabolic in shape. As in previous studies of force-field adaptation, we found that changes in muscle activation involved a general increase in the activity of all muscles, which increased arm stiffness, and selectively-greater increases in the activation of muscles which counteracted the PF. With training, activation was exponentially reduced, albeit more slowly than kinematic error. Progressive changes in kinematics and EMG occurred predominantly in the region of the workspace beyond the force field. We suggest that constraints on muscle mechanics limit the ability of the central nervous system to employ an inverse dynamics model to nullify impulse-like forces by generating mirror-image forces. Consequently, subjects adopted a strategy of slightly overcompensating for the first half of the force field, then allowing the force field to push them in the opposite direction. Muscle activity patterns in the region beyond the boundary of the force field were subsequently adjusted because of the relatively-slow response of the second-order mechanics of muscle impedance to the force impulse.
Resumo:
Marine organisms have to cope with increasing CO2 partial pressures and decreasing pH in the oceans. We elucidated the impacts of an 8-week acclimation period to four seawater pCO2 treatments (39, 113, 243 and 405 Pa/385, 1,120, 2,400 and 4,000 µatm) on mantle gene expression patterns in the blue mussel Mytilus edulis from the Baltic Sea. Based on the M. edulis mantle tissue transcriptome, the expression of several genes involved in metabolism, calcification and stress responses was assessed in the outer (marginal and pallial zone) and the inner mantle tissues (central zone) using quantitative real-time PCR. The expression of genes involved in energy and protein metabolism (F-ATPase, hexokinase and elongation factor alpha) was strongly affected by acclimation to moderately elevated CO2 partial pressures. Expression of a chitinase, potentially important for the calcification process, was strongly depressed (maximum ninefold), correlating with a linear decrease in shell growth observed in the experimental animals. Interestingly, shell matrix protein candidate genes were less affected by CO2 in both tissues. A compensatory process toward enhanced shell protection is indicated by a massive increase in the expression of tyrosinase, a gene involved in periostracum formation (maximum 220-fold). Using correlation matrices and a force-directed layout network graph, we were able to uncover possible underlying regulatory networks and the connections between different pathways, thereby providing a molecular basis of observed changes in animal physiology in response to ocean acidification.
Resumo:
Final report of the Task Force which was directed by Illinois Senate Resolution 206 to make recommendations with respect to the best methods to implement a criminal background check of EMT's.
Resumo:
This paper presents a family of algorithms for approximate inference in credal networks (that is, models based on directed acyclic graphs and set-valued probabilities) that contain only binary variables. Such networks can represent incomplete or vague beliefs, lack of data, and disagreements among experts; they can also encode models based on belief functions and possibilistic measures. All algorithms for approximate inference in this paper rely on exact inferences in credal networks based on polytrees with binary variables, as these inferences have polynomial complexity. We are inspired by approximate algorithms for Bayesian networks; thus the Loopy 2U algorithm resembles Loopy Belief Propagation, while the Iterated Partial Evaluation and Structured Variational 2U algorithms are, respectively, based on Localized Partial Evaluation and variational techniques. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
This paper delineates the development of a prototype hybrid knowledge-based system for the optimum design of liquid retaining structures by coupling the blackboard architecture, an expert system shell VISUAL RULE STUDIO and genetic algorithm (GA). Through custom-built interactive graphical user interfaces under a user-friendly environment, the user is directed throughout the design process, which includes preliminary design, load specification, model generation, finite element analysis, code compliance checking, and member sizing optimization. For structural optimization, GA is applied to the minimum cost design of structural systems with discrete reinforced concrete sections. The design of a typical example of the liquid retaining structure is illustrated. The results demonstrate extraordinarily converging speed as near-optimal solutions are acquired after merely exploration of a small portion of the search space. This system can act as a consultant to assist novice designers in the design of liquid retaining structures.
Resumo:
"Series: Solid mechanics and its applications, vol. 226"
Resumo:
BACKGROUND: Reversed shoulder arthroplasty is an accepted treatment for glenohumeral arthritis associated to rotator cuff deficiency. For most reversed shoulder prostheses, the baseplate of the glenoid component is uncemented and its primary stability is provided by a central peg and peripheral screws. Because of the importance of the primary stability for a good osteo-integration of the baseplate, the optimal fixation of the screws is crucial. In particular, the amplitude of the tightening force of the nonlocking screws is clearly associated to this stability. Since this force is unknown, it is currently not accounted for in experimental or numerical analyses. Thus, the primary goal of this work is to measure this tightening force experimentally. In addition, the tightening torque was also measured, to estimate an optimal surgical value. METHODS: An experimental setup with an instrumented baseplate was developed to measure simultaneously the tightening force, tightening torque and screwing angle, of the nonlocking screws of the Aquealis reversed prosthesis. In addition, the amount of bone volume around each screw was measured with a micro-CT. Measurements were performed on 6 human cadaveric scapulae. FINDINGS: A statistically correlated relationship (p<0.05, R=0.83) was obtained between the maximal tightening force and the bone volume. The relationship between the tightening torque and the bone volume was not statistically significant. INTERPRETATION: The experimental relationship presented in this paper can be used in numerical analyses to improve the baseplate fixation in the glenoid bone.
Resumo:
Eighty-five of 99 Iowa counties were declared Presidential Disaster Areas for Public Assistance and/orIndividual Assistance as a result of the tornadoes, storms, and floods over the incident period May 25 through August 13, 2008. Response dominated the state’s attention for weeks, with a transition to recovery as the local situations warranted. The widespread damage and severity of the impact on Iowans and their communities required a statewide effort to continue moving forward despite being surrounded by adversity. By all accounts, it will require years for the state to recover from these disasters. With an eye toward the future, recovery is underway across Iowa. As part of the Rebuild Iowa efforts, the Long Term Recovery Planning Task Force was charged with responsibilities somewhat different from other topical Task Force assignments. Rather than assess damage and report on how the state might address immediate needs, the Long Term Recovery Planning Task Force is directed to discuss and discern the best approach to the lengthy recovery process. Certainly, the Governor and Lieutenant Governor expect the task to be difficult; when planning around so many critical issues and overwhelming needs, it is challenging to think to the future, rather than to rise to the current day’s needs.
Resumo:
The World Health Organization fracture risk assessment tool, FRAX(®), is an advance in clinical care that can assist in clinical decision-making. However, with increasing clinical utilization, numerous questions have arisen regarding how to best estimate fracture risk in an individual patient. Recognizing the need to assist clinicians in optimal use of FRAX(®), the International Osteoporosis Foundation (IOF) in conjunction with the International Society for Clinical Densitometry (ISCD) assembled an international panel of experts that ultimately developed joint Official Positions of the ISCD and IOF advising clinicians regarding FRAX(®) usage. As part of the process, the charge of the FRAX(®) Clinical Task Force was to review and synthesize data surrounding a number of recognized clinical risk factors including rheumatoid arthritis, smoking, alcohol, prior fracture, falls, bone turnover markers and glucocorticoid use. This synthesis was presented to the expert panel and constitutes the data on which the subsequent Official Positions are predicated. A summary of the Clinical Task Force composition and charge is presented here.