882 resultados para Computer aided network analysis
Resumo:
The Team Formation problem (TFP) has become a well-known problem in the OR literature over the last few years. In this problem, the allocation of multiple individuals that match a required set of skills as a group must be chosen to maximise one or several social positive attributes. Speci�cally, the aim of the current research is two-fold. First, two new dimensions of the TFP are added by considering multiple projects and fractions of people's dedication. This new problem is named the Multiple Team Formation Problem (MTFP). Second, an optimization model consisting in a quadratic objective function, linear constraints and integer variables is proposed for the problem. The optimization model is solved by three algorithms: a Constraint Programming approach provided by a commercial solver, a Local Search heuristic and a Variable Neighbourhood Search metaheuristic. These three algorithms constitute the first attempt to solve the MTFP, being a variable neighbourhood local search metaheuristic the most effi�cient in almost all cases. Applications of this problem commonly appear in real-life situations, particularly with the current and ongoing development of social network analysis. Therefore, this work opens multiple paths for future research.
Resumo:
The most significant radiation field nonuniformity is the well-known Heel effect. This nonuniform beam effect has a negative influence on the results of computer-aided diagnosis of mammograms, which is frequently used for early cancer detection. This paper presents a method to correct all pixels in the mammography image according to the excess or lack on radiation to which these have been submitted as a result of the this effect. The current simulation method calculates the intensities at all points of the image plane. In the simulated image, the percentage of radiation received by all the points takes the center of the field as reference. In the digitized mammography, the percentages of the optical density of all the pixels of the analyzed image are also calculated. The Heel effect causes a Gaussian distribution around the anode-cathode axis and a logarithmic distribution parallel to this axis. Those characteristic distributions are used to determine the center of the radiation field as well as the cathode-anode axis, allowing for the automatic determination of the correlation between these two sets of data. The measurements obtained with our proposed method differs on average by 2.49 mm in the direction perpendicular to the anode-cathode axis and 2.02 mm parallel to the anode-cathode axis of commercial equipment. The method eliminates around 94% of the Heel effect in the radiological image and the objects will reflect their x-ray absorption. To evaluate this method, experimental data was taken from known objects, but could also be done with clinical and digital images.
Resumo:
In medical processes where ionizing radiation is used, dose planning and dose delivery are the key elements to patient safety and treatment success, particularly, when the delivered dose in a single session of treatment can be an order of magnitude higher than the regular doses of radiotherapy. Therefore, the radiation dose should be well defined and precisely delivered to the target while minimizing radiation exposure to surrounding normal tissues [1]. Several methods have been proposed to obtain three-dimensional (3-D) dose distribution [2, 3]. In this paper, we propose an alternative method, which can be easily implemented in any stereotactic radiosurgery center with a magnetic resonance imaging (MRI) facility. A phantom with or without scattering centers filled with Fricke gel solution is irradiated with Gamma Knife(A (R)) system at a chosen spot. The phantom can be a replica of a human organ such as head, breast or any other organ. It can even be constructed from a real 3-D MR image of an organ of a patient using a computer-aided construction and irradiated at a specific region corresponding to the tumor position determined by MRI. The spin-lattice relaxation time T (1) of different parts of the irradiated phantom is determined by localized spectroscopy. The T (1)-weighted phantom images are used to correlate the image pixels intensity to the absorbed dose and consequently a 3-D dose distribution with a high resolution is obtained.
Resumo:
Estimating the sizes of hard-to-count populations is a challenging and important problem that occurs frequently in social science, public health, and public policy. This problem is particularly pressing in HIV/AIDS research because estimates of the sizes of the most at-risk populations-illicit drug users, men who have sex with men, and sex workers-are needed for designing, evaluating, and funding programs to curb the spread of the disease. A promising new approach in this area is the network scale-up method, which uses information about the personal networks of respondents to make population size estimates. However, if the target population has low social visibility, as is likely to be the case in HIV/AIDS research, scale-up estimates will be too low. In this paper we develop a game-like activity that we call the game of contacts in order to estimate the social visibility of groups, and report results from a study of heavy drug users in Curitiba, Brazil (n = 294). The game produced estimates of social visibility that were consistent with qualitative expectations but of surprising magnitude. Further, a number of checks suggest that the data are high-quality. While motivated by the specific problem of population size estimation, our method could be used by researchers more broadly and adds to long-standing efforts to combine the richness of social network analysis with the power and scale of sample surveys. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Two targets, reverse transcriptase (RT) and protease from HIV-1, were used during the past two decades to the discovery of non-nucleoside reverse transcriptase inhibitors (NNRTI) and protease inhibitors (PI) that belong to the arsenal of the antiretroviral therapy. Herein these enzymes were chosen as templates for conducting a computer-aided ligand design. Ligand and structure-based drug designs were the starting points to select compounds from a database bearing more than five million compounds by means of cheminformatic tools. New promising lead structures are retrieved from the database, which are open to acquisition and test. Classes of molecules already described as NNRTI or PI in the literature also came out and were useful to prove the reliability of the workflow, and thus validating the work carried out so far. (c) 2007 Elsevier Masson SAS. All rights reserved.
Resumo:
Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.
Resumo:
Background: Several methods have been utilized to prevent pericardial and retrosternal adhesions, but none of them evaluated the mesothelial regenerative hypothesis. There are evidences that the mesothelial trauma reduces pericardial fibrinolytic capability and induces an adhesion process. Keratinocyte growth factor (KGF) has proven to improve mesothelial cells proliferation. This study investigated the influence of keratinocyte growth factor in reducing post-surgical adhesions. Methods: Twelve pigs were operated and an adhesion protocol was employed. Following a stratified randomization, the animals received a topical application of KGF or saline. At 8 weeks, intrapericardial adhesions were evaluated and a severity score was established. The time spent to dissect the adhesions and the amount of sharp dissection used, were recorded. Histological sections were stained with sirius red and morphometric analyses were assessed with a computer-assisted image analysis system. Results: The severity score was lower in the KGF group than in the control group (11.5 vs 17, p = 0.005). The dissection time was lower in the KGF group (9.2 +/- 1.4 min vs 33.9 +/- 9.2 min, p = 0.004) and presented a significant correlation with the severity score (r = 0.83, p = 0.001). A significantly less sharp dissection was also required in the KGF group. Also, adhesion area and adhesion collagen were significantly tower in the KGF group than in the control group. Conclusion: The simulation of pericardial cells with KGF reduced the intensity of postoperative adhesions and facilitated the re-operation. This study suggests that the mesothelial regeneration is the new horizon in anti-adhesion therapies. (C) 2008 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.
Resumo:
Background. Mesothelial injury is the pivot in the development of adhesions. An increase in the proliferation of mesothelial cells was verified by in vitro studies with the use of keratinocyte growth factor (KGF). This study investigated the influence of KGF associated with thermo-sterilized carboxymethyl chitosan (NOCCts) in the reduction of pericardial adhesions. Methods. An induction model of pericardial adhesion was carried out in 24 pigs. Animals were randomly allocated to receive topical application of KGF, KGF + NOCCts, NOCCts, or saline (control). At 8 weeks, intra-pericardial adhesions were evaluated and a severity score was established. The time spent to dissect the adhesions and the amount of sharp dissection used, were recorded. Histologic sections were stained with sirius red for a morphometric evaluation using a computer-assisted image analysis system. Cytokeratin AE1/AE3 immunostaining were employed to identify mesothelial cells. Results. The severity score expressed in median (minimum to maximum), in relation to the control group (17 [15 to 18]), was lower in the KGF + NOCCts group (7 [6 to 9], p < 0.01) followed by the KGF group (11.5 [9 to 12], 0.01 < p < 0.05) and the NOCCts group (12 [9 to 14], p > 0.05). The dissection time was significantly lower in the KGF + NOCCts group (7.1 +/- 0.6 vs 33.9 +/- 9.2 minutes, p < 0.001). A significantly less sharp dissection was also required in the KGF + NOCCts group. In the adhesion segment, a decreased collagen proportion was found in the KGF + NOCCts group (p < 0.05). Mesothelial cells were present more extensively in groups in which KGF was delivered (p = 0.01). Conclusions. The use of KGF associated with NOCCts resulted in a synergic action that decreases postoperative pericardial adhesions in a highly significant way. (Ann Thorac Surg 2010; 90: 566-72) (C) 2010 by The Society of Thoracic Surgeons
Resumo:
The introduction of a new technology High Speed Downlink Packet Access (HSDPA) in the Release 5 of the 3GPP specifications raises the question about its performance capabilities. HSDPA is a promising technology which gives theoretical rates up to 14.4 Mbits. The main objective of this thesis is to discuss the system level performance of HSDPAMainly the thesis exploration focuses on the Packet Scheduler because it is the central entity of the HSDPA design. Due to its function, the Packet Scheduler has a direct impact on the HSDPA system performance. Similarly, it also determines the end user performance, and more specifically the relative performance between the users in the cell.The thesis analyzes several Packet Scheduling algorithms that can optimize the trade-off between system capacity and end user performance for the traffic classes targeted in this thesis.The performance evaluation of the algorithms in the HSDPA system are carried out under computer aided simulations that are assessed under realistic conditions to predict the results as precise on the algorithms efficiency. The simulation of the HSDPA system and the algorithms are coded in C/C++ language
Resumo:
Este trabalho pretende, na visão de novas tecnologias, discutir o processo de forjamento das ligas de alumínio (ABNT 6061), buscando propor uma metodologia baseada na ciência da engenharia. Deseja-se minimizar os procedimentos de tentativa e erro no desenvolvimento de processos de conformação. Para tanto, novas tecnologias disponíveis atualmente, tais como o Projeto Assistido por Computador (CAD), a Fabricação Assistida por Computador (CAM) e a Simulação do Processo (CAE) são empregadas. Resultados experimentais mostrando o comportamento da liga ABNT 6061 através das curvas de escoamento bem como o estabelecimento da condição do atrito no processo de conformação, avaliando dois lubrificantes comerciais disponíveis (Deltaforge 31 e Oildag) para aplicações nas ligas de alumínio, são reportados neste trabalho. A comparação dos resultados obtidos de um experimento prático de forjamento com a simulação pelo “Método dos Elementos Finitos” usando o código “QForm” é apresentada para uma peça de simetria axial em liga de alumínio. Finalmente, os resultados obtidos no forjamento de um componente automotivo em liga de alumínio (ABNT 6061), desenvolvido em parceria com a empresa Dana, são analisados e comparados com as simulações computacionais realizadas usando o código “Superforge”.
Resumo:
Tendo como motivação o desenvolvimento de uma representação gráfica de redes com grande número de vértices, útil para aplicações de filtro colaborativo, este trabalho propõe a utilização de superfícies de coesão sobre uma base temática multidimensionalmente escalonada. Para isso, utiliza uma combinação de escalonamento multidimensional clássico e análise de procrustes, em algoritmo iterativo que encaminha soluções parciais, depois combinadas numa solução global. Aplicado a um exemplo de transações de empréstimo de livros pela Biblioteca Karl A. Boedecker, o algoritmo proposto produz saídas interpretáveis e coerentes tematicamente, e apresenta um stress menor que a solução por escalonamento clássico.
Resumo:
A estratégia empresarial é uma disciplina jovem. Comparado com os campos de estudo de economia e sociologia o campo de estratégia empresarial pode ser visto como um fenômeno de formação mais recente, embora extremamente dinâmico em sua capacidade de criar abordagens teóricas diferenciadas. Este trabalho discute a recente proliferação de teorias em estratégia empresarial, propondo um modelo de classificação destas teorias baseado na análise empírica do modelo de escolas de pensamento em estratégia empresarial desenvolvido por Mintzberg, Ahlstrand e Lampel em seu livro Safári de Estratégia (1998). As possíveis conseqüências relativas à interação entre teoria e prática são também discutidas apresentando o que definimos como a síndrome do ornitorrinco.
Resumo:
Sistemas de recomendação baseados cooperação indireta podem ser implementados em bibliotecas por meio da aplicação de conceitos e procedimentos de análise de redes. Uma medida de distância temática, inicialmente desenvolvida para variáveis dicotômicas, foi generalizada e aplicada a matrizes de co-ocorrências, permitindo o aproveitando de toda a informação disponível sobre o comportamento dos usuários com relação aos itens consultados. Como resultado formaram-se subgrupos especializados altamente coerentes, para os quais listas-base e listas personalizadas foram geradas da maneira usual. Aplicativos programáveis capazes de manipularem matrizes, como o software S-plus, foram utilizados para os cálculos (com vantagens sobre o software especializado UCINET 5.0), sendo suficientes para o processamento de grupos temáticos de até 10.000 usuários.
Resumo:
Last week I sat down with a Brazilian acquaintance who was shaking his head over the state of national politics. A graduate of a military high school, he'd been getting e-mails from former classmates, many of them now retired army officers, who were irate over the recent presidential elections. "We need to kick these no-good Petistas out of office," one bristled, using the derogatory shorthand for members of the ruling Workers Party, or PT in Portuguese.
Resumo:
What you see above is a graphic representation of something anyone who followed the campaign that led to the re-election of Dilma Rousseff as Brazil’s president on October 26 already knows: the election was the most polarised in the country’s history. Brasil was split down the middle, not only numerically (Dilma got 52 per cent, Aécio Neves 48) and geographically (Dilma won in the less developed north, Aécio in the more prosperous south). The twitterspere, too, was divided into two camps. Not only that; they hardly talked to each other at all.