997 resultados para Software geomere Cabri II
Resumo:
The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).
Resumo:
Motivation: The immunogenicity of peptides depends on their ability to bind to MHC molecules. MHC binding affinity prediction methods can save significant amounts of experimental work. The class II MHC binding site is open at both ends, making epitope prediction difficult because of the multiple binding ability of long peptides. Results: An iterative self-consistent partial least squares (PLS)-based additive method was applied to a set of 66 pep- tides no longer than 16 amino acids, binding to DRB1*0401. A regression equation containing the quantitative contributions of the amino acids at each of the nine positions was generated. Its predictability was tested using two external test sets which gave r pred =0.593 and r pred=0.655, respectively. Furthermore, it was benchmarked using 25 known T-cell epitopes restricted by DRB1*0401 and we compared our results with four other online predictive methods. The additive method showed the best result finding 24 of the 25 T-cell epitopes. Availability: Peptides used in the study are available from http://www.jenner.ac.uk/JenPep. The PLS method is available commercially in the SYBYL molecular modelling software package. The final model for affinity prediction of peptides binding to DRB1*0401 molecule is available at http://www.jenner.ac.uk/MHCPred. Models developed for DRB1*0101 and DRB1*0701 also are available in MHC- Pred
Resumo:
Due to dynamic variability, identifying the specific conditions under which non-functional requirements (NFRs) are satisfied may be only possible at runtime. Therefore, it is necessary to consider the dynamic treatment of relevant information during the requirements specifications. The associated data can be gathered by monitoring the execution of the application and its underlying environment to support reasoning about how the current application configuration is fulfilling the established requirements. This paper presents a dynamic decision-making infrastructure to support both NFRs representation and monitoring, and to reason about the degree of satisfaction of NFRs during runtime. The infrastructure is composed of: (i) an extended feature model aligned with a domain-specific language for representing NFRs to be monitored at runtime; (ii) a monitoring infrastructure to continuously assess NFRs at runtime; and (iii) a exible decision-making process to select the best available configuration based on the satisfaction degree of the NRFs. The evaluation of the approach has shown that it is able to choose application configurations that well fit user NFRs based on runtime information. The evaluation also revealed that the proposed infrastructure provided consistent indicators regarding the best application configurations that fit user NFRs. Finally, a benefit of our approach is that it allows us to quantify the level of satisfaction with respect to NFRs specification.
Resumo:
Software product line engineering promotes large software reuse by developing a system family that shares a set of developed core features, and enables the selection and customization of a set of variabilities that distinguish each software product family from the others. In order to address the time-to-market, the software industry has been using the clone-and-own technique to create and manage new software products or product lines. Despite its advantages, the clone-and-own approach brings several difficulties for the evolution and reconciliation of the software product lines, especially because of the code conflicts generated by the simultaneous evolution of the original software product line, called Source, and its cloned products, called Target. This thesis proposes an approach to evolve and reconcile cloned products based on mining software repositories and code conflict analysis techniques. The approach provides support to the identification of different kinds of code conflicts – lexical, structural and semantics – that can occur during development task integration – bug correction, enhancements and new use cases – from the original evolved software product line to the cloned product line. We have also conducted an empirical study of characterization of the code conflicts produced during the evolution and merging of two large-scale web information system product lines. The results of our study demonstrate the approach potential to automatically or semi-automatically solve several existing code conflicts thus contributing to reduce the complexity and costs of the reconciliation of cloned software product lines.
Resumo:
The spread of wireless networks and growing proliferation of mobile devices require the development of mobility control mechanisms to support the different demands of traffic in different network conditions. A major obstacle to developing this kind of technology is the complexity involved in handling all the information about the large number of Moving Objects (MO), as well as the entire signaling overhead required to manage these procedures in the network. Despite several initiatives have been proposed by the scientific community to address this issue they have not proved to be effective since they depend on the particular request of the MO that is responsible for triggering the mobility process. Moreover, they are often only guided by wireless medium statistics, such as Received Signal Strength Indicator (RSSI) of the candidate Point of Attachment (PoA). Thus, this work seeks to develop, evaluate and validate a sophisticated communication infrastructure for Wireless Networking for Moving Objects (WiNeMO) systems by making use of the flexibility provided by the Software-Defined Networking (SDN) paradigm, where network functions are easily and efficiently deployed by integrating OpenFlow and IEEE 802.21 standards. For purposes of benchmarking, the analysis was conducted in the control and data planes aspects, which demonstrate that the proposal significantly outperforms typical IPbased SDN and QoS-enabled capabilities, by allowing the network to handle the multimedia traffic with optimal Quality of Service (QoS) transport and acceptable Quality of Experience (QoE) over time.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
168 p.
Resumo:
A avaliação dos impactos das tecnologias pode ser uma aliada do processo de decisão. Este trabalho apresenta uma metodologia para a avaliação dos impactos de inovações tecnológicas com adequações para a avaliação integrada das biotecnologias agrícolas e suas aplicações, fornecendo informações organizadas de acordo com critérios e indicadores nas diversas dimensões onde os impactos da biotecnologia agrícola sob avaliação podem ser percebidos. O método consiste de um sistema que permite a análise: i) do cenário no qual a tecnologia será introduzida, a partir da geração do índice de significância, e ii) do desempenho da inovação, pela análise dos indicadores de desempenho que irão compor o índice de magnitude. Este sistema conta com uma ferramenta de apoio, o software ?INOVA-tec v 1.0?, com informações apresentadas como um nortear para permitir uma avaliação instruída e embasada.
Resumo:
En el área de Aerofotogrametría Digital, el software comercial prevalente para postproceso presenta limitaciones debido a dos factores: (i) las legislaciones de cada país o región requieren diferentes convenciones, y (ii) las necesidades de las empresas son tan cambiantes que no justifican la compra de software de alto rendimiento, que puede quedar sin utilizar debido a un viraje del mercado -- El presente proyecto se ha desarrollado para atender necesidades de procesamiento automático de planos (partición, detección y corrección de errores, etc.), así como módulos de importación – exportación paquete a paquete, trazado de rutas e interacción con GPS -- Este artículo informa de los dos últimos aspectos -- Debido a necesidades de los clientes, los archivos entregados deben llevar un formato comercial (DWG, DXF), pero el procesamiento de los archivos debe ser hecho en paquetes y formatos diversos (DGN) -- Por lo tanto, fue necesario diseñar e implementar un formato acompañante que permitió llevar la información que se pierde al usar filtros comerciales (DGN a DXF/DWG) -- Asimismo se crearon módulos de importación y exportación redundantes, que hicieron efectivos dichos atributos -- En el aspecto de generación de rutas de vuelo, se reportan en este artículo la aplicación de algoritmos tradicionales de barrido (peinado) de áreas 2D, a los cuales se agregaron restricciones geométricas (puntos fijos, offsets, orden de los barridos de acuerdo a coordenadas del sitio de partida, etc.) -- Debido a los altos costos de equipos equivalentes, se decidió desarrollar software para traducción de rutas entre formatos GPS y formatos geográficos locales al país -- Ello permite la eliminación de fuentes de error y además facilita la carga del plan de vuelo, a costos mucho menores a los del hardware / software comercial
Resumo:
In the near future, the LHC experiments will continue to be upgraded as the LHC luminosity will increase from the design 1034 to 7.5 × 1034, with the HL-LHC project, to reach 3000 × f b−1 of accumulated statistics. After the end of a period of data collection, CERN will face a long shutdown to improve overall performance by upgrading the experiments and implementing more advanced technologies and infrastructures. In particular, ATLAS will upgrade parts of the detector, the trigger, and the data acquisition system. It will also implement new strategies and algorithms for processing and transferring the data to the final storage. This PhD thesis presents a study of a new pattern recognition algorithm to be used in the trigger system, which is a software designed to provide the information necessary to select physical events from background data. The idea is to use the well-known Hough Transform mathematical formula as an algorithm for detecting particle trajectories. The effectiveness of the algorithm has already been validated in the past, independently of particle physics applications, to detect generic shapes in images. Here, a software emulation tool is proposed for the hardware implementation of the Hough Transform, to reconstruct the tracks in the ATLAS Trigger and Data Acquisition system. Until now, it has never been implemented on electronics in particle physics experiments, and as a hardware implementation it would provide overall latency benefits. A comparison between the simulated data and the physical system was performed on a Xilinx UltraScale+ FPGA device.
Resumo:
In this study, 103 unrelated South-American patients with mucopolysaccharidosis type II (MPS II) were investigated aiming at the identification of iduronate-2-sulfatase (IDS) disease causing mutations and the possibility of some insights on the genotype-phenotype correlation The strategy used for genotyping involved the identification of the previously reported inversion/disruption of the IDS gene by PCR and screening for other mutations by PCR/SSCP. The exons with altered mobility on SSCP were sequenced, as well as all the exons of patients with no SSCP alteration. By using this strategy, we were able to find the pathogenic mutation in all patients. Alterations such as inversion/disruption and partial/total deletions of the IDS gene were found in 20/103 (19%) patients. Small insertions/deletions/indels (<22 bp) and point mutations were identified in 83/103 (88%) patients, including 30 novel mutations; except for a higher frequency of small duplications in relation to small deletions, the frequencies of major and minor alterations found in our sample are in accordance with those described in the literature.
Resumo:
To analyze the effects of treatment approach on the outcomes of newborns (birth weight [BW] < 1,000 g) with patent ductus arteriosus (PDA), from the Brazilian Neonatal Research Network (BNRN) on: death, bronchopulmonary dysplasia (BPD), severe intraventricular hemorrhage (IVH III/IV), retinopathy of prematurity requiring surgical (ROPsur), necrotizing enterocolitis requiring surgery (NECsur), and death/BPD. This was a multicentric, cohort study, retrospective data collection, including newborns (BW < 1000 g) with gestational age (GA) < 33 weeks and echocardiographic diagnosis of PDA, from 16 neonatal units of the BNRN from January 1, 2010 to Dec 31, 2011. Newborns who died or were transferred until the third day of life, and those with presence of congenital malformation or infection were excluded. Groups: G1 - conservative approach (without treatment), G2 - pharmacologic (indomethacin or ibuprofen), G3 - surgical ligation (independent of previous treatment). Factors analyzed: antenatal corticosteroid, cesarean section, BW, GA, 5 min. Apgar score < 4, male gender, Score for Neonatal Acute Physiology Perinatal Extension (SNAPPE II), respiratory distress syndrome (RDS), late sepsis (LS), mechanical ventilation (MV), surfactant (< 2 h of life), and time of MV. death, O2 dependence at 36 weeks (BPD36wks), IVH III/IV, ROPsur, NECsur, and death/BPD36wks. Student's t-test, chi-squared test, or Fisher's exact test; Odds ratio (95% CI); logistic binary regression and backward stepwise multiple regression. Software: MedCalc (Medical Calculator) software, version 12.1.4.0. p-values < 0.05 were considered statistically significant. 1,097 newborns were selected and 494 newborns were included: G1 - 187 (37.8%), G2 - 205 (41.5%), and G3 - 102 (20.6%). The highest mortality was observed in G1 (51.3%) and the lowest in G3 (14.7%). The highest frequencies of BPD36wks (70.6%) and ROPsur were observed in G3 (23.5%). The lowest occurrence of death/BPD36wks occurred in G2 (58.0%). Pharmacological (OR 0.29; 95% CI: 0.14-0.62) and conservative (OR 0.34; 95% CI: 0.14-0.79) treatments were protective for the outcome death/BPD36wks. The conservative approach of PDA was associated to high mortality, the surgical approach to the occurrence of BPD36wks and ROPsur, and the pharmacological treatment was protective for the outcome death/BPD36wks.
Resumo:
Sickle cell disease (SCD) pathogenesis leads to recurrent vaso-occlusive and hemolytic processes, causing numerous clinical complications including renal damage. As vasoconstrictive mechanisms may be enhanced in SCD, due to endothelial dysfunction and vasoactive protein production, we aimed to determine whether the expression of proteins of the renin-angiotensin system (RAS) may be altered in an animal model of SCD. Plasma angiotensin II (Ang II) was measured in C57BL/6 (WT) mice and mice with SCD by ELISA, while quantitative PCR was used to compare the expressions of the genes encoding the angiotensin-II-receptors 1 and 2 (AT1R and AT2R) and the angiotensin-converting enzymes (ACE1 and ACE2) in the kidneys, hearts, livers and brains of mice. The effects of hydroxyurea (HU; 50-75mg/kg/day, 4weeks) treatment on these parameters were also determined. Plasma Ang II was significantly diminished in SCD mice, compared with WT mice, in association with decreased AT1R and ACE1 expressions in SCD mice kidneys. Treatment of SCD mice with HU reduced leukocyte and platelet counts and increased plasma Ang II to levels similar to those of WT mice. HU also increased AT1R and ACE2 gene expression in the kidney and heart. Results indicate an imbalanced RAS in an SCD mouse model; HU therapy may be able to restore some RAS parameters in these mice. Further investigations regarding Ang II production and the RAS in human SCD may be warranted, as such changes may reflect or contribute to renal damage and alterations in blood pressure.
Resumo:
A monomeric basic PLA2 (PhTX-II) of 14149.08 Da molecular weight was purified to homogeneity from Porthidium hyoprora venom. Amino acid sequence by in tandem mass spectrometry revealed that PhTX-II belongs to Asp49 PLA2 enzyme class and displays conserved domains as the catalytic network, Ca2+-binding loop and the hydrophobic channel of access to the catalytic site, reflected in the high catalytic activity displayed by the enzyme. Moreover, PhTX-II PLA2 showed an allosteric behavior and its enzymatic activity was dependent on Ca2+. Examination of PhTX-II PLA2 by CD spectroscopy indicated a high content of alpha-helical structures, similar to the known structure of secreted phospholipase IIA group suggesting a similar folding. PhTX-II PLA2 causes neuromuscular blockade in avian neuromuscular preparations with a significant direct action on skeletal muscle function, as well as, induced local edema and myotoxicity, in mice. The treatment of PhTX-II by BPB resulted in complete loss of their catalytic activity that was accompanied by loss of their edematogenic effect. On the other hand, enzymatic activity of PhTX-II contributes to this neuromuscular blockade and local myotoxicity is dependent not only on enzymatic activity. These results show that PhTX-II is a myotoxic Asp49 PLA2 that contributes with toxic actions caused by P. hyoprora venom.