978 resultados para computer forensics tools
Resumo:
L'article és una reflexió sobre els requisits de formació dels professionals que demana la societat del coneixement. Un dels objectius més importants que ha de tenir la universitat en la societat del coneixement és la formació de professionals competents que tinguin prou eines intel·lectuals per a enfrontar-se a la incertesa de la informació, a la consciència que aquesta té una data de caducitat a curt termini i a l'ansietat que això provoca. Però, a més, també han de ser capaços de definir i crear les eines de treball amb què donaran sentit i eficàcia a aquest coneixement mudable i mutant. Per això, l'espai europeu d'ensenyament superior prioritza la competència transversal del treball col·laboratiu amb l'objectiu de promoure un aprenentatge autònom, compromès i adaptat a les noves necessitats de l'empresa del segle xxi. En aquest context, es presenta l'entorn teòric que fonamenta el treball desenvolupat a la plataforma informàtica ACME, que uneix el treball col·laboratiu i l'aprenentatge semipresencial o blended learning. Així mateix, es descriuen amb detall alguns exemples de wikis, paradigma del treball col·laboratiu, fets en assignatures impartides per la Universitat de Girona en l'espai virtual ACME
Resumo:
BACKGROUND: The model plant Arabidopsis thaliana (Arabidopsis) shows a wide range of genetic and trait variation among wild accessions. Because of its unparalleled biological and genomic resources, the potential of Arabidopsis for molecular genetic analysis of this natural variation has increased dramatically in recent years. SCOPE: Advanced genomics has accelerated molecular phylogenetic analysis and gene identification by quantitative trait loci (QTL) mapping and/or association mapping in Arabidopsis. In particular, QTL mapping utilizing natural accessions is now becoming a major strategy of gene isolation, offering an alternative to artificial mutant lines. Furthermore, the genomic information is used by researchers to uncover the signature of natural selection acting on the genes that contribute to phenotypic variation. The evolutionary significance of such genes has been evaluated in traits such as disease resistance and flowering time. However, although molecular hallmarks of selection have been found for the genes in question, a corresponding ecological scenario of adaptive evolution has been difficult to prove. Ecological strategies, including reciprocal transplant experiments and competition experiments, and utilizing near-isogenic lines of alleles of interest will be a powerful tool to measure the relative fitness of phenotypic and/or allelic variants. CONCLUSIONS: As the plant model organism, Arabidopsis provides a wealth of molecular background information for evolutionary genetics. Because genetic diversity between and within Arabidopsis populations is much higher than anticipated, combining this background information with ecological approaches might well establish Arabidopsis as a model organism for plant evolutionary ecology.
Resumo:
Abstract Bacterial genomes evolve through mutations, rearrangements or horizontal gene transfer. Besides the core genes encoding essential metabolic functions, bacterial genomes also harbour a number of accessory genes acquired by horizontal gene transfer that might be beneficial under certain environmental conditions. The horizontal gene transfer contributes to the diversification and adaptation of microorganisms, thus having an impact on the genome plasticity. A significant part of the horizontal gene transfer is or has been facilitated by genomic islands (GEIs). GEIs are discrete DNA segments, some of which are mobile and others which are not, or are no longer mobile, which differ among closely related strains. A number of GEIs are capable of integration into the chromosome of the host, excision, and transfer to a new host by transformation, conjugation or transduction. GEIs play a crucial role in the evolution of a broad spectrum of bacteria as they are involved in the dissemination of variable genes, including antibiotic resistance and virulence genes leading to generation of hospital 'superbugs', as well as catabolic genes leading to formation of new metabolic pathways. Depending on the composition of gene modules, the same type of GEIs can promote survival of pathogenic as well as environmental bacteria.
Resumo:
Positron emission computed tomography (PET) is a functional, noninvasive method for imaging regional metabolic processes that is nowadays most often combined to morphological imaging with computed tomography (CT). Its use is based on the well-founded assumption that metabolic changes occur earlier in tumors than morphologic changes, adding another dimension to imaging. This article will review the established and investigational indications and radiopharmaceuticals for PET/CT imaging for prostate cancer, bladder cancer and testicular cancer, before presenting upcoming applications in radiation therapy.
Resumo:
Since 2004, four antiangiogenic drugs have been approved for clinical use in patients with advanced solid cancers, on the basis of their capacity to improve survival in phase III clinical studies. These achievements validated the concept introduced by Judah Folkman that the inhibition of tumor angiogenesis could control tumor growth. It has been suggested that biomarkers of angiogenesis would greatly facilitate the clinical development of antiangiogenic therapies. For these four drugs, the pharmacodynamic effects observed in early clinical studies were important to corroborate activities, but were not essential for the continuation of clinical development and approval. Furthermore, no validated biomarkers of angiogenesis or antiangiogenesis are available for routine clinical use. Thus, the quest for biomarkers of angiogenesis and their successful use in the development of antiangiogenic therapies are challenges in clinical oncology and translational cancer research. We review critical points resulting from the successful clinical trials, review current biomarkers, and discuss their potential impact on improving the clinical use of available antiangiogenic drugs and the development of new ones.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
Computer-Aided Tomography Angiography (CTA) images are the standard for assessing Peripheral artery disease (PAD). This paper presents a Computer Aided Detection (CAD) and Computer Aided Measurement (CAM) system for PAD. The CAD stage detects the arterial network using a 3D region growing method and a fast 3D morphology operation. The CAM stage aims to accurately measure the artery diameters from the detected vessel centerline, compensating for the partial volume effect using Expectation Maximization (EM) and a Markov Random field (MRF). The system has been evaluated on phantom data and also applied to fifteen (15) CTA datasets, where the detection accuracy of stenosis was 88% and the measurement accuracy was with an 8% error.
A biophysical model of atrial fibrillation ablation: what can a surgeon learn from a computer model?
Resumo:
AIMS: Surgical ablation procedures for treating atrial fibrillation have been shown to be highly successful. However, the ideal ablation pattern still remains to be determined. This article reports on a systematic study of the effectiveness of the performance of different ablation line patterns. METHODS AND RESULTS: This study of ablation line patterns was performed in a biophysical model of human atria by combining basic lines: (i) in the right atrium: isthmus line, line between vena cavae and appendage line and (ii) in the left atrium: several versions of pulmonary vein isolation, connection of pulmonary veins, isthmus line, and appendage line. Success rates and the presence of residual atrial flutter were documented. Basic patterns yielded conversion rates of only 10-25 and 10-55% in the right and the left atria, respectively. The best result for pulmonary vein isolation was obtained when a single closed line encompassed all veins (55%). Combination of lines in the right/left atrium only led to a success rate of 65/80%. Higher rates, up to 90-100%, could be obtained if right and left lines were combined. The inclusion of a left isthmus line was found to be essential for avoiding uncommon left atrial flutter. CONCLUSION: Some patterns studied achieved a high conversion rate, although using a smaller number of lines than those of the Maze III procedure. The biophysical atrial model is shown to be effective in the search for promising alternative ablation strategies.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.
Resumo:
PURPOSE OF REVIEW: The kidney plays an essential role in maintaining sodium and water balance, thereby controlling the volume and osmolarity of the extracellular body fluids, the blood volume and the blood pressure. The final adjustment of sodium and water reabsorption in the kidney takes place in cells of the distal part of the nephron in which a set of apical and basolateral transporters participate in vectorial sodium and water transport from the tubular lumen to the interstitium and, finally, to the general circulation. According to a current model, the activity and/or cell-surface expression of these transporters is/are under the control of a gene network composed of the hormonally regulated, as well as constitutively expressed, genes. It is proposed that this gene network may include new candidate genes for salt- and water-losing syndromes and for salt-sensitive hypertension. A new generation of functional genomics techniques have recently been applied to the characterization of this gene network. The purpose of this review is to summarize these studies and to discuss the potential of the different techniques for characterization of the renal transcriptome. RECENT FINDINGS: Recently, DNA microarrays and serial analysis of gene expression have been applied to characterize the kidney transcriptome in different in-vivo and in-vitro models. In these studies, a set of new interesting genes potentially involved in the regulation of sodium and water reabsorption by the kidney have been identified and are currently under detailed investigation. SUMMARY: Characterization of the kidney transcriptome is greatly expanding our knowledge of the gene networks involved in multiple kidney functions, including the maintenance of sodium and water homeostasis.
Resumo:
Résumé Les tumeurs sont diverses et hétérogènes, mais toutes partagent la capacité de proliférer sans contrôle. Une prolifération dérégulée de cellules couplée à une insensibilité à une réponse apoptotique constitue une condition minimale pour que l'évolution d'une tumeur se produise. Un des traitements les plus utilisés pour traité le cancer à l'heure actuelle sont les chimiothérapies, qui sont fréquemment des composés chimiques qui induisent des dommages dans l'ADN. Les agents anticancéreux sont efficaces seulement quand les cellules tumorales sont plus aisément tuées que le tissu normal environnant. L'efficacité de ces agents est en partie déterminée par leur capacité à induire l'apoptose. Nous avons récemment démontré que la protéine RasGAP est un substrat non conventionnel des caspases parce elle peut induire à la fois des signaux anti et pro-apoptotiques, selon l'ampleur de son clivage par les caspases. A un faible niveau d'activité des caspases, RasGAP est clivé, générant deux fragments (le fragment N et le fragment C). Le fragment N semble être un inhibiteur général de l'apoptose en aval de l'activation des caspases. À des niveaux plus élevés d'activité des caspases, la capacité du fragment N de contrecarrer l'apoptose est supprimée quand il est clivé à nouveau par les caspases. Ce dernier clivage produit deux nouveaux fragments, N 1 et N2, qui contrairement au fragment N sensibilisent efficacement des cellules cancéreuses envers des agents chimiothérapeutiques. Dans cette étude nous avons prouvé qu'un peptide, appelé par la suite TAT-RasGAP317-326, qui est dérivé du fragment N2 de RasGAP et est rendu perméable aux cellules, sensibilise spécifiquement des cellules cancéreuses à trois génotoxines différentes utilisées couramment dans des traitements anticancéreux, et cela dans des modèles in vitro et in vivo. Il est important de noté que ce peptide semble ne pas avoir d'effet sur des cellules non cancéreuses. Nous avons également commencé à caractériser les mécanismes moléculaires expliquant les fonctions de sensibilisation de TAT-RasGAP317-326. Nous avons démontré que le facteur de transcription p53 et une protéine sous son activité transcriptionelle, nommée Puma, sont indispensables pour l'activité de TAT-RasGAP317-326. Nous avons également prouvé que TAT-RasGAP317-326 exige la présence d'une protéine appelée G3BP1, une protéine se liant a RasGAP, pour potentialisé les effets d'agents anticancéreux. Les données obtenues dans cette étude montrent qu'il pourrait être possible d'augmenter l'efficacité des chimiothérapies à l'aide d'un composé capable d'augmenter la sensibilité des tumeurs aux génotoxines et ainsi pourrait permettre de traiter de manière plus efficace des patients sous traitement chimiothérapeutiques. Summary Tumors are diverse and heterogeneous, but all share the ability to proliferate without control. Deregulated cell proliferation coupled with suppressed apoptotic sensitivity constitutes a minimal requirement upon which tumor evolution occurs. One of the most commonly used treatments is chemotherapy, which frequently uses chemical compounds that induce DNA damages. Anticancer agents are effective only when tumors cells are more readily killed than the surrounding normal tissue. The efficacy of these agents is partly determined by their ability to induce apoptosis. We have recently demonstrated that the protein RasGAP is an unconventional caspase substrate because it can induce both anti- and pro-apoptotic signals, depending on the extent of its cleavage by caspases. At low levels of caspase activity, RasGAP is cleaved, generating an N-terminal fragment (fragment N) and a C-terminal fragment (fragment C). Fragment N appears to be a general Mocker of apoptosis downstream of caspase activation. At higher levels of caspase activity, the ability of fragment N to counteract apoptosis is suppressed when it is further cleaved. This latter cleavage event generates two fragments, N1 and N2, which in contrast to fragment N potently sensitizes cancer cells toward DNA-damaging agents induced apoptosis. In the present study we show that a cell permeable peptide derived from the N2 fragment of RasGAP, thereafter called TAT-RasGAP317-326, specifically sensitizes cancer cells to three different genotoxins commonly used in chemotherapy in vitro and in vivo models. Importantly this peptide seems not to have any effect on non cancer cells. We have also started to characterize the molecular mechanisms underlying the sensitizing functions of TAT-RasGAP317-326. We have demonstrated that the p53 transcription factor and a protein under its transcriptional activity, called Puma, are required for the activity of TATRasGAP317-326. We have also showed that TAT-RasGAP317-326 requires the presence of a protein called G3BP1, which have been shown to interact with RasGAP, to increase the effect of the DNA-damaging drug cisplatin. The data obtained in this study showed that it is possible to increase the efficacy of current used chemotherapies with a compound able to increase the efficacy of genotoxins which could be beneficial for patients subjected to chemotherapy.