919 resultados para GRAPHICAL LASSO
Resumo:
The advent of retrievable caval filters was a game changer in the sense, that the previously irreversible act of implanting a medical device into the main venous blood stream of the body requiring careful evaluation of the pros and cons prior to execution suddenly became a "reversible" procedure where potential hazards in the late future of the patient lost most of their weight at the time of decision making. This review was designed to assess the rate of success with late retrieval of so called retrievable caval filters in order to get some indication about reasonable implant duration with respect to relatively "easy" implant removal with conventional means, i.e., catheters, hooks and lassos. A PubMed search (www.pubmed.gov) was performed with the search term "cava filter retrieval after 30 days clinical", and 20 reports between 1994 and 2013 dealing with late retrieval of caval filters were identified, covering approximately 7,000 devices with 600 removed filters. The maximal duration of implant reported is 2,599 days and the maximal implant duration of removed filters is also 2,599 days. The maximal duration reported with standard retrieval techniques, i.e., catheter, hook and/or lasso, is 475 days, whereas for the retrievals after this period more sophisticated techniques including lasers, etc. were required. The maximal implant duration for series with 100% retrieval accounts for 84 days, which is equivalent to 12 weeks or almost 3 months. We conclude that retrievable caval filters often become permanent despite the initial decision of temporary use. However, such "forgotten" retrievable devices can still be removed with a great chance of success up to three months after implantation. Conventional percutaneous removal techniques may be sufficient up to sixteen months after implantation whereas more sophisticated catheter techniques have been shown to be successful up to 83 months or more than seven years of implant duration. Tilting, migrating, or misplaced devices should be removed early on, and replaced if indicated with a device which is both, efficient and retrievable.
Resumo:
The Computational Biophysics Group at the Universitat Pompeu Fabra (GRIB-UPF) hosts two unique computational resources dedicated to the execution of large scale molecular dynamics (MD) simulations: (a) the ACMD molecular-dynamics software, used on standard personal computers with graphical processing units (GPUs); and (b) the GPUGRID. net computing network, supported by users distributed worldwide that volunteer GPUs for biomedical research. We leveraged these resources and developed studies, protocols and open-source software to elucidate energetics and pathways of a number of biomolecular systems, with a special focus on flexible proteins with many degrees of freedom. First, we characterized ion permeation through the bactericidal model protein Gramicidin A conducting one of the largest studies to date with the steered MD biasing methodology. Next, we addressed an open problem in structural biology, the determination of drug-protein association kinetics; we reconstructed the binding free energy, association, and dissaciociation rates of a drug like model system through a spatial decomposition and a Makov-chain analysis. The work was published in the Proceedings of the National Academy of Sciences and become one of the few landmark papers elucidating a ligand-binding pathway. Furthermore, we investigated the unstructured Kinase Inducible Domain (KID), a 28-peptide central to signalling and transcriptional response; the kinetics of this challenging system was modelled with a Markovian approach in collaboration with Frank Noe’s group at the Freie University of Berlin. The impact of the funding includes three peer-reviewed publication on high-impact journals; three more papers under review; four MD analysis components, released as open-source software; MD protocols; didactic material, and code for the hosting group.
Resumo:
A fundamental question in developmental biology is how tissues are patterned to give rise to differentiated body structures with distinct morphologies. The Drosophila wing disc offers an accessible model to understand epithelial spatial patterning. It has been studied extensively using genetic and molecular approaches. Bristle patterns on the thorax, which arise from the medial part of the wing disc, are a classical model of pattern formation, dependent on a pre-pattern of trans-activators and –repressors. Despite of decades of molecular studies, we still only know a subset of the factors that determine the pre-pattern. We are applying a novel and interdisciplinary approach to predict regulatory interactions in this system. It is based on the description of expression patterns by simple logical relations (addition, subtraction, intersection and union) between simple shapes (graphical primitives). Similarities and relations between primitives have been shown to be predictive of regulatory relationships between the corresponding regulatory factors in other Systems, such as the Drosophila egg. Furthermore, they provide the basis for dynamical models of the bristle-patterning network, which enable us to make even more detailed predictions on gene regulation and expression dynamics. We have obtained a data-set of wing disc expression patterns which we are now processing to obtain average expression patterns for each gene. Through triangulation of the images we can transform the expression patterns into vectors which can easily be analysed by Standard clustering methods. These analyses will allow us to identify primitives and regulatory interactions. We expect to identify new regulatory interactions and to understand the basic Dynamics of the regulatory network responsible for thorax patterning. These results will provide us with a better understanding of the rules governing gene regulatory networks in general, and provide the basis for future studies of the evolution of the thorax-patterning network in particular.
Resumo:
El propòsit d'aquest projecte és desenvolupar una pàgina web d'un club esportiu que permeti gestionar les parts esportiva, administrativa, i doni a conèixe'l als usuaris d'Internet. Aquesta memòria s'inicia explicant les motivacions, objectius i la planificació del projecte. Seguidament ens situa dins l'estat de l'art en què es troba i s'exposa l'anàlisi de requeriments detallant el que s'espera del producte. Tot seguit, s'explica el modelat de comportament on s'especifiquen els casos d'ús, es presenta el disseny de la BD, les tecnologies utilitzades i una presentació gràfica del portal. Per finalitzar s'exposen els errors trobats amb les solucions pertinents, les proves i les conclusions.
Resumo:
Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).
Resumo:
A traditional photonic-force microscope (PFM) results in huge sets of data, which requires tedious numerical analysis. In this paper, we propose instead an analog signal processor to attain real-time capabilities while retaining the richness of the traditional PFM data. Our system is devoted to intracellular measurements and is fully interactive through the use of a haptic joystick. Using our specialized analog hardware along with a dedicated algorithm, we can extract the full 3D stiffness matrix of the optical trap in real time, including the off-diagonal cross-terms. Our system is also capable of simultaneously recording data for subsequent offline analysis. This allows us to check that a good correlation exists between the classical analysis of stiffness and our real-time measurements. We monitor the PFM beads using an optical microscope. The force-feedback mechanism of the haptic joystick helps us in interactively guiding the bead inside living cells and collecting information from its (possibly anisotropic) environment. The instantaneous stiffness measurements are also displayed in real time on a graphical user interface. The whole system has been built and is operational; here we present early results that confirm the consistency of the real-time measurements with offline computations.
Resumo:
Background: Peach fruit undergoes a rapid softening process that involves a number of metabolic changes. Storing fruit at low temperatures has been widely used to extend its postharvest life. However, this leads to undesired changes, such as mealiness and browning, which affect the quality of the fruit. In this study, a 2-D DIGE approach was designed to screen for differentially accumulated proteins in peach fruit during normal softening as well as under conditions that led to fruit chilling injury. Results:The analysis allowed us to identify 43 spots -representing about 18% of the total number analyzed- that show statistically significant changes. Thirty-nine of the proteins could be identified by mass spectrometry. Some of the proteins that changed during postharvest had been related to peach fruit ripening and cold stress in the past. However, we identified other proteins that had not been linked to these processes. A graphical display of the relationship between the differentially accumulated proteins was obtained using pairwise average-linkage cluster analysis and principal component analysis. Proteins such as endopolygalacturonase, catalase, NADP-dependent isocitrate dehydrogenase, pectin methylesterase and dehydrins were found to be very important for distinguishing between healthy and chill injured fruit. A categorization of the differentially accumulated proteins was performed using Gene Ontology annotation. The results showed that the 'response to stress', 'cellular homeostasis', 'metabolism of carbohydrates' and 'amino acid metabolism' biological processes were affected the most during the postharvest. Conclusions: Using a comparative proteomic approach with 2-D DIGE allowed us to identify proteins that showed stage-specific changes in their accumulation pattern. Several proteins that are related to response to stress, cellular homeostasis, cellular component organization and carbohydrate metabolism were detected as being differentially accumulated. Finally, a significant proportion of the proteins identified had not been associated with softening, cold storage or chilling injury-altered fruit before; thus, comparative proteomics has proven to be a valuable tool for understanding fruit softening and postharvest.
Resumo:
The mutual information of independent parallel Gaussian-noise channels is maximized, under an average power constraint, by independent Gaussian inputs whose power is allocated according to the waterfilling policy. In practice, discrete signalling constellations with limited peak-to-average ratios (m-PSK, m-QAM, etc) are used in lieu of the ideal Gaussian signals. This paper gives the power allocation policy that maximizes the mutual information over parallel channels with arbitrary input distributions. Such policy admits a graphical interpretation, referred to as mercury/waterfilling, which generalizes the waterfilling solution and allows retaining some of its intuition. The relationship between mutual information of Gaussian channels and nonlinear minimum mean-square error proves key to solving the power allocation problem.
Resumo:
We show how to build full-diversity product codes under both iterative encoding and decoding over non-ergodic channels, in presence of block erasure and block fading. The concept of a rootcheck or a root subcode is introduced by generalizing the same principle recently invented for low-density parity-check codes. We also describe some channel related graphical properties of the new family of product codes, a familyreferred to as root product codes.
Resumo:
This paper presents a webservice architecture for Statistical Machine Translation aimed at non-technical users. A workfloweditor allows a user to combine different webservices using a graphical user interface. In the current state of this project,the webservices have been implemented for a range of sentential and sub-sententialaligners. The advantage of a common interface and a common data format allows the user to build workflows exchanging different aligners.
Resumo:
This final year project presents the design principles and prototype implementation of BIMS (Biomedical Information Management System), a flexible software system which provides an infrastructure to manage all information required by biomedical research projects.The BIMS project was initiated with the motivation to solve several limitations in medical data acquisition of some research projects, in which Universitat Pompeu Fabra takes part. These limitations,based on the lack of control mechanisms to constraint information submitted by clinicians, impact on the data quality, decreasing it.BIMS can easily be adapted to manage information of a wide variety of clinical studies, not being limited to a given clinical specialty. The software can manage both, textual information, like clinical data (measurements, demographics, diagnostics, etc ...), as well as several kinds of medical images (magnetic resonance imaging, computed tomography, etc ...). Moreover, BIMS provides a web - based graphical user interface and is designed to be deployed in a distributed andmultiuser environment. It is built on top of open source software products and frameworks.Specifically, BIMS has been used to represent all clinical data being currently used within the CardioLab platform (an ongoing project managed by Universitat Pompeu Fabra), demonstratingthat it is a solid software system, which could fulfill requirements of a real production environment.
Resumo:
Almost 30 years ago, Bayesian networks (BNs) were developed in the field of artificial intelligence as a framework that should assist researchers and practitioners in applying the theory of probability to inference problems of more substantive size and, thus, to more realistic and practical problems. Since the late 1980s, Bayesian networks have also attracted researchers in forensic science and this tendency has considerably intensified throughout the last decade. This review article provides an overview of the scientific literature that describes research on Bayesian networks as a tool that can be used to study, develop and implement probabilistic procedures for evaluating the probative value of particular items of scientific evidence in forensic science. Primary attention is drawn here to evaluative issues that pertain to forensic DNA profiling evidence because this is one of the main categories of evidence whose assessment has been studied through Bayesian networks. The scope of topics is large and includes almost any aspect that relates to forensic DNA profiling. Typical examples are inference of source (or, 'criminal identification'), relatedness testing, database searching and special trace evidence evaluation (such as mixed DNA stains or stains with low quantities of DNA). The perspective of the review presented here is not exclusively restricted to DNA evidence, but also includes relevant references and discussion on both, the concept of Bayesian networks as well as its general usage in legal sciences as one among several different graphical approaches to evidence evaluation.
Resumo:
This project focuses on studying and testing the benefits of the NX Remote Desktop technology in administrative use for Finnish Meteorological Institutes existing Linux Terminal Service Project environment. This was done due to the criticality of the system caused by growing number of users as the Linux Terminal Service Project system expands. Although many of the supporting tasks can be done via Secure Shell connection, testing graphical programs or desktop behaviour in such a way is impossible. At first basic technologies behind the NX Remote Desktop were studied, and after that started the testing of two possible programs, FreeNX and NoMachine NX server. Testing the functionality and bandwidth demands were first done in a closed local area network, and results were studied. The better candidate was then installed in a virtual server simulating actual Linux Terminal Service Project server at Finnish Meteorological Institute and connection from Internet was tested to see was there any problems with firewalls and security policies. The results are reported in this study. Studying and testing the two different candidates of NX Remote Desktop showed, that NoMachine NX Server provides better customer support and documentation. Security aspects of the Finnish Meteorological Institute had also to be considered, and since updates along with the new developing tools are announced in next version of the program, this version was the choice. Studies also show that even NoMachine promises a swift connection over an average of 20Kbit/s bandwidth, at least double of that is needed. This project gives an overview of available remote desktop products along their benefits. NX Remote Desktop technology is studied, and installation instructions are included. Testing is done in both, closed and the actual environment and problems and suggestions are studied and analyzed. The installation to the actual LTSP server is not yet made, but a virtual server is put up in the same place in the view of network topology. This ensures, that if the administrators are satisfied with the system, installation and setting up the system will go as described in this report.
Resumo:
Biplots are graphical displays of data matrices based on the decomposition of a matrix as the product of two matrices. Elements of these two matrices are used as coordinates for the rows and columns of the data matrix, with an interpretation of the joint presentation that relies on the properties of the scalar product. Because the decomposition is not unique, there are several alternative ways to scale the row and column points of the biplot, which can cause confusion amongst users, especially when software packages are not united in their approach to this issue. We propose a new scaling of the solution, called the standard biplot, which applies equally well to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. The standard biplot also handles data matrices with widely different levels of inherent variance. Two concepts taken from correspondence analysis are important to this idea: the weighting of row and column points, and the contributions made by the points to the solution. In the standard biplot one set of points, usually the rows of the data matrix, optimally represent the positions of the cases or sample units, which are weighted and usually standardized in some way unless the matrix contains values that are comparable in their raw form. The other set of points, usually the columns, is represented in accordance with their contributions to the low-dimensional solution. As for any biplot, the projections of the row points onto vectors defined by the column points approximate the centred and (optionally) standardized data. The method is illustrated with several examples to demonstrate how the standard biplot copes in different situations to give a joint map which needs only one common scale on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot readable. The proposal also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important.
Resumo:
The forensic two-trace problem is a perplexing inference problem introduced by Evett (J Forensic Sci Soc 27:375-381, 1987). Different possible ways of wording the competing pair of propositions (i.e., one proposition advanced by the prosecution and one proposition advanced by the defence) led to different quantifications of the value of the evidence (Meester and Sjerps in Biometrics 59:727-732, 2003). Here, we re-examine this scenario with the aim of clarifying the interrelationships that exist between the different solutions, and in this way, produce a global vision of the problem. We propose to investigate the different expressions for evaluating the value of the evidence by using a graphical approach, i.e. Bayesian networks, to model the rationale behind each of the proposed solutions and the assumptions made on the unknown parameters in this problem.