50 resultados para Visualization Using Computer Algebra Tools
Resumo:
Protein-protein interactions encode the wiring diagram of cellular signaling pathways and their deregulations underlie a variety of diseases, such as cancer. Inhibiting protein-protein interactions with peptide derivatives is a promising way to develop new biological and therapeutic tools. Here, we develop a general framework to computationally handle hundreds of non-natural amino acid sidechains and predict the effect of inserting them into peptides or proteins. We first generate all structural files (pdb and mol2), as well as parameters and topologies for standard molecular mechanics software (CHARMM and Gromacs). Accurate predictions of rotamer probabilities are provided using a novel combined knowledge and physics based strategy. Non-natural sidechains are useful to increase peptide ligand binding affinity. Our results obtained on non-natural mutants of a BCL9 peptide targeting beta-catenin show very good correlation between predicted and experimental binding free-energies, indicating that such predictions can be used to design new inhibitors. Data generated in this work, as well as PyMOL and UCSF Chimera plug-ins for user-friendly visualization of non-natural sidechains, are all available at http://www.swisssidechain.ch. Our results enable researchers to rapidly and efficiently work with hundreds of non-natural sidechains.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
MRI visualization of devices is traditionally based on signal loss due to T(2)* effects originating from local susceptibility differences. To visualize nitinol devices with positive contrast, a recently introduced postprocessing method is adapted to map the induced susceptibility gradients. This method operates on regular gradient-echo MR images and maps the shift in k-space in a (small) neighborhood of every voxel by Fourier analysis followed by a center-of-mass calculation. The quantitative map of the local shifts generates the positive contrast image of the devices, while areas without susceptibility gradients render a background with noise only. The positive signal response of this method depends only on the choice of the voxel neighborhood size. The properties of the method are explained and the visualizations of a nitinol wire and two stents are shown for illustration.
Resumo:
PURPOSE: To investigate the potential of free-breathing 3D steady-state free precession (SSFP) imaging with radial k-space sampling for coronary MR-angiography (MRA), coronary projection MR-angiography and coronary vessel wall imaging. MATERIALS AND METHODS: A navigator-gated free-breathing T2-prepared 3D SSFP sequence (TR = 6.1 ms, TE = 3.0 ms, flip angle = 120 degrees, field-of-view = 360 mm(2)) with radial k-space sampling (384 radials) was implemented for coronary MRA. For projection coronary MRA, this sequence was combined with a 2D selective aortic spin tagging pulse. Coronary vessel wall imaging was performed using a high-resolution inversion-recovery black-blood 3D radial SSFP sequence (384 radials, TR = 5.3 ms, TE = 2.7 ms, flip angle = 55 degrees, reconstructed resolution 0.35 x 0.35 x 1.2 mm(3)) and a local re-inversion pulse. Six healthy volunteers (two for each sequence) were investigated. Motion artifact level was assessed by two radiologists. Results: In coronary MRA, the coronary lumen was displayed with a high signal and high contrast to the surrounding lumen. Projection coronary MRA demonstrated selective visualization of the coronary lumen while surrounding tissue was almost completely suppressed. In coronary vessel wall imaging, the vessel wall was displayed with a high signal when compared to the blood pool and the surrounding tissue. No visible motion artifacts were seen. Conclusion: 3D radial SSFP imaging enables coronary MRA, coronary projection MRA and coronary vessel wall imaging with a low motion artifact level.
Resumo:
There has been confusion about the subunit stoichiometry of the degenerin family of ion channels. Recently, a crystal structure of acid-sensing ion channel (ASIC) 1a revealed that it assembles as a trimer. Here, we used atomic force microscopy (AFM) to image unprocessed ASIC1a bound to mica. We detected a mixture of subunit monomers, dimers and trimers. In some cases, triple-subunit clusters were clearly visible, confirming the trimeric structure of the channel, and indicating that the trimer sometimes disaggregated after adhesion to the mica surface. This AFM-based technique will now enable us to determine the subunit arrangement within heteromeric ASICs.
Resumo:
BACKGROUND AND OBJECTIVES: Microparticles (MPs) are small phospholipid vesicles of less than 1 microm, shed in blood flow by various cell types. These MPs are involved in several biological processes and diseases. MPs have also been detected in blood products; however, their role in transfused patients is unknown. The purpose of this study was to characterize those MPs in blood bank conditions. MATERIALS AND METHODS: Qualitative and quantitative experiments using flow cytometry or proteomic techniques were performed on MPs derived from erythrocytes concentrates. In order to count MPs, they were either isolated by various centrifugation procedures or counted directly in erythrocyte concentrates. RESULTS: A 20-fold increase after 50 days of storage at 4 degrees C was observed (from 3370 +/- 1180 MPs/microl at day 5 to 64 850 +/- 37 800 MPs/microl at day 50). Proteomic analysis revealed changes of protein expression comparing MPs to erythrocyte membranes. Finally, the expression of Rh blood group antigens was shown on MPs generated during erythrocyte storage. CONCLUSIONS: Our work provides evidence that storage of red blood cell is associated with the generation of MPs characterized by particular proteomic profiles. These results contribute to fundamental knowledge of transfused blood products.
Resumo:
In proton magnetic resonance imaging (MRI) metallic substances lead to magnetic field distortions that often result in signal voids in the adjacent anatomic structures. Thus, metallic objects and superparamagnetic iron oxide (SPIO)-labeled cells appear as hypointense artifacts that obscure the underlying anatomy. The ability to illuminate these structures with positive contrast would enhance noninvasive MR tracking of cellular therapeutics. Therefore, an MRI methodology that selectively highlights areas of metallic objects has been developed. Inversion-recovery with ON-resonant water suppression (IRON) employs inversion of the magnetization in conjunction with a spectrally-selective on-resonant saturation prepulse. If imaging is performed after these prepulses, positive signal is obtained from off-resonant protons in close proximity to the metallic objects. The first successful use of IRON to produce positive contrast in areas of metallic spheres and SPIO-labeled stem cells in vitro and in vivo is presented.
Resumo:
The aim of this research was to evaluate how fingerprint analysts would incorporate information from newly developed tools into their decision making processes. Specifically, we assessed effects using the following: (1) a quality tool to aid in the assessment of the clarity of the friction ridge details, (2) a statistical tool to provide likelihood ratios representing the strength of the corresponding features between compared fingerprints, and (3) consensus information from a group of trained fingerprint experts. The measured variables for the effect on examiner performance were the accuracy and reproducibility of the conclusions against the ground truth (including the impact on error rates) and the analyst accuracy and variation for feature selection and comparison.¦The results showed that participants using the consensus information from other fingerprint experts demonstrated more consistency and accuracy in minutiae selection. They also demonstrated higher accuracy, sensitivity, and specificity in the decisions reported. The quality tool also affected minutiae selection (which, in turn, had limited influence on the reported decisions); the statistical tool did not appear to influence the reported decisions.
Resumo:
The coverage and volume of geo-referenced datasets are extensive and incessantly¦growing. The systematic capture of geo-referenced information generates large volumes¦of spatio-temporal data to be analyzed. Clustering and visualization play a key¦role in the exploratory data analysis and the extraction of knowledge embedded in¦these data. However, new challenges in visualization and clustering are posed when¦dealing with the special characteristics of this data. For instance, its complex structures,¦large quantity of samples, variables involved in a temporal context, high dimensionality¦and large variability in cluster shapes.¦The central aim of my thesis is to propose new algorithms and methodologies for¦clustering and visualization, in order to assist the knowledge extraction from spatiotemporal¦geo-referenced data, thus improving making decision processes.¦I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical¦Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis:¦the Tree-structured Self-organizing Maps Component Planes. In addition, I present¦methodologies that combined with FGHSON and the Tree-structured SOM Component¦Planes allow the integration of space and time seamlessly and simultaneously in¦order to extract knowledge embedded in a temporal context.¦The originality of the FGHSON lies in its capability to reflect the underlying structure¦of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of¦clusters is crucial when data include complex structures with large variability of cluster¦shapes, variances, densities and number of clusters. The most important characteristics¦of the FGHSON include: (1) It does not require an a-priori setup of the number¦of clusters. (2) The algorithm executes several self-organizing processes in parallel.¦Hence, when dealing with large datasets the processes can be distributed reducing the¦computational cost. (3) Only three parameters are necessary to set up the algorithm.¦In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm¦lies in its ability to create a structure that allows the visual exploratory data analysis¦of large high-dimensional datasets. This algorithm creates a hierarchical structure¦of Self-Organizing Map Component Planes, arranging similar variables' projections in¦the same branches of the tree. Hence, similarities on variables' behavior can be easily¦detected (e.g. local correlations, maximal and minimal values and outliers).¦Both FGHSON and the Tree-structured SOM Component Planes were applied in¦several agroecological problems proving to be very efficient in the exploratory analysis¦and clustering of spatio-temporal datasets.¦In this thesis I also tested three soft competitive learning algorithms. Two of them¦well-known non supervised soft competitive algorithms, namely the Self-Organizing¦Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the¦third was our original contribution, the FGHSON. Although the algorithms presented¦here have been used in several areas, to my knowledge there is not any work applying¦and comparing the performance of those techniques when dealing with spatiotemporal¦geospatial data, as it is presented in this thesis.¦I propose original methodologies to explore spatio-temporal geo-referenced datasets¦through time. Our approach uses time windows to capture temporal similarities and¦variations by using the FGHSON clustering algorithm. The developed methodologies¦are used in two case studies. In the first, the objective was to find similar agroecozones¦through time and in the second one it was to find similar environmental patterns¦shifted in time.¦Several results presented in this thesis have led to new contributions to agroecological¦knowledge, for instance, in sugar cane, and blackberry production.¦Finally, in the framework of this thesis we developed several software tools: (1)¦a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called¦BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user¦interface tool which integrates the FGHSON algorithm with Google Earth in order to¦show zones with similar agroecological characteristics.
Resumo:
Visualization of the vascular systems of organs or of small animals is important for an assessment of basic physiological conditions, especially in studies that involve genetically manipulated mice. For a detailed morphological analysis of the vascular tree, it is necessary to demonstrate the system in its entirety. In this study, we present a new lipophilic contrast agent, Angiofil, for performing postmortem microangiography by using microcomputed tomography. The new contrast agent was tested in 10 wild-type mice. Imaging of the vascular system revealed vessels down to the caliber of capillaries, and the digital three-dimensional data obtained from the scans allowed for virtual cutting, amplification, and scaling without destroying the sample. By use of computer software, parameters such as vessel length and caliber could be quantified and remapped by color coding onto the surface of the vascular system. The liquid Angiofil is easy to handle and highly radio-opaque. Because of its lipophilic abilities, it is retained intravascularly, hence it facilitates virtual vessel segmentation, and yields an enduring signal which is advantageous during repetitive investigations, or if samples need to be transported from the site of preparation to the place of actual analysis, respectively. These characteristics make Angiofil a promising novel contrast agent; when combined with microcomputed tomography, it has the potential to turn into a powerful method for rapid vascular phenotyping.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.