69 resultados para Computer-aided analysis
Resumo:
Helical tomotherapy is a relatively new intensity-modulated radiation therapy (IMRT) treatment for which room shielding has to be reassessed for the following reasons. The beam-on-time needed to deliver a given target dose is increased and leads to a weekly workload of typically one order of magnitude higher than that for conventional radiation therapy. The special configuration of tomotherapy units does not allow the use of standard shielding calculation methods. A conventional linear accelerator must be shielded for primary, leakage and scatter photon radiations. For tomotherapy, primary radiation is no longer the main shielding issue since a beam stop is mounted on the gantry directly opposite the source. On the other hand, due to the longer irradiation time, the accelerator head leakage becomes a major concern. An analytical model based on geometric considerations has been developed to determine leakage radiation levels throughout the room for continuous gantry rotation. Compared to leakage radiation, scatter radiation is a minor contribution. Since tomotherapy units operate at a nominal energy of 6 MV, neutron production is negligible. This work proposes a synthetic and conservative model for calculating shielding requirements for the Hi-Art II TomoTherapy unit. Finally, the required concrete shielding thickness is given for different positions of interest.
Resumo:
An image analysis method is presented which allows for the reconstruction of the three-dimensional path of filamentous objects from two of their projections. Starting with stereo pairs, this method is used to trace the trajectory of DNA molecules embedded in vitreous ice and leads to a faithful representation of their three-dimensional shape in solution. This computer-aided reconstruction is superior to the subjective three-dimensional impression generated by observation of stereo pairs of micrographs because it enables one to look at the reconstructed molecules from any chosen direction and distance and allows quantitative analysis such as determination of distances, curvature, persistence length, and writhe of DNA molecules in solution.
Resumo:
Understanding molecular recognition is one major requirement for drug discovery and design. Physicochemical and shape complementarity between two binding partners is the driving force during complex formation. In this study, the impact of shape within this process is analyzed. Protein binding pockets and co-crystallized ligands are represented by normalized principal moments of inertia ratios (NPRs). The corresponding descriptor space is triangular, with its corners occupied by spherical, discoid, and elongated shapes. An analysis of a selected set of sc-PDB complexes suggests that pockets and bound ligands avoid spherical shapes, which are, however, prevalent in small unoccupied pockets. Furthermore, a direct shape comparison confirms previous studies that on average only one third of a pocket is filled by its bound ligand, supplemented by a 50 % subpocket coverage. In this study, we found that shape complementary is expressed by low pairwise shape distances in NPR space, short distances between the centers-of-mass, and small deviations in the angle between the first principal ellipsoid axes. Furthermore, it is assessed how different binding pocket parameters are related to bioactivity and binding efficiency of the co-crystallized ligand. In addition, the performance of different shape and size parameters of pockets and ligands is evaluated in a virtual screening scenario performed on four representative targets.
Resumo:
Purpose: Revolutionary endovascular treatments are on the verge of being available for management of ascending aortic diseases. Morphometric measurements of the ascending aorta have already been done with ECG-gated MDCT to help such therapeutic development. However the reliability of these measurements remains unknown. The objective of this work was to compare the intraobserver and interobserver variability of CAD (computer aided diagnosis) versus manual measurements in the ascending aorta. Methods and materials: Twenty-six consecutive patients referred for ECG-gated CT thoracic angiography (64-row CT scanner) were evaluated. Measurements of the maximum and minimum ascending aorta diameters at mid-distance between the brachiocephalic artery and the aortic valve were obtained automatically with a commercially available CAD and manually by two observers separately. Both observers repeated the measurements during a different session at least one month after the first measurements. Intraclass coefficients as well the Bland and Altman method were used for comparison between measurements. Two-paired t-test was used to determine the significance of intraobserver and interobserver differences (alpha = 0.05). Results: There is a significant difference between CAD and manual measurements in the maximum diameter (p = 0.004) for the first observer, whereas the difference was significant for minimum diameter between the second observer and the CAD (p <0.001). Interobserver variability showed a weak agreement when measurements were done manually. Intraobserver variability was lower with the CAD compared to the manual measurements (limits of variability: from -0.7 to 0.9 mm for the former and from -1.2 to 1.3 mm for the latter). Conclusion: In order to improve reproductibility of measurements whenever needed, pre- and post-therapeutic management of the ascending aorta may benefit from follow-up done by a unique observer with the help of CAD.
Resumo:
The reciprocal interaction between cancer cells and the tissue-specific stroma is critical for primary and metastatic tumor growth progression. Prostate cancer cells colonize preferentially bone (osteotropism), where they alter the physiological balance between osteoblast-mediated bone formation and osteoclast-mediated bone resorption, and elicit prevalently an osteoblastic response (osteoinduction). The molecular cues provided by osteoblasts for the survival and growth of bone metastatic prostate cancer cells are largely unknown. We exploited the sufficient divergence between human and mouse RNA sequences together with redefinition of highly species-specific gene arrays by computer-aided and experimental exclusion of cross-hybridizing oligonucleotide probes. This strategy allowed the dissection of the stroma (mouse) from the cancer cell (human) transcriptome in bone metastasis xenograft models of human osteoinductive prostate cancer cells (VCaP and C4-2B). As a result, we generated the osteoblastic bone metastasis-associated stroma transcriptome (OB-BMST). Subtraction of genes shared by inflammation, wound healing and desmoplastic responses, and by the tissue type-independent stroma responses to a variety of non-osteotropic and osteotropic primary cancers generated a curated gene signature ("Core" OB-BMST) putatively representing the bone marrow/bone-specific stroma response to prostate cancer-induced, osteoblastic bone metastasis. The expression pattern of three representative Core OB-BMST genes (PTN, EPHA3 and FSCN1) seems to confirm the bone specificity of this response. A robust induction of genes involved in osteogenesis and angiogenesis dominates both the OB-BMST and Core OB-BMST. This translates in an amplification of hematopoietic and, remarkably, prostate epithelial stem cell niche components that may function as a self-reinforcing bone metastatic niche providing a growth support specific for osteoinductive prostate cancer cells. The induction of this combinatorial stem cell niche is a novel mechanism that may also explain cancer cell osteotropism and local interference with hematopoiesis (myelophthisis). Accordingly, these stem cell niche components may represent innovative therapeutic targets and/or serum biomarkers in osteoblastic bone metastasis.
Resumo:
Metabolic problems lead to numerous failures during clinical trials, and much effort is now devoted to developing in silico models predicting metabolic stability and metabolites. Such models are well known for cytochromes P450 and some transferases, whereas less has been done to predict the activity of human hydrolases. The present study was undertaken to develop a computational approach able to predict the hydrolysis of novel esters by human carboxylesterase hCES2. The study involved first a homology modeling of the hCES2 protein based on the model of hCES1 since the two proteins share a high degree of homology (congruent with 73%). A set of 40 known substrates of hCES2 was taken from the literature; the ligands were docked in both their neutral and ionized forms using GriDock, a parallel tool based on the AutoDock4.0 engine which can perform efficient and easy virtual screening analyses of large molecular databases exploiting multi-core architectures. Useful statistical models (e.g., r (2) = 0.91 for substrates in their unprotonated state) were calculated by correlating experimental pK(m) values with distance between the carbon atom of the substrate's ester group and the hydroxy function of Ser228. Additional parameters in the equations accounted for hydrophobic and electrostatic interactions between substrates and contributing residues. The negatively charged residues in the hCES2 cavity explained the preference of the enzyme for neutral substrates and, more generally, suggested that ligands which interact too strongly by ionic bonds (e.g., ACE inhibitors) cannot be good CES2 substrates because they are trapped in the cavity in unproductive modes and behave as inhibitors. The effects of protonation on substrate recognition and the contrasting behavior of substrates and products were finally investigated by MD simulations of some CES2 complexes.
Resumo:
Trans-apical aortic valve replacement (AVR) is a new and rapidly growing therapy. However, there are only few training opportunities. The objective of our work is to build an appropriate artificial model of the heart that can replace the use of animals for surgical training in trans-apical AVR procedures. To reduce the necessity for fluoroscopy, we pursued the goal of building a translucent model of the heart that has nature-like dimensions. A simplified 3D model of a human heart with its aortic root was created in silico using the SolidWorks Computer-Aided Design (CAD) program. This heart model was printed using a rapid prototyping system developed by the Fab@Home project and dip-coated two times with dispersion silicone. The translucency of the heart model allows the perception of the deployment area of the valved-stent without using heavy imaging support. The final model was then placed in a human manikin for surgical training on trans-apical AVR procedure. Trans-apical AVR with all the necessary steps (puncture, wiring, catheterization, ballooning etc.) can be realized repeatedly in this setting.
Resumo:
In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.
Resumo:
We report on the medical history of a Caucasian smoker woman diagnosed with a stage IV NSCLC adenocarcinoma, characterized by a rare epidermal growth factor receptor (EGFR) point mutation in exon 21 codon 843 (p.V843I/c.2527G>A/COSMIC ID 85894). This genetic alteration revealed to be germline, after its presence was demonstrated in chondroblasts from the bone biopsy. While it is the first description of germline V843I mutation without concomitant additional known EGFR activating mutation, we modeled the EGFR ATP catalytic domain in complex with ATP, gefitinib and erlotinib using computer-aided approaches to estimate possible changes in affinity upon the V843I mutation.
Resumo:
In order to understand relationships between executive and structural deficits in the frontal cortex of patients within normal aging or Alzheimer's disease, we studied frontal pathological changes in young and old controls compared to cases with sporadic (AD) or familial Alzheimer's disease (FAD). We performed a semi-automatic computer assisted analysis of the distribution of beta-amyloid (Abeta) deposits revealed by Abeta immunostaining as well as of neurofibrillary tangles (NFT) revealed by Gallyas silver staining in Brodman areas 10 (frontal polar), 12 (ventro-infero-median) and 24 (anterior cingular), using tissue samples from 5 FAD, 6 sporadic AD and 10 control brains. We also performed densitometric measurements of glial fibrillary acidic protein, principal compound of intermediate filaments of astrocytes, and of phosphorylated neurofilament H and M epitopes in areas 10 and 24. All regions studied seem almost completely spared in normal old controls, with only the oldest ones exhibiting a weak percentage of beta-amyloid deposit and hardly any NFT. On the contrary, all AD and FAD cases were severely damaged as shown by statistically significant increased percentages of beta-amyloid deposit, as well as by a high number of NFT. FAD cases (all from the same family) had statistically more beta-amyloid and GFAP than sporadic AD cases in both areas 10 and 24 and statistically more NFT only in area 24. The correlation between the percentage of beta-amyloid and the number of NFT was significant only for area 24. Altogether, these data suggest that the frontal cortex can be spared by AD type lesions in normal aging, but is severely damaged in sporadic and still more in familial Alzheimer's disease. The frontal regions appear to be differentially vulnerable, with area 12 having the less amyloid burden, area 24 the less NFT and area 10 having both more amyloid and more NFT. This pattern of damage in frontal regions may represent a strong neuroanatomical support for the deterioration of attention and cognitive capacities as well as for the presence of emotional and behavioral troubles in AD patients.
Resumo:
The potential of type-2 fuzzy sets for managing high levels of uncertainty in the subjective knowledge of experts or of numerical information has focused on control and pattern classification systems in recent years. One of the main challenges in designing a type-2 fuzzy logic system is how to estimate the parameters of type-2 fuzzy membership function (T2MF) and the Footprint of Uncertainty (FOU) from imperfect and noisy datasets. This paper presents an automatic approach for learning and tuning Gaussian interval type-2 membership functions (IT2MFs) with application to multi-dimensional pattern classification problems. T2MFs and their FOUs are tuned according to the uncertainties in the training dataset by a combination of genetic algorithm (GA) and crossvalidation techniques. In our GA-based approach, the structure of the chromosome has fewer genes than other GA methods and chromosome initialization is more precise. The proposed approach addresses the application of the interval type-2 fuzzy logic system (IT2FLS) for the problem of nodule classification in a lung Computer Aided Detection (CAD) system. The designed IT2FLS is compared with its type-1 fuzzy logic system (T1FLS) counterpart. The results demonstrate that the IT2FLS outperforms the T1FLS by more than 30% in terms of classification accuracy.
Resumo:
The drug discovery process has been deeply transformed recently by the use of computational ligand-based or structure-based methods, helping the lead compounds identification and optimization, and finally the delivery of new drug candidates more quickly and at lower cost. Structure-based computational methods for drug discovery mainly involve ligand-protein docking and rapid binding free energy estimation, both of which require force field parameterization for many drug candidates. Here, we present a fast force field generation tool, called SwissParam, able to generate, for arbitrary small organic molecule, topologies, and parameters based on the Merck molecular force field, but in a functional form that is compatible with the CHARMM force field. Output files can be used with CHARMM or GROMACS. The topologies and parameters generated by SwissParam are used by the docking software EADock2 and EADock DSS to describe the small molecules to be docked, whereas the protein is described by the CHARMM force field, and allow them to reach success rates ranging from 56 to 78%. We have also developed a rapid binding free energy estimation approach, using SwissParam for ligands and CHARMM22/27 for proteins, which requires only a short minimization to reproduce the experimental binding free energy of 214 ligand-protein complexes involving 62 different proteins, with a standard error of 2.0 kcal mol(-1), and a correlation coefficient of 0.74. Together, these results demonstrate the relevance of using SwissParam topologies and parameters to describe small organic molecules in computer-aided drug design applications, together with a CHARMM22/27 description of the target protein. SwissParam is available free of charge for academic users at www.swissparam.ch.
Resumo:
PURPOSE: To explore whether triaxial accelerometric measurements can be utilized to accurately assess speed and incline of running in free-living conditions. METHODS: Body accelerations during running were recorded at the lower back and at the heel by a portable data logger in 20 human subjects, 10 men, and 10 women. After parameterizing body accelerations, two neural networks were designed to recognize each running pattern and calculate speed and incline. Each subject ran 18 times on outdoor roads at various speeds and inclines; 12 runs were used to calibrate the neural networks whereas the 6 other runs were used to validate the model. RESULTS: A small difference between the estimated and the actual values was observed: the square root of the mean square error (RMSE) was 0.12 m x s(-1) for speed and 0.014 radiant (rad) (or 1.4% in absolute value) for incline. Multiple regression analysis allowed accurate prediction of speed (RMSE = 0.14 m x s(-1)) but not of incline (RMSE = 0.026 rad or 2.6% slope). CONCLUSION: Triaxial accelerometric measurements allows an accurate estimation of speed of running and incline of terrain (the latter with more uncertainty). This will permit the validation of the energetic results generated on the treadmill as applied to more physiological unconstrained running conditions.
Resumo:
A method is proposed for the estimation of absolute binding free energy of interaction between proteins and ligands. Conformational sampling of the protein-ligand complex is performed by molecular dynamics (MD) in vacuo and the solvent effect is calculated a posteriori by solving the Poisson or the Poisson-Boltzmann equation for selected frames of the trajectory. The binding free energy is written as a linear combination of the buried surface upon complexation, SASbur, the electrostatic interaction energy between the ligand and the protein, Eelec, and the difference of the solvation free energies of the complex and the isolated ligand and protein, deltaGsolv. The method uses the buried surface upon complexation to account for the non-polar contribution to the binding free energy because it is less sensitive to the details of the structure than the van der Waals interaction energy. The parameters of the method are developed for a training set of 16 HIV-1 protease-inhibitor complexes of known 3D structure. A correlation coefficient of 0.91 was obtained with an unsigned mean error of 0.8 kcal/mol. When applied to a set of 25 HIV-1 protease-inhibitor complexes of unknown 3D structures, the method provides a satisfactory correlation between the calculated binding free energy and the experimental pIC5o without reparametrization.