48 resultados para Computational grids (Computer systems)
em Université de Lausanne, Switzerland
Resumo:
BACKGROUND: Molecular interaction Information is a key resource in modern biomedical research. Publicly available data have previously been provided in a broad array of diverse formats, making access to this very difficult. The publication and wide implementation of the Human Proteome Organisation Proteomics Standards Initiative Molecular Interactions (HUPO PSI-MI) format in 2004 was a major step towards the establishment of a single, unified format by which molecular interactions should be presented, but focused purely on protein-protein interactions. RESULTS: The HUPO-PSI has further developed the PSI-MI XML schema to enable the description of interactions between a wider range of molecular types, for example nucleic acids, chemical entities, and molecular complexes. Extensive details about each supported molecular interaction can now be captured, including the biological role of each molecule within that interaction, detailed description of interacting domains, and the kinetic parameters of the interaction. The format is supported by data management and analysis tools and has been adopted by major interaction data providers. Additionally, a simpler, tab-delimited format MITAB2.5 has been developed for the benefit of users who require only minimal information in an easy to access configuration. CONCLUSION: The PSI-MI XML2.5 and MITAB2.5 formats have been jointly developed by interaction data producers and providers from both the academic and commercial sector, and are already widely implemented and well supported by an active development community. PSI-MI XML2.5 enables the description of highly detailed molecular interaction data and facilitates data exchange between databases and users without loss of information. MITAB2.5 is a simpler format appropriate for fast Perl parsing or loading into Microsoft Excel.
Resumo:
The dynamic properties of helix 12 in the ligand binding domain of nuclear receptors are a major determinant of AF-2 domain activity. We investigated the molecular and structural basis of helix 12 mobility, as well as the involvement of individual residues with regard to peroxisome proliferator-activated receptor alpha (PPARalpha) constitutive and ligand-dependent transcriptional activity. Functional assays of the activity of PPARalpha helix 12 mutants were combined with free energy molecular dynamics simulations. The agreement between the results from these approaches allows us to make robust claims concerning the mechanisms that govern helix 12 functions. Our data support a model in which PPARalpha helix 12 transiently adopts a relatively stable active conformation even in the absence of a ligand. This conformation provides the interface for the recruitment of a coactivator and results in constitutive activity. The receptor agonists stabilize this conformation and increase PPARalpha transcription activation potential. Finally, we disclose important functions of residues in PPARalpha AF-2, which determine the positioning of helix 12 in the active conformation in the absence of a ligand. Substitution of these residues suppresses PPARalpha constitutive activity, without changing PPARalpha ligand-dependent activation potential.
Resumo:
Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.
Resumo:
PURPOSE: Afferent asymmetry of visual function is detectable in both normal and pathologic conditions. With a computerized test, we assessed the variability in measuring afferent asymmetry of the pupillary light reflex, that is, the relative afferent pupillary defect. METHODS: In ten normal subjects, pupillary responses to an alternating light stimulus were recorded with computerized infrared pupillography. The relative afferent pupillary defect for each test was determined by using a new computer analysis. The 95% confidence interval of each determination of relative afferent pupillary defect was used to represent the short-term fluctuation in its measurement. To optimize the test for clinical use, we studied the influence of stimulus intensity, duration, and number on the variability of the relative afferent pupillary defect. RESULTS: When the relative afferent pupillary defect was based on only a few light alternations (stimulus pairs), there was excessive variability in its measurement (95% confidence interval > 0.5 log units). With approximately 200 stimulus pairs, the 95% confidence interval was reduced to less than 0.1 log unit (relative afferent pupillary defect +/- 0.05 log unit). Also, there was less variability when the dark interval between alternating light stimulation was less than one second. CONCLUSIONS: Computerized infrared pupillography can standardize the alternating light test and minimize the error in quantifying a relative afferent pupillary defect. A reproducible relative afferent pupillary defect measurement is desirable for defining afferent injury and following the course of disease.
Resumo:
Gene duplication and neofunctionalization are known to be important processes in the evolution of phenotypic complexity. They account for important evolutionary novelties that confer ecological adaptation, such as the major histocompatibility complex (MHC), a multigene family crucial to the vertebrate immune system. In birds, two MHC class II β (MHCIIβ) exon 3 lineages have been recently characterized, and two hypotheses for the evolutionary history of MHCIIβ lineages were proposed. These lineages could have arisen either by 1) an ancient duplication and subsequent divergence of one paralog or by 2) recent parallel duplications followed by functional convergence. Here, we compiled a data set consisting of 63 MHCIIβ exon 3 sequences from six avian orders to distinguish between these hypotheses and to understand the role of selection in the divergent evolution of the two avian MHCIIβ lineages. Based on phylogenetic reconstructions and simulations, we show that a unique duplication event preceding the major avian radiations gave rise to two ancestral MHCIIβ lineages that were each likely lost once later during avian evolution. Maximum likelihood estimation shows that following the ancestral duplication, positive selection drove a radical shift from basic to acidic amino acid composition of a protein domain facing the α-chain in the MHCII α β-heterodimer. Structural analyses of the MHCII α β-heterodimer highlight that three of these residues are potentially involved in direct interactions with the α-chain, suggesting that the shift following duplication may have been accompanied by coevolution of the interacting α- and β-chains. These results provide new insights into the long-term evolutionary relationships among avian MHC genes and open interesting perspectives for comparative and population genomic studies of avian MHC evolution.
Resumo:
BACKGROUND: Maintaining therapeutic concentrations of drugs with a narrow therapeutic window is a complex task. Several computer systems have been designed to help doctors determine optimum drug dosage. Significant improvements in health care could be achieved if computer advice improved health outcomes and could be implemented in routine practice in a cost effective fashion. This is an updated version of an earlier Cochrane systematic review, by Walton et al, published in 2001. OBJECTIVES: To assess whether computerised advice on drug dosage has beneficial effects on the process or outcome of health care. SEARCH STRATEGY: We searched the Cochrane Effective Practice and Organisation of Care Group specialized register (June 1996 to December 2006), MEDLINE (1966 to December 2006), EMBASE (1980 to December 2006), hand searched the journal Therapeutic Drug Monitoring (1979 to March 2007) and the Journal of the American Medical Informatics Association (1996 to March 2007) as well as reference lists from primary articles. SELECTION CRITERIA: Randomized controlled trials, controlled trials, controlled before and after studies and interrupted time series analyses of computerized advice on drug dosage were included. The participants were health professionals responsible for patient care. The outcomes were: any objectively measured change in the behaviour of the health care provider (such as changes in the dose of drug used); any change in the health of patients resulting from computerized advice (such as adverse reactions to drugs). DATA COLLECTION AND ANALYSIS: Two reviewers independently extracted data and assessed study quality. MAIN RESULTS: Twenty-six comparisons (23 articles) were included (as compared to fifteen comparisons in the original review) including a wide range of drugs in inpatient and outpatient settings. Interventions usually targeted doctors although some studies attempted to influence prescriptions by pharmacists and nurses. Although all studies used reliable outcome measures, their quality was generally low. Computerized advice for drug dosage gave significant benefits by:1.increasing the initial dose (standardised mean difference 1.12, 95% CI 0.33 to 1.92)2.increasing serum concentrations (standradised mean difference 1.12, 95% CI 0.43 to 1.82)3.reducing the time to therapeutic stabilisation (standardised mean difference -0.55, 95%CI -1.03 to -0.08)4.reducing the risk of toxic drug level (rate ratio 0.45, 95% CI 0.30 to 0.70)5.reducing the length of hospital stay (standardised mean difference -0.35, 95% CI -0.52 to -0.17). AUTHORS' CONCLUSIONS: This review suggests that computerized advice for drug dosage has some benefits: it increased the initial dose of drug, increased serum drug concentrations and led to a more rapid therapeutic control. It also reduced the risk of toxic drug levels and the length of time spent in the hospital. However, it had no effect on adverse reactions. In addition, there was no evidence to suggest that some decision support technical features (such as its integration into a computer physician order entry system) or aspects of organization of care (such as the setting) could optimise the effect of computerised advice.
Resumo:
The ATP-binding cassette (ABC) family of proteins comprise a group of membrane transporters involved in the transport of a wide variety of compounds, such as xenobiotics, vitamins, lipids, amino acids, and carbohydrates. Determining their regional expression patterns along the intestinal tract will further characterize their transport functions in the gut. The mRNA expression levels of murine ABC transporters in the duodenum, jejunum, ileum, and colon were examined using the Affymetrix MuU74v2 GeneChip set. Eight ABC transporters (Abcb2, Abcb3, Abcb9, Abcc3, Abcc6, Abcd1, Abcg5, and Abcg8) displayed significant differential gene expression along the intestinal tract, as determined by two statistical models (a global error assessment model and a classic ANOVA, both with a P < 0.01). Concordance with semiquantitative real-time PCR was high. Analyzing the promoters of the differentially expressed ABC transporters did not identify common transcriptional motifs between family members or with other genes; however, the expression profile for Abcb9 was highly correlated with fibulin-1, and both genes share a common complex promoter model involving the NFkappaB, zinc binding protein factor (ZBPF), GC-box factors SP1/GC (SP1F), and early growth response factor (EGRF) transcription binding motifs. The cellular location of another of the differentially expressed ABC transporters, Abcc3, was examined by immunohistochemistry. Staining revealed that the protein is consistently expressed in the basolateral compartment of enterocytes along the anterior-posterior axis of the intestine. Furthermore, the intensity of the staining pattern is concordant with the expression profile. This agrees with previous findings in which the mRNA, protein, and transport function of Abcc3 were increased in the rat distal intestine. These data reveal regional differences in gene expression profiles along the intestinal tract and demonstrate that a complete understanding of intestinal ABC transporter function can only be achieved by examining the physiologically distinct regions of the gut.
Resumo:
The impact of navigator spatial resolution and navigator evaluation time on image quality in free-breathing navigator-gated 3D coronary magnetic resonance angiography (MRA), including real-time motion correction, was investigated in a moving phantom. Objective image quality parameters signal-to-noise ratio (SNR) and vessel sharpness were compared. It was found that for improved mage quality a short navigator evaluation time is of crucial importance. Navigator spatial resolution showed minimal influence on image quality.
Resumo:
OBJECTIVE: Before a patient can be connected to a mechanical ventilator, the controls of the apparatus need to be set up appropriately. Today, this is done by the intensive care professional. With the advent of closed loop controlled mechanical ventilation, methods will be needed to select appropriate start up settings automatically. The objective of our study was to test such a computerized method which could eventually be used as a start-up procedure (first 5-10 minutes of ventilation) for closed-loop controlled ventilation. DESIGN: Prospective Study. SETTINGS: ICU's in two adult and one children's hospital. PATIENTS: 25 critically ill adult patients (age > or = 15 y) and 17 critically ill children selected at random were studied. INTERVENTIONS: To stimulate 'initial connection', the patients were disconnected from their ventilator and transiently connected to a modified Hamilton AMADEUS ventilator for maximally one minute. During that time they were ventilated with a fixed and standardized breath pattern (Test Breaths) based on pressure controlled synchronized intermittent mandatory ventilation (PCSIMV). MEASUREMENTS AND MAIN RESULTS: Measurements of airway flow, airway pressure and instantaneous CO2 concentration using a mainstream CO2 analyzer were made at the mouth during application of the Test-Breaths. Test-Breaths were analyzed in terms of tidal volume, expiratory time constant and series dead space. Using this data an initial ventilation pattern consisting of respiratory frequency and tidal volume was calculated. This ventilation pattern was compared to the one measured prior to the onset of the study using a two-tailed paired t-test. Additionally, it was compared to a conventional method for setting up ventilators. The computer-proposed ventilation pattern did not differ significantly from the actual pattern (p > 0.05), while the conventional method did. However the scatter was large and in 6 cases deviations in the minute ventilation of more than 50% were observed. CONCLUSIONS: The analysis of standardized Test Breaths allows automatic determination of an initial ventilation pattern for intubated ICU patients. While this pattern does not seem to be superior to the one chosen by the conventional method, it is derived fully automatically and without need for manual patient data entry such as weight or height. This makes the method potentially useful as a start up procedure for closed-loop controlled ventilation.
Resumo:
BACKGROUND AND OBJECTIVES: Experimental assessment of photodynamic therapy (PDT) for malignant pleural mesothelioma using a polyethylene glycol conjugate of meta-tetrahydroxyphenylchlorin (PEG-mTHPC). STUDY DESIGN/MATERIALS AND METHODS: (a) PDT was tested on H-meso-1 xenografts (652 nm laser light; fluence 10 J/cm(2); 0.93, 9.3, or 27.8 mg/kg of PEG-mTHPC; drug-light intervals 3-8 days). (b) Intraoperative PDT with similar treatment conditions was performed in the chest cavity of minipigs (n = 18) following extrapleural pneumonectomy (EPP) using an optical integrating balloon device combined with in situ light dosimetry. RESULTS: (a) PDT using PEG-mTHPC resulted in larger extent of tumor necrosis than in untreated tumors (P < or = 0.01) without causing damage to normal tissue. (b) Intraoperative PDT following EPP was well tolerated in 17 of 18 animals. Mean fluence and fluence rates measured at four sites of the chest cavity ranged from 10.2 +/- 0.2 to 13.2 +/- 2.3 J/cm(2) and 5.5 +/- 1.2 to 7.9 +/- 1.7 mW/cm(2) (mean +/- SD). Histology 3 months after light delivery revealed no PDT related tissue injury in all but one animal. CONCLUSIONS: PEG-mTHPC mediated PDT showed selective destruction of mesothelioma xenografts without causing damage to intrathoracic organs in pigs at similar treatment conditions. The light delivery system afforded regular light distribution to different parts of the chest cavity.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
The M-Coffee server is a web server that makes it possible to compute multiple sequence alignments (MSAs) by running several MSA methods and combining their output into one single model. This allows the user to simultaneously run all his methods of choice without having to arbitrarily choose one of them. The MSA is delivered along with a local estimation of its consistency with the individual MSAs it was derived from. The computation of the consensus multiple alignment is carried out using a special mode of the T-Coffee package [Notredame, Higgins and Heringa (T-Coffee: a novel method for fast and accurate multiple sequence alignment. J. Mol. Biol. 2000; 302: 205-217); Wallace, O'Sullivan, Higgins and Notredame (M-Coffee: combining multiple sequence alignment methods with T-Coffee. Nucleic Acids Res. 2006; 34: 1692-1699)] Given a set of sequences (DNA or proteins) in FASTA format, M-Coffee delivers a multiple alignment in the most common formats. M-Coffee is a freeware open source package distributed under a GPL license and it is available either as a standalone package or as a web service from www.tcoffee.org.
Resumo:
In the cerebral cortex, the activity levels of neuronal populations are continuously fluctuating. When neuronal activity, as measured using functional MRI (fMRI), is temporally coherent across 2 populations, those populations are said to be functionally connected. Functional connectivity has previously been shown to correlate with structural (anatomical) connectivity patterns at an aggregate level. In the present study we investigate, with the aid of computational modeling, whether systems-level properties of functional networks-including their spatial statistics and their persistence across time-can be accounted for by properties of the underlying anatomical network. We measured resting state functional connectivity (using fMRI) and structural connectivity (using diffusion spectrum imaging tractography) in the same individuals at high resolution. Structural connectivity then provided the couplings for a model of macroscopic cortical dynamics. In both model and data, we observed (i) that strong functional connections commonly exist between regions with no direct structural connection, rendering the inference of structural connectivity from functional connectivity impractical; (ii) that indirect connections and interregional distance accounted for some of the variance in functional connectivity that was unexplained by direct structural connectivity; and (iii) that resting-state functional connectivity exhibits variability within and across both scanning sessions and model runs. These empirical and modeling results demonstrate that although resting state functional connectivity is variable and is frequently present between regions without direct structural linkage, its strength, persistence, and spatial statistics are nevertheless constrained by the large-scale anatomical structure of the human cerebral cortex.