66 resultados para Software Modification
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
Biomarkers of blood lipid modification and oxidative stress have been associated with increased cardiovascular morbidity. We sought to determine whether these biomarkers were related to functional indices of stenosis severity among patients with stable coronary artery disease. We studied 197 consecutive patients with stable coronary artery disease due to single vessel disease. Fractional flow reserve (FFR) ≤ 0.80 was assessed as index of a functionally significant lesion. Serum levels of secretory phospholipase A2 (sPLA2) activity, secretory phospholipase A2 type IIA (sPLA2-IIA), myeloperoxydase (MPO), lipoprotein-associated phospholipase A2 (Lp-PLA2), and oxidized low-density lipoprotein (OxLDL) were assessed using commercially available assays. Patients with FFR > 0.8 had higher sPLA2 activity, sPLA2 IIA, and OxLDL levels than patients with FFR ≤ 0.8 (21.25 [16.03-27.28] vs 25.85 [20.58-34.63] U/mL, p < 0.001, 2.0 [1.5-3.4] vs 2.6 [2.0-3.4] ng/mL, p < 0.01; and 53.0 [36.0-71.0] vs 64.5 [50-89.25], p < 0.001 respectively). Patients with FFR > 0.80 had similar Lp-PLA2 and MPO levels versus those with FFR ≤ 0.8. sPLA2 activity, sPLA2 IIA significantly increased area under the curve over baseline characteristics to predict FFR ≤ 0.8 (0.67 to 0.77 (95 % confidence interval [CI]: 0.69-0.85) p < 0.01 and 0.67 to 0.77 (95 % CI: 0.69-0.84) p < 0.01, respectively). Serum sPLA2 activity as well as sPLA2-IIA level is related to functional characteristics of coronary stenoses in patients with stable coronary artery disease.
Resumo:
We tested for antigen recognition and T cell receptor (TCR)-ligand binding 12 peptide derivative variants on seven H-2Kd-restricted cytotoxic T lymphocytes (CTL) clones specific for a bifunctional photoreactive derivative of the Plasmodium berghei circumsporozoite peptide 252-260 (SYIPSAEKI). The derivative contained iodo-4-azidosalicylic acid in place of PbCS S-252 and 4-azidobenzoic acid on PbCS K-259. Selective photoactivation of the N-terminal photoreactive group allowed crosslinking to Kd molecules and photoactivation of the orthogonal group to TCR. TCR photoaffinity labeling with covalent Kd-peptide derivative complexes allowed direct assessment of TCR-ligand binding on living CTL. In most cases (over 80%) cytotoxicity (chromium release) and TCR-ligand binding differed by less than fivefold. The exceptions included (a) partial TCR agonists (8 cases), for which antigen recognition was five-tenfold less efficient than TCR-ligand binding, (b) TCR antagonists (2 cases), which were not recognized and capable of inhibiting recognition of the wild-type conjugate, (c) heteroclitic agonists (2 cases), for which antigen recognition was more efficient than TCR-ligand binding, and (d) one partial TCR agonist, which activated only Fas (C1)95), but not perforin/granzyme-mediated cytotoxicity. There was no correlation between these divergences and the avidity of TCR-ligand binding, indicating that other factors than binding avidity determine the nature of the CTL response. An unexpected and novel finding was that CD8-dependent clones clearly incline more to TCR antagonism than CD8-independent ones. As there was no correlation between CD8 dependence and the avidity of TCR-ligand binding, the possibility is suggested that CD8 plays a critical role in aberrant CTL function.
Resumo:
We have characterized the maturation, co- and posttranslational modifications, and functional properties of the alpha(1B)-adrenergic receptor (AR) expressed in different mammalian cells transfected using conventional approaches or the Semliki Forest virus system. We found that the alpha(1B)-AR undergoes N-linked glycosylation as demonstrated by its sensitivity to endoglycosidases and by the effect of tunicamycin on receptor maturation. Pulse-chase labeling experiments in BHK-21 cells demonstrate that the alpha(1B)-AR is synthesized as a 70 kDa core glycosylated precursor that is converted to the 90 kDa mature form of the receptor with a half-time of approximately 2 h. N-Linked glycosylation of the alpha(1B)-AR occurs at four asparagines on the N-terminus of the receptor. Mutations of the N-linked glycosylation sites did not have a significant effect on receptor function or expression. Surprisingly, receptor mutants lacking N-linked glycosylation migrated as heterogeneous bands in SDS-PAGE. Our findings demonstrate that N-linked glycosylation and phosphorylation, but not palmitoylation or O-linked glycosylation, contribute to the structural heterogeneity of the alpha(1B)-AR as it is observed in SDS-PAGE. The modifications found are similar in the different mammalian expression systems explored. Our findings indicate that the Semliki Forest virus system can provide large amounts of functional and fully glycosylated alpha(1B)-AR protein suitable for biochemical and structural studies. The results of this study contribute to elucidate the basic steps involved in the processing of G protein-coupled receptors as well as to optimize strategies for their overexpression.
Resumo:
Aim. Several software packages (SWP) and models have been released for quantification of myocardial perfusion (MP). Although they all are validated against something, the question remains how well their values agree. The present analysis focused on cross-comparison of three SWP for MP quantification of 13N-ammonia PET studies. Materials & Methods. 48 rest and stress MP 13N-ammonia PET studies of hypertrophic cardiomyopathy (HCM) patients (Sciagrà et al., 2009) were analysed with three SW packages - Carimas, PMOD, and FlowQuant - by three observers blinded to the results of each other. All SWP implement the one-tissue-compartment model (1TCM, DeGrado et al. 1996), and first two - the two-tissue-compartment model (2TCM, Hutchins et al. 1990) as well. Linear mixed model for the repeated measures was fitted to the data. Where appropriate we used Bland-Altman plots as well. The reproducibility was assessed on global, regional and segmental levels. Intraclass correlation coefficients (ICC), differences between the SWPs and between models were obtained. ICC≥0.75 indicated excellent reproducibility, 0.4≤ICC<0.75 indicated fair to good reproducibility, ICC<0.4 - poor reproducibility (Rosner, 2010). Results. When 1TCM MP values were compared, the SW agreement on global and regional levels was excellent, except for Carimas vs. PMOD at RCA: ICC=0.715 and for PMOD vs. FlowQuant at LCX:ICC=0.745 which were good. In segmental analysis in five segments: 7,12,13, 16, and 17 the agreement between all SWP was excellent; in the remaining 12 segments the agreement varied between the compared SWP. Carimas showed excellent agreement with FlowQuant in 13 segments and good in four - 1, 5, 6, 11: 0.687≤ICCs≤0.73; Carimas had excellent agreement with PMOD in 11 segments, good in five_4, 9, 10, 14, 15: 0.682≤ICCs≤0.737, and poor in segment 3: ICC=0.341. PMOD had excellent agreement with FlowQuant in eight segments and substantial-to-good in nine_1, 2, 3, 5, 6,8-11: 0.585≤ICCs≤0.738. Agreement between Carimas and PMOD for 2TCM was good at a global level: ICC=0.745, excellent at LCX (0.780) and RCA (0.774), good at LAD (0.662); agreement was excellent for ten segments, fair-to-substantial for segments 2, 3, 8, 14, 15 (0.431≤ICCs≤0.681), poor for segments 4 (0.384) and 17 (0.278). Conclusions. The three SWP used by different operators to analyse 13N-ammonia PET MP studies provide results that agree well at a global level, regional levels, and mostly well even at a segmental level. Agreement is better for 1TCM. Poor agreement at segments 4 and 17 for 2TCM needs further clarification.
Resumo:
In cases of transjugular liver biopsies, the venous angle formed between the chosen hepatic vein and the vena cava main axis in a frontal plane can be large, leading to technical difficulties. In a prospective study including 139 consecutive patients who underwent transjugular liver biopsy using the Quick-Core biopsy set, the mean venous angle was equal to 49.6 degrees. For 21.1% of the patients, two attempts at hepatic venous catheterization failed because the venous angle was too large, with a mean of 69.7 degrees. In all of these patients, manual reshaping of the distal curvature of the stiffening metallic cannula, by forming a new mean angle equal to 48 degrees , allowed successful completion of the procedure in less than 10 min.
Resumo:
BACKGROUND: Current bilevel positive-pressure ventilators for home noninvasive ventilation (NIV) provide physicians with software that records items important for patient monitoring, such as compliance, tidal volume (Vt), and leaks. However, to our knowledge, the validity of this information has not yet been independently assessed. METHODS: Testing was done for seven home ventilators on a bench model adapted to simulate NIV and generate unintentional leaks (ie, other than of the mask exhalation valve). Five levels of leaks were simulated using a computer-driven solenoid valve (0-60 L/min) at different levels of inspiratory pressure (15 and 25 cm H(2)O) and at a fixed expiratory pressure (5 cm H(2)O), for a total of 10 conditions. Bench data were compared with results retrieved from ventilator software for leaks and Vt. RESULTS: For assessing leaks, three of the devices tested were highly reliable, with a small bias (0.3-0.9 L/min), narrow limits of agreement (LA), and high correlations (R(2), 0.993-0.997) when comparing ventilator software and bench results; conversely, for four ventilators, bias ranged from -6.0 L/min to -25.9 L/min, exceeding -10 L/min for two devices, with wide LA and lower correlations (R(2), 0.70-0.98). Bias for leaks increased markedly with the importance of leaks in three devices. Vt was underestimated by all devices, and bias (range, 66-236 mL) increased with higher insufflation pressures. Only two devices had a bias < 100 mL, with all testing conditions considered. CONCLUSIONS: Physicians monitoring patients who use home ventilation must be aware of differences in the estimation of leaks and Vt by ventilator software. Also, leaks are reported in different ways according to the device used.
Resumo:
abstract:occasional Adnominal Idiom Modification - A Cognitive Linguistic Approach From a cognitive-linguistic perspective, this paper explores alternative types of adnoniinal modification in occasional variants of English verbal idioms. Being discussed against data extracted from the British National Corpiis (BNC), the model claims that in idioni-production idiomatic constructions are activated as complex linguistic schemas to code a context-specific target-conceptualisation. Adnominal pre- and postmodifications are one specific form of creative alteration to adapt the idiom for this purpose. Semantically, idiom-interna1 NPextension is not a uniforni process. It is necessary to distinguish two systematic types of adnominal modification: external and internal modification (Ernst 1981). While external NPmodification has adverbial function, ¡.e. it modifies the idiom as a unit, internal modification directly applies to the head-noun and thus depends on the degree of motivation and analysability of a given idiom. Following the cognitive-linguistic framework, these dimensions of idiom-transparency result from the language user's ability to remotivate the bipartite semantic structure by conceptual metaphors and metonymies.
Resumo:
OBJECTIVE: Bench evaluation of the hydrodynamic behavior of venous cannulas is a valuable technique for the analysis of their performance during cardiopulmonary bypass (CPB). The aim of this study was to investigate the effect of the internal diameter of the extracorporeal connecting tube of venous cannulas on flow rate (Q), pressure drop (delta P), and cannula resistance (delta P/Q²) values, using a computer assisted test bench.¦METHODS: An in vitro circuit was set up with silicone tubing between the test cannula encased in a movable reservoir, and a static reservoir. The delta P, defined as the difference between the drainage pressure and the preload pressure, was measured using high-fidelity Millar pressure transducers. Q was measured using an ultrasonic flowmeter. Data display and data recording were controlled using virtual instruments in a stepwise fashion.¦RESULTS: The 27 F smartcanula® with a 9 mm connecting tube diameter showed 17% less resistance compared to that with an 8 mm connecting tube diameter. Q values were 7.22±0.1 and 7.81±0.04 L/min for cannulas with 8 mm and 9 mm connecting tube diameters, respectively. The delta P/Q² ratio values were 72% lower for the Medtronic cannula with a 9 mm connecting tube diameter compared to that with an 8 mm connecting tube diameter. Q values for the Medtronic cannula were 3.94±0.23 and 6.58±0.04 L/min with 8 mm and 9 mm connecting tube diameters, respectively. The 27 F smartcanula® showed 13% more flow rate compared to the 28 F Medtronic cannula using the unpaired Student t-test (p<0.0001).¦CONCLUSIONS: Our results demonstrated that Q was increased but delta P and delta P/Q² values were significantly decreased when the connecting tube diameter was increased for venous cannulas. The connecting tube diameter significantly affected the resistance to liquid flow through the cannula. Smartcanulas® outperform Medtronic cannulas.
Resumo:
Expression by Saccharomyces cerevisiae of a polyhydroxyalkanoate (PHA) synthase modified at the carboxy end by the addition of a peroxisome targeting signal derived from the last 34 amino acids of the Brassica napus isocitrate lyase (ICL) and containing the terminal tripeptide Ser-Arg-Met resulted in the synthesis of PHA. The ability of the terminal peptide Ser-Arg-Met and of the 34-amino-acid peptide from the B. napus ICL to target foreign proteins to the peroxisome of S. cerevisiae was demonstrated with green fluorescent protein fusions. PHA synthesis was found to be dependent on the presence of both the enzymes generating the beta-oxidation intermediate 3-hydroxyacyl-coenzyme A (3-hydroxyacyl-[CoA]) and the peroxin-encoding PEX5 gene, demonstrating the requirement for a functional peroxisome and a beta-oxidation cycle for PHA synthesis. Using a variant of the S. cerevisiae beta-oxidation multifunctional enzyme with a mutation inactivating the B domain of the R-3-hydroxyacyl-CoA dehydrogenase, it was possible to modify the PHA monomer composition through an increase in the proportion of the short-chain monomers of five and six carbons.
Novel insulated gamma and lentis retroviral vectors towards safer genetic modification of stem cells
Resumo:
In otherwise successful gene therapy trials insertional mutagenesis has resulted in leukemia. The identification of new short synthetic genetic insulator elements (GIE) which would both prevent such activation effects and shield the transgene from silencing, is a main challenge. Previous attempts with e.g. b-globin HS4, have met with poor efficacy and genetic instability. We have investigated potential improvement with two new candidate synthetic GIEs in SIN-gamma and lentiviral vectors. With each constructs two internal promoters have been tested: either the strong Fr- MuLV-U3 or the housekeeping hPGK.We could identify a specific combination of insulator 2 repeats which translates into best functional activity, high titers and boundary effect in both gammaretro and lentivectors. In target cells a dramatic shift of expression is observed with an homogenous profile the level of which strictly depends on the promoter strength. These data remain stable in both HeLa cells over three months and cord blood HSCs for two months, irrespective of the multiplicity of infection (MOI). In comparison, control native and SIN vectors expression levels show heterogeneous, depend on the MOI and prove unstable. We have undertaken genotoxicity assessment in comparing integration patterns ingenuity in human target cells sampled over three months using high-throughput pyro-sequencing. Data will be presented. Further genotoxicity assessment will include in vivo studies. We have established insulated vectors which harbour both boundary and enhancer-blocking effect and show stable in prolonged in vitro culture conditions. Work performed with support of EC-DG research FP6-NoE, CLINIGENE: LSHB-CT-2006-018933
Resumo:
We propose a new approach and related indicators for globally distributed software support and development based on a 3-year process improvement project in a globally distributed engineering company. The company develops, delivers and supports a complex software system with tailored hardware components and unique end-customer installations. By applying the domain knowledge from operations management on lead time reduction and its multiple benefits to process performance, the workflows of globally distributed software development and multitier support processes were measured and monitored throughout the company. The results show that the global end-to-end process visibility and centrally managed reporting at all levels of the organization catalyzed a change process toward significantly better performance. Due to the new performance indicators based on lead times and their variation with fixed control procedures, the case company was able to report faster bug-fixing cycle times, improved response times and generally better customer satisfaction in its global operations. In all, lead times to implement new features and to respond to customer issues and requests were reduced by 50%.
Resumo:
BACKGROUND: Adverse effects of combination antiretroviral therapy (CART) commonly result in treatment modification and poor adherence. METHODS: We investigated predictors of toxicity-related treatment modification during the first year of CART in 1318 antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals from the Swiss HIV Cohort Study who began treatment between January 1, 2005, and June 30, 2008. RESULTS: The total rate of treatment modification was 41.5 (95% confidence interval [CI], 37.6-45.8) per 100 person-years. Of these, switches or discontinuations because of drug toxicity occurred at a rate of 22.4 (95% CI, 19.5-25.6) per 100 person-years. The most frequent toxic effects were gastrointestinal tract intolerance (28.9%), hypersensitivity (18.3%), central nervous system adverse events (17.3%), and hepatic events (11.5%). In the multivariate analysis, combined zidovudine and lamivudine (hazard ratio [HR], 2.71 [95% CI, 1.95-3.83]; P < .001), nevirapine (1.95 [1.01-3.81]; P = .050), comedication for an opportunistic infection (2.24 [1.19-4.21]; P = .01), advanced age (1.21 [1.03-1.40] per 10-year increase; P = .02), female sex (1.68 [1.14-2.48]; P = .009), nonwhite ethnicity (1.71 [1.18-2.47]; P = .005), higher baseline CD4 cell count (1.19 [1.10-1.28] per 100/microL increase; P < .001), and HIV-RNA of more than 5.0 log(10) copies/mL (1.47 [1.10-1.97]; P = .009) were associated with higher rates of treatment modification. Almost 90% of individuals with treatment-limiting toxic effects were switched to a new regimen, and 85% achieved virologic suppression to less than 50 copies/mL at 12 months compared with 87% of those continuing CART (P = .56). CONCLUSIONS: Drug toxicity remains a frequent reason for treatment modification; however, it does not affect treatment success. Close monitoring and management of adverse effects and drug-drug interactions are crucial for the durability of CART.