855 resultados para Well-established techniques


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Impairment of lung liquid absorption can lead to severe respiratory symptoms, such as those observed in pulmonary oedema. In the adult lung, liquid absorption is driven by cation transport through two pathways: a well-established amiloride-sensitive Na(+) channel (ENaC) and, more controversially, an amiloride-insensitive channel that may belong to the cyclic nucleotide-gated (CNG) channel family. Here, we show robust CNGA1 (but not CNGA2 or CNGA3) channel expression principally in rat alveolar type I cells; CNGA3 was expressed in ciliated airway epithelial cells. Using a rat in situ lung liquid clearance assay, CNG channel activation with 1 mM 8Br-cGMP resulted in an approximate 1.8-fold stimulation of lung liquid absorption. There was no stimulation by 8Br-cGMP when applied in the presence of either 100 μM L: -cis-diltiazem or 100 nM pseudechetoxin (PsTx), a specific inhibitor of CNGA1 channels. Channel specificity of PsTx and amiloride was confirmed by patch clamp experiments showing that CNGA1 channels in HEK 293 cells were not inhibited by 100 μM amiloride and that recombinant αβγ-ENaC were not inhibited by 100 nM PsTx. Importantly, 8Br-cGMP stimulated lung liquid absorption in situ, even in the presence of 50 μM amiloride. Furthermore, neither L: -cis-diltiazem nor PsTx affected the β(2)-adrenoceptor agonist-stimulated lung liquid absorption, but, as expected, amiloride completely ablated it. Thus, transport through alveolar CNGA1 channels, located in type I cells, underlies the amiloride-insensitive component of lung liquid reabsorption. Furthermore, our in situ data highlight the potential of CNGA1 as a novel therapeutic target for the treatment of diseases characterised by lung liquid overload.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The concept of antibody-mediated targeting of antigenic MHC/peptide complexes on tumor cells in order to sensitize them to T-lymphocyte cytotoxicity represents an attractive new immunotherapy strategy. In vitro experiments have shown that an antibody chemically conjugated or fused to monomeric MHC/peptide can be oligomerized on the surface of tumor cells, rendering them susceptible to efficient lysis by MHC-peptide restricted specific T-cell clones. However, this strategy has not yet been tested entirely in vivo in immunocompetent animals. To this aim, we took advantage of OT-1 mice which have a transgenic T-cell receptor specific for the ovalbumin (ova) immunodominant peptide (257-264) expressed in the context of the MHC class I H-2K(b). We prepared and characterized conjugates between the Fab' fragment from a high-affinity monoclonal antibody to carcinoembryonic antigen (CEA) and the H-2K(b) /ova peptide complex. First, we showed in OT-1 mice that the grafting and growth of a syngeneic colon carcinoma line transfected with CEA could be specifically inhibited by systemic injections of the conjugate. Next, using CEA transgenic C57BL/6 mice adoptively transferred with OT-1 spleen cells and immunized with ovalbumin, we demonstrated that systemic injections of the anti-CEA-H-2K(b) /ova conjugate could induce specific growth inhibition and regression of well-established, palpable subcutaneous grafts from the syngeneic CEA-transfected colon carcinoma line. These results, obtained in a well-characterized syngeneic carcinoma model, demonstrate that the antibody-MHC/peptide strategy can function in vivo. Further preclinical experimental studies, using an anti-viral T-cell response, will be performed before this new form of immunotherapy can be considered for clinical use.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Transcatheter aortic valve therapies are the newest established techniques for the treatment of high risk patients affected by severe symptomatic aortic valve stenosis. The transapical approach requires a left anterolateral mini-thoracotomy, whereas the transfemoral method requires an adequate peripheral vascular access and can be performed fully percutaneously. Alternatively, the trans-subclavian access has been recently proposed as a third promising approach. Depending on the technique, the fine stent-valve positioning can be performed with or without contrast injections. The transapical echo-guided stent-valve implantation without angiography (the Lausanne technique) relies entirely on transoesophageal echocardiogramme imaging for the fine stent-valve positioning and it has been proved that this technique prevents the onset of postoperative contrast-related acute kidney failure. Recent published reports have shown good hospital outcomes and short-term results after transcatheter aortic valve implantation, but there are no proven advantages in using the transfemoral or the transapical technique. In particular, the transapical series have a higher mean logistic Euroscore of 27-35%, a procedural success rate above 95% and a mean 30-day mortality between 7.5 and 17.5%, whereas the transfemoral results show a lower logistic Euroscore of 23-25.5%, a procedural success rate above 90% and a 30-day mortality of 7-10.8%. Nevertheless, further clinical trials and long-term results are mandatory to confirm this positive trend. Future perspectives in transcatheter aortic valve therapies would be the development of intravascular devices for the ablation of the diseased valve leaflets and the launch of new stent-valves with improved haemodynamic, different sizes and smaller delivery systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The interest in solar ultraviolet (UV) radiation from the scientific community and the general population has risen significantly in recent years because of the link between increased UV levels at the Earth's surface and depletion of ozone in the stratosphere. As a consequence of recent research, UV radiation climatologies have been developed, and effects of some atmospheric constituents (such as ozone or aerosols) have been studied broadly. Correspondingly, there are well-established relationships between, for example, total ozone column and UV radiation levels at the Earth's surface. Effects of clouds, however, are not so well described, given the intrinsic difficulties in properly describing cloud characteristics. Nevertheless, the effect of clouds cannot be neglected, and the variability that clouds induce on UV radiation is particularly significant when short timescales are involved. In this review we show, summarize, and compare several works that deal with the effect of clouds on UV radiation. Specifically, works reviewed here approach the issue from the empirical point of view: Some relationship between measured UV radiation in cloudy conditions and cloud-related information is given in each work. Basically, there are two groups of methods: techniques that are based on observations of cloudiness (either from human observers or by using devices such as sky cameras) and techniques that use measurements of broadband solar radiation as a surrogate for cloud observations. Some techniques combine both types of information. Comparison of results from different works is addressed through using the cloud modification factor (CMF) defined as the ratio between measured UV radiation in a cloudy sky and calculated radiation for a cloudless sky. Typical CMF values for overcast skies range from 0.3 to 0.7, depending both on cloud type and characteristics. Despite this large dispersion of values corresponding to the same cloud cover, it is clear that the cloud effect on UV radiation is 15–45% lower than the cloud effect on total solar radiation. The cloud effect is usually a reducing effect, but a significant number of works report an enhancement effect (that is increased UV radiation levels at the surface) due to the presence of clouds. The review concludes with some recommendations for future studies aimed to further analyze the cloud effects on UV radiation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Asphalt binder is typically modified with poly type (styrene-butadiene-styrene or SBS) polymers to improve its rheological properties and performance grade. The elastic and principal component of SBS polymers is butadiene. For the last decade, butadiene prices have fluctuated and significantly increased, leading state highway agencies to search for economically viable alternatives to butadiene based materials. This project reports the recent advances in polymerization techniques that have enabled the synthesis of elastomeric, thermoplastic, block-copolymers (BCPs) comprised of styrene and soybean oil, where the “B” block in SBS polymers is replaced with polymerized triglycerides derived from soybean oil. These new breeds of biopolymers have elastomeric properties comparable to well-established butadiene-based styrenic BCPs. In this report, two types of biopolymer formulations are evaluated for their ability to modify asphalt binder. Laboratory blends of asphalt modified with the biopolymers are tested for their rheological properties and performance grade. Blends of asphalt modified with the biopolymers are compared to blends of asphalt modified with two commonly used commercial polymers. The viscoelastic properties of the blends show that biopolymers improve the performance grade of the asphalt to a similar and even greater extent as the commercial SBS polymers. Results shown in this report indicate there is an excellent potential for the future of these biopolymers as economically and environmentally favorable alternatives to their petrochemically-derived analogs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In a multicenter study a new, fully automated Roche Diagnostics Elecsys HBsAg II screening assay with improved sensitivity to HBsAg mutant detection was compared to well-established HBsAg tests: AxSYM HBsAg V2 (Abbott), Architect HBsAg (Abbott), Advia Centaur HBsAg (Bayer) Enzygnost HBsAg 5.0 (Dade-Behring), and Vitros Eci HBsAg (Ortho). A total of 16 seroconversion panels, samples of 60 HBsAg native mutants, and 31 HBsAg recombinant mutants, dilution series of NIBSC and PEI standards, 156 HBV positive samples comprising genotypes A to G, 686 preselected HBsAg positive samples from different stages of infection, 3,593 samples from daily routine, and 6,360 unselected blood donations were tested to evaluate the analytical and clinical sensitivity, the detection of mutants, and the specificity of the new assay. Elecsys HBsAg II showed a statistically significant better sensitivity in seroconversion panels to the compared tests. Fifty-seven out of 60 native mutants and all recombinant mutants were found positive. Among 156 HBV samples with different genotypes and 696 preselected HBsAg positive samples Elecsys HBsAg II achieved a sensitivity of 100%. The lower detection limit for NIBSC standard was calculated to be 0.025 IU/ml and for the PEI standards ad and ay it was <0.001 and <0.005 U/ml, respectively. Within 2,724 daily routine specimens and 6.360 unselected blood donations Elecsys HBsAg II showed a specificity of 99.97 and 99.88%, respectively. In conclusion the new Elecsys HBsAg II shows a high sensitivity for the detection of all stages of HBV infection and HBsAg mutants paired together with a high specificity in blood donors, daily routine samples, and potentially interfering sera.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In some high-risk patients, standard mitral valve replacement can represent a challenging procedure, requiring a risky extensive decalcification of the annulus. In particular, high-risk redo patients and patients with a previously implanted transcatheter aortic valve, who develop calcific mitral disease, would benefit from the development of new, minimally invasive, transcatheter or hybrid techniques for mitral valve replacement. In particular, mixing transcatheter valve therapies and well-established minimally invasive techniques for mitral replacement or repair can help in decreasing the surgical risk and the technical complexity. Thus, placing transcatheter, balloon-expandable Sapien? XT stent-valves in calcified, degenerated mitral valves through a right thoracotomy, a left atriotomy and on an on-pump fibrillating heart, represents an attractive alternative to standard surgery in redo patients, in patients with concomitant transcatheter aortic stent-valves in place and in patients with a high-risk profile. We describe this hybrid technique in detail.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is well established that cancer cells can recruit CD11b(+) myeloid cells to promote tumor angiogenesis and tumor growth. Increasing interest has emerged on the identification of subpopulations of tumor-infiltrating CD11b(+) myeloid cells using flow cytometry techniques. In the literature, however, discrepancies exist on the phenotype of these cells (Coffelt et al., Am J Pathol 2010;176:1564-1576). Since flow cytometry analysis requires particular precautions for accurate sample preparation and trustable data acquisition, analysis, and interpretation, some discrepancies might be due to technical reasons rather than biological grounds. We used the syngenic orthotopic 4T1 mammary tumor model in immunocompetent BALB/c mice to analyze and compare the phenotype of CD11b(+) myeloid cells isolated from peripheral blood and from tumors, using six-color flow cytometry. We report here that the nonspecific antibody binding through Fc receptors, the presence of dead cells and cell doublets in tumor-derived samples concur to generate artifacts in the phenotype of tumor-infiltrating CD11b(+) subpopulations. We show that the heterogeneity of tumor-infiltrating CD11b(+) subpopulations analyzed without particular precautions was greatly reduced upon Fc block treatment, dead cells, and cell doublets exclusion. Phenotyping of tumor-infiltrating CD11b(+) cells was particularly sensitive to these parameters compared to circulating CD11b(+) cells. Taken together, our results identify Fc block treatment, dead cells, and cell doublets exclusion as simple but crucial steps for the proper analysis of tumor-infiltrating CD11b(+) cell populations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Electrostatic phenomena were discovered long ago but their interpretation according to well-established atomic-molecular theory is still lacking. As a result, electrostatic phenomena are often irreproducible and uncontrolled, causing serious practical problems. Highly reproducible recent experimental results on electrostatic charging from this and other laboratories are reviewed in this work, together with a description of the relevant but not so usual Kelvin probe and Faraday cup techniques. These results support a new model for electrostatic charging of dielectrics and insulated metals, based on the role of moist atmosphere as a charge reservoir.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Poorly soluble drugs have low bioavailability, representing a major challenge for the pharmaceutical industry. Processing drugs into the nanosized range changes their physical properties, and these are being used in pharmaceutics to develop innovative formulations known as Nanocrystals. Use of nanocrystals to overcome the problem of low bioavailability, and their production using different techniques such as microfluidization or high pressure homogenization, was reviewed in this paper. Examples of drugs, cosmetics and nutraceutical ingredients were also discussed. These technologies are well established in the pharmaceutical industry and are approved by the Food and Drug Administration.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Le dogme voulant que les récepteurs couplés aux protéines G (GPCRs) activent des voies de signalisation seulement lorsqu’ils sont localisés à la membrane plasmatique, a récemment été remis en question. Des données récentes indiquent que certains GPCRs peuvent également induire une réponse intracellulaire à partir des compartiments intracellulaires dont le noyau. Les récepteurs activés par la protéase (PAR) sont des membres de la famille GPCR. Les PARs sont activés par le clivage de la partie N–terminale du récepteur ce qui permet au ligand attaché sur le récepteur de se lier à sa poche réceptrice. Quatre PARs ont été décrits : PAR1, PAR2, PAR3 et PAR4. PAR2 peut susciter des effets mitogéniques et participer aux processus comme l’angiogenèse et l'inflammation. Alors que beaucoup d'effets intracellulaires de PAR2 peuvent être expliqués lorsqu’il est localisé à la membrane plasmatique, une fonction intracrine de PAR2 a aussi été proposée. Pourtant les mécanismes par lesquels PAR2 peut provoquer l’expression de gènes ciblés sont toujours inconnus. Le but de notre étude était de vérifier l’existence d’une population nucléaire de PAR2. Nous avons également émis l’hypothèse que les voies activées par l’activation de PAR2 dépendent de sa localization cellulaire. En utilisant des techniques de microscopie confocale et de "Western Blot" nous avons démontré la présence d’une population nucléaire de PAR2. À la suite de la stimulation de PAR2, nous avons observé une augmentation de la translocation du récepteur de la membrane plasmatique au noyau. En utilisant la technique de "RT – PCR", nous avons observé des rôles différents de PAR2 à la surface de la cellule et du noyau dans l’initiation de l’expression des gènes. Afin d’identifier les mécanismes responsables de la translocation nucléaire de PAR2, nous avons évalué l’implication des membres de la famille de "Sorting Nexins (SNX)" dans la translocation nucléaire de PAR2. "Sorting Nexins" est un groupe de protéines avec des fonctions de transport bien établies. SNX1 et SNX2 ont été identifiés comme responsables du transfert de PAR1 vers les lysosomes. SNX11 n'a pas encore été étudié et nous avons émis l’hypothèse qu'il pourrait être un autre membre de la famille des SNX impliqué dans la signalisation de PAR2. Pour ce faire, nous avons développé des "knockdowns" stables pour SNX1, SNX2 et SNX11 dans les cellules HEK293. En utilisant les essais d’immunofluorescence, "Western Blot" et de cytométrie en flux, nous avons déterminé que tous les trois membres du groupe SNX sont des partenaires d'interaction de PAR2. Toutefois, seul SNX11 se co-localise avec son partenaire au noyau et est responsable de sa translocation nucléaire. Les expériences de "RT - PCR" sur les lignées de cellule de SNXs "knockdowns" ont démontré que la fonction de PAR2 nucléaire dépend surtout de SNX11; néanmoins SNX1 et SNX2 peuvent aussi l’influencer, suggérant qu'ils font aussi partie du réseau signalétique de PAR2. En conclusion, PAR2 est déplacé de la membrane plasmatique à la membrane nucléaire après sa stimulation avec un agoniste. La translocation nucléaire de PAR2 par un mécanisme impliquant SNX11, initie des effets intracellulaires différents de sa signalisation membranaire. Mots clés : récepteurs couplés à la protéine G, “Sorting Nexins”, récepteurs activés par la protéase, translocation nucléaire, membrane nucléaire, signal nucléaire.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Quand le E-learning a émergé il ya 20 ans, cela consistait simplement en un texte affiché sur un écran d'ordinateur, comme un livre. Avec les changements et les progrès dans la technologie, le E-learning a parcouru un long chemin, maintenant offrant un matériel éducatif personnalisé, interactif et riche en contenu. Aujourd'hui, le E-learning se transforme de nouveau. En effet, avec la prolifération des systèmes d'apprentissage électronique et des outils d'édition de contenu éducatif, ainsi que les normes établies, c’est devenu plus facile de partager et de réutiliser le contenu d'apprentissage. En outre, avec le passage à des méthodes d'enseignement centrées sur l'apprenant, en plus de l'effet des techniques et technologies Web2.0, les apprenants ne sont plus seulement les récipiendaires du contenu d'apprentissage, mais peuvent jouer un rôle plus actif dans l'enrichissement de ce contenu. Par ailleurs, avec la quantité d'informations que les systèmes E-learning peuvent accumuler sur les apprenants, et l'impact que cela peut avoir sur leur vie privée, des préoccupations sont soulevées afin de protéger la vie privée des apprenants. Au meilleur de nos connaissances, il n'existe pas de solutions existantes qui prennent en charge les différents problèmes soulevés par ces changements. Dans ce travail, nous abordons ces questions en présentant Cadmus, SHAREK, et le E-learning préservant la vie privée. Plus précisément, Cadmus est une plateforme web, conforme au standard IMS QTI, offrant un cadre et des outils adéquats pour permettre à des tuteurs de créer et partager des questions de tests et des examens. Plus précisément, Cadmus fournit des modules telles que EQRS (Exam Question Recommender System) pour aider les tuteurs à localiser des questions appropriées pour leur examens, ICE (Identification of Conflits in Exams) pour aider à résoudre les conflits entre les questions contenu dans un même examen, et le Topic Tree, conçu pour aider les tuteurs à mieux organiser leurs questions d'examen et à assurer facilement la couverture des différent sujets contenus dans les examens. D'autre part, SHAREK (Sharing REsources and Knowledge) fournit un cadre pour pouvoir profiter du meilleur des deux mondes : la solidité des systèmes E-learning et la flexibilité de PLE (Personal Learning Environment) tout en permettant aux apprenants d'enrichir le contenu d'apprentissage, et les aider à localiser nouvelles ressources d'apprentissage. Plus précisément, SHAREK combine un système recommandation multicritères, ainsi que des techniques et des technologies Web2.0, tels que le RSS et le web social, pour promouvoir de nouvelles ressources d'apprentissage et aider les apprenants à localiser du contenu adapté. Finalement, afin de répondre aux divers besoins de la vie privée dans le E-learning, nous proposons un cadre avec quatre niveaux de vie privée, ainsi que quatre niveaux de traçabilité. De plus, nous présentons ACES (Anonymous Credentials for E-learning Systems), un ensemble de protocoles, basés sur des techniques cryptographiques bien établies, afin d'aider les apprenants à atteindre leur niveau de vie privée désiré.