16 resultados para pacs: engineering mathematics and mathematical techniques

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

State-of-the-art production technologies for conjugate vaccines are complex, multi-step processes. An alternative approach to produce glycoconjugates is based on the bacterial N-linked protein glycosylation system first described in Campylobacter jejuni. The C. jejuni N-glycosylation system has been successfully transferred into Escherichia coli, enabling in vivo production of customized recombinant glycoproteins. However, some antigenic bacterial cell surface polysaccharides, like the Vi antigen of Salmonella enterica serovar Typhi, have not been reported to be accessible to the bacterial oligosaccharyltransferase PglB, hence hamper development of novel conjugate vaccines against typhoid fever. In this report, Vi-like polysaccharide structures that can be transferred by PglB were evaluated as typhoid vaccine components. A polysaccharide fulfilling these requirements was found in Escherichia coli serovar O121. Inactivation of the E. coli O121 O antigen cluster encoded gene wbqG resulted in expression of O polysaccharides reactive with antibodies raised against the Vi antigen. The structure of the recombinantly expressed mutant O polysaccharide was elucidated using a novel HPLC and mass spectrometry based method for purified undecaprenyl pyrophosphate (Und-PP) linked glycans, and the presence of epitopes also found in the Vi antigen was confirmed. The mutant O antigen structure was transferred to acceptor proteins using the bacterial N-glycosylation system, and immunogenicity of the resulting conjugates was evaluated in mice. The conjugate-induced antibodies reacted in an enzyme-linked immunosorbent assay with E. coli O121 LPS. One animal developed a significant rise in serum immunoglobulin anti-Vi titer upon immunization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology of exploratory data analysis investigating the phenomenon of orographic precipitation enhancement is proposed. The precipitation observations obtained from three Swiss Doppler weather radars are analysed for the major precipitation event of August 2005 in the Alps. Image processing techniques are used to detect significant precipitation cells/pixels from radar images while filtering out spurious effects due to ground clutter. The contribution of topography to precipitation patterns is described by an extensive set of topographical descriptors computed from the digital elevation model at multiple spatial scales. Additionally, the motion vector field is derived from subsequent radar images and integrated into a set of topographic features to highlight the slopes exposed to main flows. Following the exploratory data analysis with a recent algorithm of spectral clustering, it is shown that orographic precipitation cells are generated under specific flow and topographic conditions. Repeatability of precipitation patterns in particular spatial locations is found to be linked to specific local terrain shapes, e.g. at the top of hills and on the upwind side of the mountains. This methodology and our empirical findings for the Alpine region provide a basis for building computational data-driven models of orographic enhancement and triggering of precipitation. Copyright (C) 2011 Royal Meteorological Society .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite their limited proliferation capacity, regulatory T cells (T(regs)) constitute a population maintained over the entire lifetime of a human organism. The means by which T(regs) sustain a stable pool in vivo are controversial. Using a mathematical model, we address this issue by evaluating several biological scenarios of the origins and the proliferation capacity of two subsets of T(regs): precursor CD4(+)CD25(+)CD45RO(-) and mature CD4(+)CD25(+)CD45RO(+) cells. The lifelong dynamics of T(regs) are described by a set of ordinary differential equations, driven by a stochastic process representing the major immune reactions involving these cells. The model dynamics are validated using data from human donors of different ages. Analysis of the data led to the identification of two properties of the dynamics: (1) the equilibrium in the CD4(+)CD25(+)FoxP3(+)T(regs) population is maintained over both precursor and mature T(regs) pools together, and (2) the ratio between precursor and mature T(regs) is inverted in the early years of adulthood. Then, using the model, we identified three biologically relevant scenarios that have the above properties: (1) the unique source of mature T(regs) is the antigen-driven differentiation of precursors that acquire the mature profile in the periphery and the proliferation of T(regs) is essential for the development and the maintenance of the pool; there exist other sources of mature T(regs), such as (2) a homeostatic density-dependent regulation or (3) thymus- or effector-derived T(regs), and in both cases, antigen-induced proliferation is not necessary for the development of a stable pool of T(regs). This is the first time that a mathematical model built to describe the in vivo dynamics of regulatory T cells is validated using human data. The application of this model provides an invaluable tool in estimating the amount of regulatory T cells as a function of time in the blood of patients that received a solid organ transplant or are suffering from an autoimmune disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé La protéomique basée sur la spectrométrie de masse est l'étude du proteome l'ensemble des protéines exprimées au sein d'une cellule, d'un tissu ou d'un organisme - par cette technique. Les protéines sont coupées à l'aide d'enzymes en plus petits morceaux -les peptides -, et, séparées par différentes techniques. Les différentes fractions contenant quelques centaines de peptides sont ensuite analysées dans un spectromètre de masse. La masse des peptides est enregistrée et chaque peptide est séquentiellement fragmenté pour en obtenir sa séquence. L'information de masse et séquence est ensuite comparée à une base de données de protéines afin d'identifier la protéine d'origine. Dans une première partie, la thèse décrit le développement de méthodes d'identification. Elle montre l'importance de l'enrichissement de protéines comme moyen d'accès à des protéines de moyenne à faible abondance dans le lait humain. Elle utilise des injections répétées pour augmenter la couverture en protéines et la confiance dans l'identification. L'impacte de nouvelle version de base de données sur la liste des protéines identifiées est aussi démontré. De plus, elle utilise avec succès la spectrométrie de masse comme alternative aux anticorps, pour valider la présence de 34 constructions de protéines pathogéniques du staphylocoque doré exprimées dans une souche de lactocoque. Dans une deuxième partie, la thèse décrit le développement de méthodes de quantification. Elle expose de nouvelles approches de marquage des terminus des protéines aux isotopes stables et décrit la première méthode de marquage des groupements carboxyliques au niveau protéine à l'aide de réactifs composé de carbone 13. De plus, une nouvelle méthode, appelée ANIBAL, marquant tous les groupements amines et carboxyliques au niveau de la protéine, est exposée. Summary Mass spectrometry-based proteomics is the study of the proteome -the set of all expressed proteins in a cell, tissue or organism -using mass spectrometry. Proteins are cut into smaller pieces - peptides - using proteolytic enzymes and separated using different separation techniques. The different fractions containing several hundreds of peptides are than analyzed by mass spectrometry. The mass of the peptides entering the instrument are recorded and each peptide is sequentially fragmented to obtain its amino acid sequence. Each peptide sequence with its corresponding mass is then searched against a protein database to identify the protein to which it belongs. This thesis presents new method developments in this field. In a first part, the thesis describes development of identification methods. It shows the importance of protein enrichment methods to gain access to medium-to-low abundant proteins in a human milk sample. It uses repeated injection to increase protein coverage and confidence in identification and demonstrates the impact of new database releases on protein identification lists. In addition, it successfully uses mass spectrometry as an alternative to antibody-based assays to validate the presence of 34 different recombinant constructs of Staphylococcus aureus pathogenic proteins expressed in a Lactococcus lactis strain. In a second part, development of quantification methods is described. It shows new stable isotope labeling approaches based on N- and C-terminus labeling of proteins and describes the first method of labeling of carboxylic groups at the protein level using 13C stable isotopes. In addition, a new quantitative approach called ANIBAL is explained that labels all amino and carboxylic groups at the protein level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent study of a pair of sympatric species of cichlids in Lake Apoyo in Nicaragua is viewed as providing probably one of the most convincing examples of sympatric speciation to date. Here, we describe and study a stochastic, individual-based, explicit genetic model tailored for this cichlid system. Our results show that relatively rapid (<20,000 generations) colonization of a new ecological niche and (sympatric or parapatric) speciation via local adaptation and divergence in habitat and mating preferences are theoretically plausible if: (i) the number of loci underlying the traits controlling local adaptation, and habitat and mating preferences is small; (ii) the strength of selection for local adaptation is intermediate; (iii) the carrying capacity of the population is intermediate; and (iv) the effects of the loci influencing nonrandom mating are strong. We discuss patterns and timescales of ecological speciation identified by our model, and we highlight important parameters and features that need to be studied empirically to provide information that can be used to improve the biological realism and power of mathematical models of ecological speciation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a deep study on tissue modelization andclassification Techniques on T1-weighted MR images. Threeapproaches have been taken into account to perform thisvalidation study. Two of them are based on FiniteGaussian Mixture (FGM) model. The first one consists onlyin pure gaussian distributions (FGM-EM). The second oneuses a different model for partial volume (PV) (FGM-GA).The third one is based on a Hidden Markov Random Field(HMRF) model. All methods have been tested on a DigitalBrain Phantom image considered as the ground truth. Noiseand intensity non-uniformities have been added tosimulate real image conditions. Also the effect of ananisotropic filter is considered. Results demonstratethat methods relying in both intensity and spatialinformation are in general more robust to noise andinhomogeneities. However, in some cases there is nosignificant differences between all presented methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present thesis is a contribution to the debate on the applicability of mathematics; it examines the interplay between mathematics and the world, using historical case studies. The first part of the thesis consists of four small case studies. In chapter 1, I criticize "ante rem structuralism", proposed by Stewart Shapiro, by showing that his so-called "finite cardinal structures" are in conflict with mathematical practice. In chapter 2, I discuss Leonhard Euler's solution to the Königsberg bridges problem. I propose interpreting Euler's solution both as an explanation within mathematics and as a scientific explanation. I put the insights from the historical case to work against recent philosophical accounts of the Königsberg case. In chapter 3, I analyze the predator-prey model, proposed by Lotka and Volterra. I extract some interesting philosophical lessons from Volterra's original account of the model, such as: Volterra's remarks on mathematical methodology; the relation between mathematics and idealization in the construction of the model; some relevant details in the derivation of the Third Law, and; notions of intervention that are motivated by one of Volterra's main mathematical tools, phase spaces. In chapter 4, I discuss scientific and mathematical attempts to explain the structure of the bee's honeycomb. In the first part, I discuss a candidate explanation, based on the mathematical Honeycomb Conjecture, presented in Lyon and Colyvan (2008). I argue that this explanation is not scientifically adequate. In the second part, I discuss other mathematical, physical and biological studies that could contribute to an explanation of the bee's honeycomb. The upshot is that most of the relevant mathematics is not yet sufficiently understood, and there is also an ongoing debate as to the biological details of the construction of the bee's honeycomb. The second part of the thesis is a bigger case study from physics: the genesis of GR. Chapter 5 is a short introduction to the history, physics and mathematics that is relevant to the genesis of general relativity (GR). Chapter 6 discusses the historical question as to what Marcel Grossmann contributed to the genesis of GR. I will examine the so-called "Entwurf" paper, an important joint publication by Einstein and Grossmann, containing the first tensorial formulation of GR. By comparing Grossmann's part with the mathematical theories he used, we can gain a better understanding of what is involved in the first steps of assimilating a mathematical theory to a physical question. In chapter 7, I introduce, and discuss, a recent account of the applicability of mathematics to the world, the Inferential Conception (IC), proposed by Bueno and Colyvan (2011). I give a short exposition of the IC, offer some critical remarks on the account, discuss potential philosophical objections, and I propose some extensions of the IC. In chapter 8, I put the Inferential Conception (IC) to work in the historical case study: the genesis of GR. I analyze three historical episodes, using the conceptual apparatus provided by the IC. In episode one, I investigate how the starting point of the application process, the "assumed structure", is chosen. Then I analyze two small application cycles that led to revisions of the initial assumed structure. In episode two, I examine how the application of "new" mathematics - the application of the Absolute Differential Calculus (ADC) to gravitational theory - meshes with the IC. In episode three, I take a closer look at two of Einstein's failed attempts to find a suitable differential operator for the field equations, and apply the conceptual tools provided by the IC so as to better understand why he erroneously rejected both the Ricci tensor and the November tensor in the Zurich Notebook.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brain perfusion can be assessed by CT and MR. For CT, two major techniques are used. First, Xenon CT is an equilibrium technique based on a freely diffusible tracer. First pass of iodinated contrast injected intravenously is a second method, more widely available. Both methods are proven to be robust and quantitative, thanks to the linear relationship between contrast concentration and x-ray attenuation. For the CT methods, concern regarding x-ray doses delivered to the patients need to be addressed. MR is also able to assess brain perfusion using the first pass of gadolinium based contrast agent injected intravenously. This method has to be considered as a semi-quantitative because of the non linear relationship between contrast concentration and MR signal changes. Arterial spin labeling is another MR method assessing brain perfusion without injection of contrast. In such case, the blood flow in the carotids is magnetically labelled by an external radiofrequency pulse and observed during its first pass through the brain. Each of this various CT and MR techniques have advantages and limits that will be illustrated and summarized.Learning Objectives:1. To understand and compare the different techniques for brain perfusion imaging.2. To learn about the methods of acquisition and post-processing of brain perfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current restrictions for human cell-based therapies have been related to technological limitations with regards to cellular proliferation capacity (simple culture conditions), maintenance of differentiated phenotype for primary human cell culture and transmission of communicable diseases. Cultured primary fetal cells from one organ donation could possibly meet the exigent and stringent technical aspects for development of therapeutic products. Master and working cell banks from one fetal organ donation (skin) can be developed in short periods of time and safety tests can be performed at all stages of cell banking. For therapeutic use, fetal cells can be used up to two thirds of their life-span in an out-scaling process and consistency for several biological properties includes protein concentration, gene expression and biological activity. As it is the intention that banked primary fetal cells can profit from the prospected treatment of hundreds of thousands of patients with only one organ donation, it is imperative to show consistency, tracability and safety of the process including donor tissue selection, cell banking, cell testing and growth of cells in out-scaling for the preparation of whole-cell tissue-engineering products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rockfall is an extremely rapid process involving long travel distances. Due to these features, when an event occurs, the ability to take evasive action is practically zero and, thus, the risk of injury or loss of life is high. Damage to buildings and infrastructure is quite likely. In many cases, therefore, suitable protection measures are necessary. This contribution provides an overview of previous and current research on the main topics related to rockfall. It covers the onset of rockfall and runout modelling approaches, as well as hazard zoning and protection measures. It is the aim of this article to provide an in-depth knowledge base for researchers and practitioners involved in projects dealing with the rockfall protection of infrastructures, who may work in the fields of civil or environmental engineering, risk and safety, the earth and natural sciences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brain perfusion can be assessed by CT and MR. For CT, two major techniquesare used. First, Xenon CT is an equilibrium technique based on a freely diffusibletracer. First pass of iodinated contrast injected intravenously is a second method,more widely available. Both methods are proven to be robust and quantitative,thanks to the linear relationship between contrast concentration and x-ray attenuation.For the CT methods, concern regarding x-ray doses delivered to the patientsneed to be addressed. MR is also able to assess brain perfusion using the firstpass of gadolinium based contrast agent injected intravenously. This method hasto be considered as a semi-quantitative because of the non linear relationshipbetween contrast concentration and MR signal changes. Arterial spin labelingis another MR method assessing brain perfusion without injection of contrast. Insuch case, the blood flow in the carotids is magnetically labelled by an externalradiofrequency pulse and observed during its first pass through the brain. Eachof this various CT and MR techniques have advantages and limits that will be illustratedand summarised.Learning Objectives:1. To understand and compare the different techniques for brain perfusionimaging.2. To learn about the methods of acquisition and post-processing of brainperfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).