87 resultados para solve
em Université de Lausanne, Switzerland
Resumo:
Fish acute toxicity tests play an important role in environmental risk assessment and hazard classification because they allow for first estimates of the relative toxicity of various chemicals in various species. However, such tests need to be carefully interpreted. Here we shortly summarize the main issues which are linked to the genetics and the condition of the test animals, the standardized test situations, the uncertainty about whether a given test species can be seen as representative to a given fish fauna, the often missing knowledge about possible interaction effects, especially with micropathogens, and statistical problems like small sample sizes and, in some cases, pseudoreplication. We suggest that multi-factorial embryo tests on ecologically relevant species solve many of these issues, and we shortly explain how such tests could be done to avoid the weaker points of fish acute toxicity tests.
Resumo:
The incidental discovery of a solitary pulmonary nodule while performing a CT scan of the chest is a very common clinical problem. The differential diagnosis is large but the main clinical challenge is to exclude or ascertain a neoplasia. The evaluation of preexisting risk factors and the analysis of morphological characteristics of the nodule allow the clinician to solve this challenge in a significant number of cases. When the nature of the lesion remains indeterminate a careful follow-up with volumetric determination is necessary for decision making.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
Replacement of the hyperimmune anti-Rhesus (Rh) D immunoglobulin, currently used to prevent haemolytic disease of the newborn, by fully recombinant human anti-RhD antibodies would solve the current logistic problems associated with supply and demand. The combination of phage display repertoire cloning with precise selection procedures enables isolation of specific genes that can then be inserted into mammalian expression systems allowing production of large quantities of recombinant human proteins. With the aim of selecting high-affinity anti-RhD antibodies, two human Fab libraries were constructed from a hyperimmune donor. Use of a new phage panning procedure involving bromelin-treated red blood cells enabled the isolation of two high-affinity Fab-expressing phage clones. LD-6-3 and LD-6-33, specific for RhD. These showed a novel reaction pattern by recognizing the D variants D(III), D(IVa), D(IVb), D(Va), D(VI) types I and II. D(VII), Rh33 and DFR. Full-length immunoglobulin molecules were constructed by cloning the variable regions into expression vectors containing genomic DNA encoding the immunoglobulin constant regions. We describe the first, stable, suspension growth-adapted Chinese hamster ovary (CHO) cell line producing a high affinity recombinant human IgG1 anti-RhD antibody adapted to pilot-scale production. Evaluation of the Fc region of this recombinant antibody by either chemiluminescence or antibody-dependent cell cytotoxicity (ADCC) assays demonstrated macrophage activation and lysis of red blood cells by human lymphocytes. A consistent source of recombinant human anti-RhD immunoglobulin produced by CHO cells is expected to meet the stringent safety and regulatory requirements for prophylactic application.
Resumo:
Because Staphylococcus aureus strains contain multiple virulence factors, studying their pathogenic role by single-gene inactivation generated equivocal results. To circumvent this problem, we have expressed specific S. aureus genes in the less virulent organism Streptococcus gordonii and tested the recombinants for a gain of function both in vitro and in vivo. Clumping factor A (ClfA) and coagulase were investigated. Both gene products were expressed functionally and with similar kinetics during growth by streptococci and staphylococci. ClfA-positive S. gordonii was more adherent to platelet-fibrin clots mimicking cardiac vegetations in vitro and more infective in rats with experimental endocarditis (P < 0.05). Moreover, deleting clfA from clfA-positive streptococcal transformants restored both the low in vitro adherence and the low in vivo infectivity of the parent. Coagulase-positive transformants, on the other hand, were neither more adherent nor more infective than the parent. Furthermore, coagulase did not increase the pathogenicity of clfA-positive streptococci when both clfA and coa genes were simultaneously expressed in an artificial minioperon in streptococci. These results definitively attribute a role for ClfA, but not coagulase, in S. aureus endovascular infections. This gain-of-function strategy might help solve the role of individual factors in the complex the S. aureus-host relationship.
Resumo:
BACKGROUND: Hypertension can be controlled adequately with existing drugs such as angiotensin-converting enzyme inhibitors or angiotensin II receptor blockers. Nevertheless, treatment success is often restricted by patients not adhering to treatment. Immunisation against angiotensin II could solve this problem. We investigated the safety and efficacy of CYT006-AngQb-a vaccine based on a virus-like particle-that targets angiotensin II to reduce ambulatory blood pressure. METHODS: In this multicentre, double-blind, randomised, placebo-controlled phase IIa trial, 72 patients with mild-to-moderate hypertension were randomly assigned with a computer-generated randomisation list to receive subcutaneous injections of either 100 mug CYT006-AngQb (n=24), 300 mug CYT006-AngQb (24), or placebo (24), at weeks 0, 4, and 12. 24-h ambulatory blood pressure was measured before treatment and at week 14. The primary outcomes were safety and tolerability. Analyses were done by intention to treat. This study is registered with ClinicalTrials.gov, number NCT00500786. FINDINGS: Two patients in the 100 mug group, three in the 300 mug group, and none in the placebo group discontinued study treatment. All patients were included in safety analyses; efficacy analyses did not include the five dropouts, for whom no data were available at week 14. Five serious adverse events were reported (two in the 100 mug group, two in the 300 mug group, and one in the placebo group); none were deemed to be treatment related. Most side-effects were mild, transient reactions at the injection site. Mild, transient influenza-like symptoms were seen in three patients in the 100 mug group, seven in the 300 mug group, and none in the placebo group. In the 300 mug group, there was a reduction from baseline in mean ambulatory daytime blood pressure at week 14 by -9.0/-4.0 mm Hg compared with placebo (p=0.015 for systolic and 0.064 for diastolic). The 300 mug dose reduced the early morning blood-pressure surge compared with placebo (change at 0800 h -25/-13 mm Hg; p<0.0001 for systolic, p=0.0035 for diastolic). INTERPRETATION: Immunisation with CYT006-AngQb was associated with no serious adverse events; most observed adverse events were consistent with local or systemic responses similar to those seen with other vaccines. The 300 mug dose reduced blood pressure in patients with mild-to-moderate hypertension during the daytime, especially in the early morning. FUNDING: Cytos Biotechnology AG.
Resumo:
The RsmA family of RNA-binding proteins are global post-transcriptional regulators that mediate extensive changes in gene expression in bacteria. They bind to, and affect the translation rate of target mRNAs, a function that is further modulated by one or more, small, untranslated competitive regulatory RNAs. To gain new insights into the nature of this protein/RNA interaction, we used X-ray crystallography to solve the structure of the Yersinia enterocolitica RsmA homologue. RsmA consists of a dimeric beta barrel from which two alpha helices are projected. From structure-based alignments of the RsmA protein family from diverse bacteria, we identified key amino acid residues likely to be involved in RNA-binding. Site-specific mutagenesis revealed that arginine at position 44, located at the N terminus of the alpha helix is essential for biological activity in vivo and RNA-binding in vitro. Mutation of this site affects swarming motility, exoenzyme and secondary metabolite production in the human pathogen Pseudomonas aeruginosa, carbon metabolism in Escherichia coli, and hydrogen cyanide production in the plant beneficial strain Pseudomonas fluorescens CHA0. R44A mutants are also unable to interact with the small untranslated RNA, RsmZ. Thus, although possessing a motif similar to the KH domain of some eukaryotic RNA-binding proteins, RsmA differs substantially and incorporates a novel class of RNA-binding site.
Resumo:
This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.
Resumo:
Altruism is a deep and complex phenomenon that is analysed by scholars of various disciplines, including psychology, philosophy, biology, evolutionary anthropology and experimental economics. Much confusion arises in current literature because the term altruism covers variable concepts and processes across disciplines. Here we investigate the sense given to altruism when used in different fields and argumentative contexts. We argue that four distinct but related concepts need to be distinguished: (a) psychological altruism, the genuine motivation to improve others' interests and welfare; (b) reproductive altruism, which involves increasing others' chances of survival and reproduction at the actor's expense; (c) behavioural altruism, which involves bearing some cost in the interest of others; and (d) preference altruism, which is a preference for others' interests. We show how this conceptual clarification permits the identification of overstated claims that stem from an imprecise use of terminology. Distinguishing these four types of altruism will help to solve rhetorical conflicts that currently undermine the interdisciplinary debate about human altruism.
Resumo:
A survey was undertaken among Swiss occupational health and safety specialists in 2004 to identify uses, difficulties, and possible developments of exposure models. Occupational hygienists (121), occupational physicians (169), and safety specialists (95) were surveyed with an in depth questionnaire. Results obtained indicate that models are not used very much in practice in Switzerland and are reserved to research groups focusing on specific topics. However, various determinants of exposure are often considered important by professionals (emission rate, work activity), and in some cases recorded and used (room parameters, operator activity). These parameters cannot be directly included in present models. Nevertheless, more than half of the occupational hygienists think that it is important to develop quantitative exposure models. Looking at research institutions, there is, however, a big interest in the use of models to solve problems which are difficult to address with direct measurements; i. e. retrospective exposure assessment for specific clinical cases and prospective evaluation for new situations or estimation of the effect of selected parameters. In a recent study about cases of acutepulmonary toxicity following water proofing spray exposure, exposure models have been used to reconstruct exposure of a group of patients. Finally, in the context of exposure prediction, it is also important to report about a measurement database existing in Switzerland since 1991. [Authors]
Resumo:
Law and science have partnered together in the recent past to solve major public health issues, ranging from asbestos to averting the threat of a nuclear holocaust. This paper travels to a legal and health policy frontier where no one has gone before, examining the role of precautionary principles under international law as a matter of codified international jurisprudence by examining draft terminology from prominent sources including the Royal Commission on Environmental Pollution (UK), the Swiss Confederation, the USA (NIOSH) and the OECD. The research questions addressed are how can the benefits of nanotechnology be realized, while minimizing the risk of harm? What law, if any, applies to protect consumers (who comprise the general public, nanotechnology workers and their corporate social partners) and other stakeholders within civil society from liability? What law, if any, applies to prevent harm?
Resumo:
(Résumé de l'ouvrage) In a world society ruled by economic globalisation, by political interests and theories such as Huntington's «clash of civilisations» that widen the gap between the North and the South, the question should be asked of the role of the religion. To what extent religion and politics can work together? Can faith still be thought as a means of saving the world? Considering that Christianity, Islam and Judaism have much in common, this collection of miscellanies wonders if these religions can join their forces for public benefit. Senior and junior scholars from all over the world, gathered for an interdisciplinary seminar, analyse the contemporary international relationships and geopolitics through the prism of religion, discussing whether it can provide practical solutions to solve conflicts and increase the respect of human rights.
Resumo:
Si La notion de Bien commun paraît de prime abord trop ambitieuse, trompeuse en ses promesses excessives et inaccessibles, elle est néanmoins nécessaire au débat éthique dans l'espace public contemporain. Dans cette contribution, nous voudrions montrer comment une compréhension critique de la notion controversée de Bien commun peut s'avérer compatible avec une prise en compte réaliste et responsable des conflits d'intérêts et de la délibération éthique. L'exemple du débat français sur la laïcité permet à cet égard de comprendre la nécessité de dépasser l'opposition stérile entre un communautarisme poussé à l'extrême et un universalisme vidé de sa pertinence historique et dialectique. If, at the first look, the notion of the common Good seems too ambitious, making excessive and unaccessible promises, it is nevertheless necessary to use it in the ethical public discussion. This paper aims to show that a critical understanding of this controversial idea of the common Good may help to solve some conflicts of interests. The example of the french debate on laicity should help us to overcome the fallacious opposition between extreme communitarianism and abstract universalism.
Resumo:
QUESTIONS UNDER STUDY: Our aim was to identify the barriers young men face to consult a health professional when they encounter sexual dysfunctions and where they turn to, if so, for answers. METHODS: We conducted an exploratory qualitative research including 12 young men aged 16-20 years old seen in two focus groups. Discussions were triggered through vignettes about sexual dysfunction. RESULTS: Young men preferred not to talk about sexual dysfunction problems with anyone and to solve them alone as it is considered an intimate and embarrassing subject which can negatively impact their masculinity. Confidentiality appeared to be the most important criterion in disclosing an intimate subject to a health professional. Participants raised the problem of males' accessibility to services and lack of reason to consult. Two criteria to address the problem were if it was long-lasting or considered as physical. The Internet was unanimously considered as an initial solution to solve a problem, which could guide them to a face-to-face consultation if necessary. CONCLUSIONS: Results suggest that Internet-based tools should be developed to become an easy access door to sexual health services for young men. Wherever they consult and for whatever problem, sexual health must be on the agenda.
Resumo:
Knockout mice lacking alphalb noradrenergic receptors were tested in behavioural experiments to test a possible effect of the absence of this receptor in reaction to novelty and spatial orientation. Reaction to novelty was tested in two experiments. In the first one the mice' latency to exit the first part of a two compartment set-up was measured. The knockout mice were faster to emerge then their littermate controls. Then they were tested in an open-field, in which new objects were added at the second trial. In the open-field without objects (first trial), the knockout mice showed a greater locomotor activity (path length). Then the same mice showed enhanced exploration of the newly introduced objects, relative to the control. The spatial orientation experiments were done on a homing board and in the water maze. The homing board did not yield a significant difference between the knock-out and the control mice. Both groups showed impaired results when the proximal (olfactory) and distal (visual) cues were disrupted by the rotation of the table. In the water maze however, the alphalb(-/-) mice were unable to solve the task (acquisition and retention), whereas the control mice showed a good acquisition and retention behaviour. The knockout mice' incapacity to learn to reach the submerged platform was not due to an incapacity to swim, as they were as good as their control littermates to reach the platform when it was visible.