953 resultados para Models and Principles


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: The aim of the study was to assess whether prospective follow-up data within the Swiss HIV Cohort Study can be used to predict patients who stop smoking; or among smokers who stop, those who start smoking again. METHODS: We built prediction models first using clinical reasoning ('clinical models') and then by selecting from numerous candidate predictors using advanced statistical methods ('statistical models'). Our clinical models were based on literature that suggests that motivation drives smoking cessation, while dependence drives relapse in those attempting to stop. Our statistical models were based on automatic variable selection using additive logistic regression with component-wise gradient boosting. RESULTS: Of 4833 smokers, 26% stopped smoking, at least temporarily; because among those who stopped, 48% started smoking again. The predictive performance of our clinical and statistical models was modest. A basic clinical model for cessation, with patients classified into three motivational groups, was nearly as discriminatory as a constrained statistical model with just the most important predictors (the ratio of nonsmoking visits to total visits, alcohol or drug dependence, psychiatric comorbidities, recent hospitalization and age). A basic clinical model for relapse, based on the maximum number of cigarettes per day prior to stopping, was not as discriminatory as a constrained statistical model with just the ratio of nonsmoking visits to total visits. CONCLUSIONS: Predicting smoking cessation and relapse is difficult, so that simple models are nearly as discriminatory as complex ones. Patients with a history of attempting to stop and those known to have stopped recently are the best candidates for an intervention.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

On December 4th 2007, a 3-Mm3 landslide occurred along the northwestern shore of Chehalis Lake. The initiation zone is located at the intersection of the main valley slope and the northern sidewall of a prominent gully. The slope failure caused a displacement wave that ran up to 38 m on the opposite shore of the lake. The landslide is temporally associated with a rain-on-snow meteorological event which is thought to have triggered it. This paper describes the Chehalis Lake landslide and presents a comparison of discontinuity orientation datasets obtained using three techniques: field measurements, terrestrial photogrammetric 3D models and an airborne LiDAR digital elevation model to describe the orientation and characteristics of the five discontinuity sets present. The discontinuity orientation data are used to perform kinematic, surface wedge limit equilibrium and three-dimensional distinct element analyses. The kinematic and surface wedge analyses suggest that the location of the slope failure (intersection of the valley slope and a gully wall) has facilitated the development of the unstable rock mass which initiated as a planar sliding failure. Results from the three-dimensional distinct element analyses suggest that the presence, orientation and high persistence of a discontinuity set dipping obliquely to the slope were critical to the development of the landslide and led to a failure mechanism dominated by planar sliding. The three-dimensional distinct element modelling also suggests that the presence of a steeply dipping discontinuity set striking perpendicular to the slope and associated with a fault exerted a significant control on the volume and extent of the failed rock mass but not on the overall stability of the slope.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Metabolic problems lead to numerous failures during clinical trials, and much effort is now devoted in developing in silico models predicting metabolic stability and metabolites. Such models are well known for cytochromes P450 and some transferases, whereas little has been done to predict the hydrolytic activity of human hydrolases. The present study was undertaken to develop a computational approach able to predict the hydrolysis of novel esters by human carboxylesterase hCES1. The study involves both docking analyses of known substrates to develop predictive models, and molecular dynamics (MD) simulations to reveal the in situ behavior of substrates and products, with particular attention being paid to the influence of their ionization state. The results emphasize some crucial properties of the hCES1 catalytic cavity, confirming that as a trend with several exceptions, hCES1 prefers substrates with relatively smaller and somewhat polar alkyl/aryl groups and larger hydrophobic acyl moieties. The docking results underline the usefulness of the hydrophobic interaction score proposed here, which allows a robust prediction of hCES1 catalysis, while the MD simulations show the different behavior of substrates and products in the enzyme cavity, suggesting in particular that basic substrates interact with the enzyme in their unprotonated form.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hem establert les bases metodològiques i teòriques per investigar la pregunta “Tenen les nacions sense estat el dret de controlar el seu propi espai de comunicació?”. La investigació ajusta el concepte d’espai de comunicació a la teoria política, cercant els seus límits en els drets individuals i, des de la perspectiva del liberalisme 2, aportant la justificació del seu control en quant que plataforma que incideix en la conservació i supervivència d’una cultura nacional. El primer article i fase de la tesi és l’adaptació i definició del concepte espai de comunicació. Fins ara, la recerca ha proposat diferents models d’espai de comunicació entenent si es tracta d’una visió emfatitzant la distribució i la producció de material marcat amb els símbols de la identitat nacional de la societat emissora, o bé si emfatitza la idea d’un espai de circulació de fluxos comunicatiu ajustat a un territori tradicionalment vinculat a una identitat nacional o nació sense estat. Igualment, es distingeix la dimensió d’emissió –sortir del territori al món- i la de recepció –fluxos informatius rebuts des del món al territori, concretament, al ciutadà; el paper d’intervenció de les institucions democràtiques és diferent en una dimensió o una altra i, per tant, també són diferents els drets afectats i les teories o principis que neguen o justifiquen el control de l’espai de comunicació. També s’ha indagat en les teories sobre els efectes cognitius dels mitjans de comunicació per relacionar-los amb la construcció nacional com a cohesió simbòlica i cultural. Si bé els mitjans no poden fer canviar de pensament immediatament, sí que poden conformar a llarg termini una percepció nacional general. Una comunitat és imaginada, donada la distància física dels seus components, i la comunicació social és, juntament amb l’educació, el principal factor de construcció nacional, avui en dia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

New blood vessel formation, a process referred to as angiogenesis, is essential for embryonic development and for many physiological and pathological processes during postnatal life, including cancer progression. Endothelial cell adhesion molecules of the integrin family have emerged as critical mediators and regulators of angiogenesis and vascular homeostasis. Integrins provide the physical interaction with the extracellular matrix necessary for cell adhesion, migration and positioning, and induction of signaling events essential for cell survival, proliferation and differentiation. Antagonists of integrin alpha V beta 3 suppress angiogenesis in many experimental models and are currently tested in clinical trials for their therapeutic efficacy against angiogenesis-dependent diseases, including cancer. Furthermore, interfering with signaling pathways downstream of integrins results in suppression of angiogenesis and may have relevant therapeutic implications. In this article we review the role of integrins in endothelial cell function and angiogenesis. In the light of recent advances in the field, we will discuss their relevance as a therapeutic target to suppress tumor angiogenesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Vascular integrins are essential regulators and mediators of physiological and pathological angiogenesis, including tumor angiogenesis. Integrins provide the physical interaction with the extracellular matrix (ECM) necessary for cell adhesion, migration and positioning, and induce signaling events essential for cell survival, proliferation and differentiation. Integrins preferentially expressed on neovascular endothelial cells, such as alphaVbeta3 and alpha5beta1, are considered as relevant targets for anti-angiogenic therapies. Anti-integrin antibodies and small molecular integrin inhibitors suppress angiogenesis and tumor progression in many animal models, and are currently tested in clinical trials as anti-angiogenic agents. Cyclooxygense-2 (COX-2), a key enzyme in the synthesis of prostaglandins and thromboxans, is highly up-regulated in tumor cells, stromal cells and angiogenic endothelial cells during tumor progression. Recent experiments have demonstrated that COX-2 promotes tumor angiogenesis. Chronic intake of nonsteroidal anti-inflammatory drugs and COX-2 inhibitors significantly reduces the risk of cancer development, and this effect may be due, at least in part, to the inhibition of tumor angiogenesis. Endothelial cell COX-2 promotes integrin alphaVbeta3-mediated endothelial cell adhesion, spreading, migration and angiogenesis through the prostaglandin-cAMP-PKA-dependent activation of the small GTPase Rac. In this article, we review the role of integrins and COX-2 in angiogenesis, their cross talk, and discuss implications relevant to their targeting to suppress tumor angiogenesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Prediction of clinical course and outcome after severe traumatic brain injury (TBI) is important. OBJECTIVE: To examine whether clinical scales (Glasgow Coma Scale [GCS], Injury Severity Score [ISS], and Acute Physiology and Chronic Health Evaluation II [APACHE II]) or radiographic scales based on admission computed tomography (Marshall and Rotterdam) were associated with intensive care unit (ICU) physiology (intracranial pressure [ICP], brain tissue oxygen tension [PbtO2]), and clinical outcome after severe TBI. METHODS: One hundred one patients (median age, 41.0 years; interquartile range [26-55]) with severe TBI who had ICP and PbtO2 monitoring were identified. The relationship between admission GCS, ISS, APACHE II, Marshall and Rotterdam scores and ICP, PbtO2, and outcome was examined by using mixed-effects models and logistic regression. RESULTS: Median (25%-75% interquartile range) admission GCS and APACHE II without GCS scores were 3.0 (3-7) and 11.0 (8-13), respectively. Marshall and Rotterdam scores were 3.0 (3-5) and 4.0 (4-5). Mean ICP and PbtO2 during the patients' ICU course were 15.5 ± 10.7 mm Hg and 29.9 ± 10.8 mm Hg, respectively. Three-month mortality was 37.6%. Admission GCS was not associated with mortality. APACHE II (P = .003), APACHE-non-GCS (P = .004), Marshall (P < .001), and Rotterdam scores (P < .001) were associated with mortality. No relationship between GCS, ISS, Marshall, or Rotterdam scores and subsequent ICP or PbtO2 was observed. The APACHE II score was inversely associated with median PbtO2 (P = .03) and minimum PbtO2 (P = .008) and had a stronger correlation with amount of time of reduced PbtO2. CONCLUSION: Following severe TBI, factors associated with outcome may not always predict a patient's ICU course and, in particular, intracranial physiology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Models predicting species spatial distribution are increasingly applied to wildlife management issues, emphasising the need for reliable methods to evaluate the accuracy of their predictions. As many available datasets (e.g. museums, herbariums, atlas) do not provide reliable information about species absences, several presence-only based analyses have been developed. However, methods to evaluate the accuracy of their predictions are few and have never been validated. The aim of this paper is to compare existing and new presenceonly evaluators to usual presence/absence measures. We use a reliable, diverse, presence/absence dataset of 114 plant species to test how common presence/absence indices (Kappa, MaxKappa, AUC, adjusted D-2) compare to presenceonly measures (AVI, CVI, Boyce index) for evaluating generalised linear models (GLM). Moreover we propose a new, threshold-independent evaluator, which we call "continuous Boyce index". All indices were implemented in the B10MAPPER software. We show that the presence-only evaluators are fairly correlated (p > 0.7) to the presence/absence ones. The Boyce indices are closer to AUC than to MaxKappa and are fairly insensitive to species prevalence. In addition, the Boyce indices provide predicted-toexpected ratio curves that offer further insights into the model quality: robustness, habitat suitability resolution and deviation from randomness. This information helps reclassifying predicted maps into meaningful habitat suitability classes. The continuous Boyce index is thus both a complement to usual evaluation of presence/absence models and a reliable measure of presence-only based predictions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim To explore the respective power of climate and topography to predict the distribution of reptiles in Switzerland, hence at a mesoscale level. A more detailed knowledge of these relationships, in combination with maps of the potential distribution derived from the models, is a valuable contribution to the design of conservation strategies. Location All of Switzerland. Methods Generalized linear models are used to derive predictive habitat distribution models from eco-geographical predictors in a geographical information system, using species data from a field survey conducted between 1980 and 1999. Results The maximum amount of deviance explained by climatic models is 65%, and 50% by topographical models. Low values were obtained with both sets of predictors for three species that are widely distributed in all parts of the country (Anguis fragilis , Coronella austriaca , and Natrix natrix), a result that suggests that including other important predictors, such as resources, should improve the models in further studies. With respect to topographical predictors, low values were also obtained for two species where we anticipated a strong response to aspect and slope, Podarcis muralis and Vipera aspis . Main conclusions Overall, both models and maps derived from climatic predictors more closely match the actual reptile distributions than those based on topography. These results suggest that the distributional limits of reptile species with a restricted range in Switzerland are largely set by climatic, predominantly temperature-related, factors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis focuses on theoretical asset pricing models and their empirical applications. I aim to investigate the following noteworthy problems: i) if the relationship between asset prices and investors' propensities to gamble and to fear disaster is time varying, ii) if the conflicting evidence for the firm and market level skewness can be explained by downside risk, Hi) if costly learning drives liquidity risk. Moreover, empirical tests support the above assumptions and provide novel findings in asset pricing, investment decisions, and firms' funding liquidity. The first chapter considers a partial equilibrium model where investors have heterogeneous propensities to gamble and fear disaster. Skewness preference represents the desire to gamble, while kurtosis aversion represents fear of extreme returns. Using US data from 1988 to 2012, my model demonstrates that in bad times, risk aversion is higher, more people fear disaster, and fewer people gamble, in contrast to good times. This leads to a new empirical finding: gambling preference has a greater impact on asset prices during market downturns than during booms. The second chapter consists of two essays. The first essay introduces a foramula based on conditional CAPM for decomposing the market skewness. We find that the major market upward and downward movements can be well preadicted by the asymmetric comovement of betas, which is characterized by an indicator called "Systematic Downside Risk" (SDR). We find that SDR can efafectively forecast future stock market movements and we obtain out-of-sample R-squares (compared with a strategy using historical mean) of more than 2.27% with monthly data. The second essay reconciles a well-known empirical fact: aggregating positively skewed firm returns leads to negatively skewed market return. We reconcile this fact through firms' greater response to negative maraket news than positive market news. We also propose several market return predictors, such as downside idiosyncratic skewness. The third chapter studies the funding liquidity risk based on a general equialibrium model which features two agents: one entrepreneur and one external investor. Only the investor needs to acquire information to estimate the unobservable fundamentals driving the economic outputs. The novelty is that information acquisition is more costly in bad times than in good times, i.e. counter-cyclical information cost, as supported by previous empirical evidence. Later we show that liquidity risks are principally driven by costly learning. Résumé Cette thèse présente des modèles théoriques dévaluation des actifs et leurs applications empiriques. Mon objectif est d'étudier les problèmes suivants: la relation entre l'évaluation des actifs et les tendances des investisseurs à parier et à crainadre le désastre varie selon le temps ; les indications contraires pour l'entreprise et l'asymétrie des niveaux de marché peuvent être expliquées par les risques de perte en cas de baisse; l'apprentissage coûteux augmente le risque de liquidité. En outre, des tests empiriques confirment les suppositions ci-dessus et fournissent de nouvelles découvertes en ce qui concerne l'évaluation des actifs, les décisions relatives aux investissements et la liquidité de financement des entreprises. Le premier chapitre examine un modèle d'équilibre où les investisseurs ont des tendances hétérogènes à parier et à craindre le désastre. La préférence asymétrique représente le désir de parier, alors que le kurtosis d'aversion représente la crainte du désastre. En utilisant les données des Etats-Unis de 1988 à 2012, mon modèle démontre que dans les mauvaises périodes, l'aversion du risque est plus grande, plus de gens craignent le désastre et moins de gens parient, conatrairement aux bonnes périodes. Ceci mène à une nouvelle découverte empirique: la préférence relative au pari a un plus grand impact sur les évaluations des actifs durant les ralentissements de marché que durant les booms économiques. Exploitant uniquement cette relation générera un revenu excédentaire annuel de 7,74% qui n'est pas expliqué par les modèles factoriels populaires. Le second chapitre comprend deux essais. Le premier essai introduit une foramule base sur le CAPM conditionnel pour décomposer l'asymétrie du marché. Nous avons découvert que les mouvements de hausses et de baisses majeures du marché peuvent être prédits par les mouvements communs des bêtas. Un inadicateur appelé Systematic Downside Risk, SDR (risque de ralentissement systématique) est créé pour caractériser cette asymétrie dans les mouvements communs des bêtas. Nous avons découvert que le risque de ralentissement systématique peut prévoir les prochains mouvements des marchés boursiers de manière efficace, et nous obtenons des carrés R hors échantillon (comparés avec une stratégie utilisant des moyens historiques) de plus de 2,272% avec des données mensuelles. Un investisseur qui évalue le marché en utilisant le risque de ralentissement systématique aurait obtenu une forte hausse du ratio de 0,206. Le second essai fait cadrer un fait empirique bien connu dans l'asymétrie des niveaux de march et d'entreprise, le total des revenus des entreprises positiveament asymétriques conduit à un revenu de marché négativement asymétrique. Nous décomposons l'asymétrie des revenus du marché au niveau de l'entreprise et faisons cadrer ce fait par une plus grande réaction des entreprises aux nouvelles négatives du marché qu'aux nouvelles positives du marché. Cette décomposition révélé plusieurs variables de revenus de marché efficaces tels que l'asymétrie caractéristique pondérée par la volatilité ainsi que l'asymétrie caractéristique de ralentissement. Le troisième chapitre fournit une nouvelle base théorique pour les problèmes de liquidité qui varient selon le temps au sein d'un environnement de marché incomplet. Nous proposons un modèle d'équilibre général avec deux agents: un entrepreneur et un investisseur externe. Seul l'investisseur a besoin de connaitre le véritable état de l'entreprise, par conséquent, les informations de paiement coutent de l'argent. La nouveauté est que l'acquisition de l'information coute plus cher durant les mauvaises périodes que durant les bonnes périodes, comme cela a été confirmé par de précédentes expériences. Lorsque la récession comamence, l'apprentissage coûteux fait augmenter les primes de liquidité causant un problème d'évaporation de liquidité, comme cela a été aussi confirmé par de précédentes expériences.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Summary : Division of labour is one of the most fascinating aspects of social insects. The efficient allocation of individuals to a multitude of different tasks requires a dynamic adjustment in response to the demands of a changing environment. A considerable number of theoretical models have focussed on identifying the mechanisms allowing colonies to perform efficient task allocation. The large majority of these models are built on the observation that individuals in a colony vary in their propensity (response threshold) to perform different tasks. Since individuals with a low threshold for a given task stimulus are more likely to perform that task than individuals with a high threshold, infra-colony variation in individual thresholds results in colony division of labour. These theoretical models suggest that variation in individual thresholds is affected by the within-colony genetic diversity. However, the models have not considered the genetic architecture underlying the individual response thresholds. This is important because a better understanding of division of labour requires determining how genotypic variation relates to differences in infra-colony response threshold distributions. In this thesis, we investigated the combined influence on task allocation efficiency of both, the within-colony genetic variability (stemming from variation in the number of matings by queens) and the number of genes underlying the response thresholds. We used an agent-based simulator to model a situation where workers in a colony had to perform either a regulatory task (where the amount of a given food item in the colony had to be maintained within predefined bounds) or a foraging task (where the quantity of a second type of food item collected had to be the highest possible). The performance of colonies was a function of workers being able to perform both tasks efficiently. To study the effect of within-colony genetic diversity, we compared the performance of colonies with queens mated with varying number of males. On the other hand, the influence of genetic architecture was investigated by varying the number of loci underlying the response threshold of the foraging and regulatory tasks. Artificial evolution was used to evolve the allelic values underlying the tasks thresholds. The results revealed that multiple matings always translated into higher colony performance, whatever the number of loci encoding the thresholds of the regulatory and foraging tasks. However, the beneficial effect of additional matings was particularly important when the genetic architecture of queens comprised one or few genes for the foraging task's threshold. By contrast, higher number of genes encoding the foraging task reduced colony performance with the detrimental effect being stronger when queens had mated with several males. Finally, the number of genes determining the threshold for the regulatory task only had a minor but incremental effect on colony performance. Overall, our numerical experiments indicate the importance of considering the effects of queen mating frequency, genetic architecture underlying task thresholds and the type of task performed when investigating the factors regulating the efficiency of division of labour in social insects. In this thesis we also investigate the task allocation efficiency of response threshold models and compare them with neural networks. While response threshold models are widely used amongst theoretical biologists interested in division of labour in social insects, our simulation reveals that they perform poorly compared to a neural network model. A major shortcoming of response thresholds is that they fail at one of the most crucial requirement of division of labour, the ability of individuals in a colony to efficiently switch between tasks under varying environmental conditions. Moreover, the intrinsic properties of the threshold models are that they lead to a large proportion of idle workers. Our results highlight these limitations of the response threshold models and provide an adequate substitute. Altogether, the experiments presented in this thesis provide novel contributions to the understanding of how division of labour in social insects is influenced by queen mating frequency and genetic architecture underlying worker task thresholds. Moreover, the thesis also provides a novel model of the mechanisms underlying worker task allocation that maybe more generally applicable than the widely used response threshold models. Resumé : La répartition du travail est l'un des aspects les plus fascinants des insectes vivant en société. Une allocation efficace de la multitude de différentes tâches entre individus demande un ajustement dynamique afin de répondre aux exigences d'un environnement en constant changement. Un nombre considérable de modèles théoriques se sont attachés à identifier les mécanismes permettant aux colonies d'effectuer une allocation efficace des tâches. La grande majorité des ces modèles sont basés sur le constat que les individus d'une même colonie diffèrent dans leur propension (inclination à répondre) à effectuer différentes tâches. Etant donné que les individus possédant un faible seuil de réponse à un stimulus associé à une tâche donnée sont plus disposés à effectuer cette dernière que les individus possédant un seuil élevé, les différences de seuils parmi les individus vivant au sein d'une même colonie mènent à une certaine répartition du travail. Ces modèles théoriques suggèrent que la variation des seuils des individus est affectée par la diversité génétique propre à la colonie. Cependant, ces modèles ne considèrent pas la structure génétique qui est à la base des seuils de réponse individuels. Ceci est très important car une meilleure compréhension de la répartition du travail requière de déterminer de quelle manière les variations génotypiques sont associées aux différentes distributions de seuils de réponse à l'intérieur d'une même colonie. Dans le cadre de cette thèse, nous étudions l'influence combinée de la variabilité génétique d'une colonie (qui prend son origine dans la variation du nombre d'accouplements des reines) avec le nombre de gènes supportant les seuils de réponse, vis-à-vis de la performance de l'allocation des tâches. Nous avons utilisé un simulateur basé sur des agents pour modéliser une situation où les travailleurs d'une colonie devaient accomplir une tâche de régulation (1a quantité d'une nourriture donnée doit être maintenue à l'intérieur d'un certain intervalle) ou une tâche de recherche de nourriture (la quantité d'une certaine nourriture doit être accumulée autant que possible). Dans ce contexte, 'efficacité des colonies tient en partie des travailleurs qui sont capable d'effectuer les deux tâches de manière efficace. Pour étudier l'effet de la diversité génétique d'une colonie, nous comparons l'efficacité des colonies possédant des reines qui s'accouplent avec un nombre variant de mâles. D'autre part, l'influence de la structure génétique a été étudiée en variant le nombre de loci à la base du seuil de réponse des deux tâches de régulation et de recherche de nourriture. Une évolution artificielle a été réalisée pour évoluer les valeurs alléliques qui sont à l'origine de ces seuils de réponse. Les résultats ont révélé que de nombreux accouplements se traduisaient toujours en une plus grande performance de la colonie, quelque soit le nombre de loci encodant les seuils des tâches de régulation et de recherche de nourriture. Cependant, les effets bénéfiques d'accouplements additionnels ont été particulièrement important lorsque la structure génétique des reines comprenait un ou quelques gènes pour le seuil de réponse pour la tâche de recherche de nourriture. D'autre part, un nombre plus élevé de gènes encodant la tâche de recherche de nourriture a diminué la performance de la colonie avec un effet nuisible d'autant plus fort lorsque les reines s'accouplent avec plusieurs mâles. Finalement, le nombre de gènes déterminant le seuil pour la tâche de régulation eu seulement un effet mineur mais incrémental sur la performance de la colonie. Pour conclure, nos expériences numériques révèlent l'importance de considérer les effets associés à la fréquence d'accouplement des reines, à la structure génétique qui est à l'origine des seuils de réponse pour les tâches ainsi qu'au type de tâche effectué au moment d'étudier les facteurs qui régulent l'efficacité de la répartition du travail chez les insectes vivant en communauté. Dans cette thèse, nous étudions l'efficacité de l'allocation des tâches des modèles prenant en compte des seuils de réponses, et les comparons à des réseaux de neurones. Alors que les modèles basés sur des seuils de réponse sont couramment utilisés parmi les biologistes intéressés par la répartition des tâches chez les insectes vivant en société, notre simulation montre qu'ils se révèlent peu efficace comparé à un modèle faisant usage de réseaux de neurones. Un point faible majeur des seuils de réponse est qu'ils échouent sur un point crucial nécessaire à la répartition des tâches, la capacité des individus d'une colonie à commuter efficacement entre des tâches soumises à des conditions environnementales changeantes. De plus, les propriétés intrinsèques des modèles basés sur l'utilisation de seuils conduisent à de larges populations de travailleurs inactifs. Nos résultats mettent en évidence les limites de ces modèles basés sur l'utilisation de seuils et fournissent un substitut adéquat. Ensemble, les expériences présentées dans cette thèse fournissent de nouvelles contributions pour comprendre comment la répartition du travail chez les insectes vivant en société est influencée par la fréquence d'accouplements des reines ainsi que par la structure génétique qui est à l'origine, pour un travailleur, du seuil de réponse pour une tâche. De plus, cette thèse fournit également un nouveau modèle décrivant les mécanismes qui sont à l'origine de l'allocation des tâches entre travailleurs, mécanismes qui peuvent être appliqué de manière plus générale que ceux couramment utilisés et basés sur des seuils de réponse.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Personal and Public Involvement (PPI) is an integral element of effective commissioning and is underpinned by a core set of values and principles - involving and listening to people in order to help us make services better.It brings about a number of recognised benefits if fully embraced into our culture and practice, these include:Use of service user knowledge and expertise;Better priority setting and decision making;More responsive, appropriate, efficient and tailored services;Transformation and reduction of complaints;Increased levels of service satisfaction;Increased dignity and self worth.The Public Health Agency (PHA) and Health and Social Care Board (HSCB) have now developed a joint Personal and Public Involvement (PPI) Strategy after extensive engagement and discussion. The Strategy has been approved by both organisations and is now being formally consulted on during the period 23rd June 2011 to 15th September 2011.The Strategy is now available for your consideration. We have developed the following documents (please see attachments below):Valuing People, Valuing Their Participation. Involving You and Listening to You Consultation Document.Valuing People, Valuing Their Participation, Involving You and Listening to You. [An Easy Read version of the Personal and Public Involvement Strategy].Valuing People, Valuing Their Participation. [An Equality and Human Rights Screening of the Strategy].Key Questions to guide consideration of the Personal and Public Involvement Strategy.People are encouraged to read the Strategy and to let us have your views.� There is a set of Key Questions, but any comments, ideas and or suggestions that you may have, that could support us in our efforts to embed Personal and Public Involvement into our culture and practice, would be most welcome.Responses should be returned by 4.00pm on Thursday 15th September 2011 to:By post:Martin QuinnRegional PPI LeadPublic Health AgencyGransha Park House15 Gransha ParkLondonderryBT47 6FNBy email: siobhan.carlin@hscni.net By telephone: (028) 7186 0086A more detailed version of the consultation document is avalable by clicking here or contacting Siobhan Carlin, email: siobhan.carlin@hscni.net, Tel: (028) 7186 0086.If you require any of these documents in an alternative format such as Braille, larger print or in another language if you are not fluent in English, please do not hesitate to contact us.A report of feedback received as part of this consultation can be made available upon request.Please be aware that the PHA and HSCB are also currently consulting on the Community Development Strategy.You are invited to consider responding to this consultation as well if appropriate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study was carried out to evaluate the molecular pattern of all available Brazilian human T-cell lymphotropic virus type 1 Env (n = 15) and Pol (n = 43) nucleotide sequences via epitope prediction, physico-chemical analysis, and protein potential sites identification, giving support to the Brazilian AIDS vaccine program. In 12 previously described peptides of the Env sequences we found 12 epitopes, while in 4 peptides of the Pol sequences we found 4 epitopes. The total variation on the amino acid composition was 9 and 17% for human leukocyte antigen (HLA) class I and class II Env epitopes, respectively. After analyzing the Pol sequences, results revealed a total amino acid variation of 0.75% for HLA-I and HLA-II epitopes. In 5 of the 12 Env epitopes the physico-chemical analysis demonstrated that the mutations magnified the antigenicity profile. The potential protein domain analysis of Env sequences showed the loss of a CK-2 phosphorylation site caused by D197N mutation in one epitope, and a N-glycosylation site caused by S246Y and V247I mutations in another epitope. Besides, the analysis of selection pressure have found 8 positive selected sites (w = 9.59) using the codon-based substitution models and maximum-likelihood methods. These studies underscore the importance of this Env region for the virus fitness, for the host immune response and, therefore, for the development of vaccine candidates.