28 resultados para Environment impact
Resumo:
Biological monitoring of occupational exposure is characterized by important variability, due both to variability in the environment and to biological differences between workers. A quantitative description and understanding of this variability is important for a dependable application of biological monitoring. This work describes this variability,using a toxicokinetic model, for a large range of chemicals for which reference biological reference values exist. A toxicokinetic compartmental model describing both the parent compound and its metabolites was used. For each chemical, compartments were given physiological meaning. Models were elaborated based on physiological, physicochemical, and biochemical data when available, and on half-lives and central compartment concentrations when not available. Fourteen chemicals were studied (arsenic, cadmium, carbon monoxide, chromium, cobalt, ethylbenzene, ethyleneglycol monomethylether, fluorides, lead, mercury, methyl isobutyl ketone, penthachlorophenol, phenol, and toluene), representing 20 biological indicators. Occupational exposures were simulated using Monte Carlo techniques with realistic distributions of both individual physiological parameters and exposure conditions. Resulting biological indicator levels were then analyzed to identify the contribution of environmental and biological variability to total variability. Comparison of predicted biological indicator levels with biological exposure limits showed a high correlation with the model for 19 out of 20 indicators. Variability associated with changes in exposure levels (GSD of 1.5 and 2.0) is shown to be mainly influenced by the kinetics of the biological indicator. Thus, with regard to variability, we can conclude that, for the 14 chemicals modeled, biological monitoring would be preferable to air monitoring. For short half-lives (less than 7 hr), this is very similar to the environmental variability. However, for longer half-lives, estimated variability decreased. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: tables detailing the CBTK models for all 14 chemicals and the symbol nomenclature that was used.] [Authors]
Resumo:
This article analyzes whether and to what extent the policy environment of civil servants has an impact on their level of Public Service Motivation (PSM). It hypothesizes that public employees working in different policy domains and stages of the policy cycle are diversely motivated by four PSM orientations (Compassion, Commitment to the public interest, Self-sacrifice and Attraction to politics). The empirical results are based on a survey of 6885 Swiss civil servants. They show that those in charge of Welfare State policies are inclined to have higher levels of 'Compassion', whereas those performing core state functions report lower levels. Furthermore, employees whose main tasks are related to policy formulation display high levels of the 'Attraction to politics' dimension of PSM. This study questions the generalization of previous findings on PSM that are based on heterogeneous survey populations.
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.
Resumo:
With a life expectancy at the age of 65 of around 20 years, damaging health risk behaviours of young-old adults have become a target for preventive actions. Such risk factors necessitate an accurate understanding of the present and past socioeconomic conditions associated with health risk behaviours. The aim of our study is to assess the impact of certain life events as well as economic and environmental factors on health risk behaviours. We included 1309 participants of the Lausanne Cohort Lc65+ aged 65-70 years and employed logistic regression analyses, with individuals nested within areas. The results illustrate the influences of socioeconomic factors from childhood to young-old age. Life experiences in adulthood and economic resources in young-old age are both associated with unfavourable health behaviours. Neighbourhood is a modest determinant as well, particularly regarding alcohol consumption. Therefore, prevention against health risk behaviours should focus on population subgroups defined on the basis of their socioeconomic and living contexts.
Resumo:
NK cells can kill MHC-different or MHC-deficient but not syngeneic MHC-expressing target cells. This MHC class I-specific tolerance is acquired during NK cell development. MHC recognition by murine NK cells largely depends on clonally distributed Ly49 family receptors, which inhibit NK cell function upon ligand engagement. We investigated whether these receptors play a role for the development of NK cells and provide evidence that the expression of a Ly49 receptor transgene on developing NK cells endowed these cells with a significant developmental advantage over NK cells lacking such a receptor, but only if the relevant MHC ligand was present in the environment. The data suggest that the transgenic Ly49 receptor accelerates and/or rescues the development of NK cells which would otherwise fail to acquire sufficient numbers of self-MHC-specific receptors. Interestingly, the positive effect on NK cell development is most prominent when the MHC ligand is simultaneously present on both hemopoietic and nonhemopoietic cells. These findings correlate with functional data showing that MHC class I ligand on all cells is required to generate functionally mature NK cells capable of reacting to cells lacking the respective MHC ligand. We conclude that the engagement of inhibitory MHC receptors during NK cell development provides signals that are important for further NK cell differentiation and/or maturation.
Resumo:
Nanotechnology has been heralded as a "revolution" in science, for two reasons: first, because of its revolutionary view of the way in which chemicals and elements, such as gold and silver, behave, compared to traditional scientific understanding of their properties. Second, the impact of these new discoveries, as applied to commerce, can transform the daily life of consumer products ranging from sun tan lotions and cosmetics, food packaging and paints and coatings for cars, housing and fabrics, medicine and thousands of industrial processes.9 Beneficial consumer use of nanotechnologies, already in the stream of commerce, improves coatings on inks and paints in everything from food packaging to cars. Additionally, "Nanomedicine" offers the promise of diagnosis and treatment at the molecular level in order to detect and treat presymptomatic disease,10 or to rebuild neurons in Alzheimer's and Parkinson's disease. There is a possibility that severe complications such as stroke or heart attack may be avoided by means of prophylactic treatment of people at risk, and bone regeneration may keep many people active who never expected rehabilitation. Miniaturisation of diagnostic equipment can also reduce the amount of sampling materials required for testing and medical surveillance. Miraculous developments, that sound like science fiction to those people who eagerly anticipate these medical products, combined with the emerging commercial impact of nanotechnology applications to consumer products will reshape civil society - permanently. Thus, everyone within the jurisdiction of the Council of Europe is an end-user of nanotechnology, even without realising that nanotechnology has touched daily life.
Resumo:
Abstract in English : Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels. The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform. The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform. Résumé en Français : Basée sur le constat que les calculs ubiquitaires vont devenir le paradigme de programmation dans les années à venir, cette thèse propose une analyse des contraintes matérielles et environnementale auxquelles sont soumises les plateformes pervasives. Ces contraintes ayant un impact fort sur la programmation des plateformes. Des solutions sont donc proposées pour faciliter cette programmation tant au niveau de l'ensemble des noeuds qu'au niveau de chacun des noeuds de la plateforme. La première contribution présentée dans ce document propose d'utiliser une alliance de programmation orientée agent avec les grands principes de la bio-inspiration (Phylogénèse, Ontogénèse et Épigénèse). Ceci pour répondres aux contraintes de programmation de plateformes pervasives comme la plateforme PERvasive computing framework for modeling comPLEX virtually Unbounded Systems . La seconde contribution propose quant à elle une méthode permettant de programmer efficacement des applications parallélisable sur chaque noeud de calcul de la plateforme
Resumo:
Many biotic and abiotic factors affect the persistence and activity of beneficial pseudomonads introduced into soil to suppress plant diseases. One such factor may be the presence of virulent bacteriophages that decimate the population of the introduced bacteria, thereby reducing their beneficial effect. We have isolated a lytic bacteriophage (phi)GP100) that specifically infects the biocontrol bacterium Pseudomonas fluorescens CHA0 and some closely related Pseudomonas strains. phiGP100 was found to be a double-stranded-DNA phage with an icosahedral head, a stubby tail, and a genome size of approximately 50 kb. Replication of phiGP100 was negatively affected at temperatures higher than 25 degrees C. phiGP100 had a negative impact on the population size and the biocontrol activity of P. fluorescens strain CHA0-Rif (a rifampicin-resistant variant of CHA0) in natural soil microcosms. In the presence of phiGP100, the population size of strain CHA0-Rif in soil and on cucumber roots was reduced more than 100-fold. As a consequence, the bacterium's capacity to protect cucumber against a root disease caused by the pathogenic oomycete Pythium ultimum was entirely abolished. In contrast, the phage affected neither root colonization and nor the disease suppressive effect of a phiDGP100-resistant variant of strain CHA0-Rif. To our knowledge, this study is the first to illustrate the potential of phages to impair biocontrol performance of beneficial bacteria released into the natural soil environment.
Resumo:
Therapeutic nanoparticles (NPs) are used in nanomedicine as drug carriers or imaging agents, providing increased selectivity/specificity for diseased tissues. The first NPs in nanomedicine were developed for increasing the efficacy of known drugs displaying dose-limiting toxicity and poor bioavailability and for enhancing disease detection. Nanotechnologies have gained much interest owing to their huge potential for applications in industry and medicine. It is necessary to ensure and control the biocompatibility of the components of therapeutic NPs to guarantee that intrinsic toxicity does not overtake the benefits. In addition to monitoring their toxicity in vitro, in vivo and in silico, it is also necessary to understand their distribution in the human body, their biodegradation and excretion routes and dispersion in the environment. Therefore, a deep understanding of their interactions with living tissues and of their possible effects in the human (and animal) body is required for the safe use of nanoparticulate formulations. Obtaining this information was the main aim of the NanoTEST project, and the goals of the reports collected together in this special issue are to summarise the observations and results obtained by the participating research teams and to provide methodological tools for evaluating the biological impact of NPs.
Resumo:
Introduction: Frequent emergency department (ED) users are often vulnerable patients with many risk factors affecting their quality of life (QoL). The aim of this study was to examine to what extent a case management intervention improved frequent ED users' QoL. Methods: Data were part of a randomized, controlled trial designed to improve frequent ED users' QoL at the Lausanne University Hospital. A total of 194 frequent ED users (≥ 5 attendances during the previous 12 months; ≥ 18 years of age) were randomly assigned to the control or the intervention group. Participants in the intervention group received a case management intervention (i.e. counseling and assistance concerning social determinants of health, substance-use disorders, and access to the health-care system). QoL was evaluated using the WHOQOL-BREF at baseline and twelve months later. Four dimensions of QoL were retained: physical health, psychological health, social relationship, and environment, with scores ranging from 0 (low QoL) to 100 (high QoL).
Resumo:
PURPOSE: Laparoscopic surgery represents specific challenges, such as the reduction of a three-dimensional anatomic environment to two dimensions. The aim of this study was to investigate the impact of the loss of the third dimension on laparoscopic virtual reality (VR) performance. METHODS: We compared a group of examinees with impaired stereopsis (group 1, n = 28) to a group with accurate stereopsis (group 2, n = 29). The primary outcome was the difference between the mean total score (MTS) of all tasks taken together and the performance in task 3 (eye-hand coordination), which was a priori considered to be the most dependent on intact stereopsis. RESULTS: The MTS and performance in task 3 tended to be slightly, but not significantly, better in group 2 than in group 1 [MTS: -0.12 (95 % CI -0.32, 0.08; p = 0.234); task 3: -0.09 (95 % CI -0.29, 0.11; p = 0.385)]. The difference of MTS between simulated impaired stereopsis between group 2 (by attaching an eye patch on the adominant eye in the 2nd run) and the first run of group 1 was not significant (MTS: p = 0.981; task 3: p = 0.527). CONCLUSION: We were unable to demonstrate an impact of impaired examinees' stereopsis on laparoscopic VR performance. Individuals with accurate stereopsis seem to be able to compensate for the loss of the third dimension in laparoscopic VR simulations.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
Exiting from the largely sterile environment of the womb, the neonatal immune system is not fully mature, and thus neonatal immune cells must simultaneously mount responses against environmental stimuli while maturing. This dynamic process of immune maturation is driven by a variety of cell-intrinsic and extrinsic factors. Recent studies have focused on some of these factors and have shed light on the mechanisms by which they drive immune maturation. We review the interactions and consequences of immune maturation during the pre- and perinatal period. We discuss environmental signals in early life that are needed for healthy immune homeostasis, and highlight detrimental factors that can set an individual on a path towards disease. This early-life period of immune maturation could hold the key to strategies for setting individuals on trajectories towards health and reduced disease susceptibility.