73 resultados para ATTRIBUTES
em Université de Lausanne, Switzerland
Resumo:
Among the various work stress models, one of the most popular to date is the job demands-‐control (JDC) model developed by Karasek (1979), which postulates that work-‐related strain will be the highest under work conditions characterized by high demands and low autonomy. The absence of social support at work will further increase negative outcomes. However, this model does not apply equally to all individuals and to all cultures. In the following studies, we assessed work characteristics, personality traits, culture-‐driven individual attributes, and work-‐related health outcomes, through the administration of questionnaires. The samples consist of Swiss (n = 622) and South African (n = 879) service-‐oriented employees (from health, finance, education and commerce sectors) and aged from 18 to 65 years old. Results generally confirm the universal contribution of high psychological demands, low decision latitude and low supervisor support at work, as well as high neuroticism predict the worse health outcomes among employees in both countries. Furthermore, low neuroticism plays a moderating role between psychological demands and burnout, while high openness and high conscientiousness each play a moderating role between decision latitude and burnout in South Africa. Results also reveal that culture-‐driven individual attributes play a role in both countries, but in a unique manner and according to the ethnic group of belonging. Given that organizations are increasingly characterized with multicultural employees as well as increasingly adverse and complex job conditions, our results help in identifying more updated and refined dynamics that are key between the employee and the work environment in today's context. -- L'un des modèles sur le stress au travail des plus répandus est celui développé par Karasek (1979), qui postule qu'une mauvaise santé chez les employés résulte d'une combinaison de demandes psychologiques élevées, d'une latitude décisionnelle faible et de l'absence de soutien social au travail. Néanmoins, ce modèle ne s'applique pas de façon équivalente chez tous les individus et dans toutes les cultures. Dans les études présentées, nous avons mesuré les caractéristiques de travail, les traits de personnalité, les traits culturels et les effets lies à la santé à l'aide de questionnaires. L'échantillon provient de la Suisse (n = 622) et de l'Afrique du Sud (n = 879) et comprend des employés de domaines divers en lien avec le service (notamment des secteurs de la santé, finance, éducation et commerce) tous âgés entre 18 et 65 ans. Les résultats confirment l'universalité des effets directs des demandes au travail, la latitude décisionnelle faible, le soutien social faible provenant du supérieur hiérarchique, ainsi que le névrosisme élevé qui contribuent à un niveau de santé faible au travail, et ce, dans les deux pays. De plus, un niveau faible de névrosisme a un effet de modération entre les demandes au travail et l'épuisement professionnel, alors que l'ouverture élevée et le caractère consciencieux élevé modèrent la relation entre la latitude décisionnelle et l'épuisement professionnel en Afrique du Sud. Nous avons aussi trouvé que les traits culturels jouent un rôle dans les deux pays, mais de façon unique et en fonction du groupe ethnique d'appartenance. Sachant que les organisations sont de plus en plus caractérisées par des employés d'origine ethnique variées, et que les conditions de travail se complexifient, nos résultats contribuent à mieux comprendre les dynamiques entre l'employé et l'environnement de travail contemporain. personnalité, différences individuelles, comparaisons culturelles, culture, stress au travail, épuisement professionnel, santé des employés.
Resumo:
T cell factor-1 (TCF-1) and lymphoid enhancer-binding factor 1, the effector transcription factors of the canonical Wnt pathway, are known to be critical for normal thymocyte development. However, it is largely unknown if it has a role in regulating mature T cell activation and T cell-mediated immune responses. In this study, we demonstrate that, like IL-7Ralpha and CD62L, TCF-1 and lymphoid enhancer-binding factor 1 exhibit dynamic expression changes during T cell responses, being highly expressed in naive T cells, downregulated in effector T cells, and upregulated again in memory T cells. Enforced expression of a p45 TCF-1 isoform limited the expansion of Ag-specific CD8 T cells in response to Listeria monocytogenes infection. However, when the p45 transgene was coupled with ectopic expression of stabilized beta-catenin, more Ag-specific memory CD8 T cells were generated, with enhanced ability to produce IL-2. Moreover, these memory CD8 T cells expanded to a larger number of secondary effectors and cleared bacteria faster when the immunized mice were rechallenged with virulent L. monocytogenes. Furthermore, in response to vaccinia virus or lymphocytic choriomeningitis virus infection, more Ag-specific memory CD8 T cells were generated in the presence of p45 and stabilized beta-catenin transgenes. Although activated Wnt signaling also resulted in larger numbers of Ag-specific memory CD4 T cells, their functional attributes and expansion after the secondary infection were not improved. Thus, constitutive activation of the canonical Wnt pathway favors memory CD8 T cell formation during initial immunization, resulting in enhanced immunity upon second encounter with the same pathogen.
Resumo:
The purpose of this paper is to review the scientific literature from August 2007 to July 2010. The review is focused on more than 420 published papers. The review will not cover information coming from international meetings available only in abstract form. Fingermarks constitute an important chapter with coverage of the identification process as well as detection techniques on various surfaces. We note that the research has been very dense both at exploring and understanding current detection methods as well as bringing groundbreaking techniques to increase the number of marks detected from various objects. The recent report from the US National Research Council (NRC) is a milestone that has promoted a critical discussion on the state of forensic science and its associated research. We can expect a surge of interest in research in relation to cognitive aspect of mark and print comparison, establishment of relevant forensic error rates and statistical modelling of the selectivity of marks' attributes. Other biometric means of forensic identification such as footmarks or earmarks are also covered in the report. Compared to previous years, we noted a decrease in the number of submission in these areas. No doubt that the NRC report has set the seed for further investigation of these fields as well.
Resumo:
Whether the somatosensory system, like its visual and auditory counterparts, is comprised of parallel functional pathways for processing identity and spatial attributes (so-called what and where pathways, respectively) has hitherto been studied in humans using neuropsychological and hemodynamic methods. Here, electrical neuroimaging of somatosensory evoked potentials (SEPs) identified the spatio-temporal mechanisms subserving vibrotactile processing during two types of blocks of trials. What blocks varied stimuli in their frequency (22.5 Hz vs. 110 Hz) independently of their location (left vs. right hand). Where blocks varied the same stimuli in their location independently of their frequency. In this way, there was a 2x2 within-subjects factorial design, counterbalancing the hand stimulated (left/right) and trial type (what/where). Responses to physically identical somatosensory stimuli differed within 200 ms post-stimulus onset, which is within the same timeframe we previously identified for audition (De Santis, L., Clarke, S., Murray, M.M., 2007. Automatic and intrinsic auditory "what" and "where" processing in humans revealed by electrical neuroimaging. Cereb Cortex 17, 9-17.). Initially (100-147 ms), responses to each hand were stronger to the what than where condition in a statistically indistinguishable network within the hemisphere contralateral to the stimulated hand, arguing against hemispheric specialization as the principal basis for somatosensory what and where pathways. Later (149-189 ms) responses differed topographically, indicative of the engagement of distinct configurations of brain networks. A common topography described responses to the where condition irrespective of the hand stimulated. By contrast, different topographies accounted for the what condition and also as a function of the hand stimulated. Parallel, functionally specialized pathways are observed across sensory systems and may be indicative of a computationally advantageous organization for processing spatial and identity information.
Resumo:
Normal visual perception requires differentiating foreground from background objects. Differences in physical attributes sometimes determine this relationship. Often such differences must instead be inferred, as when two objects or their parts have the same luminance. Modal completion refers to such perceptual "filling-in" of object borders that are accompanied by concurrent brightness enhancement, in turn termed illusory contours (ICs). Amodal completion is filling-in without concurrent brightness enhancement. Presently there are controversies regarding whether both completion processes use a common neural mechanism and whether perceptual filling-in is a bottom-up, feedforward process initiating at the lowest levels of the cortical visual pathway or commences at higher-tier regions. We previously examined modal completion (Murray et al., 2002) and provided evidence that the earliest modal IC sensitivity occurs within higher-tier object recognition areas of the lateral occipital complex (LOC). We further proposed that previous observations of IC sensitivity in lower-tier regions likely reflect feedback modulation from the LOC. The present study tested these proposals, examining the commonality between modal and amodal completion mechanisms with high-density electrical mapping, spatiotemporal topographic analyses, and the local autoregressive average distributed linear inverse source estimation. A common initial mechanism for both types of completion processes (140 msec) that manifested as a modulation in response strength within higher-tier visual areas, including the LOC and parietal structures, is demonstrated, whereas differential mechanisms were evident only at a subsequent time period (240 msec), with amodal completion relying on continued strong responses in these structures.
Resumo:
1. Statistical modelling is often used to relate sparse biological survey data to remotely derived environmental predictors, thereby providing a basis for predictively mapping biodiversity across an entire region of interest. The most popular strategy for such modelling has been to model distributions of individual species one at a time. Spatial modelling of biodiversity at the community level may, however, confer significant benefits for applications involving very large numbers of species, particularly if many of these species are recorded infrequently. 2. Community-level modelling combines data from multiple species and produces information on spatial pattern in the distribution of biodiversity at a collective community level instead of, or in addition to, the level of individual species. Spatial outputs from community-level modelling include predictive mapping of community types (groups of locations with similar species composition), species groups (groups of species with similar distributions), axes or gradients of compositional variation, levels of compositional dissimilarity between pairs of locations, and various macro-ecological properties (e.g. species richness). 3. Three broad modelling strategies can be used to generate these outputs: (i) 'assemble first, predict later', in which biological survey data are first classified, ordinated or aggregated to produce community-level entities or attributes that are then modelled in relation to environmental predictors; (ii) 'predict first, assemble later', in which individual species are modelled one at a time as a function of environmental variables, to produce a stack of species distribution maps that is then subjected to classification, ordination or aggregation; and (iii) 'assemble and predict together', in which all species are modelled simultaneously, within a single integrated modelling process. These strategies each have particular strengths and weaknesses, depending on the intended purpose of modelling and the type, quality and quantity of data involved. 4. Synthesis and applications. The potential benefits of modelling large multispecies data sets using community-level, as opposed to species-level, approaches include faster processing, increased power to detect shared patterns of environmental response across rarely recorded species, and enhanced capacity to synthesize complex data into a form more readily interpretable by scientists and decision-makers. Community-level modelling therefore deserves to be considered more often, and more widely, as a potential alternative or supplement to modelling individual species.
Resumo:
1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.
Resumo:
Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence-environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence-environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building 'under fit' models, having insufficient flexibility to describe observed occurrence-environment relationships, we risk misunderstanding the factors shaping species distributions. By building 'over fit' models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.
Resumo:
The survival, physiology and gene expression profile of the phenanthrene-degrading Sphingomonas sp. LH128 was examined after an extended period of complete nutrient starvation and compared with a non-starved population that had been harvested in exponential phase. After 6 months of starvation in an isotonic solution, only 5 % of the initial population formed culturable cells. Microscopic observation of GFP fluorescent cells, however, suggested that a larger fraction of cells (up to 80 %) were still alive and apparently had entered a viable but non-culturable (VBNC) state. The strain displayed several cellular and genetic adaptive strategies to survive long-term starvation. Flow cytometry, microscopic observation and fatty acid methyl ester (FAME) analysis showed a reduction in cell size, a change in cell shape and an increase in the degree of membrane fatty acid saturation. Transcriptome analysis showed decreased expression of genes involved in ribosomal protein biosynthesis, chromosomal replication, cell division and aromatic catabolism, increased expression of genes involved in regulation of gene expression and efflux systems, genetic translocations, and degradation of rRNA and fatty acids. Those phenotypic and transcriptomic changes were not observed after 4 h of starvation. Despite the starvation situation, the polycyclic aromatic hydrocarbon (PAH) catabolic activity was immediate upon exposure to phenanthrene. We conclude that a large fraction of cells maintain viability after an extended period of starvation apparently due to tuning the expression of a wide variety of cellular processes. Due to these survival attributes, bacteria of the genus Sphingomonas, like strain LH128, could be considered as suitable targets for use in remediation of nutrient-poor PAH-contaminated environments.
Resumo:
This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.
Resumo:
Phenological events - defined points in the life cycle of a plant or animal - have been regarded as highly plastic traits, reflecting flexible responses to various environmental cues. The ability of a species to track, via shifts in phenological events, the abiotic environment through time might dictate its vulnerability to future climate change. Understanding the predictors and drivers of phenological change is therefore critical. Here, we evaluated evidence for phylogenetic conservatism - the tendency for closely related species to share similar ecological and biological attributes - in phenological traits across flowering plants. We aggregated published and unpublished data on timing of first flower and first leaf, encompassing 4000 species at 23 sites across the Northern Hemisphere. We reconstructed the phylogeny for the set of included species, first, using the software program Phylomatic, and second, from DNA data. We then quantified phylogenetic conservatism in plant phenology within and across sites. We show that more closely related species tend to flower and leaf at similar times. By contrasting mean flowering times within and across sites, however, we illustrate that it is not the time of year that is conserved, but rather the phenological responses to a common set of abiotic cues. Our findings suggest that species cannot be treated as statistically independent when modelling phenological responses.Synthesis. Closely related species tend to resemble each other in the timing of their life-history events, a likely product of evolutionarily conserved responses to environmental cues. The search for the underlying drivers of phenology must therefore account for species' shared evolutionary histories.
Resumo:
OBJECTIVE: To investigate the relationship between usual and acute alcohol consumption among injured patients and, when combined, how they covary with other injury attributes. METHODS: Data from a randomised sample of 486 injured patients interviewed in an emergency department (Lausanne University Hospital, Switzerland) were analysed using the chi(2) test for independence and cluster analysis. RESULTS: Acute alcohol consumption (24.7%) was associated with usual drinking and particularly with high volumes of consumption. Six injury clusters were identified. Over-representations of acute consumption were found in a cluster typical of injuries sustained through interpersonal violence and in another formed by miscellaneous circumstances. A third cluster, typical of sports injuries, was linked to a group of frequent heavy episodic drinkers (without acute consumption). CONCLUSIONS: Among injured patients, acute alcohol consumption is common and associated with usual drinking. Acute and/or usual consumption form part of some, but not all, injury clusters.
Resumo:
Mammalian genomes contain highly conserved sequences that are not functionally transcribed. These sequences are single copy and comprise approximately 1-2% of the human genome. Evolutionary analysis strongly supports their functional conservation, although their potentially diverse, functional attributes remain unknown. It is likely that genomic variation in conserved non-genic sequences is associated with phenotypic variability and human disorders. So how might their function and contribution to human disorders be examined?
Resumo:
To provide nursing practice with evidence, it is important to understand nursing phenomena in detail. Therefore, good descriptions including the identification of characteristics and attributes of nursing phenomena on various levels of abstraction, i. e., concepts, are needed. In this article the significance of concept development for nursing science will be demonstrated by drawing on the example of 'transitoriness'. The evolutionary concept analysis proposed by Rodgers (2000) is introduced in more detail. Drawing on transitoriness, the phenomenon is presented with the help of the evolutionary concept analysis by Rodgers (2000). The phenomenon's characteristics and attributes are identified, as well as potential areas of application. Moreover, areas are outlined, in which interventions for nursing practice can be developed, implemented and evaluated. Thus, nursing practice is updated to include new findings and innovation. Through concept analysis nursing phenomena can be described in more detail, enhanced or broadened for use in nursing practice. Such structured processes as concept analysis can be employed successfully for other nursing phenomena. Concept analyses can lead to the identification of tasks for the respective scientific discipline and professionals. Thus, concept analyses can lead to the concretisation of tasks in nursing.