37 resultados para Scale Development

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Careers today increasingly require engagement in proactive career behaviors; however, there is a lack of validated measures assessing the general degree to which somebody is engaged in such career behaviors. We describe the results of six studies with six independent samples of German university students (total N = 2,854), working professionals (total N = 561), and university graduates (N = 141) that report the development and validation of the Career Engagement Scale - a measure of the degree to which somebody is proactively developing her or his career as expressed by diverse career behaviors. The studies provide support for measurement invariance across gender and time. In support of convergent and discriminant validity, we find that career engagement is more prevalent among working professionals than among university students and that this scale has incremental validity above several specific career behaviors regarding its relation to vocational identity clarity and career self-efficacy beliefs among students and to job and career satisfaction among employees. In support of incremental predictive validity, beyond the effects of several more specific career behaviors, career engagement while at university predicts higher job and career satisfaction several months later after beginning work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this article was to apply psychometric theory to develop and validate a visual grading scale for assessing the visual perception of digital image quality anteroposterior (AP) pelvis. METHODS: Psychometric theory was used to guide scale development. Seven phantom and seven cadaver images of visually and objectively predetermined quality were used to help assess scale reliability and validity. 151 volunteers scored phantom images, and 184 volunteers scored cadaver images. Factor analysis and Cronbach's alpha were used to assess scale validity and reliability. RESULTS: A 24-item scale was produced. Aggregated mean volunteer scores for each image correlated with the rank order of the visually and objectively predetermined image qualities. Scale items had good interitem correlation (≥0.2) and high factor loadings (≥0.3). Cronbach's alpha (reliability) revealed that the scale has acceptable levels of internal reliability for both phantom and cadaver images (α = 0.8 and 0.9, respectively). Factor analysis suggested that the scale is multidimensional (assessing multiple quality themes). CONCLUSION: This study represents the first full development and validation of a visual image quality scale using psychometric theory. It is likely that this scale will have clinical, training and research applications. ADVANCES IN KNOWLEDGE: This article presents data to create and validate visual grading scales for radiographic examinations. The visual grading scale, for AP pelvis examinations, can act as a validated tool for future research, teaching and clinical evaluations of image quality.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Although brand authenticity is gaining increasing interest in consumer behavior research and managerial practice, literature on its measurement and contribution to branding theory is still limited. This article develops an integrative framework of the concept of brand authenticity and reports the development and validation of a scale measuring consumers' perceived brand authenticity (PBA). A multi-phase scale development process resulted in a 15-item PBA scale measuring four dimensions: credibility, integrity, symbolism, and continuity. This scale is reliable across different brands and cultural contexts. We find that brand authenticity perceptions are influenced by indexical, existential, and iconic cues, whereby some of the latters' influence is moderated by consumers' level of marketing skepticism. Results also suggest that PBA increases emotional brand attachment and word-of-mouth, and that it drives brand choice likelihood through self-congruence for consumers high in self-authenticity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Malignant cells are frequently recognized and destroyed by T cells, hence the development of T cell vaccines against established tumors. The challenge is to induce protective type 1 immune responses, with efficient Th1 and CTL activation, and long-term immunological memory. These goals are similar as in many infectious diseases, where successful immune protection is ideally induced with live vaccines. However, large-scale development of live vaccines is prevented by their very limited availability and vector immunogenicity. Synthetic vaccines have multiple advantages. Each of their components (antigens, adjuvants, delivery systems) contributes specifically to induction and maintenance of T cell responses. Here we summarize current experience with vaccines based on proteins and peptide antigens, and discuss approaches for the molecular characterization of clonotypic T cell responses. With carefully designed step-by-step modifications of innovative vaccine formulations, T cell vaccination can be optimized towards the goal of inducing therapeutic immune responses in humans.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Les plantes sont essentielles pour les sociétés humaines. Notre alimentation quotidienne, les matériaux de constructions et les sources énergétiques dérivent de la biomasse végétale. En revanche, la compréhension des multiples aspects développementaux des plantes est encore peu exploitée et représente un sujet de recherche majeur pour la science. L'émergence des technologies à haut débit pour le séquençage de génome à grande échelle ou l'imagerie de haute résolution permet à présent de produire des quantités énormes d'information. L'analyse informatique est une façon d'intégrer ces données et de réduire la complexité apparente vers une échelle d'abstraction appropriée, dont la finalité est de fournir des perspectives de recherches ciblées. Ceci représente la raison première de cette thèse. En d'autres termes, nous appliquons des méthodes descriptives et prédictives combinées à des simulations numériques afin d'apporter des solutions originales à des problèmes relatifs à la morphogénèse à l'échelle de la cellule et de l'organe. Nous nous sommes fixés parmi les objectifs principaux de cette thèse d'élucider de quelle manière l'interaction croisée des phytohormones auxine et brassinosteroïdes (BRs) détermine la croissance de la cellule dans la racine du méristème apical d'Arabidopsis thaliana, l'organisme modèle de référence pour les études moléculaires en plantes. Pour reconstruire le réseau de signalement cellulaire, nous avons extrait de la littérature les informations pertinentes concernant les relations entre les protéines impliquées dans la transduction des signaux hormonaux. Le réseau a ensuite été modélisé en utilisant un formalisme logique et qualitatif pour pallier l'absence de données quantitatives. Tout d'abord, Les résultats ont permis de confirmer que l'auxine et les BRs agissent en synergie pour contrôler la croissance de la cellule, puis, d'expliquer des observations phénotypiques paradoxales et au final, de mettre à jour une interaction clef entre deux protéines dans la maintenance du méristème de la racine. Une étude ultérieure chez la plante modèle Brachypodium dystachion (Brachypo- dium) a révélé l'ajustement du réseau d'interaction croisée entre auxine et éthylène par rapport à Arabidopsis. Chez ce dernier, interférer avec la biosynthèse de l'auxine mène à la formation d'une racine courte. Néanmoins, nous avons isolé chez Brachypodium un mutant hypomorphique dans la biosynthèse de l'auxine qui affiche une racine plus longue. Nous avons alors conduit une analyse morphométrique qui a confirmé que des cellules plus anisotropique (plus fines et longues) sont à l'origine de ce phénotype racinaire. Des analyses plus approfondies ont démontré que la différence phénotypique entre Brachypodium et Arabidopsis s'explique par une inversion de la fonction régulatrice dans la relation entre le réseau de signalisation par l'éthylène et la biosynthèse de l'auxine. L'analyse morphométrique utilisée dans l'étude précédente exploite le pipeline de traitement d'image de notre méthode d'histologie quantitative. Pendant la croissance secondaire, la symétrie bilatérale de l'hypocotyle est remplacée par une symétrie radiale et une organisation concentrique des tissus constitutifs. Ces tissus sont initialement composés d'une douzaine de cellules mais peuvent aisément atteindre des dizaines de milliers dans les derniers stades du développement. Cette échelle dépasse largement le seuil d'investigation par les moyens dits 'traditionnels' comme l'imagerie directe de tissus en profondeur. L'étude de ce système pendant cette phase de développement ne peut se faire qu'en réalisant des coupes fines de l'organe, ce qui empêche une compréhension des phénomènes cellulaires dynamiques sous-jacents. Nous y avons remédié en proposant une stratégie originale nommée, histologie quantitative. De fait, nous avons extrait l'information contenue dans des images de très haute résolution de sections transverses d'hypocotyles en utilisant un pipeline d'analyse et de segmentation d'image à grande échelle. Nous l'avons ensuite combiné avec un algorithme de reconnaissance automatique des cellules. Cet outil nous a permis de réaliser une description quantitative de la progression de la croissance secondaire révélant des schémas développementales non-apparents avec une inspection visuelle classique. La formation de pôle de phloèmes en structure répétée et espacée entre eux d'une longueur constante illustre les bénéfices de notre approche. Par ailleurs, l'exploitation approfondie de ces résultats a montré un changement de croissance anisotropique des cellules du cambium et du phloème qui semble en phase avec l'expansion du xylème. Combinant des outils génétiques et de la modélisation biomécanique, nous avons démontré que seule la croissance plus rapide des tissus internes peut produire une réorientation de l'axe de croissance anisotropique des tissus périphériques. Cette prédiction a été confirmée par le calcul du ratio des taux de croissance du xylème et du phloème au cours de développement secondaire ; des ratios élevés sont effectivement observés et concomitant à l'établissement progressif et tangentiel du cambium. Ces résultats suggèrent un mécanisme d'auto-organisation établi par un gradient de division méristématique qui génèrent une distribution de contraintes mécaniques. Ceci réoriente la croissance anisotropique des tissus périphériques pour supporter la croissance secondaire. - Plants are essential for human society, because our daily food, construction materials and sustainable energy are derived from plant biomass. Yet, despite this importance, the multiple developmental aspects of plants are still poorly understood and represent a major challenge for science. With the emergence of high throughput devices for genome sequencing and high-resolution imaging, data has never been so easy to collect, generating huge amounts of information. Computational analysis is one way to integrate those data and to decrease the apparent complexity towards an appropriate scale of abstraction with the aim to eventually provide new answers and direct further research perspectives. This is the motivation behind this thesis work, i.e. the application of descriptive and predictive analytics combined with computational modeling to answer problems that revolve around morphogenesis at the subcellular and organ scale. One of the goals of this thesis is to elucidate how the auxin-brassinosteroid phytohormone interaction determines the cell growth in the root apical meristem of Arabidopsis thaliana (Arabidopsis), the plant model of reference for molecular studies. The pertinent information about signaling protein relationships was obtained through the literature to reconstruct the entire hormonal crosstalk. Due to a lack of quantitative information, we employed a qualitative modeling formalism. This work permitted to confirm the synergistic effect of the hormonal crosstalk on cell elongation, to explain some of our paradoxical mutant phenotypes and to predict a novel interaction between the BREVIS RADIX (BRX) protein and the transcription factor MONOPTEROS (MP),which turned out to be critical for the maintenance of the root meristem. On the same subcellular scale, another study in the monocot model Brachypodium dystachion (Brachypodium) revealed an alternative wiring of auxin-ethylene crosstalk as compared to Arabidopsis. In the latter, increasing interference with auxin biosynthesis results in progressively shorter roots. By contrast, a hypomorphic Brachypodium mutant isolated in this study in an enzyme of the auxin biosynthesis pathway displayed a dramatically longer seminal root. Our morphometric analysis confirmed that more anisotropic cells (thinner and longer) are principally responsible for the mutant root phenotype. Further characterization pointed towards an inverted regulatory logic in the relation between ethylene signaling and auxin biosynthesis in Brachypodium as compared to Arabidopsis, which explains the phenotypic discrepancy. Finally, the morphometric analysis of hypocotyl secondary growth that we applied in this study was performed with the image-processing pipeline of our quantitative histology method. During its secondary growth, the hypocotyl reorganizes its primary bilateral symmetry to a radial symmetry of highly specialized tissues comprising several thousand cells, starting with a few dozens. However, such a scale only permits observations in thin cross-sections, severely hampering a comprehensive analysis of the morphodynamics involved. Our quantitative histology strategy overcomes this limitation. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with an automated cell type recognition algorithm, it allows precise quantitative characterization of vascular development and reveals developmental patterns that were not evident from visual inspection, for example the steady interspace distance of the phloem poles. Further analyses indicated a change in growth anisotropy of cambial and phloem cells, which appeared in phase with the expansion of xylem. Combining genetic tools and computational modeling, we showed that the reorientation of growth anisotropy axis of peripheral tissue layers only occurs when the growth rate of central tissue is higher than the peripheral one. This was confirmed by the calculation of the ratio of the growth rate xylem to phloem throughout secondary growth. High ratios are indeed observed and concomitant with the homogenization of cambium anisotropy. These results suggest a self-organization mechanism, promoted by a gradient of division in the cambium that generates a pattern of mechanical constraints. This, in turn, reorients the growth anisotropy of peripheral tissues to sustain the secondary growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Fragile X syndrome (FXS) is the most common inherited cause of intellectual disability. With no curative treatment available, current therapeutic approaches are aimed at symptom management. FXS is caused by silencing the FMR1 gene, which encodes FMRP; as loss of FMRP leads to the development of symptoms associated with FXS. Areas covered: In this evaluation, the authors examine the role of the metabotropic glutamate receptor 5 (mGluR5) in the pathophysiology of FXS, and its suitability as a target for rescuing the disease state. Furthermore, the authors review the evidence from preclinical studies of pharmacological interventions targeting mGluR5 in FXS. Lastly, the authors assess the findings from clinical studies in FXS, in particular the use of the Aberrant Behavior Checklist-Community Edition (ABC-C) and the recently developed ABC-C for FXS scale, as clinical endpoints to assess disease modification in this patient population. Expert opinion: There is cautious optimism for the successful treatment of the core behavioral and cognitive symptoms of FXS based on preclinical data in animal models and early studies in humans. However, the association between mGluR5-heightened responsiveness and the clinical phenotype in humans remains to be demonstrated. Many questions regarding the optimal treatment and outcome measures of FXS remain unanswered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Career adapt-ability has recently gained momentum as a psychosocial construct that not only has much to offer the field of career development, but also contributes to positive coping, adjustment and self-regulation through the four dimensions of concern, control, curiosity and confidence. The positive psychology movement, with concepts such as the orientations to happiness, explores the factors that contribute to human flourishing and optimum functioning. This research has two main contributions; 1) to validate a German version of the Career Adapt-Abilities Scale (CAAS), and 2) to extend the contribution of adapt-abilities to the field of work stress and explore its mediating capacity in the relation between orientations to happiness and work stress. We used a representative sample of the German-speaking Swiss working population including 1204 participants (49.8% women), aged between 26 and 56 (Mage = 42.04). Results indicated that the German version of the CAAS is valid, with overall high levels of model fit suggesting that the conceptual structure of career adapt-ability replicates well in this cultural context. Adapt-abilities showed a negative relationship to work stress, and a positive one with orientations to happiness. The engagement and pleasure scales of orientations to happiness also correlated negatively with work stress. Moreover, career adapt-ability mediates the relationship between orientations to happiness and work stress. In depth analysis of the mediating effect revealed that control is the only significant mediator. Thus control may be acting as a mechanism through which individuals attain their desired life at work subsequently contributing to reduced stress levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to present McHale's coparenting scale,a self-administered questionnaire enabling assessment of the quality of coparenting, and first steps in structural and construct validation of the French version. A total of 41 French speaking Swiss families and 84 US families completed this questionnaire and the Dyadic Adjustment Scale, a measure of marital satisfaction. The results of the Swiss families correspond to those of US families: first, items distributed into four factors (family integrity, conflict, affection and disparagement) and second, a partial link was found between quality of coparenting and marital adjustment. This finding supports the construct validity of the questionnaire, reflecting the established link between these two family sub-systems. Given that coparenting quality has a major influence on children's socio-affective development, the questionnaire will find great use in assessing not just negative features of coparenting, such as conflicts and disparagement, but also positive components such as warmth and support. This will be an important asset for research as well as clinical purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: Although acute pain is frequently reported by patients admitted to the emergency room, it is often insufficiently evaluated by physicians and is thus undertreated. With the aim of improving the care of adult patients with acute pain, we developed and implemented abbreviated clinical practice guidelines (CG) for the staff of nurses and physicians in our hospital's emergency room. METHODS: Our algorithm is based upon the practices described in the international literature and uses a simultaneous approach of treating acute pain in a rapid and efficacious manner along with diagnostic and therapeutic procedures. RESULTS: Pain was assessed using either a visual analogue scale (VAS) or a numerical rating scale (NRS) at ER admission and again during the hospital stay. Patients were treated with paracetamol and/or NSAID (VAS/NRS <4) or intravenous morphine (VAS/NRS > or =04). The algorithm also outlines a specific approach for patients with headaches to minimise the risks inherent to a non-specific treatment. In addition, our algorithm addresses the treatment of paroxysmal pain in patients with chronic pain as well as acute pain in drug addicts. It also outlines measures for pain prevention prior to minor diagnostic or therapeutic procedures. CONCLUSIONS: Based on published guidelines, an abbreviated clinical algorithm (AA) was developed and its simple format permitted a widespread implementation. In contrast to international guidelines, our algorithm favours giving nursing staff responsibility for decision making aspects of pain assessment and treatment in emergency room patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every year, debris flows cause huge damage in mountainous areas. Due to population pressure in hazardous zones, the socio-economic impact is much higher than in the past. Therefore, the development of indicative susceptibility hazard maps is of primary importance, particularly in developing countries. However, the complexity of the phenomenon and the variability of local controlling factors limit the use of processbased models for a first assessment. A debris flow model has been developed for regional susceptibility assessments using digital elevation model (DEM) with a GIS-based approach.. The automatic identification of source areas and the estimation of debris flow spreading, based on GIS tools, provide a substantial basis for a preliminary susceptibility assessment at a regional scale. One of the main advantages of this model is its workability. In fact, everything is open to the user, from the data choice to the selection of the algorithms and their parameters. The Flow-R model was tested in three different contexts: two in Switzerland and one in Pakistan, for indicative susceptibility hazard mapping. It was shown that the quality of the DEM is the most important parameter to obtain reliable results for propagation, but also to identify the potential debris flows sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Petroleum hydrocarbons are common contaminants in marine and freshwater aquatic habitats, often occurring as a result of oil spillage. Rapid and reliable on-site tools for measuring the bioavailable hydrocarbon fractions, i.e., those that are most likely to cause toxic effects or are available for biodegradation, would assist in assessing potential ecological damage and following the progress of cleanup operations. Here we examined the suitability of a set of different rapid bioassays (2-3 h) using bacteria expressing the LuxAB luciferase to measure the presence of short-chain linear alkanes, monoaromatic and polyaromatic compounds, biphenyls, and DNA-damaging agents in seawater after a laboratory-scale oil spill. Five independent spills of 20 mL of NSO-1 crude oil with 2 L of seawater (North Sea or Mediterranean Sea) were carried out in 5 L glass flasks for periods of up to 10 days. Bioassays readily detected ephemeral concentrations of short-chain alkanes and BTEX (i.e., benzene, toluene, ethylbenzene, and xylenes) in the seawater within minutes to hours after the spill, increasing to a maximum of up to 80 muM within 6-24 h, after which they decreased to low or undetectable levels. The strong decrease in short-chain alkanes and BTEX may have been due to their volatilization or biodegradation, which was supported by changes in the microbial community composition. Two- and three-ring PAHs appeared in the seawater phase after 24 h with a concentration up to 1 muM naphthalene equivalents and remained above 0.5 muM for the duration of the experiment. DNA-damage-sensitive bioreporters did not produce any signal with the oil-spilled aqueous-phase samples, whereas bioassays for (hydroxy)biphenyls showed occasional responses. Chemical analysis for alkanes and PAHs in contaminated seawater samples supported the bioassay data, but did not show the typical ephemeral peaks observed with the bioassays. We conclude that bacterium-based bioassays can be a suitable alternative for rapid on-site quantitative measurement of hydrocarbons in seawater.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Anhedonia is defined as a diminished capacity to experience pleasant emotion and is commonly included among the negative symptoms of schizophrenia. However, if patients report experiencing a lower level of pleasure than controls, they report experiencing as much pleasure as controls with online measurements of emotion. OBJECTIVE: The Temporal Experience of Pleasure Scale (TEPS) measures pleasure experienced in the moment and in anticipation of future activities. The TEPS is an 18-item self-report measurement of anticipatory (10 items) and consummatory (eight items) pleasure. The goal of this paper is to assess the psychometric characteristics of the French translation of this scale. METHODS: A control sample was composed of 60 women and 22 men, with a mean age of 38.1 years (S.D.: 10.8). Thirty-six were without qualification and 46 with qualified professional diploma. A sample of 21 patients meeting DSM IV-TR criteria for schizophrenia was recruited among the community psychiatry service of the department of psychiatry in Lausanne. They were five women and 16 men; mean age was of 34.1 years (S.D.: 7.5). Ten obtained a professional qualification and 11 were without qualification. None worked in competitive employment. Their mean dose of chlorpromazine equivalent was 431mg (S.D.: 259). All patients were on atypical antipsychotics. The control sample fulfilled the TEPS and the Physical Anhedonia Scale (PAS). The patient sample fulfilled the TEPS and was independently rated on the Calgary Depression Scale and the Scale for Assessment of Negative Symptoms. For comparison with controls, patients were matched on age, sex and professional qualification. This required the supplementary recruitment of two control subjects. RESULTS: Results with the control sample indicate that the TEPS presents an acceptable internal validity with Crombach alphas of 0.84 for the total scale, 0.74 for the anticipatory pleasure scale and 0.79 for the consummatory pleasure scale. The confirmatory factor analysis indicated that the model is well adapted to our data (chi(2)/dl=1.333; df=134; p<0.0006; root mean square residual, RMSEA=0.064). External validity measured with the PAS showed R=-0.27 (p<0.05) for the consummatory scale and R=-0.26 for the total score. Comparisons between patients and matched controls indicated that patients were significantly lower than control on anticipatory pleasure (t=2.7, df(40), 2-tailed p=0.01; cohen's d=0.83) and on total score of the TEPS (t=2.8, df (40), 2-tailed p=0.01; cohen's d=0.87). The two samples did not differ on consummatory pleasure. The anticipatory pleasure factor and the total TEPS showed significant negative correlation with the SANS anhedonia, respectively R=-0.78 (p<0.01) for the anticipatory factor and R=-0.61 (p<0.01) for the total TEPS. There was also a negative correlation between the anticipatory factor and the SANS avolition of R=-0.50 (p<0.05). These correlations were maintained, with partial correlations controlling for depression and chlorpromazine equivalents. CONCLUSION: The results of this validation show that the French version of the TEPS has psychometric characteristics similar to the original version. These results highlight the discrepancy between results of direct or indirect report of experienced pleasure in patients with schizophrenia. Patients may have difficulties in anticipating the pleasure of future enjoyable activities, but not in experiencing pleasure once in an enjoyable activity. Medication and depression do not seems to modify our results, but this should be better controlled in a longitudinal study. The anticipatory versus consummatory pleasure distinction appears to be useful for the development of new psychosocial interventions, tailored to improve desire in patients suffering from schizophrenia. Major limitations of the study are the small size of patient sample and the under representation of men in the control sample.