137 resultados para Time Use
Resumo:
PURPOSE: Not in Education, Employment, or Training (NEET) youth are youth disengaged from major social institutions and constitute a worrying concern. However, little is known about this subgroup of vulnerable youth. This study aimed to examine if NEET youth differ from other contemporaries in terms of personality, mental health, and substance use and to provide longitudinal examination of NEET status, testing its stability and prospective pathways with mental health and substance use. METHODS: As part of the Cohort Study on Substance Use Risk Factors, 4,758 young Swiss men in their early 20s answered questions concerning their current professional and educational status, personality, substance use, and symptomatology related to mental health. Descriptive statistics, generalized linear models for cross-sectional comparisons, and cross-lagged panel models for longitudinal associations were computed. RESULTS: NEET youth were 6.1% at baseline and 7.4% at follow-up with 1.4% being NEET at both time points. Comparisons between NEET and non-NEET youth showed significant differences in substance use and depressive symptoms only. Longitudinal associations showed that previous mental health, cannabis use, and daily smoking increased the likelihood of being NEET. Reverse causal paths were nonsignificant. CONCLUSIONS: NEET status seemed to be unlikely and transient among young Swiss men, associated with differences in mental health and substance use but not in personality. Causal paths presented NEET status as a consequence of mental health and substance use rather than a cause. Additionally, this study confirmed that cannabis use and daily smoking are public health problems. Prevention programs need to focus on these vulnerable youth to avoid them being disengaged.
Resumo:
Research has demonstrated that landscape or watershed scale processes can influence instream aquatic ecosystems, in terms of the impacts of delivery of fine sediment, solutes and organic matter. Testing such impacts upon populations of organisms (i.e. at the catchment scale) has not proven straightforward and differences have emerged in the conclusions reached. This is: (1) partly because different studies have focused upon different scales of enquiry; but also (2) because the emphasis upon upstream land cover has rarely addressed the extent to which such land covers are hydrologically connected, and hence able to deliver diffuse pollution, to the drainage network However, there is a third issue. In order to develop suitable hydrological models, we need to conceptualise the process cascade. To do this, we need to know what matters to the organism being impacted by the hydrological system, such that we can identify which processes need to be modelled. Acquiring such knowledge is not easy, especially for organisms like fish that might occupy very different locations in the river over relatively short periods of time. However, and inevitably, hydrological modellers have started by building up piecemeal the aspects of the problem that we think matter to fish. Herein, we report two developments: (a) for the case of sediment associated diffuse pollution from agriculture, a risk-based modelling framework, SCIMAP, has been developed, which is distinct because it has an explicit focus upon hydrological connectivity; and (b) we use spatially distributed ecological data to infer the processes and the associated process parameters that matter to salmonid fry. We apply the model to spatially distributed salmon and fry data from the River Eden, Cumbria, England. The analysis shows, quite surprisingly, that arable land covers are relatively unimportant as drivers of fry abundance. What matters most is intensive pasture, a land cover that could be associated with a number of stressors on salmonid fry (e.g. pesticides, fine sediment) and which allows us to identify a series of risky field locations, where this land cover is readily connected to the river system by overland flow. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
AIM: To assess whether repeating a grade was associated with drug use among adolescents after controlling for personal, family and school-related variables, and whether there were differences between students in mandatory and post-mandatory school. METHODS: Data were drawn from the Catalonia Adolescent Health Survey, a cross-sectional study of in-school adolescents aged 14-19 y. The index group included 366 subjects who were repeating a grade at the time the survey was carried out (old-for-grade, OFG). A control group matched by gender, school and being one grade ahead was randomly chosen among all the subjects who had never repeated a grade. All statistically significant variables in the bivariate analysis were included in a multivariate analysis. In a second step, all analyses were repeated for students in mandatory (14-16 y) and post-mandatory (17-19 y) school. RESULTS: After controlling for background variables, subjects in the index group were more likely to perceive that most of their peers were using synthetic drugs and to have ever used them, to have bad grades and a worse relationship with their teachers. OFG students in mandatory school were more likely to have divorced parents, bad grades and have ever used synthetic drugs, whereas they were less likely to be regular drinkers. OFG students in post-mandatory school were more likely to have below average grades, to be regular smokers and to perceive that most of their peers used synthetic drugs. CONCLUSIONS: When background variables are taken into consideration, the relationship between repeating a grade and drug use is not so clear. By increasing the familial and academic support of adolescents with academic underachievement, we could reduce their drug consumption.
Resumo:
BACKGROUND: In Switzerland, nurses are allowed to prescribe and administer morphine in emergency situations without a doctor. Still, nurses and other health professionals are often reluctant to prescribe and administer morphine for pain management in patients. No valid French-speaking instrument is available in Switzerland to assess the attitudes of nurses and other health professionals towards the prescription and administration of morphine. In this study, we evaluated the psychometric properties of the French version of the questionnaire "Attitudes towards morphine use". METHODS: The instrument was derived from an Italian version. Forward and back translations of the questionnaire were performed. Item analysis and construct validity were assessed between April and December 2010 in a cross sectional study including five Swiss hospitals in a sample of 588 health professionals (533 nurses, mean age 38.3 ± 10.2 years). Thirty subjects participated in test-retest reliability. RESULTS: The time to complete the instrument ranged between 12 and 15 minutes and neither floor nor ceiling effect were found. The initial 24-item instrument showed an intraclass correlation (ICC) of 0.69 (95% CI: 0.64 to 0.73, P < 0.001), and a Cronbach's α of 0.700. Factor analysis led to a six-component solution explaining 52.4% of the total variance. After excluding five items, the shortened version showed an ICC of 0.74 (95% CI, 0.70 to 0.77, P < 0.001) and a Cronbach's α of 0.741. Factor analysis led to a five-component solution explaining 54.3% of the total variance. The five components were named "risk of addiction/dependence"; "operational reasons for not using morphine"; "risk of escalation"; "other (non-dependence) risks" and "external (non-operational) reasons". In test-retest, the shortened instrument showed an ICC of 0.797 (95% CI, 0.630 to 0.911, P < 0.001) and a Cronbach's α of 0.797. CONCLUSIONS: The 19-item shortened instrument assessing attitudes towards the prescription and administration of morphine showed adequate content and construct validity.
Resumo:
OBJECTIVE: The principal aim of this study was to develop a Swiss Food Frequency Questionnaire (FFQ) for the elderly population for use in a study to investigate the influence of nutritional factors on bone health. The secondary aim was to assess its validity and both short-term and long-term reproducibility. DESIGN: A 4-day weighed record (4 d WR) was applied to 51 randomly selected women of a mean age of 80.3 years. Subsequently, a detailed FFQ was developed, cross-validated against a further 44 4-d WR, and the short- (1 month, n = 15) and long-term (12 months, n = 14) reproducibility examined. SETTING: French speaking part of Switzerland. SUBJECTS: The subjects were randomly selected women recruited from the Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture cohort study. RESULTS: Mean energy intakes by 4-d WR and FFQ showed no significant difference [1564.9 kcal (SD 351.1); 1641.3 kcal (SD 523.2) respectively]. Mean crude nutrient intakes were also similar (with nonsignifcant P-values examining the differences in intake) and ranged from 0.13 (potassium) to 0.48 (magnesium). Similar results were found in the reproducibility studies. CONCLUSION: These findings provide evidence that this FFQ adequately estimates nutrient intakes and can be used to rank individuals within distributions of intake in specific populations.
Resumo:
Although active personal dosemeters (APDs) are not used quite often in hospital environments, the possibility to assess the dose and/or dose rate in real time is particularly interesting in interventional radiology and cardiology (IR/IC) since operators can receive relatively high doses while standing close to the primary radiation field.A study concerning the optimization of the use of APDs in IR/IC was performed in the framework of the ORAMED project, a Collaborative Project (2008-2011) supported by the European Commission within its 7th Framework Program. This paper reports on tests performed with APDs on phantoms using an X-ray facility in a hospital environment and APDs worn by interventionalists during routine practice in different European hospitals.The behaviour of the APDs is more satisfactory in hospitals than in laboratories with respect to the influence of the tube peak high voltage and pulse width, because the APDs are tested in scattered fields with dose equivalent rates generally lower than 1 Sv.h(-1).
Resumo:
Introduction Music performance anxiety (MPA, often referred to as "stage fright") is one of the leading severe medical problems among musicians. For about 15-25% of musicians MPA is a serious problem. Particularly high levels of MPA are observed among music students. Musical performance can induce negative emotions, including anxiety, which in some individuals can approach extreme levels of terror and take the form of panic attack, impair the quality of the performance, lead to avoidance of performance situations, and consequently have debilitating effects on the career. Coping efforts used by musicians in their attempts to manage MPA, such as sedatives, alcohol, and β-blockers can have deleterious health side-effects. Music ranks high in the cultural and economic life of Switzerland. In ten university music schools, students from all around the world are educated to become professional musicians. Despite the importance of musical education in Switzerland, data concerning the phenomenon of MPA are largely lacking. Goal and Methods The main goal of this research was to survey the occurrence, experience, and management of MPA among full-time music students in French Swiss conservatories. A questionnaire was developed based on the literature and interviews with music students and teachers and distributed to all the students of the conservatories of Fribourg, Geneva, Lausanne, and Neuchâtel in the spring 2007. 194 students (61% women) returned the questionnaire. Results The size of the problem: MPA is a major problem for 1/3 of the students (ranks 3 and 4). The consequences of MPA: 22% and 35% of the students think that they have failed exams and auditions, respectively, because of MPA. Further, 25% of the students have already avoided performing and 11% have interrupted public performances because of MPA. Coping with MPA: 90% of the students have never used alcohol prior to performing, whereas 97% and 81%, respectively, have never used recreation drugs and medication. The majority of students use relaxation exercises, respiratory exercises, and meditation techniques to prepare themselves. About ¾ of the students think that the use of alcohol and recreational drugs to manage MPA is never justified. 53% of the students think that the use of medication is justified on some occasions. Need for information and support: 66% of the students would like to receive more support and help to cope with music performance situations. This support should mainly come from their teachers and specialists. 53% of the students know nothing or little about possible means for the management of MPA. About 50% consider themselves not at all or little informed about the possible risks associated with the consumption of alcohol, recreational drugs, and medication for the management of performance situations. 89% would like to know more about MPA and 94% think that this topic should be discussed much more in their musical education at the conservatory. Conclusions The results of this survey indicate that MPA is a major problem for 1/3 of the students with serious consequences on their career. There is a huge need for more information and support on how to manage the stress due to performance situations. The use of alcohol, recreational drugs, and medication is modest but the students are poorly informed about possible side-effects of these coping strategies. It seems clear that more should be done in the French Swiss conservatories about music performance anxiety to inform, educate, and prepare the students for their future professional career.
Resumo:
RESUME Objectifs: Etudier la prévalence des troubles liés à l'utilisation de substances psychoatives parmi des adolescents suicidaires; évaluer l'influence de la prise de substances psychoactives sur le geste suicidaire; analyser l'association entre les troubles liés à l'utilisation de substances psychoactives et le risque de récidive de la conduite suicidaire. Méthode: 186 adolescents, âgés de 16 à 21 ans, hospitalisés pour tentative de suicide ou idées suicidaires envahissantes, ont été inclus. Parmi eux, 148 ont été revus pour évaluation à 6 et/ou 18 mois. Des diagnostics psychiatriques, basés sur les critères du DSM-IV, ont été posés à l'aide d'un questionnaire, le MINI (Mini International Neuropsychiatric Interview). Résultats: A l'inclusion, 39.2% des sujets avaient un trouble lié à l'utilisation de substances psychoactives. Parmi eux, une proportion significativement plus élevée était sous l'influence d'alcool ou drogue au moment de la tentative de suicide (44.3% versus 25.4%). Des 148 adolescents suivis et revus à 6 ou 18 mois, 2 sont décédés par suicide et il y a eu 30 récidives de tentative de suicide durant l'étude. Une association significative a été trouvée entre les récidives de suicide et un diagnostic d'abus/dépendance à l'alcool à l'inclusion (OR=3.3; CI 0.7-15.0; 0R=2.6, CI 0.7-9.3). Des antécédents de plusieurs tentatives de suicide (OR=3.2; CI 1.1-10.0) et un âge supérieur à 19 ans (OR=3.2; CI 1.1-9.2) à l'inclusion étaient associés à la probabilité de mort par suicide ou de récidive de tentative de suicide. Conclusion: Parmi les adolescents hospitalisés pour tentative de suicide ou idées suicidaires envahissantes, le risque de décès ou de récidive est important. Ce risque est associé, entre autres, à des antécédents suicidaires et au diagnostic de trouble lié à l'utilisation de substances psychoactives. Le risque suicidaire ainsi que la consommation de substances psychoactives devrait être évalué chez les adolescents. De plus, les sujets jugés à risque devraient être suivis systématiquement après une hospitalisation pour conduite suicidaire. ABSTRACT Aim: To study the prevalence of psychoactive substance use disorder (PSUD) among suicidal adolescents, psychoactive substance intoxication at the moment of the attempt and the association between PSUD at baseline and either occurrence of suicide or repetition of suicide attempt(s). Methods: 186 adolescents aged 16 to 21 hospitalised for suicide attempt or overwhelming suicidal ideation were included (TO); 148 of them were traced again for evaluations after 6 months and/or 18 months. DSM-IV diagnoses were assessed each time using the Mini International Neuropsychiatric Interview. Results: At TO, 39.2% of the subjects were found to have a PSUD. Among them, a significantly higher proportion was intoxicated at the time of the attempt than those without PSUD (44-.3% vs. 25.4%). Among the 148 adolescents who could be traced at either Ti or T2, two died from suicide and 30 repeated suicide attempt once or more time. A marginally significant association was found between death by suicide/repetition of suicide attempt and alcohol abuse/dependence at baseline (0R=3.3; CI 0.7-15.0; 0R=2.6, CI 0.7-9.3). More than one suicide attempt before admission to hospital at TO (OR=3.2; CI 1.1-10.0) and age over 19 at TO (0R=3.2; CI 1.1-9.2) were independently associated with the likelihood of death by suicide or repetition of suicide attempt. Conclusion: Among adolescents hospitalised for suicide attempt or overwhelming suicidal ideation, the risk of death or repetition of attempt is high and is associated with previous suicide attempts - especially among older adolescents - and also marginally associated with PSUD; these adolescents should be carefully evaluated for such risks and followed up once discharged from the hospital.
Resumo:
Inspired by experiments that use single-particle tracking to measure the regions of confinement of selected chromosomal regions within cell nuclei, we have developed an analytical approach that takes into account various possible positions and shapes of the confinement regions. We show, in particular, that confinement of a particle into a subregion that is entirely enclosed within a spherical volume can lead to a higher limit of the mean radial square displacement value than the one associated with a particle that can explore the entire spherical volume. Finally, we apply the theory to analyse the motion of extrachromosomal chromatin rings within nuclei of living yeast.
Resumo:
BACKGROUND: Years since onset of sexual intercourse (YSSI) is a rarely used variable when studying adolescents- sexual outcomes. The aim of this study is to evaluate the influence of YSSI on the adverse sexual outcomes of early sexual initiators.METHODS: Data were drawn from the 2002 Swiss Multicenter Adolescent Survey on Health database, a nationally representative cross-sectional survey including 7429 adolescents in post mandatory school aged 16-20 years. Only adolescents reporting sexual intercourse (SI) were included (N=4388; 45% females) and divided by age of onset of SI (early initiators, age<16: N=1469, 44% females; and late initiators, age?16: N=2919, 46% females). Analyses were done separately by gender. Groups were compared for personal characteristics at the bivariate level. We analyzed three sexual outcomes (?4 sexual partners, pregnancy and non-use of condom at last SI) controlling for all significant personal variables with two logistic regressions first using age, then YSSI as one of the confounding variables. Results are given as adjusted odds ratios (aOR) using lSI as the reference category.RESULTS: After adjusting for YSSI instead of age, negative sexual outcomes among early initiators were no longer significant, except for multiple sexual partners among females, although at a much lower level. Early initiators were less likely to report non-use of condom at last SI when adjusting for YSSI (females: aOR=0.59 [0.44-0.79]; p<0.001; males aOR=0.71 [0.50-1.00]; p=0.053).CONCLUSION: YSSI is an important explanatory variable when studying adolescents- sexuality and needs to be included in future research on adolescents- sexual health.
Resumo:
Background: Understanding the true prevalence of lymphangioleiomyomatosis (LAM) is important in estimating disease burden and targeting specific interventions. As with all rare diseases, obtaining reliable epidemiological data is difficult and requires innovative approaches.Aim: To determine the prevalence and incidence of LAM using data from patient organizations in seven countries, and to use the extent to which the prevalence of LAM varies regionally and nationally to determine whether prevalence estimates are related to health-care provision.Methods: Numbers of women with LAM were obtained from patient groups and national databases from seven countries (n = 1001). Prevalence was calculated for regions within countries using female population figures from census data. Incidence estimates were calculated for the USA, UK and Switzerland. Regional variation in prevalence and changes in incidence over time were analysed using Poisson regression and linear regression.Results: Prevalence of LAM in the seven countries ranged from 3.4 to 7.8/million women with significant variation, both between countries and between states in the USA. This variation did not relate to the number of pulmonary specialists in the region nor the percentage of population with health insurance, but suggests a large number of patients remain undiagnosed. The incidence of LAM from 2004 to 2008 ranged from 0.23 to 0.31/million women/per year in the USA, UK and Switzerland.Conclusions: Using this method, we have found that the prevalence of LAM is higher than that previously recorded and that many patients with LAM are undiagnosed.
Resumo:
Les cellules dendritiques (DCs) sont des cellules multifonctionnelles qui font le lien entre le sytème immunitaire inné et adaptatif chez les mammifères. Il existe plusieurs sous-types de DCs basés sur leurs fonctions et l'endroit où elles se situent dans le corps. Dans le cadre de cette thèse, nous avons étudié le rôle de ces cellules face à une infection parasitaire. La Leishmania est un parasite causant une maladie appelée Leishmaniose, maladie endémique de l'Afrique, de l'Asie et de certaines régions de l'Amérique du Sud. Certaines espèces causent des lésions cutanées, alors que d'autres causent des lésions dans les muqueuses ou dans les organes internes. Le système immunitaire répond en générant une réponse inflammatoire qui élimine l'infection. Lors d'une réponse non-inflammatoire (de type cytokines, chemokines), cela va amener à une persistance du parasite sur le long terme. Les DC s'activant en présence du parasite dans la peau, vont le transporter vers un ganglion. A cet endroit, se trouvent différents sous-types de DC qui ont la particularité de présenter l'antigène (spécifique à la Leishmaniose) aux lymphocytes T, ce qui va alors amener à une réponse immunitaire puissante contre le parasite. Nous avons comparé différentes espèces de Leishmaniose dans leur façon d'activer les DC et différents modèles de souris ont été utilisé dans ce but-là. Les souris du type C57BL/6 sont connues pour être résistantes à L. major et sensibles à L. mexicana, alors qu'au contraire, les souris Balb/c sont connues pour être sensibles à ces deux espèces. En utilisant des parasites fluorescents transgéniques, nous avons comparé ces deux espèces de parasites (L. major et L. mexicana) en recherchant quelles cellules elles sont capables d'infecter in-vivo dans un modèle murin. Le rôle général des DC dans une infection à L. major a déjà été décrit. Dans notre étude, nous avons étudié le besoin en DC CD8a+ dans les ganglions afin d'engendrer une réponse face à une infection à L. major. Les souris qui n'ont pas ce sous-type de DC sont beaucoup plus sensibles à l'infection : elles ont des marqueurs inflammatoires plus bas et des lésions plus grandes. Nous avons également remarqué que les DC CD8a+ jouent un rôle crucial dans une phase plus avancée de l'infection. Dans notre laboratoire, nous avons la chance d'avoir une source illimitée de DCs de sous-type CD8a+ provenant d'une souris génétiquement modifiée par nos soin. Grâce à cela, nous avons utilisé ces cellules CD8a+ pour immuniser des rats afin de produire des anticorps monoclonaux ayant des propriétés spécifiques comme l'identification de protéines uniques présentes à la surface des DC et qui ensuite, modulent une réponse immunitaire in-vivo. Nous sommes actuellement en phase de caractérisation de plus de 750 hybridomes générés dans notre laboratoire. - Les cellules dendritiques (DCs) constituent le lien entre le système inné et adaptatif de la réponse immunitaire, car elles sont capables de présenter l'antigène, de donner la co- stimulation et de relâcher des cytokines et chimokines. Au cours de cette thèse, nous avons exploré différentes familles de DC lors d'infections parasitaires, telles que la Leishmaniose, parasite intracellulaire qui infecte les mammifères. La plupart des lésions cutanées résistantes sont caractérisées par une réponse pro-inflammatoire générée par l'IL-12. A l'inverse, pour la forme non résistante, la réponse est générée par l'IL-4 et l'IL-10, dans les modèles murins vulnérables. L'infection avec Lmajor a été caractérisée chez la souris C57BL/6 (Thl) et chez la souris Balb/c (Th2). Chez la souris C57BL/6 la lésion guérit, alors que chez la souris Balb/c, la lésion est au contraire non-cicatrisante. Nous avons comparé l'activation causée dans l'ensemble des DC par différentes espéces de Leishmania, et plus spécifiquement dans les DC CD8a+ présentes dans les ganglions lymphatiques et leur rôle dans la vulnérabilité à L. major. Ces cellules sont spécialisées dans la présentation croisée d'antigènes exogènes par le CMH-I et le haut taux de production d'IL-12 après activation. En utilisant des DC dérivées de moelle osseuse, nous avons constaté que L. guyanensis V+ (transportant un retrovirus) était le plus efficace pour l'activation des DC in-vitro comparé à L. major, L. mexicana et L. guyanensis (V-). Toutefois, in-vivo, les souris infectées avec L. major ont vu la taille de leur ganglions lymphatiques drainants augmentée, 3-6 semaines après l'infection dans les deux espèces de souris (les C57BL/6 résistantes et les Balb/c sensibles). En utilisant un parasite fluorescent transgénique, nous avons trouvé que les souris C57BL/6 sensibles à Lmexicana ont un nombre plus important de cellules Β infectées et un plus petit nombre de DC dérivées des monocytes inflammatoires, comparé au souris infectées avec L. major. Les conséquences de ces observations sont encore à l'étude. Des souris déficientes en CD8ct+DC et CD103+ sont plus sensibles à L. major que les souris WT: leurs lésions sont plus grandes et la charge parasitaire est plus importante. Nous avons généré une chimère de moelles osseuse CD11-DTR et Batf3-/- en mélangeant les moelles de ces deux souris, afin de déterminer le temps après infection où le manque de DC's CD8a+ contribue le plus à l'augmentation de la vulnérabilité chez la souris KO. Ces souris produisent plus d'IgG1 et IgE, font une réponse Th2 plus forte et Thl moins forte. Nous avons constaté que les souris déficientes en DC CD8a+ au début de la réponse immunitaire adaptive (trois semaines après injection) maintiennent un haut taux de lésions de grande taille, semblable à celui des souris chez qui les cellules ont été déplétées avant l'injection. Cela indique que les DC CD8a+ sont nécessaires pour l'efficacité de l'immunité dans la phase chronique de l'infection à L. major. Parallèlement à cela, nous avons aussi commencé une génération d'anticorps monoclonaux dirigés contre les DC CD8a+ activés en utilisant des souches établies dans notre laboratoire. En partant d'une librairie de 763 hybridomes, nous avons identifié plusieurs clones dignes d'intérêt avec une capacité fonctionnelle à moduler la prolifération et la sécrétion de cytokines des cellules T, ainsi que les molécules de co-stimulation présentes à la surface des DC activées elle-même. - Dendritic cells (DCs) are the bridge between the innate and the adaptive arms of the immune systems. They are professional antigen presentation cells and have important cytokine/chemokine release functions. In this dissertation we have focussed on the study of the different subsets of DCs in parasitic infection immunity. Leishmania are intra-cellular parasites of many different species that infect mammals. Most cutaneous lesions that are self- healing are characterized with a pro-inflammatory response with IL-12 while high levels of cytokines such as IL-4 and IL-10 characterized in susceptible mouse models. In mice L. major infection has been well characterized in C57BL/6 mice (Thl) that form healing lesions while Balb/c mice (Th2) form non-healing lesions. This thesis is focussed on comparing DC activation at large by different strains of Leishmania and more specifically, dLN resident CD8a+ DCs and their role in L. major susceptibility. This subset is specialized in cross- presentation of exogenous antigens in the MHC-I pathway and produce high levels of EL-12. Using bone marrow derived DCs we found that L. guyanensis V+ (carrying a retro-virus) was the most efficient at activating DCs in-vitro. In-vivo however L. major infected mice had the largest dLNs 3-6 weeks after infection in both genetically resistant C57BL/6 and susceptible Balb/c mice. Using transgenic fluorescent parasites, we found that C57BL/6 mice which are susceptible to L. mexicana had more number of infected Β cells and fewer number of infected inflammatory monocyte derived DCs in contrast to L. major infection. Using mice deficient in CD8a+ DCs, we found that these mice were more susceptible to L. major than their WT counterparts. They made larger lesions, had higher parasite burdens, higher levels of Th2 indicating immunolgloblins as measured by higher serie IgE levels and lower CD4+ IFNy+ cells. A mixed bone marrow chimera system of CDllc-DTR and Batf3~'~ was generated to determine the time point at which the lack of CD8a+ DCs most contributes to the increased susceptibility in KO mice. We found that mice depleted of CD8a+ DCs at the advent of the adaptive response (3 weeks after infection) maintained the significantly higher lesion size similar to mice whose cells were depleted from the onset of infection. This indicates that CD8a+ DCs are required for effective immunity in the chronic phase of L. major infection. We also began the generation of a valuable tool of monoclonal antibodies against activated CD8a+ DCs using our in-house DC line. From a library of 763 hybridomas we have identified several interesting clones with a functional ability to modulate Τ cell proliferation and cytokine secretion as well as down-modulating co-stimulatory molecules on activated DC cells themselves.
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
Most airborne microorganisms are natural components of our ecosystem. Soil, vegetation and animals, including humans, are sources for aerial release of these living or dead cells. In the past, assessment of airborne microorganisms was mainly restricted to occupational health concerns. Indeed, in several occupations, exposure to very high concentrations of non-infectious airborne bacteria and fungi, result in allergenic, toxic or irritant reactions. Recently, the threat of bioterrorism and pandemics have highlighted the urgent need to increase knowledge of bioaerosol ecology. More fundamentally, airborne bacterial and fungal communities begin to draw much more consideration from environmental microbiologists, who have neglected this area for a long time. This increased interest of scientists is to a great part due to the development and use of real-time PCR techniques to identify and quantify airborne microorganisms. Even if the advantages of the PCR technology are obvious, researchers are confronted with new problems. This review describes the methodological state of the art in bioaerosols field and emphasizes the future challenges and perspectives of the real-time PCR-based methods for airborne microorganism studies.
Resumo:
BACKGROUND: Co-morbidity information derived from administrative data needs to be validated to allow its regular use. We assessed evolution in the accuracy of coding for Charlson and Elixhauser co-morbidities at three time points over a 5-year period, following the introduction of the International Classification of Diseases, 10th Revision (ICD-10), coding of hospital discharges.METHODS: Cross-sectional time trend evaluation study of coding accuracy using hospital chart data of 3'499 randomly selected patients who were discharged in 1999, 2001 and 2003, from two teaching and one non-teaching hospital in Switzerland. We measured sensitivity, positive predictive and Kappa values for agreement between administrative data coded with ICD-10 and chart data as the 'reference standard' for recording 36 co-morbidities.RESULTS: For the 17 the Charlson co-morbidities, the sensitivity - median (min-max) - was 36.5% (17.4-64.1) in 1999, 42.5% (22.2-64.6) in 2001 and 42.8% (8.4-75.6) in 2003. For the 29 Elixhauser co-morbidities, the sensitivity was 34.2% (1.9-64.1) in 1999, 38.6% (10.5-66.5) in 2001 and 41.6% (5.1-76.5) in 2003. Between 1999 and 2003, sensitivity estimates increased for 30 co-morbidities and decreased for 6 co-morbidities. The increase in sensitivities was statistically significant for six conditions and the decrease significant for one. Kappa values were increased for 29 co-morbidities and decreased for seven.CONCLUSIONS: Accuracy of administrative data in recording clinical conditions improved slightly between 1999 and 2003. These findings are of relevance to all jurisdictions introducing new coding systems, because they demonstrate a phenomenon of improved administrative data accuracy that may relate to a coding 'learning curve' with the new coding system.