106 resultados para Cattelino, Elena: Adolescents and risk
Resumo:
Sport participation has often been the topic in sports science and it could be shown that in Europe the population of northern and western countries are more often physically active than southern and eastern countries (European Commission, 2014). In Switzerland the physical activity of the Swiss population also differs between the linguistic regions. The German speaking population is more often physically active than the French or Italian speaking part (Stamm & Lamprecht, 2008). To explain the differences in sport participation structural and cultural factors have been discussed. Because within a country homogenous structural conditions can be assumed, the aim of this study is to analyse how socio-cultural factors correlate with sport participation of adolescents and young adults. In order to analyse this research question, Bourdieu’s concept of habitus (1984) has been used as theoretical background. This sport-related concept of habitus considers cultural determined values, the attribution of meaning and patterns of action which is socially determined and have an influence on individual actions and therefore also on the sport practise. On this basis, a qualitative study including guideline-based interviews with German (n=5) and French (n=3) speaking adolescents and young adults at the age of 16 to 24 (M=21.4) were held in two different linguistic regions of Switzerland. To analyse the interviews the documentary method was applied (Bohnsack, 2010). Initial findings reveal that there are different sport related values, attributions of meanings and patterns of action also called framework of orientations concerning topics like body, health and leisure which correlate with the habitual sports practise in the two different linguistic regions. This study illustrates that the habitus is culturally shaped and that it could help to understand the meaning of socio-cultural factors for sport participation.
Resumo:
Assessing and managing risks relating to the consumption of food stuffs for humans and to the environment has been one of the most complex legal issues in WTO law, ever since the Agreement on Sanitary and Phytosanitary Measures was adopted at the end of the Uruguay Round and entered into force in 1995. The problem was expounded in a number of cases. Panels and the Appellate Body adopted different philosophies in interpreting the agreement and the basic concept of risk assessment as defined in Annex A para. 4 of the Agreement. Risk assessment entails fundamental question on law and science. Different interpretations reflect different underlying perceptions of science and its relationship to the law. The present thesis supported by the Swiss National Research Foundation undertakes an in-depth analysis of these underlying perceptions. The author expounds the essence and differences of positivism and relativism in philosophy and natural sciences. He clarifies the relationship of fundamental concepts such as risk, hazards and probability. This investigation is a remarkable effort on the part of lawyer keen to learn more about the fundamentals based upon which the law – often unconsciously – is operated by the legal profession and the trade community. Based upon these insights, he turns to a critical assessment of jurisprudence both of panels and the Appellate Body. Extensively referring and discussing the literature, he deconstructs findings and decisions in light of implied and assumed underlying philosophies and perceptions as to the relationship of law and science, in particular in the field of food standards. Finding that both positivism and relativism does not provide adequate answers, the author turns critical rationalism and applies the methodologies of falsification developed by Karl R. Popper. Critical rationalism allows combining discourse in science and law and helps preparing the ground for a new approach to risk assessment and risk management. Linking the problem to the doctrine of multilevel governance the author develops a theory allocating risk assessment to international for a while leaving the matter of risk management to national and democratically accountable government. While the author throughout the thesis questions the possibility of separating risk assessment and risk management, the thesis offers new avenues which may assist in structuring a complex and difficult problem
Thrombophilia and risk of VTE recurrence according to the age at the time of first VTE manifestation
Resumo:
BACKGROUND Whether screening for thrombophilia is useful for patients after a first episode of venous thromboembolism (VTE) is a controversial issue. However, the impact of thrombophilia on the risk of recurrence may vary depending on the patient's age at the time of the first VTE. PATIENTS AND METHODS Of 1221 VTE patients (42 % males) registered in the MAISTHRO (MAin-ISar-THROmbosis) registry, 261 experienced VTE recurrence during a 5-year follow-up after the discontinuation of anticoagulant therapy. RESULTS Thrombophilia was more common among patients with VTE recurrence than those without (58.6 % vs. 50.3 %; p = 0.017). Stratifying patients by the age at the time of their initial VTE, Cox proportional hazards analyses adjusted for age, sex and the presence or absence of established risk factors revealed a heterozygous prothrombin (PT) G20210A mutation (hazard ratio (HR) 2.65; 95 %-confidence interval (CI) 1.71 - 4.12; p < 0.001), homozygosity/double heterozygosity for the factor V Leiden and/or PT mutation (HR 2.35; 95 %-CI 1.09 - 5.07, p = 0.030), and an antithrombin deficiency (HR 2.12; 95 %-CI 1.12 - 4.10; p = 0.021) to predict recurrent VTE in patients aged 40 years or older, whereas lupus anticoagulants (HR 3.05; 95%-CI 1.40 - 6.66; p = 0.005) increased the risk of recurrence in younger patients. Subgroup analyses revealed an increased risk of recurrence for a heterozygous factor V Leiden mutation only in young females without hormonal treatment whereas the predictive value of a heterozygous PT mutation was restricted to males over the age of 40 years. CONCLUSIONS Our data do not support a preference of younger patients for thrombophilia testing after a first venous thromboembolic event.
Resumo:
BACKGROUND Physicians traditionally treat ulcerative colitis (UC) using a step-up approach. Given the paucity of data, we aimed to assess the cumulative probability of UC-related need for step-up therapy and to identify escalation-associated risk factors. METHODS Patients with UC enrolled into the Swiss IBD Cohort Study were analyzed. The following steps from the bottom to the top of the therapeutic pyramid were examined: (1) 5-aminosalicylic acid and/or rectal corticosteroids, (2) systemic corticosteroids, (3) immunomodulators (IM) (azathioprine, 6-mercaptopurine, methotrexate), (4) TNF antagonists, (5) calcineurin inhibitors, and (6) colectomy. RESULTS Data on 996 patients with UC with a median disease duration of 9 years were examined. The point estimates of cumulative use of different treatments at years 1, 5, 10, and 20 after UC diagnosis were 91%, 96%, 96%, and 97%, respectively, for 5-ASA and/or rectal corticosteroids, 63%, 69%, 72%, and 79%, respectively, for systemic corticosteroids, 43%, 57%, 59%, and 64%, respectively, for IM, 15%, 28%, and 35% (up to year 10 only), respectively, for TNF antagonists, 5%, 9%, 11%, and 12%, respectively, for calcineurin inhibitors, 1%, 5%, 9%, and 18%, respectively, for colectomy. The presence of extraintestinal manifestations and extended disease location (at least left-sided colitis) were identified as risk factors for step-up in therapy with systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and surgery. Cigarette smoking at diagnosis was protective against surgery. CONCLUSIONS The presence of extraintestinal manifestations, left-sided colitis, and extensive colitis/pancolitis at the time of diagnosis were associated with use of systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and colectomy during the disease course.
Resumo:
IMPORTANCE Some experts suggest that serum thyrotropin levels in the upper part of the current reference range should be considered abnormal, an approach that would reclassify many individuals as having mild hypothyroidism. Health hazards associated with such thyrotropin levels are poorly documented, but conflicting evidence suggests that thyrotropin levels in the upper part of the reference range may be associated with an increased risk of coronary heart disease (CHD). OBJECTIVE To assess the association between differences in thyroid function within the reference range and CHD risk. DESIGN, SETTING, AND PARTICIPANTS Individual participant data analysis of 14 cohorts with baseline examinations between July 1972 and April 2002 and with median follow-up ranging from 3.3 to 20.0 years. Participants included 55,412 individuals with serum thyrotropin levels of 0.45 to 4.49 mIU/L and no previously known thyroid or cardiovascular disease at baseline. EXPOSURES Thyroid function as expressed by serum thyrotropin levels at baseline. MAIN OUTCOMES AND MEASURES Hazard ratios (HRs) of CHD mortality and CHD events according to thyrotropin levels after adjustment for age, sex, and smoking status. RESULTS Among 55,412 individuals, 1813 people (3.3%) died of CHD during 643,183 person-years of follow-up. In 10 cohorts with information on both nonfatal and fatal CHD events, 4666 of 48,875 individuals (9.5%) experienced a first-time CHD event during 533,408 person-years of follow-up. For each 1-mIU/L higher thyrotropin level, the HR was 0.97 (95% CI, 0.90-1.04) for CHD mortality and 1.00 (95% CI, 0.97-1.03) for a first-time CHD event. Similarly, in analyses by categories of thyrotropin, the HRs of CHD mortality (0.94 [95% CI, 0.74-1.20]) and CHD events (0.97 [95% CI, 0.83-1.13]) were similar among participants with the highest (3.50-4.49 mIU/L) compared with the lowest (0.45-1.49 mIU/L) thyrotropin levels. Subgroup analyses by sex and age group yielded similar results. CONCLUSIONS AND RELEVANCE Thyrotropin levels within the reference range are not associated with risk of CHD events or CHD mortality. This finding suggests that differences in thyroid function within the population reference range do not influence the risk of CHD. Increased CHD risk does not appear to be a reason for lowering the upper thyrotropin reference limit.
Resumo:
Data concerning the link between severity of abdominal aortic calcification (AAC) and fracture risk in postmenopausal women are discordant. This association may vary by skeletal site and duration of follow-up. Our aim was to assess the association between the AAC severity and fracture risk in older women over the short- and long term. This is a case-cohort study nested in a large multicenter prospective cohort study. The association between AAC and fracture was assessed using Odds Ratios (OR) and 95% confidence intervals (95%CI) for vertebral fractures and using Hazard Risks (HR) and 95%CI for non-vertebral and hip fractures. AAC severity was evaluated from lateral spine radiographs using Kauppila's semiquantitative score. Severe AAC (AAC score 5+) was associated with higher risk of vertebral fracture during 4 years of follow-up, after adjustment for confounders (age, BMI, walking, smoking, hip bone mineral density, prevalent vertebral fracture, systolic blood pressure, hormone replacement therapy) (OR=2.31, 95%CI: 1.24-4.30, p<0.01). In a similar model, severe AAC was associated with an increase in the hip fracture risk (HR=2.88, 95%CI: 1.00-8.36, p=0.05). AAC was not associated with the risk of any non-vertebral fracture. AAC was not associated with the fracture risk after 15 years of follow-up. In elderly women, severe AAC is associated with higher short-term risk of vertebral and hip fractures, but not with the long-term risk of these fractures. There is no association between AAC and risk of non-vertebral-non-hip fracture in older women. Our findings lend further support to the hypothesis that AAC and skeletal fragility are related.
Resumo:
BACKGROUND National safety alert systems publish relevant information to improve patient safety in hospitals. However, the information has to be transformed into local action to have an effect on patient safety. We studied three research questions: How do Swiss healthcare quality and risk managers (qm/rm(1)) see their own role in learning from safety alerts issued by the Swiss national voluntary reporting and analysis system? What are their attitudes towards and evaluations of the alerts, and which types of improvement actions were fostered by the safety alerts? METHODS A survey was developed and applied to Swiss healthcare risk and quality managers, with a response rate of 39 % (n=116). Descriptive statistics are presented. RESULTS The qm/rm disseminate and communicate with a broad variety of professional groups about the alerts. While most respondents felt that they should know the alerts and their contents, only a part of them felt responsible for driving organizational change based on the recommendations. However, most respondents used safety alerts to back up their own patient safety goals. The alerts were evaluated positively on various dimensions such as usefulness and were considered as standards of good practice by the majority of the respondents. A range of organizational responses was applied, with disseminating information being the most common. An active role is related to using safety alerts for backing up own patient safety goals. CONCLUSIONS To support an active role of qm/rm in their hospital's learning from safety alerts, appropriate organizational structures should be developed. Furthermore, they could be given special information or training to act as an information hub on the issues discussed in the alerts.
Resumo:
Young peoples’ sport activity differs considerably depending on the linguistic region in Switzerland (Lamprecht, Fischer, & Stamm, 2014). This appears to be based on cultural as well as on structural differences. The question then arises how differing structural conditions in communes (e.g. sport facilities, significance of the municipal promotion of sport) across different linguistic regions of Switzerland cause variation in sport behaviour. Based on the theory of social action (Coleman, 1990), it is assumed that individual behaviour is not only determined by individual but also by structural and socio - cultural factors in which a person is socially embedded . In 33 municipalities of the German and French speaking region of Switzerland, multilevel data was gathered analysing possible influences of structural factors on sports behaviour. Using an online survey, 15 to 30 year old inhabitants (N=3677) were questi oned about their sports participation, as well as their perception of sport - related structural characteristics in their commune. To collect information about communes’ sport facilities, sport providers as well as representatives of the municipal administra tion were interviewed and document analyses were conducted. Representatives of the municipal administration attach more importance to sport promotion in the German speaking than in French - speaking municipalities. Young people living in the French speaking commune are less satisfied with the sport facilities (F(1,3266)=31.31, p<.01) and they are less physically active than their German - speaking counterparts (Chi2(1,N=3537)=22.51, p<.05). These first findings show the impact of structural conditions in commun es on sport participation of adolescents and young people. However, further multilevel analyses will be conducted for a better understanding of correlations between structural conditions and different sports behaviour of young people. References Coleman, J. S. (1990). Foundations of social theory. Cambridge, MA: Belknap. Lamprecht, M., Fischer, A. & Stamm, H. (2014). Sport Schweiz 2014. Sportaktivität und Sportinteresse der Schweizer Bevölkerung. Magglingen: BASPO.
Resumo:
Enzootic pneumonia (EP) caused by Mycoplasma hyopneumoniae has a significant economic impact on domestic pig production. A control program carried out from 1999 to 2003 successfully reduced disease occurrence in domestic pigs in Switzerland, but recurrent outbreaks suggested a potential role of free-ranging wild boar (Sus scrofa) as a source of re-infection. Since little is known on the epidemiology of EP in wild boar populations, our aims were: (1) to estimate the prevalence of M. hyopneumoniae infections in wild boar in Switzerland; (2) to identify risk factors for infection in wild boar; and (3) to assess whether infection in wild boar is associated with the same gross and microscopic lesions typical of EP in domestic pigs. Nasal swabs, bronchial swabs and lung samples were collected from 978 wild boar from five study areas in Switzerland between October 2011 and May 2013. Swabs were analyzed by qualitative real time PCR and a histopathological study was conducted on lung tissues. Risk factor analysis was performed using multivariable logistic regression modeling. Overall prevalence in nasal swabs was 26.2% (95% CI 23.3-29.3%) but significant geographical differences were observed. Wild boar density, occurrence of EP outbreaks in domestic pigs and young age were identified as risk factors for infection. There was a significant association between infection and lesions consistent with EP in domestic pigs. We have concluded that M. hyopneumoniae is widespread in the Swiss wild boar population, that the same risk factors for infection of domestic pigs also act as risk factors for infection of wild boar, and that infected wild boar develop lesions similar to those found in domestic pigs. However, based on our data and the outbreak pattern in domestic pigs, we propose that spillover from domestic pigs to wild boar is more likely than transmission from wild boar to pigs.
Resumo:
Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.
Resumo:
Giardia duodenalis is considered the most common protozoan infecting humans worldwide. Molecular characterization of G. duodenalis isolates has revealed the existence of eight groups (assemblages A to H) which differ in their host distribution. A cross-sectional study was conducted in 639 children from La Habana between January and December 2013. Two assemblage-specific PCRs were carried out for the molecular characterization. The overall prevalence of Giardia infection was 11.9%. DNA from 63 of 76 (82.9%) samples was successfully amplified by PCR-tpi, while 58 from 76 (76.3%) were detected by PCRE1-HF. Similar results by both PCRs were obtained in 54 from 76 samples (71%). According to these analyses, assemblage B and mixed assemblages A + B account for most of the Giardia infections in the cohort of children tested. Our current study identified assemblage B as predominant genotype in children infected with Giardia. Univariate analysis indicated that omission of washing hands before eating and keeping dogs at home were significant risk factors for a Giardia infection. In the future, novel molecular tools for a better discrimination of assemblages at the subassemblages level are needed to verify possible correlations between Giardia genotypes and symptomatology of giardiasis.
Resumo:
BACKGROUND While liver-related deaths in HIV and hepatitis C virus (HCV) co-infected individuals have declined over the last decade, hepatocellular carcinoma (HCC) may have increased. We described the epidemiology of HCC and other liver events in a multi-cohort collaboration of HIV/HCV co-infected individuals. METHODS We studied all HCV antibody-positive adults with HIV in the EuroSIDA Study, the Southern Alberta Clinic Cohort, the Canadian Co-infection Cohort, and the Swiss HIV Cohort Study from 2001 to 2014. We calculated the incidence of HCC and other liver events (defined as liver-related deaths or decompensations, excluding HCC) and used Poisson regression to estimate incidence rate ratios. RESULTS Our study comprised 7,229 HIV/HCV co-infected individuals (68% male, 90% white). During follow-up, 72 cases of HCC and 375 other liver events occurred, yielding incidence rates of 1.6 (95% confidence interval (CI): 1.3, 2.0) and 8.6 (95% CI: 7.8, 9.5) cases per 1,000 person-years of follow-up, respectively. The rate of HCC increased 11% per calendar year (95% CI: 4%, 19%) and decreased 4% for other liver events (95% CI: 2%, 7%), but only the latter remained statistically significant after adjustment for potential confounders. High age, cirrhosis, and low current CD4 cell count were associated with a higher incidence of both HCC and other liver events. CONCLUSIONS In HIV/HCV co-infected individuals, the crude incidence of HCC increased from 2001 to 2014, while other liver events declined. Individuals with cirrhosis or low current CD4 cell count are at highest risk of developing HCC or other liver events.
Resumo:
Importance A key factor in assessing the effectiveness and cost-effectiveness of antiretroviral therapy (ART) as a prevention strategy is the absolute risk of HIV transmission through condomless sex with suppressed HIV-1 RNA viral load for both anal and vaginal sex. Objective To evaluate the rate of within-couple HIV transmission (heterosexual and men who have sex with men [MSM]) during periods of sex without condoms and when the HIV-positive partner had HIV-1 RNA load less than 200 copies/mL. Design, Setting, and Participants The prospective, observational PARTNER (Partners of People on ART-A New Evaluation of the Risks) study was conducted at 75 clinical sites in 14 European countries and enrolled 1166 HIV serodifferent couples (HIV-positive partner taking suppressive ART) who reported condomless sex (September 2010 to May 2014). Eligibility criteria for inclusion of couple-years of follow-up were condomless sex and HIV-1 RNA load less than 200 copies/mL. Anonymized phylogenetic analysis compared couples' HIV-1 polymerase and envelope sequences if an HIV-negative partner became infected to determine phylogenetically linked transmissions. Exposures Condomless sexual activity with an HIV-positive partner taking virally suppressive ART. Main Outcomes and Measures Risk of within-couple HIV transmission to the HIV-negative partner. Results Among 1166 enrolled couples, 888 (mean age, 42 years [IQR, 35-48]; 548 heterosexual [61.7%] and 340 MSM [38.3%]) provided 1238 eligible couple-years of follow-up (median follow-up, 1.3 years [IQR, 0.8-2.0]). At baseline, couples reported condomless sex for a median of 2 years (IQR, 0.5-6.3). Condomless sex with other partners was reported by 108 HIV-negative MSM (33%) and 21 heterosexuals (4%). During follow-up, couples reported condomless sex a median of 37 times per year (IQR, 15-71), with MSM couples reporting approximately 22 000 condomless sex acts and heterosexuals approximately 36 000. Although 11 HIV-negative partners became HIV-positive (10 MSM; 1 heterosexual; 8 reported condomless sex with other partners), no phylogenetically linked transmissions occurred over eligible couple-years of follow-up, giving a rate of within-couple HIV transmission of zero, with an upper 95% confidence limit of 0.30/100 couple-years of follow-up. The upper 95% confidence limit for condomless anal sex was 0.71 per 100 couple-years of follow-up. Conclusions and Relevance Among serodifferent heterosexual and MSM couples in which the HIV-positive partner was using suppressive ART and who reported condomless sex, during median follow-up of 1.3 years per couple, there were no documented cases of within-couple HIV transmission (upper 95% confidence limit, 0.30/100 couple-years of follow-up). Additional longer-term follow-up is necessary to provide more precise estimates of risk.
Resumo:
The early detection and treatment of persons at-risk for psychosis is currently regarded a promising strategy in fighting the devastating consequences of psychotic disorders. The two current at-risk approaches, i.e., the "ultra high risk" and the "basic symptom" criteria, were mainly developed on adult samples. Initial evidence suggests, however, that they cannot simply be applied to children and adolescents. For ultra high risk criteria, there is indication of some attenuated psychotic symptoms being potentially non-specific in adolescents and of brief limited intermittent symptoms being difficult to clinically classify in children when observable behavioral correlates are missing. For basic symptoms, too, only preliminary indication of their usefulness in children and adolescents exists. Since developmental peculiarities in the assessment of basic symptoms should be considered, a child and youth version of the Schizophrenia Proneness Instrument (SPI-CY) was developed. In conclusion, research on the clinical-prognostic validity of the at-risk criteria and their potential adoption to the special needs of children and adolescents is needed. If a Prodromal Risk Syndrome for Psychosis or Attenuated Psychotic Symptoms Syndrome will be included into DSM-V, it has to be highlighted that its suitability for children and adolescents is only insufficiently known.