996 resultados para ODDS-suhde
Resumo:
BACKGROUND: Cigarette smoking is often initiated at a young age as well as other risky behaviors such as alcohol drinking, cannabis and other illicit drugs use. Some studies suggest that cigarette smoking may have an influence on other risky behaviors but little is known about the chronology of occurrence of those different habits. The aim of this study was to assess, by young men, what were the other risky behaviors associated with cigarette smoking and the joint prevalence and chronology of occurrence of those risky behaviors. METHODS: Cross-sectional analyses of a population-based census of 3526 young men attending the recruitment for the Swiss army, aged between 17 and 25 years old (mean age: 19 years old), who filled a self reported questionnaire about their alcohol, cigarettes, cannabis and other illicit drugs habits. Actual smoking was defined as either regular smoking (¡Ý1 cigarette/day, on every day) or occasional smoking, binge drinking as six or more drinks at least twice a month, at risk drinking as 21 drinks or more per week, recent cannabis use as cannabis consumption at least once during the last month, and use of illicit drugs as consumption once or more of illicit drugs other than cannabis. Age at begin was defined as age at first use of cannabis or cigarette smoking. RESULTS: In this population of young men, the prevalence of actual smoking was 51.2% (36.5% regular smoking, 14.6% occasionnal smoking). Two third of participamnts (60.1%) declared that they ever used cannabis, 25.2% reported a recent use of cannabis. 53.8% of participants had a risky alcohol consumption considered as either binge or at risk drinking. Cigarette smoking was significantly associated with recent cannabis use (Odds Ratio (OR): 3.85, 95% Confidence Interval (CI): 3.10- 4.77), binge drinking (OR: 3.48, 95% CI: 3.03-4.00), at risk alcohol drinking (OR: 4.04, 95% CI: 3.12-5.24), and ever use of illicit drugs (OR: 4.34, 95% CI: 3.54-5.31). In a multivariate logistic regression, odds ratios for smoking were increased for cannabis users (OR 3.10,, 95% CI: 2.48-3.88), binge drinkers (OR: 1.77, 95% CI: 1.44-2.17), at risk alcohol drinkers (OR 2.26, 95% CI: 1.52-3.36) and ever users of illicit drugs (OR: 1.56, 95% CI: 1.20-2.03). The majority of young men (57.3%) initiated smoking before cannabis and mean age at onset was 13.4 years old, whereas only 11.1% began to use cannabis before smoking cigarettes and mean age at onset was slightly older (14.4 years old). 31.6% started both cannabis and tobacco at the same age (15 years old). About a third of participants (30.5%) did have a cluster of risky behaviours (smoking, at risk drinking, cannabis use) and 11.0% did cumulate smoking, drinking, cannabis and ever use of illegal drugs. More than half of the smokers (59.6%) did cumulate cannabis use and at risk alcohol drinking whereas only 18.5% of non-smokers did. CONCLUSIONS: The majority of young smokers initiated their risky behaviors by first smoking and then by other psychoactive drugs. Smokers have an increased risk to present other risky behaviors such as cannabis use, at risk alcohol consumtion and illicit drug use compared to nonsmokers. Prevention by young male adults should focus on smoking and also integrate interventions on other risky behaviors.
Resumo:
There are suggestions of an inverse association between folate intake and serum folate levels and the risk of oral cavity and pharyngeal cancers (OPCs), but most studies are limited in sample size, with only few reporting information on the source of dietary folate. Our study aims to investigate the association between folate intake and the risk of OPC within the International Head and Neck Cancer Epidemiology (INHANCE) Consortium. We analyzed pooled individual-level data from ten case-control studies participating in the INHANCE consortium, including 5,127 cases and 13,249 controls. Odds ratios (ORs) and the corresponding 95% confidence intervals (CIs) were estimated for the associations between total folate intake (natural, fortification and supplementation) and natural folate only, and OPC risk. We found an inverse association between total folate intake and overall OPC risk (the adjusted OR for the highest vs. the lowest quintile was 0.65, 95% CI: 0.43-0.99), with a stronger association for oral cavity (OR = 0.57, 95% CI: 0.43-0.75). A similar inverse association, though somewhat weaker, was observed for folate intake from natural sources only in oral cavity cancer (OR = 0.64, 95% CI: 0.45-0.91). The highest OPC risk was observed in heavy alcohol drinkers with low folate intake as compared to never/light drinkers with high folate (OR = 4.05, 95% CI: 3.43-4.79); the attributable proportion (AP) owing to interaction was 11.1% (95% CI: 1.4-20.8%). Lastly, we reported an OR of 2.73 (95% CI:2.34-3.19) for those ever tobacco users with low folate intake, compared with nevere tobacco users and high folate intake (AP of interaction =10.6%, 95% CI: 0.41-20.8%). Our project of a large pool of case-control studies supports a protective effect of total folate intake on OPC risk.
Resumo:
The use of bone mineral density (BMD) for fracture discrimination may be improved by considering bone microarchitecture. Texture parameters such as trabecular bone score (TBS) or mean Hurst parameter (H) could help to find women who are at high risk of fracture in the non-osteoporotic group. The purpose of this study was to combine BMD and microarchitectural texture parameters (spine TBS and calcaneus H) for the detection of osteoporotic fractures. Two hundred and fifty five women had a lumbar spine (LS), total hip (TH), and femoral neck (FN) DXA. Additionally, texture analyses were performed with TBS on spine DXA and with H on calcaneus radiographs. Seventy-nine women had prevalent fragility fractures. The association with fracture was evaluated by multivariate logistic regressions. The diagnostic value of each parameter alone and together was evaluated by odds ratios (OR). The area under curve (AUC) of the receiver operating characteristics (ROC) were assessed in models including BMD, H, and TBS. Women were also classified above and under the lowest tertile of H or TBS according to their BMD status. Women with prevalent fracture were older and had lower TBS, H, LS-BMD, and TH-BMD than women without fracture. Age-adjusted ORs were 1.66, 1.70, and 1.93 for LS, FN, and TH-BMD, respectively. Both TBS and H remained significantly associated with fracture after adjustment for age and TH-BMD: OR 2.07 [1.43; 3.05] and 1.47 [1.04; 2.11], respectively. The addition of texture parameters in the multivariate models didn't show a significant improvement of the ROC-AUC. However, women with normal or osteopenic BMD in the lowest range of TBS or H had significantly more fractures than women above the TBS or the H threshold. We have shown the potential interest of texture parameters such as TBS and H in addition to BMD to discriminate patients with or without osteoporotic fractures. However, their clinical added values should be evaluated relative to other risk factors.
Resumo:
Background: Previous studies have shown that immigrant workers face relatively worse working and employment conditions, as well as lower rates of sickness absence than native-born workers. This study aims to assess rates of sickness presenteeism in a sample of Spanish-born and foreign-born workers according to different characteristics. Methods: A cross-sectional survey was conducted amongst a convenience sample of workers (Spanish-born and foreign-born), living in four Spanish cities: Barcelona, Huelva, Madrid and Valencia (2008-2009). Sickness presenteeism information was collected through two items in the questionnaire ("Have you had health problems in the last year?" and "Have you ever had to miss work for any health problem?") and was defined as worker who had a health problem (answered yes, first item) and had not missed work (answered no, second item). For the analysis, the sample of 2,059 workers (1,617 foreign-born) who answered yes to health problems was included. After descriptives, logistic regressions were used to establish the association between origin country and sickness presenteeism (adjusted odds ratios aOR; 95% confidence interval 95%CI). Analyses were stratified per time spent in Spain among foreign-born workers. Results: All of the results refer to the comparison between foreign-born and Spanish-born workers as a whole, and in some categories relating to personal and occupational conditions. Foreign-born workers were more likely to report sickness presenteeism compared with their Spanish-born counterparts, especially those living in Spain for under 2 years [Prevalence: 42% in Spanish-born and 56.3% in Foreign-born; aOR 1.77 95%CI 1.24-2.53]. In case of foreign-born workers (with time in Spain < 2 years), men [aOR 2.31 95%CI 1.40-3.80], those with university studies [aOR 3.01 95%CI 1.04-8.69], temporary contracts [aOR 2.26 95%CI 1.29-3.98] and salaries between 751-1,200€ per month [aOR 1.74 95% CI 1.04-2.92] were more likely to report sickness presenteeism. Also, recent immigrants with good self-perceived health and good mental health were more likely to report presenteeism than Spanish-born workers with the same good health indicators. Conclusions: Immigrant workers report more sickness presenteeism than their Spanish-born counterparts. These results could be related to precarious work and employment conditions of immigrants. Immigrant workers should benefit from the same standards of social security, and of health and safety in the workplace that are enjoyed by Spanish workers.
Resumo:
BACKGROUND: The dose intensity of chemotherapy can be increased to the highest possible level by early administration of multiple and sequential high-dose cycles supported by transfusion with peripheral blood progenitor cells (PBPCs). A randomized trial was performed to test the impact of such dose intensification on the long-term survival of patients with small cell lung cancer (SCLC). METHODS: Patients who had limited or extensive SCLC with no more than two metastatic sites were randomly assigned to high-dose (High, n = 69) or standard-dose (Std, n = 71) chemotherapy with ifosfamide, carboplatin, and etoposide (ICE). High-ICE cycles were supported by transfusion with PBPCs that were collected after two cycles of treatment with epidoxorubicin at 150 mg/m(2), paclitaxel at 175 mg/m(2), and filgrastim. The primary outcome was 3-year survival. Comparisons between response rates and toxic effects within subgroups (limited or extensive disease, liver metastases or no liver metastases, Eastern Cooperative Oncology Group performance status of 0 or 1, normal or abnormal lactate dehydrogenase levels) were also performed. RESULTS: Median relative dose intensity in the High-ICE arm was 293% (range = 174%-392%) of that in the Std-ICE arm. The 3-year survival rates were 18% (95% confidence interval [CI] = 10% to 29%) and 19% (95% CI = 11% to 30%) in the High-ICE and Std-ICE arms, respectively. No differences were observed between the High-ICE and Std-ICE arms in overall response (n = 54 [78%, 95% CI = 67% to 87%] and n = 48 [68%, 95% CI = 55% to 78%], respectively) or complete response (n = 27 [39%, 95% CI = 28% to 52%] and n = 24 [34%, 95% CI = 23% to 46%], respectively). Subgroup analyses showed no benefit for any outcome from High-ICE treatment. Hematologic toxicity was substantial in the Std-ICE arm (grade > or = 3 neutropenia, n = 49 [70%]; anemia, n = 17 [25%]; thrombopenia, n = 17 [25%]), and three patients (4%) died from toxicity. High-ICE treatment was predictably associated with severe myelosuppression, and five patients (8%) died from toxicity. CONCLUSIONS: The long-term outcome of SCLC was not improved by raising the dose intensity of ICE chemotherapy by threefold.
Resumo:
BACKGROUND: HIV treatment recommendations are updated as clinical trials are published. Whether recommendations drive clinicians to change antiretroviral therapy in well-controlled patients is unexplored. METHODS: We selected patients with undetectable viral loads (VLs) on nonrecommended regimens containing double-boosted protease inhibitors (DBPIs), triple-nucleoside reverse transcriptase inhibitors (NRTIs), or didanosine (ddI) plus stavudine (d4T) at publication of the 2006 International AIDS Society recommendations. We compared demographic and clinical characteristics with those of control patients with undetectable VL not on these regimens and examined clinical outcome and reasons for treatment modification. RESULTS: At inclusion, 104 patients were in the DBPI group, 436 in the triple-NRTI group, and 19 in the ddI/d4T group. By 2010, 28 (29%), 204 (52%), and 1 (5%) patient were still on DBPIs, triple-NRTIs, and ddI plus d4T, respectively. 'Physician decision,' excluding toxicity/virological failure, drove 30% of treatment changes. Predictors of recommendation nonobservance included female sex [adjusted odds ratio (aOR) 2.69, 95% confidence interval (CI) 1 to 7.26; P = 0.01] for DPBIs, and undetectable VL (aOR 3.53, 95% CI 1.6 to 7.8; P = 0.002) and lack of cardiovascular events (aOR 2.93, 95% CI 1.23 to 6.97; P = 0.02) for triple-NRTIs. All patients on DBPIs with documented diabetes or a cardiovascular event changed treatment. Recommendation observance resulted in lower cholesterol values in the DBPI group (P = 0.06), and more patients having undetectable VL (P = 0.02) in the triple-NRTI group. CONCLUSION: The physician's decision is the main factor driving change from nonrecommended to recommended regimens, whereas virological suppression is associated with not switching. Positive clinical outcomes observed postswitch underline the importance of observing recommendations, even in well-controlled patients.
Resumo:
Background Following the discovery that mutant KRAS is associated with resistance to anti-epidermal growth factor receptor (EGFR) antibodies, the tumours of patients with metastatic colorectal cancer are now profiled for seven KRAS mutations before receiving cetuximab or panitumumab. However, most patients with KRAS wild-type tumours still do not respond. We studied the effect of other downstream mutations on the efficacy of cetuximab in, to our knowledge, the largest cohort to date of patients with chemotherapy-refractory metastatic colorectal cancer treated with cetuximab plus chemotherapy in the pre-KRAS selection era. Methods 1022 tumour DNA samples (73 from fresh-frozen and 949 from formalin-fixed, paraffin-embedded tissue) from patients treated with cetuximab between 2001 and 2008 were gathered from 11 centres in seven European countries. 773 primary tumour samples had sufficient quality DNA and were included in mutation frequency analyses; mass spectrometry genotyping of tumour samples for KRAS, BRAF, NRAS, and PIK3CA was done centrally. We analysed objective response, progression-free survival (PFS), and overall survival in molecularly defined subgroups of the 649 chemotherapy-refractory patients treated with cetuximab plus chemotherapy. Findings 40.0% (299/747) of the tumours harboured a KRAS mutation, 14.5% (108/743) harboured a PIK3CA mutation (of which 68.5% [74/108] were located in exon 9 and 20.4% [22/108] in exon 20), 4.7% (36/761) harboured a BRAF mutation, and 2.6% (17/644) harboured an NRAS mutation. KRAS mutants did not derive benefit compared with wild types, with a response rate of 6.7% (17/253) versus 35.8% (126/352; odds ratio [OR] 0.13, 95% CI 0.07-0.22; p<0.0001), a median PFS of 12. weeks versus 24 weeks (hazard ratio [HR] 1 98, 1.66-2.36; p<0.0001), and a median overall survival of 32 weeks versus 50 weeks (1.75, 1.47-2.09; p<0.0001). In KRAS wild types, carriers of BRAF and NRAS mutations had a significantly lower response rate than did BRAF and NRAS wild types, with a response rate of 8.3% (2/24) in carriers of BRAF mutations versus 38.0% in BRAF wild types (124/326; OR 0.15, 95% CI 0.02-0.51; p=0.0012); and 7.7% (1/13) in carriers of NRAS mutations versus 38.1% in NRAS wild types (110/289; OR 0.14, 0.007-0.70; p=0.013). PIK3CA exon 9 mutations had no effect, whereas exon 20 mutations were associated with a worse outcome compared with wild types, with a response rate of 0.0% (0/9) versus 36.8% (121/329; OR 0.00,0.00-0.89; p=0.029), a median PFS of 11.5 weeks versus 24 weeks (HR 2.52, 1.33-4.78; p=0.013), and a median overall survival of 34 weeks versus 51 weeks (3.29, 1.60-6.74; p=0.0057). Multivariate analysis and conditional inference trees confirmed that, if KRAS is not mutated, assessing BRAF, NRAS, and PIK3CA exon 20 mutations (in that order) gives additional information about outcome. Objective response rates in our series were 24.4% in the unselected population, 36.3% in the KRAS wild-type selected population, and 41.2% in the KRAS, BRAF, NRAS, and PIK3CA exon 20 wild-type population. Interpretation While confirming the negative effect of KRAS mutations on outcome after cetuximab, we show that BRAF, NRAS, and PIK3CA,exon 20 mutations are significantly associated with a low response rate. Objective response rates could be improved by additional genotyping of BRAF, NRAS, and PIK3CA exon 20 mutations in a KRAS wild-type population.
Resumo:
OBJECTIVES: Hypoglycaemia (glucose <2.2 mmol/l) is a defining feature of severe malaria, but the significance of other levels of blood glucose has not previously been studied in children with severe malaria. METHODS: A prospective study of 437 consecutive children with presumed severe malaria was conducted in Mali. We defined hypoglycaemia as <2.2 mmol/l, low glycaemia as 2.2-4.4 mmol/l and hyperglycaemia as >8.3 mmol/l. Associations between glycaemia and case fatality were analysed for 418 children using logistic regression models and a receiver operator curve (ROC). RESULTS: There was a significant difference between blood glucose levels in children who died (median 4.6 mmol/l) and survivors (median 7.6 mmol/l, P < 0.001). Case fatality declined from 61.5% of the hypoglycaemic children to 46.2% of those with low glycaemia, 13.4% of those with normal glycaemia and 7.6% of those with hyperglycaemia (P < 0.001). Logistic regression showed an adjusted odds ratio (AOR) of 0.75 (0.64-0.88) for case fatality per 1 mmol/l increase in baseline blood glucose. Compared to a normal blood glucose, hypoglycaemia and low glycaemia both significantly increased the odds of death (AOR 11.87, 2.10-67.00; and 5.21, 1.86-14.63, respectively), whereas hyperglycaemia reduced the odds of death (AOR 0.34, 0.13-0.91). The ROC [area under the curve at 0.753 (95% CI 0.684-0.820)] indicated that glycaemia had a moderate predictive value for death and identified an optimal threshold at glycaemia <6.1 mmol/l, (sensitivity 64.5% and specificity 75.1%). CONCLUSIONS: If there is a threshold of blood glucose which defines a worse prognosis, it is at a higher level than the current definition of 2.2 mmol/l.
Resumo:
A mortalidade materna tem se constituído em um dos problemas prioritários de saúde pública, afetando diretamente as mulheres no ciclo grávido puerperal pertencentes às classes sociais menos favorecidas. Diante desta situação o objetivo deste estudo consistiu em identificar associações entre a raça de mulheres residentes no estado da Paraíba, e as variáveis grupo etário, escolaridade e tipo de óbito das mulheres que foram a óbito por morte materna no período de 2000 a 2004. Trata-se de um estudo transversal, cuja fonte de dados constituiu-se de 109 declarações de óbitos maternos. Procedeu-se a uma análise estatística bivariada e multivariada, para avaliar a associação existente entre as variáveis através da regressão logística múltipla. Calculou-se o odds ratio para investigar a associação entre as variáveis. Observou-se que não houve significância estatística entre as variáveis raça e idade, bem como por escolaridade, mas houve indícios significativos de que as mulheres não brancas da Paraíba tiveram mais chance de morrer por morte obstétrica direta (OR=3,55; IC:1,20-10,5). Os resultados mostraram que o risco de mortalidade materna na Paraíba foi maior entre as mulheres não brancas, configurando-se em importante expressão de desigualdade social.
Resumo:
BACKGROUND: Allostatic load reflects cumulative exposure to stressors throughout lifetime and has been associated with several adverse health outcomes. It is hypothesized that people with low socioeconomic status (SES) are exposed to higher chronic stress and have therefore greater levels of allostatic load. OBJECTIVE: To assess the association of receiving social transfers and low education with allostatic load. METHODS: We included 3589 participants (1812 women) aged over 35years and under retirement age from the population-based CoLaus study (Lausanne, Switzerland, 2003-2006). We computed an allostatic load index aggregating cardiovascular, metabolic, dyslipidemic and inflammatory markers. A novel index additionally including markers of oxidative stress was also examined. RESULTS: Men with low vs. high SES were more likely to have higher levels of allostatic load (odds ratio (OR)=1.93/2.34 for social transfers/education, 95%CI from 1.45 to 4.17). The same patterns were observed among women. Associations persisted after controlling for health behaviors and marital status. CONCLUSIONS: Low education and receiving social transfers independently and cumulatively predict high allostatic load and dysregulation of several homeostatic systems in a Swiss population-based study. Participants with low SES are at higher risk of oxidative stress, which may justify its inclusion as a separate component of allostatic load.
Resumo:
Port-a-Cath© (PAC) are totally implantable devices that offer an easy and long term access to venous circulation. They have been extensively used for intravenous therapy administration and are particularly well suited for chemotherapy in oncologic patients. Previous comparative studies have shown that these devices have the lowest catheter-related bloodstream infection rates among all intravascular access systems. However, bloodstream infection (BSI) still remains a major issue of port use and epidemiology data for PAC-associated BSI (PABSI) rates differ strongly depending on studies. Also, current literature about PABSI risk factors is scarce and sometimes controversial. Such heterogeneity may depend on type of studied population and local factors. Therefore, the aim of this study was to describe local epidemiology and risk factors for PABSI in adult patients in our tertiary- care university hospital. We conducted a retrospective cohort study in order to describe local epidemiology. We also performed a nested case-control study to identify local risk factors of PABSI. We analyzed medical files of adult patients who had a PAC implanted between January 1st, 2008 and December 31st, 2009 and looked for PABSI occurrence before May 1st, 2011 to define cases. Thirty nine PABSI occurred in this population with an attack rate of 5.8%. We estimated an incidence rate of 0.08/1000 PAC-days using the case-control study. PABSI causative agents were mainly Gram positive cocci (62%). We identified three predictive factors of PABSI by multivariate statistical analysis: neutropenia on outcome date (Odds Ratio [OR]: 4.05; 95% confidence interval [CI]:1.05- 15.66; p=0.042), diabetes (OR: 11.53; 95% CI: 1.07-124.70; p=0.044) and having another infection than PABSI on outcome date (OR: 6.35; 95% CI: 1.50-26.86; p=0.012). Patients suffering from acute or renal failure (OR: 4.26; 95% CI: 0.94-19.21; p=0.059) or wearing another invasive device (OR: 2.99; 95%CI:0.96-9.31; p=0.059) did not have a statistically increased risk for developing a PABSI according to classical threshold (p<0.05) but nevertheless remained close to significance. Our study demonstrated that local epidemiology and microbiology of PABSI in our institution was similar to previous reports. A larger prospective study is required to confirm our results or to test preventive measures.
Resumo:
OBJECTIVES: The objectives were to identify the social and medical factors associated with emergency department (ED) frequent use and to determine if frequent users were more likely to have a combination of these factors in a universal health insurance system. METHODS: This was a retrospective chart review case-control study comparing randomized samples of frequent users and nonfrequent users at the Lausanne University Hospital, Switzerland. The authors defined frequent users as patients with four or more ED visits within the previous 12 months. Adult patients who visited the ED between April 2008 and March 2009 (study period) were included, and patients leaving the ED without medical discharge were excluded. For each patient, the first ED electronic record within the study period was considered for data extraction. Along with basic demographics, variables of interest included social (employment or housing status) and medical (ED primary diagnosis) characteristics. Significant social and medical factors were used to construct a logistic regression model, to determine factors associated with frequent ED use. In addition, comparison of the combination of social and medical factors was examined. RESULTS: A total of 359 of 1,591 frequent and 360 of 34,263 nonfrequent users were selected. Frequent users accounted for less than a 20th of all ED patients (4.4%), but for 12.1% of all visits (5,813 of 48,117), with a maximum of 73 ED visits. No difference in terms of age or sex occurred, but more frequent users had a nationality other than Swiss or European (n = 117 [32.6%] vs. n = 83 [23.1%], p = 0.003). Adjusted multivariate analysis showed that social and specific medical vulnerability factors most increased the risk of frequent ED use: being under guardianship (adjusted odds ratio [OR] = 15.8; 95% confidence interval [CI] = 1.7 to 147.3), living closer to the ED (adjusted OR = 4.6; 95% CI = 2.8 to 7.6), being uninsured (adjusted OR = 2.5; 95% CI = 1.1 to 5.8), being unemployed or dependent on government welfare (adjusted OR = 2.1; 95% CI = 1.3 to 3.4), the number of psychiatric hospitalizations (adjusted OR = 4.6; 95% CI = 1.5 to 14.1), and the use of five or more clinical departments over 12 months (adjusted OR = 4.5; 95% CI = 2.5 to 8.1). Having two of four social factors increased the odds of frequent ED use (adjusted = OR 5.4; 95% CI = 2.9 to 9.9), and similar results were found for medical factors (adjusted OR = 7.9; 95% CI = 4.6 to 13.4). A combination of social and medical factors was markedly associated with ED frequent use, as frequent users were 10 times more likely to have three of them (on a total of eight factors; 95% CI = 5.1 to 19.6). CONCLUSIONS: Frequent users accounted for a moderate proportion of visits at the Lausanne ED. Social and medical vulnerability factors were associated with frequent ED use. In addition, frequent users were more likely to have both social and medical vulnerabilities than were other patients. Case management strategies might address the vulnerability factors of frequent users to prevent inequities in health care and related costs.
Resumo:
Purpose: To compare the sexual behavior of adolescent males who do and do not watch pornographic websites. Methods: This study was presented as a school survey. Data were drawn from the 2002 Swiss Multicenter Adolescent Survey on Health (SMASH02) database, a survey including 7,548 adolescents age 16-20. The setting was post-mandatory schools in Switzerland. A total of 2,891 male students who connected to the internet in the last 30 days were enrolled and distributed into two groups: boys who deliberately watched pornographic websites in the last 30 days (n ¼ 942; 33%) and boys who did not (n ¼ 1,949; 67%). Socio-demographic characteristics; frequency of connection to the internet; sexual behavior parameters (having a girlfriend and if yes, for more or less than 6 months; having had sexual intercourse; age at first sexual intercourse; use of a condom at last sexual intercourse; number of sexual partners; having made a partner pregnant). Results: A logistic regression was performed using STATA 9.2. The only significant socio-demographic variable was having a low socioeconomic status (adjusted odds ratio [AOR] 1.66); no difference was found for age and academic track between the two groups. Boys who watch pornographic websites were also significantly more likely to connect frequently to the internet (one day a week: AOR 1.75; several days a week: AOR 2.36; every day: AOR 3.11), to have had sexual intercourse (AOR 2.06), and to have had their first sexual intercourse before age 15 (AOR 1.48). The stability of the relationship with their girlfriend did not appear to have any influence on the search for pornography on the internet. Conclusions: About one third of boys in our sample report having accessed pornographic websites in the last 30 days, a proportion similar to other studies. Watching such websites increases with the frequency of connection to the internet and seems to be correlated with an earlier sexual activity debut among adolescent males. However, having had first sexual intercourse before age 15 is the only sexual risk behavior that seems to be increased when watching pornographic websites among boys. Further studies should address the causality of this correlation and the factors influencing the search for pornography on the web among boys, in order to explore some new ways of prevention about sexual risk behaviors. Sources of Support: The SMASH02 survey was carried out with the financial support of the Swiss Federal Office of Public Health and the participating cantons.
Resumo:
BACKGROUND: Toll-like receptors (TLRs) are essential components of the immune response to fungal pathogens. We examined the role of TLR polymorphisms in conferring a risk of invasive aspergillosis among recipients of allogeneic hematopoietic-cell transplants. METHODS: We analyzed 20 single-nucleotide polymorphisms (SNPs) in the toll-like receptor 2 gene (TLR2), the toll-like receptor 3 gene (TLR3), the toll-like receptor 4 gene (TLR4), and the toll-like receptor 9 gene (TLR9) in a cohort of 336 recipients of hematopoietic-cell transplants and their unrelated donors. The risk of invasive aspergillosis was assessed with the use of multivariate Cox regression analysis. The analysis was replicated in a validation study involving 103 case patients and 263 matched controls who received hematopoietic-cell transplants from related and unrelated donors. RESULTS: In the discovery study, two donor TLR4 haplotypes (S3 and S4) increased the risk of invasive aspergillosis (adjusted hazard ratio for S3, 2.20; 95% confidence interval [CI], 1.14 to 4.25; P=0.02; adjusted hazard ratio for S4, 6.16; 95% CI, 1.97 to 19.26; P=0.002). The haplotype S4 was present in carriers of two SNPs in strong linkage disequilibrium (1063 A/G [D299G] and 1363 C/T [T399I]) that influence TLR4 function. In the validation study, donor haplotype S4 also increased the risk of invasive aspergillosis (adjusted odds ratio, 2.49; 95% CI, 1.15 to 5.41; P=0.02); the association was present in unrelated recipients of hematopoietic-cell transplants (odds ratio, 5.00; 95% CI, 1.04 to 24.01; P=0.04) but not in related recipients (odds ratio, 2.29; 95% CI, 0.93 to 5.68; P=0.07). In the discovery study, seropositivity for cytomegalovirus (CMV) in donors or recipients, donor positivity for S4, or both, as compared with negative results for CMV and S4, were associated with an increase in the 3-year probability of invasive aspergillosis (12% vs. 1%, P=0.02) and death that was not related to relapse (35% vs. 22%, P=0.02). CONCLUSIONS: This study suggests an association between the donor TLR4 haplotype S4 and the risk of invasive aspergillosis among recipients of hematopoietic-cell transplants from unrelated donors.
Resumo:
This study analyses the determinants of the rate of temporary employment in various OECD countries using both macro-level data drawn from the OECD and EUROSTAT databases, as well as micro-level data drawn from the 8th wave of the European Household Panel. Comparative analysis is set out to test different explanations originally formulated for the Spanish case. The evidence suggests that the overall distribution of temporary employment in advanced economies does not seem to be explicable by the characteristics of national productive structures. This evidence seems at odds with previous interpretations based on segmentation theories. As an alternative explanation, two types of supply-side factors are tested: crowding-out effects and educational gaps in the workforce. The former seems non significant, whilst the effects of the latter disappear after controlling for the levels of institutional protection in standard employment during the 1980s. Multivariate analysis shows that only this latter institutional variable, together with the degree of coordinated centralisation of the collective bargaining system, seem to have a significant impact on the distribution of temporary employment in the countries examined. On the basis of this observation, an explanation of the very high levels of temporary employment observed in Spain is proposed. This explanation is consistent with both country-specific and comparative evidence.