166 resultados para Nominal cohort
Resumo:
BACKGROUND: Data on the incidence of hepatitis C virus (HCV) infection among human immunodeficiency virus (HIV)-infected persons are sparse. It is controversial whether and how frequently HCV is transmitted by unprotected sexual intercourse. METHODS: We assessed the HCV seroprevalence and incidence of HCV infection in the Swiss HIV Cohort Study between 1988 and 2004. We investigated the association of HCV seroconversion with mode of HIV acquisition, sex, injection drug use (IDU), and constancy of condom use. Data on condom use or unsafe sexual behavior were prospectively collected between 2000 and 2004. RESULTS: The overall seroprevalence of HCV infection was 33% among a total of 7899 eligible participants and 90% among persons reporting IDU. We observed 104 HCV seroconversions among 3327 participants during a total follow-up time of 16,305 person-years, corresponding to an incidence of 0.64 cases per 100 person-years. The incidence among participants with a history of IDU was 7.4 cases per 100 person-years, compared with 0.23 cases per 100 person-years in patients without such a history (P<.001). In men who had sex with men (MSM) without a history of IDU who reported unsafe sex, the incidence was 0.7 cases per 100 person-years, compared with 0.2 cases per 100 person-years in those not reporting unsafe sex (P=.02), corresponding to an incidence rate ratio of 3.5 (95% confidence interval, 1.2-10.0). The hazard of acquiring HCV infection was elevated among younger participants who were MSM. CONCLUSIONS: HCV infection incidence in the Swiss HIV Cohort Study was mainly associated with IDU. In HIV-infected MSM, HCV infection was associated with unsafe sex.
Resumo:
BACKGROUND: Body fat changes are common in patients with HIV. For patients on protease inhibitor (PI)-based highly active antiretroviral therapy (HAART), these changes have been associated with increasing exposure to therapy in general and to stavudine in particular. Our objective is to show whether such associations are more or less likely for patients on non-nucleoside reverse transcriptase inhibitor (NNRTI)-based HAART. METHODS: We included all antiretroviral-naive patients in the Swiss HIV Cohort Study starting HAART after April 2000 who had had body weight, CD4 cell count and plasma HIV RNA measured between 6 months before and 3 months after starting HAART, and at least one assessment of body fat changes after starting HAART. At visits scheduled every 6 months, fat loss or fat gain is reported by agreement between patient and physician. We estimate the association between reported body fat changes and both time on therapy and time on stavudine, using conditional logistical regression. RESULTS: Body fat changes were reported for 85 (9%) out of 925 patients at their first assessment; a further 165 had only one assessment. Of the remaining 675 patients, body fat changes were reported for 156 patients at a rate of 13.2 changes per 100 patient-years. Body fat changes are more likely with increasing age [odds ratio (OR) 1.18 (1.00-1.38) per 10 years], with increasing BMI [OR 1.06 (1.01-1.11)] and in those with a lower baseline CD4 cell count [OR 0.91 (0.83-1.01) per 100 cells/microl]. There is only weak evidence that body fat changes are more likely with increasing time on HAART [OR 1.16 (0.93-1.46)]. After adjusting for time on HAART, fat loss is more likely with increasing stavudine use [OR 1.70 (1.34-2.15)]. There is no evidence of an association between reported fat changes and time on NNRTI therapy relative to PI therapy in those patients who used either one therapy or the other [OR 0.98 (0.56-1.63)]. CONCLUSION: Fat loss is more likely to be reported with increasing exposure to stavudine. We find no evidence of major differences between PI and NNRTI therapy in the risk of reported body fat changes.
Resumo:
The present study was undertaken to assess the influence of childhood variables (physical and emotional) to later well-being in a group of rural Swiss (Emmental Cohort). Our study is the first prospective cohort over a time period of more than 50 years. It includes 1537 children who were listed and assessed in 1942 (T1) because they had difficulties in school or were otherwise behaviorally disturbed. In 1995 (T2) more than 60% of the initial population could be reassessed by our study group. We found more subjects at T2 who had been rated as intelligent at T1. More subjects responding to T2 belonged to a higher social class, were more anxious, and had more psychosocial problems at T1. Social income at T2 is correlated to the social class at T1. More subjects have died since who were rated at T1 as being less intelligent, less neurotical, and having higher psychosocial problems. Twice as many men died than women. The emotional situation at T2 is significantly correlated to psychological well-being at T1. The somatic complaints at T2 correlate significantly to neurotic symptoms in childhood (T1). The more intelligent the children were rated at T1, the less emotional and somatic complaints were voiced at T2 and the better the psychic well-being was rated (T2). In addition, the former social milieu (T1) significantly determined somatic and psychological complaints at T2. Our data discern a significant correlation between actual status and former childhood variables more than 50 years later in a rural Swiss cohort (Emmental Cohort).
Resumo:
AIM: To assess the clinical and radiographic outcomes of immediate transmucosal placement of implants into molar extraction sockets. STUDY DESIGN: Twelve-month multicenter prospective cohort study. MATERIAL AND METHODS: Following molar extraction, tapered implants with an endosseous diameter of 4.8 mm and a shoulder diameter of 6.5 mm were immediately placed into the sockets. Molars with evidence of acute periapical pathology were excluded. After implant placement and achievement of primary stability, flaps were repositioned and sutured allowing a non-submerged, transmucosal healing. Peri-implant marginal defects were treated according to the principles of guided bone regeneration (GBR) by means of deproteinized bovine bone mineral particles in conjunction with a bioresrobable collagen membrane. Standardized radiographs were obtained at baseline and 12 months thereafter. Changes in depth and width of the distance from the implant shoulder (IS) and from the alveolar crest (AC) to the bottom of the defect (BD) were assessed. RESULTS: Eighty-two patients (42 males and 40 females) were enrolled and followed for 12 months. They contributed with 82 tapered implants. Extraction sites displayed sufficient residual bone volume to allow primary stability of all implants. Sixty-four percent of the implants were placed in the areas of 36 and 46. GBR was used in conjunction with the placement of all implants. No post-surgical complications were observed. All implants healed uneventfully yielding a survival rate of 100% and healthy soft tissue conditions after 12 months. Radiographically, statistically significant changes (P<0.0001) in mesial and distal crestal bone levels were observed from baseline to the 12-month follow-up. CONCLUSIONS: The findings of this 12-month prospective cohort study showed that immediate transmucosal implant placement represented a predictable treatment option for the replacement of mandibular and maxillary molars lost due to reasons other than periodontitis including vertical root fractures, endodontic failures and caries.
Resumo:
AIM: To evaluate the healing outcome of soft tissue dehiscence coverage at implant sites. MATERIAL AND METHODS: Ten patients with one mucosal recession defect at an implant site and a contralateral unrestored clinical crown without recession were recruited. The soft tissue recessions were surgically covered using a coronally advanced flap in combination with a free connective tissue graft. Healing was studied at 1, 3 and 6 months post-operatively. RESULTS: Soft tissue dehiscences were covered with a coronal overcompensation of the flap margin up to 1.2 mm after the procedure. After 1 month, the coverage shrank to a mean of 75%, after 3 months to 70% and after 6 months to 66%. CONCLUSIONS: The implant sites revealed a substantial, clinically significant improvement following coronal mucosal displacement in combination with connective tissue grafting, but in none of the sites, a could complete implant soft tissue dehiscence coverage be achieved.
Resumo:
OBJECTIVES: We sought to determine the risk of late stent thrombosis (ST) during long-term follow-up beyond 3 years, searched for predictors, and assessed the impact of ST on overall mortality. BACKGROUND: Late ST was reported to occur at an annual rate of 0.6% up to 3 years after drug-eluting stent (DES) implantation. METHODS: A total of 8,146 patients underwent percutaneous coronary intervention with a sirolimus-eluting stent (SES) (n=3,823) or paclitaxel-eluting stent (PES) (n=4,323) and were followed up to 4 years after stent implantation. Dual antiplatelet treatment was prescribed for 6 to 12 months. RESULTS: Definite ST occurred in 192 of 8,146 patients with an incidence density of 1.0/100 patient-years and a cumulative incidence of 3.3% at 4 years. The hazard of ST continued at a steady rate of 0.53% (95% confidence interval [CI]: 0.44 to 0.64) between 30 days and 4 years. Diabetes was an independent predictor of early ST (hazard ratio [HR]: 1.96; 95% CI: 1.18 to 3.28), and acute coronary syndrome (HR: 2.21; 95% CI: 1.39 to 3.51), younger age (HR: 0.97; 95% CI: 0.95 to 0.99), and use of PES (HR: 1.67; 95% CI: 1.08 to 2.56) were independent predictors of late ST. Rates of death and myocardial infarction at 4 years were 10.6% and 4.6%, respectively. CONCLUSIONS: Late ST occurs steadily at an annual rate of 0.4% to 0.6% for up to 4 years. Diabetes is an independent predictor of early ST, whereas acute coronary syndrome, younger age, and PES implantation are associated with late ST.
Resumo:
OBJECT: The effect of normobaric hyperoxia (fraction of inspired O2 [FIO2] concentration 100%) in the treatment of patients with traumatic brain injury (TBI) remains controversial. The aim of this study was to investigate the effects of normobaric hyperoxia on five cerebral metabolic indices, which have putative prognostic significance following TBI in humans. METHODS: At two independent neurointensive care units, the authors performed a prospective study of 52 patients with severe TBI who were treated for 24 hours with 100% FIO2, starting within 6 hours of admission. Data for these patients were compared with data for a cohort of 112 patients who were treated in the past; patients in the historical control group matched the patients in our study according to their Glasgow Coma Scale scores after resuscitation and their intracranial pressure within the first 8 hours after admission. Patients were monitored with the aid of intracerebral microdialysis and tissue O2 probes. Normobaric hyperoxia treatment resulted in a significant improvement in biochemical markers in the brain compared with the baseline measures for patients treated in our study (patients acting as their own controls) and also compared with findings from the historical control group. In the dialysate the glucose levels increased (369.02 +/- 20.1 micromol/L in the control group and 466.9 +/- 20.39 micromol/L in the 100% O2 group, p = 0.001), whereas the glutamate and lactate levels significantly decreased (p < 0.005). There were also reductions in the lactate/glucose and lactate/pyruvate ratios. Intracranial pressure in the treatment group was reduced significantly both during and after hyperoxia treatment compared with the control groups (15.03 +/- 0.8 mm Hg in the control group and 12.13 +/- 0.75 mm Hg in the 100% O2 group, p < 0.005) with no changes in cerebral perfusion pressure. Outcomes of the patients in the treatment group improved. CONCLUSIONS: The results of the study support the hypothesis that normobaric hyperoxia in patients with severe TBI improves the indices of brain oxidative metabolism. Based on these data further mechanistic studies and a prospective randomized controlled trial are warranted.
Resumo:
BACKGROUND: Acute respiratory infections (ARI) are a major cause of morbidity in infancy worldwide, with cough and wheeze being alarming symptoms to parents. We aimed to analyze in detail the viral aetiology of ARI with such symptoms in otherwise healthy infants, including rhinoviruses and recently discovered viruses such as human metapneumovirus (HMPV), coronavirus NL63 and HKU1, and human bocavirus (HBoV). METHODS: We prospectively followed 197 unselected infants during their first year of life and assessed clinical symptoms by weekly standardized interviews. At the first ARI with cough or wheeze, we analyzed nasal swabs by sensitive individual real time polymerase chain reaction assays targeting 16 different respiratory viruses. RESULTS: All 112 infants who had an ARI had cough, and 39 (35%) had wheeze. One or more respiratory viruses were found in 88 of 112 (79%) cases. Fifteen (17%) dual and 3 (3%) triple infections were recorded. Rhino- (23% of all viruses) and coronaviruses (18%) were most common, followed by parainfluenza viruses (17%), respiratory syncytial virus (RSV) (16%), HMPV (13%), and HBoV (5%). Together rhinoviruses, coronaviruses, HMPV, and HBoV accounted for 60% (65 of 109) of viruses. Although symptom scores and need for general practitioner (GP) consultations were highest in infants infected with RSV, they were similar in infants infected with other viruses. Viral shedding at 3 weeks occurred in 20% of cases. CONCLUSIONS: Rhinoviruses, coronaviruses, HMPV, and HBoV are common pathogens associated with respiratory symptoms in otherwise healthy infants. They should be considered in the differential diagnosis of the aetiology of ARI in this age group.
Resumo:
Respiratory infections cause considerable morbidity during infancy. The impact of innate immunity mechanisms, such as mannose-binding lectin (MBL), on respiratory symptoms remains unclear. The aims of this study were to investigate whether cord blood MBL levels are associated with respiratory symptoms during infancy and to determine the relative contribution of MBL when compared with known risk factors. This is a prospective birth cohort study including 185 healthy term infants. MBL was measured in cord blood and categorized into tertiles. Frequency and severity of respiratory symptoms were assessed weekly until age one. Association with MBL levels was analysed using multivariable random effects Poisson regression. We observed a trend towards an increased incidence rate of severe respiratory symptoms in infants in the low MBL tertile when compared with infants in the middle MBL tertile [incidence rate ratio (IRR) = 1.59; 95% confidence interval (CI): 0.95-2.66; p = 0.076]. Surprisingly, infants in the high MBL tertile suffered significantly more from severe and total respiratory symptoms than infants in the middle MBL tertile (IRR = 1.97; 95% CI: 1.20-3.25; p = 0.008). This association was pronounced in infants of parents with asthma (IRR = 3.64; 95% CI: 1.47-9.02; p = 0.005). The relative risk associated with high MBL was similar to the risk associated with well-known risk factors such as maternal smoking or childcare. In conclusion the association between low MBL levels and increased susceptibility to common respiratory infections during infancy was weaker than that previously reported. Instead, high cord blood MBL levels may represent a so far unrecognized risk factor for respiratory morbidity in infants of asthmatic parents.
Resumo:
Post-natal exposure to air pollution is associated with diminished lung growth during school age. The current authors aimed to determine whether pre-natal exposure to air pollution is associated with lung function changes in the newborn. In a prospective birth cohort of 241 healthy term-born neonates, tidal breathing, lung volume, ventilation inhomogeneity and exhaled nitric oxide (eNO) were measured during unsedated sleep at age 5 weeks. Maternal exposure to particles with a 50% cut-off aerodynamic diameter of 10 microm (PM(10)), nitrogen dioxide (NO(2)) and ozone (O(3)), and distance to major roads were estimated during pregnancy. The association between these exposures and lung function was assessed using linear regression. Minute ventilation was higher in infants with higher pre-natal PM(10) exposure (24.9 mL x min(-1) per microg x m(-3) PM(10)). The eNO was increased in infants with higher pre-natal NO(2) exposure (0.98 ppb per microg x m(-3) NO(2)). Post-natal exposure to air pollution did not modify these findings. No association was found for pre-natal exposure to O(3) and lung function parameters. The present results suggest that pre-natal exposure to air pollution might be associated with higher respiratory need and airway inflammation in newborns. Such alterations during early lung development may be important regarding long-term respiratory morbidity.
Resumo:
BACKGROUND: Low back pain (LBP) is by far the most prevalent and costly musculoskeletal problem in our society today. Following the recommendations of the Multinational Musculoskeletal Inception Cohort Study (MMICS) Statement, our study aims to define outcome assessment tools for patients with acute LBP and the time point at which chronic LBP becomes manifest and to identify patient characteristics which increase the risk of chronicity. METHODS: Patients with acute LBP will be recruited from clinics of general practitioners (GPs) in New Zealand (NZ) and Switzerland (CH). They will be assessed by postal survey at baseline and at 3, 6, 12 weeks and 6 months follow-up. Primary outcome will be disability as measured by the Oswestry Disability Index (ODI); key secondary endpoints will be general health as measured by the acute SF-12 and pain as measured on the Visual Analogue Scale (VAS). A subgroup analysis of different assessment instruments and baseline characteristics will be performed using multiple linear regression models. This study aims to examine: 1. Which biomedical, psychological, social, and occupational outcome assessment tools are identifiers for the transition from acute to chronic LBP and at which time point this transition becomes manifest. 2. Which psychosocial and occupational baseline characteristics like work status and period of work absenteeism influence the course from acute to chronic LBP. 3. Differences in outcome assessment tools and baseline characteristics of patients in NZ compared with CH. DISCUSSION: This study will develop a screening tool for patients with acute LBP to be used in GP clinics to access the risk of developing chronic LBP. In addition, biomedical, psychological, social, and occupational patient characteristics which influence the course from acute to chronic LBP will be identified. Furthermore, an appropriate time point for follow-ups will be given to detect this transition. The generalizability of our findings will be enhanced by the international perspective of this study. TRIAL REGISTRATION: [Clinical Trial Registration Number, ACTRN12608000520336].
Resumo:
OBJECTIVES: Bone attrition probably constitutes remodeling of the bone, resulting in flattening or depression of the articular surfaces. Defining bone attrition is challenging because it is an accentuation of the normal curvature of the tibial plateaus. We aimed to define bone attrition on magnetic resonance imaging (MRI) of the knee using information from both radiographs and MRIs, and to assess whether bone attrition is common prior to end stage disease osteoarthritis (OA) in the tibio-femoral joint. METHODS: All knees of participants in the community-based sample of the Framingham OA Study were evaluated for bone attrition in radiographs and MRIs. Radiographs were scored based on templates designed to outline the normal contours of the tibio-femoral joint. MRIs were analyzed using the semi-quantitative Whole-Organ Magnetic Resonance Imaging Scoring (WORMS) method. The prevalence of bone attrition was calculated using two different thresholds for MRI scores. RESULTS: Inter-observer agreement for identification of bone attrition was substantial for the radiographs (kappa=0.71, 95% CI 0.67-0.81) and moderate for MRI (kappa=0.56, 95% CI 0.40-0.72). Of 964 knees, 5.7% of the radiographs showed bone attrition. Of these, 91% of MRIs were also read as showing bone attrition. We selected a conservative threshold for bone attrition on MRI scoring (> or = 2 on a 0-3 scale) based on agreement with attrition on the radiograph or when bone attrition on MRI co-occurred with cartilage loss on OA. Using this threshold for bone attrition on MRI, bone attrition was common in knees with OA. For example, in knees with mild OA but no joint space narrowing, 13 of 88 MRIs (14.8%) showed bone attrition. CONCLUSIONS: Using MRI we found that many knees with mild OA without joint narrowing on radiographs had bone attrition, even using conservative definitions. The validity of our definition of bone attrition should be evaluated in further studies. Bone attrition may occur in milder OA and at earlier stages of disease than previously thought.
Resumo:
BACKGROUND: In HIV type-1-infected patients starting highly active antiretroviral therapy (HAART), the prognostic value of haemoglobin when starting HAART, and of changes in haemoglobin levels, are not well defined. METHODS: We combined data from 10 prospective studies of 12,100 previously untreated individuals (25% women). A total of 4,222 patients (35%) were anaemic: 131 patients (1.1%) had severe (<8.0 g/dl), 1,120 (9%) had moderate (male 8.0-<11.0 g/dl and female 8.0- < 10.0 g/dl) and 2,971 (25%) had mild (male 11.0- < 13.0 g/ dl and female 10.0- < 12.0 g/dl) anaemia. We separately analysed progression to AIDS or death from baseline and from 6 months using Weibull models, adjusting for CD4+ T-cell count, age, sex and other variables. RESULTS: During 48,420 person-years of follow-up 1,448 patients developed at least one AIDS event and 857 patients died. Anaemia at baseline was independently associated with higher mortality: the adjusted hazard ratio (95% confidence interval) for mild anaemia was 1.42 (1.17-1.73), for moderate anaemia 2.56 (2.07-3.18) and for severe anaemia 5.26 (3.55-7.81). Corresponding figures for progression to AIDS were 1.60 (1.37-1.86), 2.00 (1.66-2.40) and 2.24 (1.46-3.42). At 6 months the prevalence of anaemia declined to 26%. Baseline anaemia continued to predict mortality (and to a lesser extent progression to AIDS) in patients with normal haemoglobin or mild anaemia at 6 months. CONCLUSIONS: Anaemia at the start of HAART is an important factor for short- and long-term prognosis, including in patients whose haemoglobin levels improved or normalized during the first 6 months of HAART.
Resumo:
BACKGROUND: We aimed to study the incidence and outcome of severe traumatic brain injury (TBI) in Switzerland and to test the feasibility of a large cohort study with case identification in the first 24 hours and 6-month follow-up. METHODS: From January to June 2005, we consecutively enrolled and followed up all persons with severe TBI (Abbreviated Injury Score of the head region >3 and Glasgow Coma Scale <9) in the catchment areas of 3 Swiss medical centres with neurosurgical facilities. The primary outcome was the Extended Glasgow Outcome Scale (GOSE) after 6 months. Secondary outcomes included survival, Functional Independence Mea - sure (FIM), and health-related quality of life (SF-12) at defined time-points up to 6 months after injury. RESULTS: We recruited 101 participants from a source population of about 2.47 million (ie, about 33% of Swiss population). The incidence of severe TBI was 8.2 per 100,000 person-years. The overall case fatality was 70%: 41 of 101 persons (41%) died at the scene of the accident. 23 of 60 hospitalised participants (38%) died within 48 hours, and 31 (53%) within 6 months. In all hospitalised patients, the median GOSE was 1 (range 1-8) after 6 months, and was 6 (2-8) in 6-month survivors. The median total FIM score was 125 (range 18-126); median-SF-12 component mea - sures were 44 (25-55) for the physical scale and 52 (32-65) for the mental scale. CONCLUSIONS: Severe TBI was associated with high case fatality and considerable morbidity in survivors. We demonstrated the feasibility of a multicentre cohort study in Switzerland with the aim of identifying modifiable determinants of outcome and improving current trauma care.
Resumo:
BACKGROUND: Exposure to intermittent magnetic fields of 16 Hz has been shown to reduce heart rate variability, and decreased heart rate variability predicts cardiovascular mortality. We examined mortality from cardiovascular causes in railway workers exposed to varying degrees to intermittent 16.7 Hz magnetic fields. METHODS: We studied a cohort of 20,141 Swiss railway employees between 1972 and 2002, including highly exposed train drivers (median lifetime exposure 120.5 muT-years), and less or little exposed shunting yard engineers (42.1 muT-years), train attendants (13.3 muT-years) and station masters (5.7 muT-years). During 464,129 person-years of follow up, 5,413 deaths were recorded and 3,594 deaths were attributed to cardio-vascular diseases. We analyzed data using Cox proportional hazards models. RESULTS: For all cardiovascular mortality the hazard ratio compared to station masters was 0.99 (95%CI: 0.91, 1.08) in train drivers, 1.13 (95%CI: 0.98, 1.30) in shunting yard engineers, and 1.09 (95%CI: 1.00, 1.19) in train attendants. Corresponding hazard ratios for arrhythmia related deaths were 1.04 (95%CI: 0.68, 1.59), 0.58 (95%CI: 0.24, 1.37) and 10 (95%CI: 0.87, 1.93) and for acute myocardial infarction 1.00 (95%CI: 0.73, 1.36), 1.56 (95%CI: 1.04, 2.32), and 1.14 (95%CI: 0.85, 1.53). The hazard ratio for arrhythmia related deaths per 100 muT-years of cumulative exposure was 0.94 (95%CI: 0.71, 1.24) and 0.91 (95%CI: 0.75, 1.11) for acute myocardial infarction. CONCLUSION: This study provides evidence against an association between long-term occupational exposure to intermittent 16.7 Hz magnetic fields and cardiovascular mortality.