81 resultados para Regression-based decomposition.
Resumo:
Background: Few studies have examined the 20% of individuals who never experience an episode of low back pain (LBP). To date, no investigation has been undertaken that examines a group who claim to have never experienced LBP in their lifetime in comparison to two population-based case–control groups with and without momentary LBP. This study investigates whether LBP-resilient workers between 50 and 65 years had better general health, demonstrated more positive health behaviour and were better able to achieve routine activities compared with both case–control groups. Methods: Forty-two LBP-resilient participants completed the same pain assessment questionnaire as a population-based LBP sample from a nationwide, large-scale cross-sectional survey in Switzerland. The LBP-resilient participants were pairwise compared to the propensity score-matched case controls by exploring differences in demographic and work characteristics, and by calculating odds ratios (ORs) and effect sizes. A discriminant analysis explored group differences, while the multiple logistic regression analysis specified single indicators which accounted for group differences. Results: LBP-resilient participants were healthier than the case controls with momentary LBP and achieved routine activities more easily. Compared to controls without momentary LBP, LBP-resilient participants had a higher vitality, a lower workload, a healthier attitude towards health and behaved more healthily by drinking less alcohol. Conclusions: By demonstrating a difference between LBP-resilient participants and controls without momentary LBP, the question that arises is what additional knowledge can be attained. Three underlying traits seem to be relevant about LBP-resilient participants: personality, favourable work conditions and subjective attitudes/attributions towards health. These rationales have to be considered with respect to LBP prevention.
Resumo:
PURPOSE Somatostatin-based radiopeptide treatment is generally performed using the β-emitting radionuclides (90)Y or (177)Lu. The present study aimed at comparing benefits and harms of both therapeutic approaches. METHODS In a comparative cohort study, patients with advanced neuroendocrine tumours underwent repeated cycles of [(90)Y-DOTA]-TOC or [(177)Lu-DOTA]-TOC until progression of disease or permanent adverse events. Multivariable Cox regression and competing risks regression were employed to examine predictors of survival and adverse events for both treatment groups. RESULTS Overall, 910 patients underwent 1,804 cycles of [(90)Y-DOTA]-TOC and 141 patients underwent 259 cycles of [(177)Lu-DOTA]-TOC. The median survival after [(177)Lu-DOTA]-TOC and after [(90)Y-DOTA]-TOC was comparable (45.5 months versus 35.9 months, hazard ratio 0.91, 95% confidence interval 0.63-1.30, p = 0.49). Subgroup analyses revealed a significantly longer survival for [(177)Lu-DOTA]-TOC over [(90)Y-DOTA]-TOC in patients with low tumour uptake, solitary lesions and extra-hepatic lesions. The rate of severe transient haematotoxicities was lower after [(177)Lu-DOTA]-TOC treatment (1.4 vs 10.1%, p = 0.001), while the rate of severe permanent renal toxicities was similar in both treatment groups (9.2 vs 7.8%, p = 0.32). CONCLUSION The present results revealed no difference in median overall survival after [(177)Lu-DOTA]-TOC and [(90)Y-DOTA]-TOC. Furthermore, [(177)Lu-DOTA]-TOC was less haematotoxic than [(90)Y-DOTA]-TOC.
Resumo:
BACKGROUND We describe the long-term outcome after clinical introduction and dose escalation of somatostatin receptor targeted therapy with [90Y-DOTA]-TOC in patients with metastasized neuroendocrine tumors. METHODS In a clinical phase I dose escalation study we treated patients with increasing [90Y-DOTA]-TOC activities. Multivariable Cox regression and competing risk regression were used to compare efficacy and toxicities of the different dosage protocols. RESULTS Overall, 359 patients were recruited; 60 patients were enrolled for low dose (median: 2.4 GBq/cycle, range 0.9-7.8 GBq/cycle), 77 patients were enrolled for intermediate dose (median: 3.3 GBq/cycle, range: 2.0-7.4 GBq/cycle) and 222 patients were enrolled for high dose (median: 6.7 GBq/cycle, range: 3.7-8.1 GBq/cycle) [90Y-DOTA]-TOC treatment. The incidences of hematotoxicities grade 1-4 were 65.0%, 64.9% and 74.8%; the incidences of grade 4/5 kidney toxicities were 8.4%, 6.5% and 14.0%, and the median survival was 39 (range: 1-158) months, 34 (range: 1-118) months and 29 (range: 1-113) months. The high dose protocol was associated with an increased risk of kidney toxicity (Hazard Ratio: 3.12 (1.13-8.59) vs. intermediate dose, p = 0.03) and a shorter overall survival (Hazard Ratio: 2.50 (1.08-5.79) vs. low dose, p = 0.03). CONCLUSIONS Increasing [90Y-DOTA]-TOC activities may be associated with increasing hematological toxicities. The dose related hematotoxicity profile of [90Y-DOTA]-TOC could facilitate tailoring [90Y-DOTA]-TOC in patients with preexisting hematotoxicities. The results of the long-term outcome suggest that fractionated [90Y-DOTA]-TOC treatment might allow to reduce renal toxicity and to improve overall survival. (ClinicalTrials.gov number NCT00978211).
Resumo:
Objective We investigated factors associated with masked and white-coat hypertension in a Swiss population-based sample. Methods The Swiss Kidney Project on Genes in Hypertension is a family-based cross-sectional study. Office and 24-hour ambulatory blood pressure were measured using validated devices. Masked hypertension was defined as office blood pressure<140/90 mmHg and daytime ambulatory blood pressure≥135/85 mmHg. White-coat hypertension was defined as office blood pressure≥140/90 mmHg and daytime ambulatory blood pressure<135/85 mmHg. Mixed-effect logistic regression was used to examine the relationship of masked and white-coat hypertension with associated factors, while taking familial correlations into account. High-normal office blood pressure was defined as systolic/diastolic blood pressure within the 130–139/85–89 mmHg range. Results Among the 652 participants included in this analysis, 51% were female. Mean age (±SD) was 48 (±18) years. The proportion of participants with masked and white coat hypertension was respectively 15.8% and 2.6%. Masked hypertension was associated with age (odds ratio (OR) = 1.02, p = 0.012), high-normal office blood pressure (OR = 6.68, p<0.001), and obesity (OR = 3.63, p = 0.001). White-coat hypertension was significantly associated with age (OR = 1.07, p<0.001) but not with education, family history of hypertension, or physical activity. Conclusions Our findings suggest that physicians should consider ambulatory blood pressure monitoring for older individuals with high-normal office blood pressure and/or who are obese.
Resumo:
Increased renal resistive index (RRI) has been recently associated with target organ damage and cardiovascular or renal outcomes in patients with hypertension and diabetes mellitus. However, reference values in the general population and information on familial aggregation are largely lacking. We determined the distribution of RRI, associated factors, and heritability in a population-based study. Families of European ancestry were randomly selected in 3 Swiss cities. Anthropometric parameters and cardiovascular risk factors were assessed. A renal Doppler ultrasound was performed, and RRI was measured in 3 segmental arteries of both kidneys. We used multilevel linear regression analysis to explore the factors associated with RRI, adjusting for center and family relationships. Sex-specific reference values for RRI were generated according to age. Heritability was estimated by variance components using the ASSOC program (SAGE software). Four hundred women (mean age±SD, 44.9±16.7 years) and 326 men (42.1±16.8 years) with normal renal ultrasound had mean RRI of 0.64±0.05 and 0.62±0.05, respectively (P<0.001). In multivariable analyses, RRI was positively associated with female sex, age, systolic blood pressure, and body mass index. We observed an inverse correlation with diastolic blood pressure and heart rate. Age had a nonlinear association with RRI. We found no independent association of RRI with diabetes mellitus, hypertension treatment, smoking, cholesterol levels, or estimated glomerular filtration rate. The adjusted heritability estimate was 42±8% (P<0.001). In a population-based sample with normal renal ultrasound, RRI normal values depend on sex, age, blood pressure, heart rate, and body mass index. The significant heritability of RRI suggests that genes influence this phenotype.
Resumo:
Purpose Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Patients and Methods Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988–2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Results Median age was 69 years, 47.4 % were female and 86.0 % White. Median survival was 11 months overall, with an overall increase from 8 to 14 months between 1988 and 2008. Overall survival increased from 8 to 14 months for White, and from 6 to 13 months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p<0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95 % confidence interval 0.991–1.000) per year (p=0.03). Conclusion A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained.
Resumo:
BACKGROUND It is often assumed that blood pressure increases acutely after major stroke, resulting in so-called post-stroke hypertension. In view of evidence that the risks and benefits of blood pressure-lowering treatment in acute stroke might differ between patients with major ischaemic stroke and those with primary intracerebral haemorrhage, we compared acute-phase and premorbid blood pressure levels in these two disorders. METHODS In a population-based study in Oxfordshire, UK, we recruited all patients presenting with stroke between April 1, 2002, and March 31, 2012. We compared all acute-phase post-event blood pressure readings with premorbid readings from 10-year primary care records in all patients with acute major ischaemic stroke (National Institutes of Health Stroke Scale >3) versus those with acute intracerebral haemorrhage. FINDINGS Of 653 consecutive eligible patients, premorbid and acute-phase blood pressure readings were available for 636 (97%) individuals. Premorbid blood pressure (total readings 13,244) had been measured on a median of 17 separate occasions per patient (IQR 8-31). In patients with ischaemic stroke, the first acute-phase systolic blood pressure was much lower than after intracerebral haemorrhage (158·5 mm Hg [SD 30·1] vs 189·8 mm Hg [38·5], p<0·0001; for patients not on antihypertensive treatment 159·2 mm Hg [27·8] vs 193·4 mm Hg [37·4], p<0·0001), was little higher than premorbid levels (increase of 10·6 mm Hg vs 10-year mean premorbid level), and decreased only slightly during the first 24 h (mean decrease from <90 min to 24 h 13·6 mm Hg). By contrast with findings in ischaemic stroke, the mean first systolic blood pressure after intracerebral haemorrhage was substantially higher than premorbid levels (mean increase of 40·7 mm Hg, p<0·0001) and fell substantially in the first 24 h (mean decrease of 41·1 mm Hg; p=0·0007 for difference from decrease in ischaemic stroke). Mean systolic blood pressure also increased steeply in the days and weeks before intracerebral haemorrhage (regression p<0·0001) but not before ischaemic stroke. Consequently, the first acute-phase blood pressure reading after primary intracerebral haemorrhage was more likely than after ischaemic stroke to be the highest ever recorded (OR 3·4, 95% CI 2·3-5·2, p<0·0001). In patients with intracerebral haemorrhage seen within 90 min, the highest systolic blood pressure within 3 h of onset was 50 mm Hg higher, on average, than the maximum premorbid level whereas that after ischaemic stroke was 5·2 mm Hg lower (p<0·0001). INTERPRETATION Our findings suggest that systolic blood pressure is substantially raised compared with usual premorbid levels after intracerebral haemorrhage, whereas acute-phase systolic blood pressure after major ischaemic stroke is much closer to the accustomed long-term premorbid level, providing a potential explanation for why the risks and benefits of lowering blood pressure acutely after stroke might be expected to differ. FUNDING Wellcome Trust, Wolfson Foundation, UK Medical Research Council, Stroke Association, British Heart Foundation, National Institute for Health Research.
Resumo:
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts.
Resumo:
BACKGROUND In Switzerland, the heptavalent (PCV7) and 13-valent pneumococcal conjugate vaccine (PCV13) were recommended for all infants aged <2 years in 2007 and 2011, respectively. Due to herd effects, a protective impact on the invasive pneumococcal disease (IPD) rates in adults had been expected. METHODS Within this study, data from the nationwide mandatory surveillance was analyzed for all adult patients ≥16 years with IPD of known serotype/serogroup during 2003-2012. Trend (for IPD cases from 2003 to 2012) and logistic regression analyses (2007-2010) were performed to identify changes in serotype distribution and to identify the association of serotypes with age, clinical manifestations, comorbidities and case fatality, respectively. FINDINGS The proportion of PCV7 serotypes among all IPD cases (n=7678) significantly declined in adults from 44.7% (2003) before to 16.7% (2012) after the recommendation of PCV7 (P<0.001). In contrast, the proportion of non-PCV7 serogroup/serotypes increased for non-PCV13 but also PCV13 serotypes (not included in PCV7) at the same time. Serotype distribution varied significantly across ages, clinical manifestations and comorbidities. Serotype was furthermore associated with case fatality (P=0.001). In a multivariable logistic regression model, analyzing single serotypes showed that case-fatality was increased for the serotypes 3 (P=0.008), 19A (P=0.03) and 19F (P=0.005), compared to serotype 1 and 7F. CONCLUSION There was a significant decline in PCV7 serotypes among adults with IPD in Switzerland after introduction of childhood vaccination with PCV7. Pneumococcal serotypes were associated with case fatality, age, clinical manifestation and comorbidities of IPD in adults. These results may prove useful for future vaccine recommendations for adults in Switzerland.
Resumo:
BACKGROUND & Aims: Standardized instruments are needed to assess the activity of eosinophilic esophagitis (EoE), to provide endpoints for clinical trials and observational studies. We aimed to develop and validate a patient-reported outcome (PRO) instrument and score, based on items that could account for variations in patients' assessments of disease severity. We also evaluated relationships between patients' assessment of disease severity and EoE-associated endoscopic, histologic, and laboratory findings. METHODS We collected information from 186 patients with EoE in Switzerland and the US (69.4% male; median age, 43 years) via surveys (n = 135), focus groups (n = 27), and semi-structured interviews (n = 24). Items were generated for the instruments to assess biologic activity based on physician input. Linear regression was used to quantify the extent to which variations in patient-reported disease characteristics could account for variations in patients' assessment of EoE severity. The PRO instrument was prospectively used in 153 adult patients with EoE (72.5% male; median age, 38 years), and validated in an independent group of 120 patients with EoE (60.8% male; median age, 40.5 years). RESULTS Seven PRO factors that are used to assess characteristics of dysphagia, behavioral adaptations to living with dysphagia, and pain while swallowing accounted for 67% of the variation in patients' assessment of disease severity. Based on statistical consideration and patient input, a 7-day recall period was selected. Highly active EoE, based on endoscopic and histologic findings, was associated with an increase in patient-assessed disease severity. In the validation study, the mean difference between patient assessment of EoE severity and PRO score was 0.13 (on a scale from 0 to 10). CONCLUSIONS We developed and validated an EoE scoring system based on 7 PRO items that assesses symptoms over a 7-day recall period. Clinicaltrials.gov number: NCT00939263.
Resumo:
OBJECTIVES It is still debated if pre-existing minority drug-resistant HIV-1 variants (MVs) affect the virological outcomes of first-line NNRTI-containing ART. METHODS This Europe-wide case-control study included ART-naive subjects infected with drug-susceptible HIV-1 as revealed by population sequencing, who achieved virological suppression on first-line ART including one NNRTI. Cases experienced virological failure and controls were subjects from the same cohort whose viraemia remained suppressed at a matched time since initiation of ART. Blinded, centralized 454 pyrosequencing with parallel bioinformatic analysis in two laboratories was used to identify MVs in the 1%-25% frequency range. ORs of virological failure according to MV detection were estimated by logistic regression. RESULTS Two hundred and sixty samples (76 cases and 184 controls), mostly subtype B (73.5%), were used for the analysis. Identical MVs were detected in the two laboratories. 31.6% of cases and 16.8% of controls harboured pre-existing MVs. Detection of at least one MV versus no MVs was associated with an increased risk of virological failure (OR = 2.75, 95% CI = 1.35-5.60, P = 0.005); similar associations were observed for at least one MV versus no NRTI MVs (OR = 2.27, 95% CI = 0.76-6.77, P = 0.140) and at least one MV versus no NNRTI MVs (OR = 2.41, 95% CI = 1.12-5.18, P = 0.024). A dose-effect relationship between virological failure and mutational load was found. CONCLUSIONS Pre-existing MVs more than double the risk of virological failure to first-line NNRTI-based ART.
Resumo:
OBJECTIVES This study aimed to update the Logistic Clinical SYNTAX score to predict 3-year survival after percutaneous coronary intervention (PCI) and compare the performance with the SYNTAX score alone. BACKGROUND The SYNTAX score is a well-established angiographic tool to predict long-term outcomes after PCI. The Logistic Clinical SYNTAX score, developed by combining clinical variables with the anatomic SYNTAX score, has been shown to perform better than the SYNTAX score alone in predicting 1-year outcomes after PCI. However, the ability of this score to predict long-term survival is unknown. METHODS Patient-level data (N = 6,304, 399 deaths within 3 years) from 7 contemporary PCI trials were analyzed. We revised the overall risk and the predictor effects in the core model (SYNTAX score, age, creatinine clearance, and left ventricular ejection fraction) using Cox regression analysis to predict mortality at 3 years. We also updated the extended model by combining the core model with additional independent predictors of 3-year mortality (i.e., diabetes mellitus, peripheral vascular disease, and body mass index). RESULTS The revised Logistic Clinical SYNTAX models showed better discriminative ability than the anatomic SYNTAX score for the prediction of 3-year mortality after PCI (c-index: SYNTAX score, 0.61; core model, 0.71; and extended model, 0.73 in a cross-validation procedure). The extended model in particular performed better in differentiating low- and intermediate-risk groups. CONCLUSIONS Risk scores combining clinical characteristics with the anatomic SYNTAX score substantially better predict 3-year mortality than the SYNTAX score alone and should be used for long-term risk stratification of patients undergoing PCI.
Resumo:
INTRODUCTION Optimal identification of subtle cognitive impairment in the primary care setting requires a very brief tool combining (a) patients' subjective impairments, (b) cognitive testing, and (c) information from informants. The present study developed a new, very quick and easily administered case-finding tool combining these assessments ('BrainCheck') and tested the feasibility and validity of this instrument in two independent studies. METHODS We developed a case-finding tool comprised of patient-directed (a) questions about memory and depression and (b) clock drawing, and (c) the informant-directed 7-item version of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE). Feasibility study: 52 general practitioners rated the feasibility and acceptance of the patient-directed tool. Validation study: An independent group of 288 Memory Clinic patients (mean ± SD age = 76.6 ± 7.9, education = 12.0 ± 2.6; 53.8% female) with diagnoses of mild cognitive impairment (n = 80), probable Alzheimer's disease (n = 185), or major depression (n = 23) and 126 demographically matched, cognitively healthy volunteer participants (age = 75.2 ± 8.8, education = 12.5 ± 2.7; 40% female) partook. All patient and healthy control participants were administered the patient-directed tool, and informants of 113 patient and 70 healthy control participants completed the very short IQCODE. RESULTS Feasibility study: General practitioners rated the patient-directed tool as highly feasible and acceptable. Validation study: A Classification and Regression Tree analysis generated an algorithm to categorize patient-directed data which resulted in a correct classification rate (CCR) of 81.2% (sensitivity = 83.0%, specificity = 79.4%). Critically, the CCR of the combined patient- and informant-directed instruments (BrainCheck) reached nearly 90% (that is 89.4%; sensitivity = 97.4%, specificity = 81.6%). CONCLUSION A new and very brief instrument for general practitioners, 'BrainCheck', combined three sources of information deemed critical for effective case-finding (that is, patients' subject impairments, cognitive testing, informant information) and resulted in a nearly 90% CCR. Thus, it provides a very efficient and valid tool to aid general practitioners in deciding whether patients with suspected cognitive impairments should be further evaluated or not ('watchful waiting').
Resumo:
BACKGROUND Existing prediction models for mortality in chronic obstructive pulmonary disease (COPD) patients have not yet been validated in primary care, which is where the majority of patients receive care. OBJECTIVES Our aim was to validate the ADO (age, dyspnoea, airflow obstruction) index as a predictor of 2-year mortality in 2 general practice-based COPD cohorts. METHODS Six hundred and forty-six patients with COPD with GOLD (Global Initiative for Chronic Obstructive Lung Disease) stages I-IV were enrolled by their general practitioners and followed for 2 years. The ADO regression equation was used to predict a 2-year risk of all-cause mortality in each patient and this risk was compared with the observed 2-year mortality. Discrimination and calibration were assessed as well as the strength of association between the 15-point ADO score and the observed 2-year all-cause mortality. RESULTS Fifty-two (8.1%) patients died during the 2-year follow-up period. Discrimination with the ADO index was excellent with an area under the curve of 0.78 [95% confidence interval (CI) 0.71-0.84]. Overall, the predicted and observed risks matched well and visual inspection revealed no important differences between them across 10 risk classes (p = 0.68). The odds ratio for death per point increase according to the ADO index was 1.50 (95% CI 1.31-1.71). CONCLUSIONS The ADO index showed excellent prediction properties in an out-of-population validation carried out in COPD patients from primary care settings.
Resumo:
BACKGROUND Exposure to medium or high doses of ionizing radiation is a known risk factor for cancer in children. The extent to which low dose radiation from natural sources contributes to the risk of childhood cancer remains unclear. OBJECTIVES In a nationwide census-based cohort study, we investigated whether the incidence of childhood cancer was associated with background radiation from terrestrial gamma and cosmic rays. METHODS Children aged <16 years in the Swiss National Censuses in 1990 and 2000 were included. The follow-up period lasted until 2008 and incident cancer cases were identified from the Swiss Childhood Cancer Registry. A radiation model was used to predict dose rates from terrestrial and cosmic radiation at locations of residence. Cox regression models were used to assess associations between cancer risk and dose rates and cumulative dose since birth. RESULTS Among 2,093,660 children included at census, 1,782 incident cases of cancer were identified including 530 with leukemia, 328 with lymphoma, and 423 with a tumor of the central nervous system (CNS). Hazard ratios for each mSv increase in cumulative dose of external radiation were 1.03 (95% CI: 1.01, 1.05) for any cancer, 1.04 (1.00, 1.08) for leukemia, 1.01 (0.96, 1.05) for lymphoma, and 1.04 (1.00, 1.08) for CNS tumors. Adjustment for a range of potential confounders had little effect on the results. CONCLUSIONS Our study suggests that background radiation may contribute to the risk of cancer in children including leukemia and CNS tumors.