545 resultados para SOCIAL SERVICES
Resumo:
Objectives: To update the 2006 systematic review of the comparative benefits and harms of erythropoiesis-stimulating agent (ESA) strategies and non-ESA strategies to manage anemia in patients undergoing chemotherapy and/or radiation for malignancy (excluding myelodysplastic syndrome and acute leukemia), including the impact of alternative thresholds for initiating treatment and optimal duration of therapy. Data sources: Literature searches were updated in electronic databases (n=3), conference proceedings (n=3), and Food and Drug Administration transcripts. Multiple sources (n=13) were searched for potential gray literature. A primary source for current survival evidence was a recently published individual patient data meta-analysis. In that meta-analysis, patient data were obtained from investigators for studies enrolling more than 50 patients per arm. Because those data constitute the most currently available data for this update, as well as the source for on-study (active treatment) mortality data, we limited inclusion in the current report to studies enrolling more than 50 patients per arm to avoid potential differential endpoint ascertainment in smaller studies. Review methods: Title and abstract screening was performed by one or two (to resolve uncertainty) reviewers; potentially included publications were reviewed in full text. Two or three (to resolve disagreements) reviewers assessed trial quality. Results were independently verified and pooled for outcomes of interest. The balance of benefits and harms was examined in a decision model. Results: We evaluated evidence from 5 trials directly comparing darbepoetin with epoetin, 41 trials comparing epoetin with control, and 8 trials comparing darbepoetin with control; 5 trials evaluated early versus late (delay until Hb ≤9 to 11 g/dL) treatment. Trials varied according to duration, tumor types, cancer therapy, trial quality, iron supplementation, baseline hemoglobin, ESA dosing frequency (and therefore amount per dose), and dose escalation. ESAs decreased the risk of transfusion (pooled relative risk [RR], 0.58; 95% confidence interval [CI], 0.53 to 0.64; I2 = 51%; 38 trials) without evidence of meaningful difference between epoetin and darbepoetin. Thromboembolic event rates were higher in ESA-treated patients (pooled RR, 1.51; 95% CI, 1.30 to 1.74; I2 = 0%; 37 trials) without difference between epoetin and darbepoetin. In 14 trials reporting the Functional Assessment of Cancer Therapy (FACT)-Fatigue subscale, the most common patient-reported outcome, scores decreased by −0.6 in control arms (95% CI, −6.4 to 5.2; I2 = 0%) and increased by 2.1 in ESA arms (95% CI, −3.9 to 8.1; I2 = 0%). There were fewer thromboembolic and on-study mortality adverse events when ESA treatment was delayed until baseline Hb was less than 10 g/dL, in keeping with current treatment practice, but the difference in effect from early treatment was not significant, and the evidence was limited and insufficient for conclusions. No evidence informed optimal duration of therapy. Mortality was increased during the on-study period (pooled hazard ratio [HR], 1.17; 95% CI, 1.04 to 1.31; I2 = 0%; 37 trials). There was one additional death for every 59 treated patients when the control arm on-study mortality was 10 percent and one additional death for every 588 treated patients when the control-arm on-study mortality was 1 percent. A cohort decision model yielded a consistent result—greater loss of life-years when control arm on-study mortality was higher. There was no discernible increase in mortality with ESA use over the longest available followup (pooled HR, 1.04; 95% CI, 0.99 to 1.10; I2 = 38%; 44 trials), but many trials did not include an overall survival endpoint and potential time-dependent confounding was not considered. Conclusions: Results of this update were consistent with the 2006 review. ESAs reduced the need for transfusions and increased the risk of thromboembolism. FACT-Fatigue scores were better with ESA use but the magnitude was less than the minimal clinically important difference. An increase in mortality accompanied the use of ESAs. An important unanswered question is whether dosing practices and overall ESA exposure might influence harms.
Resumo:
Reporting and publication bias is a well-known problem in meta-analysis and healthcare research. In 2002 we conducted a meta-analysis on the effects of erythropoiesis-stimulating agents (ESAs) on overall survival in cancer patients, which suggested some evidence for improved survival in patients receiving ESAs compared with controls. However, a meta-analysis of individual patient data conducted several years later showed the opposite of our first meta-analysis, that is, evidence for increased on-study mortality and reduced overall survival in cancer patients receiving ESAs. We aimed to determine whether the results of our first meta-analysis could have been affected by publication and reporting biases and, if so, whether timely access to clinical study reports and individual patient data could have prevented this. We conducted a hypothetical meta-analysis for overall survival including all studies and study data that could have been available in 2002, at the time when we conducted our first meta-analysis. Compared with our original meta-analysis, which suggested an overall survival benefit for cancer patients receiving ESAs [hazard ratio (HR) 0.81, 95% confidence interval (CI) 0.67‒0.99], our hypothetical meta-analysis based on the results of all studies conducted at the time of the first analysis did not show evidence for a beneficial effect of ESAs on overall survival (HR 0.97, 95% CI 0.83‒1.12). Thus we have to conclude that our first meta-analysis showed misleading overall survival benefits due to publication and reporting biases, which could have been prevented by timely access to clinical study reports and individual patient data. Unrestricted access to clinical study protocols including amendments, clinical study reports and individual patient data is needed to ensure timely detection of both beneficial and harmful effects of healthcare interventions.
Resumo:
In the field of thrombosis and haemostasis, many preanalytical variables influence the results of coagulation assays and measures to limit potential results variations should be taken. To our knowledge, no paper describing the development and maintenance of a haemostasis biobank has been previously published. Our description of the biobank of the Swiss cohort of elderly patients with venous thromboembolism (SWITCO65+) is intended to facilitate the set-up of other biobanks in the field of thrombosis and haemostasis. SWITCO65+ is a multicentre cohort that prospectively enrolled consecutive patients aged ≥65 years with venous thromboembolism at nine Swiss hospitals from 09/2009 to 03/2012. Patients will be followed up until December 2013. The cohort includes a biobank with biological material from each participant taken at baseline and after 12 months of follow-up. Whole blood from all participants is assayed with a standard haematology panel, for which fresh samples are required. Two buffy coat vials, one PAXgene Blood RNA System tube and one EDTA-whole blood sample are also collected at baseline for RNA/DNA extraction. Blood samples are processed and vialed within 1 h of collection and transported in batches to a central laboratory where they are stored in ultra-low temperature archives. All analyses of the same type are performed in the same laboratory in batches. Using multiple core laboratories increased the speed of sample analyses and reduced storage time. After recruiting, processing and analyzing the blood of more than 1,000 patients, we determined that the adopted methods and technologies were fit-for-purpose and robust.
Resumo:
BACKGROUND Mortality risk for people with chronic kidney disease is substantially greater than that for the general population, increasing to a 7-fold greater risk for those on dialysis therapy. Higher body mass index, generally due to higher energy intake, appears protective for people on dialysis therapy, but the relationship between energy intake and survival in those with reduced kidney function is unknown. STUDY DESIGN Prospective cohort study with a median follow-up of 14.5 (IQR, 11.2-15.2) years. SETTING & PARTICIPANTS Blue Mountains Area, west of Sydney, Australia. Participants in the general community enrolled in the Blue Mountains Eye Study (n=2,664) who underwent a detailed interview, food frequency questionnaire, and physical examination including body weight, height, blood pressure, and laboratory tests. PREDICTORS Relative energy intake, food components (carbohydrates, total sugars, fat, protein, and water), and estimated glomerular filtration rate (eGFR). Relative energy intake was dichotomized at 100%, and eGFR, at 60mL/min/1.73m(2). OUTCOMES All-cause and cardiovascular mortality. MEASUREMENTS All-cause and cardiovascular mortality using unadjusted and adjusted Cox proportional regression models. RESULTS 949 people died during follow-up, 318 of cardiovascular events. In people with eGFR<60mL/min/1.73m(2) (n=852), there was an increased risk of all-cause mortality (HR, 1.48; P=0.03), but no increased risk of cardiovascular mortality (HR, 1.59; P=0.1) among those with higher relative energy intake compared with those with lower relative energy intake. Increasing intake of carbohydrates (HR per 100g/d, 1.50; P=0.04) and total sugars (HR per 100g/d, 1.62; P=0.03) was associated significantly with increased risk of cardiovascular mortality. LIMITATIONS Under-reporting of energy intake, baseline laboratory and food intake values only, white population. CONCLUSIONS Increasing relative energy intake was associated with increased all-cause mortality in patients with eGFR<60mL/min/1.73m(2). This effect may be mediated by increasing total sugars intake on subsequent cardiovascular events.
Resumo:
BACKGROUND Chronic kidney disease is associated with an increased risk of cancer, but whether reduced kidney function also leads to increased cancer mortality is uncertain. The aim of our study was to assess the independent effects of reduced kidney function on the risk of cancer deaths. STUDY DESIGN Prospective population-based cohort study. SETTING & PARTICIPANTS Participants of the Blue Mountains Eye Study (n=4,077; aged 49-97 years). PREDICTOR Estimated glomerular filtration rate (eGFR). OUTCOMES Overall and site-specific cancer mortality. RESULTS During a median follow-up of 12.8 (IQR, 8.6-15.8) years, 370 cancer deaths were observed in our study cohort. For every 10-mL/min/1.73 m(2) reduction in eGFR, there was an increase in cancer-specific mortality of 18% in the fully adjusted model (P<0.001). Compared with participants with eGFR ≥ 60 mL/min/1.73 m(2), the adjusted HR for cancer-specific mortality for those with eGFR<60 mL/min/1.73 m(2) was 1.27 (95% CI, 1.00-1.60; P=0.05). This excess cancer mortality varied with site, with the greatest risk for breast and urinary tract cancer deaths (adjusted HRs of 1.99 [95% CI, 1.05-3.85; P=0.01] and 2.54 [95% CI, 1.02-6.44; P=0.04], respectively). LIMITATIONS Residual confounding, such as from unmeasured socioeconomic factors and the potential effects of erythropoiesis-stimulating agents on cancer deaths, may have occurred. CONCLUSIONS eGFR<60 mL/min/1.73m(2) appears to be a significant risk factor for death from cancer. These effects appear to be site specific, with breast and urinary tract cancers incurring the greatest risk of death among those with reduced kidney function.
Resumo:
BACKGROUND Drinking eight glasses of fluid or water each day is widely believed to improve health, but evidence is sparse and conflicting. We aimed to investigate the association between fluid consumption and long-term mortality and kidney function. METHODS We conducted a longitudinal analysis within a prospective, population-based cohort study of 3858 men and women aged 49 years or older residing in Australia. Daily fluid intake from food and beverages not including water was measured using a food frequency questionnaire. We did multivariable adjusted Cox proportional hazard models for all-cause and cardiovascular mortality and a boot-strapping procedure for estimated glomerular filtration rate (eGFR). RESULTS Upper and lower quartiles of daily fluid intake corresponded to >3 L and <2 L, respectively. During a median follow-up of 13.1 years (total 43 093 years at risk), 1127 deaths (26.1 per 1000 years at risk) including 580 cardiovascular deaths (13.5 per 1000 years at risk) occurred. Daily fluid intake (per 250 mL increase) was not associated with all-cause [adjusted hazard ratio (HR) 0.99 (95% CI 0.98-1.01)] or cardiovascular mortality [HR 0.98 (95% CI 0.95-1.01)]. Overall, eGFR reduced by 2.2 mL/min per 1.73 m(2) (SD 10.9) in the 1207 (31%) participants who had repeat creatinine measurements and this was not associated with fluid intake [adjusted regression coefficient 0.06 mL/min/1.73 m(2) per 250 mL increase (95% CI -0.03 to 0.14)]. CONCLUSIONS Fluid intake from food and beverages excluding water is not associated with improved kidney function or reduced mortality.
Resumo:
Background: New oral anticoagulants (NOACs) are predicted to become the new standard treatment for stroke prevention in patients with atrial fibrillation, and may replace vitamin K antagonists (VKAs). NOACs are prescribed less than expected, even though they do not require international normalised ratio (INR) monitoring. In this study we assessed methods for INR monitoring after the introduction of NOACs a in heterogeneous sample of countries. Methods: We asked representatives of the Vasco da Gama Movement, a network of junior and future gen- eral practitioners (GPs) in Europe, and WONCA, the World Organization of Family Doctors, to describe the way INR is monitored in their respective countries. Results: Representatives of 14 countries responded. In most countries, the INR is monitored by GPs; in some countries, these patients are treated by other specialists or in specialised anticoagulation centres. In only a few countries, anticoagulated patients monitor the INR themselves. Conclusion: Our study showed several strategies for managing anticoagulation in different countries. In most countries, the INR is monitored by GPs. These consultations offer opportunities to address other is- sues, such as blood pressure control or medication adherence. These factors may be considered when de- ciding to switch patients from VKAs to NOACs.
Resumo:
OBJECTIVES Decisions to use condoms are made within partnerships. We examined the associations between inconsistent or no condom use and individual and partnership characteristics. We also examined the relative importance of individual versus partnership factors. METHODS Cross-sectional study of heterosexual individuals enrolled from the sexually transmitted infections (STI) outpatient clinic in Amsterdam, the Netherlands, from May to August 2010. Participants completed a questionnaire about sexual behaviour with the last four partners in the preceding year. Participant and partnership factors associated with inconsistent or no condom use in steady and casual partnerships were identified. RESULTS 2144 individuals were included, reporting 6401 partnerships; 54.7% were female, the median age was 25 (IQR 22-30) years and 79.9% were Dutch. Inconsistent or no condom use occurred in 13.9% of 2387 steady partnerships and in 33.5% of 4014 casual partnerships. There was statistical evidence of associations between inconsistent condom use in steady partnerships and ethnic concordance, longer duration, higher number of sex acts, practising anal sex, and sex-related drug use. In casual partnerships, associations were found with having an older partner, ethnic concordance, longer duration, higher number of sex acts, anal sex, sex-related drug use, ongoing partnerships and concurrency. In multivariable models, partnership factors explained 50.9% of the variance in steady partnerships and 70.1% in casual partnerships compared with 10.5% and 15.4% respectively for individual factors. CONCLUSIONS Among heterosexual STI clinic attendees in Amsterdam, partnership factors are more important factors related with inconsistent condom use than characteristics of the individual.
Resumo:
Purpose To provide normal values of the cervical spinal canal and spinal cord dimensions in several planes with respect to spinal level, age, sex, and body height. Materials and Methods This study was approved by the institutional review board; all individuals provided signed informed consent. In a prospective multicenter study, two blinded raters independently examined cervical spine magnetic resonance (MR) images of 140 healthy volunteers who were white. The midsagittal diameters and areas of spinal canal and spinal cord, respectively, were measured at the midvertebral levels of C1, C3, and C6. A multivariate general linear model described the influence of sex, body height, age, and spinal level on the measured values. Results There were differences for sex, spinal level, interaction between sex and level, and body height, while age had significant yet limited influence. Normative ranges for the sagittal diameters and areas of spinal canal and spinal cord were defined at C1, C3, and C6 levels for men and women. In addition to a calculation of normative ranges for a specific sex, spinal level, age, and body height data, data for three different height subgroups at 45 years of age were extracted. These results show a range of the spinal canal dimensions at C1 (from 10.7 to 19.7 mm), C3 (from 9.4 to 17.2 mm), and C6 (from 9.2 to 16.8 mm) levels. Conclusion : The dimensions of the cervical spinal canal and cord in healthy individuals are associated with spinal level, sex, age, and height. © RSNA, 2013 Online supplemental material is available for this article.
Resumo:
Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.
Reasons for heterogeneous change in LCI in children with cystic fibrosis after antibiotic treatment.
Resumo:
OBJECTIVE Measuring children's health-related quality of life (HRQOL) is of growing importance given increasing chronic diseases. By integrating HRQOL questions into the European GABRIEL study, we assessed differences in HRQOL between rural farm and non-farm children from Germany, Austria, Switzerland and Poland to relate it to common childhood health problems and to compare it to a representative, mostly urban German population sample (KIGGS). METHODS The parents of 10,400 school-aged children answered comprehensive questionnaires including health-related questions and the KINDL-R questions assessing HRQOL. RESULTS Austrian children reported highest KINDL-R scores (mean: 80.9; 95 % CI [80.4, 81.4]) and Polish children the lowest (74.5; [73.9, 75.0]). Farm children reported higher KINDL-R scores than non-farm children (p = 0.002). Significantly lower scores were observed in children with allergic diseases (p < 0.001), with sleeping difficulties (p < 0.001) and in overweight children (p = 0.04). The German GABRIEL sample reported higher mean scores (age 7-10 years: 80.1, [79.9, 80.4]; age 11-13 years: 77.1, [74.9, 79.2]) compared to the urban KIGGS study (age 7-10 years: 79.0, [78.7-79.3]; age 11-13 years: 75.1 [74.6-75.6]). Socio-demographic or health-related factors could not explain differences in HRQOL between countries. CONCLUSIONS Future increases in chronic diseases may negatively impact children's HRQOL.
Resumo:
Background. Few studies consider the incidence of individual AIDS-defining illnesses (ADIs) at higher CD4 counts, relevant on a population level for monitoring and resource allocation. Methods. Individuals from the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) aged ≥14 years with ≥1 CD4 count of ≥200 µL between 1998 and 2010 were included. Incidence rates (per 1000 person-years of follow-up [PYFU]) were calculated for each ADI within different CD4 strata; Poisson regression, using generalized estimating equations and robust standard errors, was used to model rates of ADIs with current CD4 ≥500/µL. Results. A total of 12 135 ADIs occurred at a CD4 count of ≥200 cells/µL among 207 539 persons with 1 154 803 PYFU. Incidence rates declined from 20.5 per 1000 PYFU (95% confidence interval [CI], 20.0–21.1 per 1000 PYFU) with current CD4 200–349 cells/µL to 4.1 per 1000 PYFU (95% CI, 3.6–4.6 per 1000 PYFU) with current CD4 ≥ 1000 cells/µL. Persons with a current CD4 of 500–749 cells/µL had a significantly higher rate of ADIs (adjusted incidence rate ratio [aIRR], 1.20; 95% CI, 1.10–1.32), whereas those with a current CD4 of ≥1000 cells/µL had a similar rate (aIRR, 0.92; 95% CI, .79–1.07), compared to a current CD4 of 750–999 cells/µL. Results were consistent in persons with high or low viral load. Findings were stronger for malignant ADIs (aIRR, 1.52; 95% CI, 1.25–1.86) than for nonmalignant ADIs (aIRR, 1.12; 95% CI, 1.01–1.25), comparing persons with a current CD4 of 500–749 cells/µL to 750–999 cells/µL. Discussion. The incidence of ADIs was higher in individuals with a current CD4 count of 500–749 cells/µL compared to those with a CD4 count of 750–999 cells/µL, but did not decrease further at higher CD4 counts. Results were similar in patients virologically suppressed on combination antiretroviral therapy, suggesting that immune reconstitution is not complete until the CD4 increases to >750 cells/µL.
Resumo:
Although persons infected with human immunodeficiency virus (HIV), particularly men who have sex with men, are at excess risk for anal cancer, it has been difficult to disentangle the influences of anal exposure to human papillomavirus (HPV) infection, immunodeficiency, and combined antiretroviral therapy. A case-control study that included 59 anal cancer cases and 295 individually matched controls was nested in the Swiss HIV Cohort Study (1988-2011). In a subset of 41 cases and 114 controls, HPV antibodies were tested. A majority of anal cancer cases (73%) were men who have sex with men. Current smoking was significantly associated with anal cancer (odds ratio (OR) = 2.59, 95% confidence interval (CI): 1.25, 5.34), as were antibodies against L1 (OR = 4.52, 95% CI: 2.00, 10.20) and E6 (OR = ∞, 95% CI: 4.64, ∞) of HPV16, as well as low CD4+ cell counts, whether measured at nadir (OR per 100-cell/μL decrease = 1.53, 95% CI: 1.18, 2.00) or at cancer diagnosis (OR per 100-cell/μL decrease = 1.24, 95% CI: 1.08, 1.42). However, the influence of CD4+ cell counts appeared to be strongest 6-7 years prior to anal cancer diagnosis (OR for <200 vs. ≥500 cells/μL = 14.0, 95% CI: 3.85, 50.9). Smoking cessation and avoidance of even moderate levels of immunosuppression appear to be important in reducing long-term anal cancer risks.