850 resultados para relative risk


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: In newly diagnosed patients with Hodgkin lymphoma (HL) the effect of doxorubicin, bleomycin, vinblastine and dacarbazine (ABVD)-related neutropenia on chemotherapy delivery is poorly documented. The aim of this analysis was to assess the impact of chemotherapy-induced neutropenia (CIN) on ABVD chemotherapy delivery in HL patients. STUDY DESIGN: Data from two similarly designed, prospective, observational studies conducted in the US and the EU were analysed. One hundred and fifteen HL patients who started a new course of ABVD during 2002-2005 were included. The primary objective was to document the effect of neutropenic complications on delivery of ABVD chemotherapy in HL patients. Secondary objectives were to investigate the incidence of CIN and febrile neutropenia (FN) and to compare US and EU practice with ABVD therapy in HL. Pooled data were analysed to explore univariate associations with neutropenic events. RESULTS: Chemotherapy delivery was suboptimal (with a relative dose intensity < or = 85%) in 18-22% of patients. The incidence of grade 4 CIN in cycles 1-4 was lower in US patients (US 24% vs. EU 32%). Patients in both the US and the EU experienced similar rates of FN across cycles 1-4 (US 12% vs. EU 11%). Use of primary colony-stimulating factor (CSF) prophylaxis and of any CSF was more common in the US than the EU (37% vs. 4% and 78% vs. 38%, respectively). The relative risk (RR) of dose delays was 1.54 (95% confidence interval [CI] 1.08-2.23, p = 0.036) for patients with vs. without grade 4 CIN and the RR of grade 4 CIN was 0.35 (95% CI 0.12-1.06, p = 0.046) for patients with vs. without primary CSF prophylaxis. CONCLUSIONS: In this population of HL patients, CIN was frequent and FN occurrence clinically relevant. Chemotherapy delivery was suboptimal. CSF prophylaxis appeared to reduce CIN rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Indoor residual spraying (IRS) has become an increasingly popular method of insecticide use for malaria control, and many recent studies have reported on its effectiveness in reducing malaria burden in a single community or region. There is a need for systematic review and integration of the published literature on IRS and the contextual determining factors of its success in controlling malaria. This study reports the findings of a meta-regression analysis based on 13 published studies, which were chosen from more than 400 articles through a systematic search and selection process. The summary relative risk for reducing malaria prevalence was 0.38 (95% confidence interval = 0.31-0.46), which indicated a risk reduction of 62%. However, an excessive degree of heterogeneity was found between the studies. The meta-regression analysis indicates that IRS is more effective with high initial prevalence, multiple rounds of spraying, use of DDT, and in regions with a combination of Plasmodium falciparum and P. vivax malaria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Approximately 45,000 individuals are hospitalized annually for burn treatment. Rehabilitation after hospitalization can offer a significant improvement in functional outcomes. Very little is known nationally about rehabilitation for burns, and practices may vary substantially depending on the region based on observed Medicare post-hospitalization spending amounts. This study was designed to measure variation in rehabilitation utilization by state of hospitalization for patients hospitalized with burn injury. This retrospective cohort study used nationally collected data over a 10-year period (2001 to 2010), from the Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases (SIDs). Patients hospitalized for burn injury (n = 57,968) were identified by ICD-9-CM codes and were examined to see specifically if they were discharged immediately to inpatient rehabilitation after hospitalization (primary endpoint). Both unadjusted and adjusted likelihoods were calculated for each state taking into account the effects of age, insurance status, hospitalization at a burn center, and extent of burn injury by TBSA. The relative risk of discharge to inpatient rehabilitation varied by as much as 6-fold among different states. Higher TBSA, having health insurance, higher age, and burn center hospitalization all increased the likelihood of discharge to inpatient rehabilitation following acute care hospitalization. There was significant variation between states in inpatient rehabilitation utilization after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To ascertain the degree of variation, by state of hospitalization, in outcomes associated with traumatic brain injury (TBI) in a pediatric population. DESIGN: A retrospective cohort study of pediatric patients admitted to a hospital with a TBI. SETTING: Hospitals from states in the United States that voluntarily participate in the Agency for Healthcare Research and Quality's Healthcare Cost and Utilization Project. PARTICIPANTS: Pediatric (age ≤ 19 y) patients hospitalized for TBI (N=71,476) in the United States during 2001, 2004, 2007, and 2010. INTERVENTIONS: None. MAIN OUTCOME MEASURES: Primary outcome was proportion of patients discharged to rehabilitation after an acute care hospitalization among alive discharges. The secondary outcome was inpatient mortality. RESULTS: The relative risk of discharge to inpatient rehabilitation varied by as much as 3-fold among the states, and the relative risk of inpatient mortality varied by as much as nearly 2-fold. In the United States, approximately 1981 patients could be discharged to inpatient rehabilitation care if the observed variation in outcomes was eliminated. CONCLUSIONS: There was significant variation between states in both rehabilitation discharge and inpatient mortality after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Hypertension and cognitive impairment are prevalent in older people. It is known that hypertension is a direct risk factor for vascular dementia and recent studies have suggested hypertension also impacts upon prevalence of Alzheimer's disease. The question is therefore whether treatment of hypertension lowers the rate of cognitive decline. OBJECTIVES: To assess the effects of blood pressure lowering treatments for the prevention of dementia and cognitive decline in patients with hypertension but no history of cerebrovascular disease. SEARCH STRATEGY: The trials were identified through a search of CDCIG's Specialised Register, CENTRAL, MEDLINE, EMBASE, PsycINFO and CINAHL on 27 April 2005. SELECTION CRITERIA: Randomized, double-blind, placebo controlled trials in which pharmacological or non-pharmacological interventions to lower blood pressure were given for at least six months. DATA COLLECTION AND ANALYSIS: Two independent reviewers assessed trial quality and extracted data. The following outcomes were assessed: incidence of dementia, cognitive change from baseline, blood pressure level, incidence and severity of side effects and quality of life. MAIN RESULTS: Three trials including 12,091 hypertensive subjects were identified. Average age was 72.8 years. Participants were recruited from industrialised countries. Mean blood pressure at entry across the studies was 170/84 mmHg. All trials instituted a stepped care approach to hypertension treatment, starting with a calcium-channel blocker, a diuretic or an angiotensin receptor blocker. The combined result of the three trials reporting incidence of dementia indicated no significant difference between treatment and placebo (Odds Ratio (OR) = 0.89, 95% CI 0.69, 1.16). Blood pressure reduction resulted in a 11% relative risk reduction of dementia in patients with no prior cerebrovascular disease but this effect was not statistically significant (p = 0.38) and there was considerable heterogeneity between the trials. The combined results from the two trials reporting change in Mini Mental State Examination (MMSE) did not indicate a benefit from treatment (Weighted Mean Difference (WMD) = 0.10, 95% CI -0.03, 0.23). Both systolic and diastolic blood pressure levels were reduced significantly in the two trials assessing this outcome (WMD = -7.53, 95% CI -8.28, -6.77 for systolic blood pressure, WMD = -3.87, 95% CI -4.25, -3.50 for diastolic blood pressure).Two trials reported adverse effects requiring discontinuation of treatment and the combined results indicated a significant benefit from placebo (OR = 1.18, 95% CI 1.06, 1.30). When analysed separately, however, more patients on placebo in SCOPE were likely to discontinue treatment due to side effects; the converse was true in SHEP 1991. Quality of life data could not be analysed in the three studies. There was difficulty with the control group in this review as many of the control subjects received antihypertensive treatment because their blood pressures exceeded pre-set values. In most cases the study became a comparison between the study drug against a usual antihypertensive regimen. AUTHORS' CONCLUSIONS: There was no convincing evidence from the trials identified that blood pressure lowering prevents the development of dementia or cognitive impairment in hypertensive patients with no apparent prior cerebrovascular disease. There were significant problems identified with analysing the data, however, due to the number of patients lost to follow-up and the number of placebo patients given active treatment. This introduced bias. More robust results may be obtained by analysing one year data to reduce differential drop-out or by conducting a meta-analysis using individual patient data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Rationale: Lung inflammation and injury is critical in cystic fibrosis. An ideal antiinflammatory agent has not been identified but inhaled corticosteroids are widely used despite lack of evidence.

Objectives: To test the safety of withdrawal of inhaled corticosteroids with the hypothesis this would not be associated with an earlier onset of acute chest exacerbations.

Methods: Multicenter randomized double-blind placebo-controlled trial in 18 pediatric and adult UK centers. Eligibility criteria included age > 6.0 yr, FEV1 ? 40% predicted, and corticosteroid use > 3 mo. During the 2-mo run-in period, all patients received fluticasone; they then took either fluticasone or placebo for 6 mo.

Measurements and Main Results: Fluticasone group: n = 84, median age 14.6 yr, mean (SD) FEV1 76% (18); placebo group: n = 87, median age 15.8 yr, mean (SD) FEV1 76% (18). There was no difference in time to first exacerbation (primary outcome) with hazard ratio (95% confidence interval) of 1.07 (0.68 to 1.70) for fluticasone versus placebo. There was no effect of age, atopy, corticosteroid dose, FEV1, or Pseudomonas aeruginosa status. There was no change in lung function or differences in antibiotic or rescue bronchodilator use. Fewer patients in the fluticasone group withdrew from the study due to lung-related adverse events (9 vs. 15%); with a relative risk (95% confidence interval) of 0.59 (0.23–1.48) fluticasone versus placebo.

Conclusions: In this study population (applicable to 40% of patients with cystic fibrosis in the UK), it appears safe to consider stopping inhaled corticosteroids. Potential advantages will be to reduce the drug burden on patients, reduce adverse effects, and make financial savings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of the current study was to evaluate the effect of a debriefing call on nutrient intake estimates using two 3-d food diaries among women participating in the Women's Health and Interview Study (WISH) Diet Validation Study. Subjects were 207 women with complete data and six 24-h recalls (24-HR) by telephone over 8 mo followed by two 3-d food diaries during the next 4 mo. Nutrient intake was assessed using the food diaries before and after a debriefing session by telephone. The purpose of the debriefing call was to obtain more detailed information on the types and amounts of fat in the diet. However, due to the ubiquitous nature of fat in the diet, the debriefing involved providing more specific detail on many aspects of the diet. There was a significant difference in macronutrient and micronutrient intake estimates after the debriefing. Estimates of protein, carbohydrate, and fiber intake were significantly higher and total fat, monounsaturated fat, saturated fat, vitamin A, vitamin C, -tocopherol, folic acid, and calcium intake were significantly lower after the debriefing (P <0.05). The limits of agreement between the food diaries before and after the debriefing were especially large for total fat intake, which could be under- or overestimated by 15 g/d. The debriefing call improved attenuation coefficients associated with measurement error for vitamin C, folic acid, iron, tocopherol, vitamin A, and calcium estimates. A hypothetical relative risk (RR) = 2.0 could be attenuated to 1.16 for folic acid intake assessed without a debriefing but to only 1.61 with a debriefing. Depending on the nutrients of interest, the inclusion of a debriefing can reduce the potential attenuation of RR in studies evaluating diet disease associations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this paper is to use Markov modelling to
investigate survival for particular types of kidney patients
in relation to their exposure to anti-hypertensive treatment
drugs. In order to monitor kidney function an intuitive three
point assessment is proposed through the collection of blood
samples in relation to Chronic Kidney Disease for Northern
Ireland patients. A five state Markov Model was devised
using specific transition probabilities for males and
females over all age groups. These transition probabilities
were then adjusted appropriately using relative risk scores
for the event death for different subgroups of patients. The
model was built using TreeAge software package in order to
explore the effects of anti-hypertensive drugs on patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this cluster randomised controlled trial was to test the impact of an infection control education and training programme on meticillin-resistant Staphylococcus aureus (MRSA) prevalence in nursing homes. Nursing homes were randomised to intervention (infection control education and training programme; N¼16) or control (usual practice continued; N¼16). Staff in intervention homes were educated and trained (0, 3 and 6 months) in the principles and implementation of good infection control practice with infection control audits conducted in all sites (0, 3, 6 and 12 months) to assess compliance with good practice. Audit scores were fed back to nursing home managers in intervention homes, together with a written report indicating where practice could be improved. Nasal swabs were taken from all consenting residents and staff at 0, 3, 6 and 12 months. The primary outcome was MRSA prevalence in residents and staff, and the secondary outcome was a change in infection control audit scores. In all, 793 residents and 338 staff were recruited at baseline. MRSA prevalence did not change during the study in residents or staff. The relative risk of a resident being colonised with MRSA in an intervention home compared with a control home at 12 months was 0.99 (95% con?dence interval: 0.69, 1.42) after adjustment for clustering. Mean infection control audit scores were signi?cantly higher in the intervention homes (82%) compared with the control homes (64%) at 12 months (P<0.0001). Consideration should be given to other approaches which may help to reduce MRSA in this setting.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper extends Blackburn and Galindev's (Economics Letters, Vol. 79 (2003), pp. 417-421) stochastic growth model in which productivity growth entails both external and internal learning behaviour with a constant relative risk aversion utility function and productivity shocks. Consequently, the relationship between long-term growth and short-term volatility depends not only on the relative importance of each learning mechanism but also on a parameter measuring individuals' attitude towards risk.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Evidence is unclear as to whether there is a socio-economic gradient in cerebral palsy (CP) prevalence beyond what would be expected from the socio-economic gradient for low birthweight, a strong risk factor for CP. We conducted a population-based study in five regions of the UK with CP registers, to investigate the relationship between CP prevalence and socio-economic deprivation, and how it varies by region, by birthweight and by severity and type of CP. The total study population was 1 657 569 livebirths, born between 1984 and 1997. Wards of residence were classified into five quintiles according to a census-based deprivation index, from Q1 (least deprived) to Q5 (most deprived). Socio-economic gradients were modelled by Poisson regression, and region-specific estimates combined by meta-analysis.

The prevalence of postneonatally acquired CP was 0.14 per 1000 livebirths overall. The mean deprivation gradient, expressed as the relative risk in the most deprived vs. the least deprived quintile, was 1.86 (95% confidence interval [95% CI 1.19, 2.88]). The prevalence of non-acquired CP was 2.22 per 1000 livebirths. For non-acquired CP the gradient was 1.16 [95% CI 1.00, 1.35]. Evidence for a socio-economic gradient was strongest for spastic bilateral cases (1.32 [95% CI 1.09, 1.59]) and cases with severe intellectual impairment (1.59 [95% CI 1.06, 2.39]). There was evidence for differences in gradient between regions. The gradient of risk of CP among normal birthweight births was not statistically significant overall (1.21 [95% CI 0.95, 1.54]), but was significant in two regions. There was non-significant evidence of a reduction in gradients over time.

The reduction of the higher rates of postneonatally acquired CP in the more socioeconomically deprived areas is a clear goal for prevention. While we found evidence for a socio-economic gradient for non-acquired CP of antenatal or perinatal origin, the picture was not consistent across regions, and there was some evidence of a decline in inequalities over time. The steeper gradients in some regions for normal birthweight cases and cases with severe intellectual impairment require further investigation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To assess the relative risk of major congenital malformation (MCM) from in utero exposure to antiepileptic drug (AEDs).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Critically ill patients are at heightened risk for nosocomial infections. The anaphylatoxin C5a impairs phagocytosis by neutrophils. However, the mechanisms by which this occurs and the relevance for acquisition of nosocomial infection remain undetermined. We aimed to characterize mechanisms by which C5a inhibits phagocytosis in vitro and in critically ill patients, and to define the relationship between C5a-mediated dysfunction and acquisition of nosocomial infection. In healthy human neutrophils, C5a significantly inhibited RhoA activation, preventing actin polymerization and phagocytosis. RhoA inhibition was mediated by PI3Kd. The effects on RhoA, actin, and phagocytosis were fully reversed by GM-CSF. Parallel observations were made in neutrophils from critically ill patients, that is, impaired phagocytosis was associated with inhibition of RhoA and actin polymerization, and reversed by GM-CSF. Among a cohort of 60 critically ill patients, C5a-mediated neutrophil dysfunction (as determined by reduced CD88 expression) was a strong predictor for subsequent acquisition of nosocomial infection (relative risk, 5.8; 95% confidence interval, 1.5-22; P = .0007), and remained independent of time effects as assessed by survival analysis (hazard ratio, 5.0; 95% confidence interval, 1.3-8.3; P = .01). In conclusion, this study provides new insight into the mechanisms underlying immunocompromise in critical illness and suggests novel avenues for therapy and prevention of nosocomial infection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Survival is reportedly worse in patients with cancer concurrently diagnosed with deep venous thrombosis. However, information on specific malignancies is limited. From a cohort study of male US veterans we identified incident cancer cases (n aEuroS== aEuroS412 008) and compared survival patterns among those with versus without a history of deep venous thrombosis. Using Cox proportional hazard models, we estimated hazard ratios (HRs) and 95%% confidence intervals as measures of the relative risk of dying. Individuals with (versus without) a concomitant deep venous thrombosis and cancer diagnosis had a higher risk of dying (HR aEuroS== aEuroS1.38; 1.28--1.49). The most prominent excess mortality (HR aEuroS== aEuroS1.29--2.55) was observed among patients diagnosed with deep venous thrombosis at the time of diagnosis of lung, gastric, prostate, bladder, or kidney cancer. Increased risk of dying was also found among cancer patients diagnosed with deep venous thrombosis 1 year (HR aEuroS== aEuroS1.14; 1.07--1.22), 1--5 years (HR aEuroS== aEuroS1.14; 1.10--1.19), and > 5 years (HR aEuroS== aEuroS1.27; 1.23--1.31) before cancer; this was true for most cancer sites (HR aEuroS== aEuroS1.17--1.64). In summary, antecedent deep venous thrombosis confers a worse prognosis upon cancer patients. Advanced stage at diagnosis, treatment effects, lifestyle factors, and comorbidity could explain differences by cancer site and time frame between a prior deep venous thrombosis diagnosis and cancer outcome.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Early meningococcal disease (MD) diagnosis is difficult. We assessed rapid molecular testing of respiratory specimens. We performed genotyping of respiratory swabs, blood, and cerebrospinal fluid from children with suspected disease and nasal swabs (NSs) from matched controls. Thirty-nine of 104 suspected cases had confirmed disease. Four controls were carriers. Throat swab ctrA and porA testing for detection of disease gave a sensitivity of 81% (17/21), specificity of 100% (44/44), positive predictive value (PPV) of 100% (17/17), negative predictive value (NPV) of 92% (44/48), and relative risk of 12. NS ctrA and porA testing gave a sensitivity of 51% (20/39), specificity of 95% (62/65), PPV of 87% (20/23), NPV of 77% (62/81), and relative risk of 4. Including only the 86 NSs taken within 48 h of presentation, the results were sensitivity of 60% (18/30), specificity of 96% (54/56), PPV of 90% (18/20), NPV of 82% (54/66), and relative risk of 5. Swab type agreement was excellent (kappa 0.80, P