976 resultados para Risk adjustment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bipolar disorder (BD) and attention deficit/hyperactivity disorder (ADHD) may share common genetic risk factors as indicated by the high co-morbidity of BD and ADHD, their phenotypic overlap especially in pediatric populations, the high heritability of both disorders, and the co-occurrence in families. We therefore examined whether known polygenic BD risk alleles are associated with ADHD. We chose the eight best SNPs of the recent genome-wide association study (GWAS) of BD patients of German ancestry and the nine SNPs from international GWAS meeting a 'genome-wide significance' level of α = 5 × 10(-8). A GWAS was performed in 495 ADHD children and 1,300 population-based controls using HumanHap550v3 and Human660 W-Quadv1 BeadArrays. We found no significant association of childhood ADHD with single BD risk alleles surviving adjustment for multiple testing. Yet, risk alleles for BD and ADHD were directionally consistent at eight of nine loci with the strongest support for three SNPs in or near NCAN, BRE, and LMAN2L. The polygene analysis for the BP risk alleles at all 14 loci indicated a higher probability of being a BD risk allele carrier in the ADHD cases as compared to the controls. At a moderate power to detect association with ADHD, if true effects were close to estimates from GWAS for BD, our results suggest that the possible contribution of BD risk variants to childhood ADHD risk is considerably lower than for BD. Yet, our findings should encourage researchers to search for common genetic risk factors in BD and childhood ADHD in future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research suggests that mutans streptococci play an important role in cariogenesis in children but the usefulness of bacterial testing in risk assessment is unknown. Our objective was to summarize the literature assessing the association of mutans streptococci and dental caries in preschool children, (Pre)Medline (1966-2003), Embase (1980-2003), the Cochrane Register of Controlled Trials (2003, issue 3), and reference lists of included studies were searched. All abstracts found by the electronic searches (n = 981) were independently scrutinized by 2 reviewers. Minimal requirements for inclusion were assessment of preschool children without caries at baseline, reporting of mutans streptococci present in saliva or plaque at baseline and assessment of caries presence after a minimum of 6 months of follow-up. Participants' details, test methods, methodological characteristics and findings were extracted by one reviewer and cross-checked by another. Homogeneity was tested using chi2 tests. Results of plaque and saliva testing were pooled separately using a fixed effects model. Methodological quality of reports was low. Out of 9 studies included, data from 3 reports on plaque test assessment alone (n = 300) and from 4 reports on saliva test assessment alone (n = 451) were available for pooled analysis. The pooled risk ratio (95% CI) was 3.85 (2.48-5.96) in studies using plaque tests and 2.11 (1.47-3.02) in those using saliva testing. Presence of mutans streptococci, both in plaque or saliva of young caries-free children, appears to be associated with a considerable increase in caries risk. Lack of adjustment for potential confounders in the original studies, however, limits the extent to which interpretations for practice can be made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In the UK, population screening for unmet need has failed to improve the health of older people. Attention is turning to interventions targeted at 'at-risk' groups. Living alone in later life is seen as a potential health risk, and older people living alone are thought to be an at-risk group worthy of further intervention. AIM: To explore the clinical significance of living alone and the epidemiology of lone status as an at-risk category, by investigating associations between lone status and health behaviours, health status, and service use, in non-disabled older people. Design of study: Secondary analysis of baseline data from a randomised controlled trial of health risk appraisal in older people. SETTING: Four group practices in suburban London. METHOD: Sixty per cent of 2641 community-dwelling non-disabled people aged 65 years and over registered at a practice agreed to participate in the study; 84% of these returned completed questionnaires. A third of this group, (n = 860, 33.1%) lived alone and two-thirds (n = 1741, 66.9%) lived with someone else. RESULTS: Those living alone were more likely to report fair or poor health, poor vision, difficulties in instrumental and basic activities of daily living, worse memory and mood, lower physical activity, poorer diet, worsening function, risk of social isolation, hazardous alcohol use, having no emergency carer, and multiple falls in the previous 12 months. After adjustment for age, sex, income, and educational attainment, living alone remained associated with multiple falls, functional impairment, poor diet, smoking status, risk of social isolation, and three self-reported chronic conditions: arthritis and/or rheumatism, glaucoma, and cataracts. CONCLUSION: Clinicians working with independently-living older people living alone should anticipate higher levels of disease and disability in these patients, and higher health and social risks, much of which will be due to older age, lower educational status, and female sex. Living alone itself appears to be associated with higher risks of falling, and constellations of pathologies, including visual loss and joint disorders. Targeted population screening using lone status may be useful in identifying older individuals at high risk of falling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To prospectively evaluate outcomes of high-risk patients undergoing bilateral carotid artery stenting (CAS). METHODS: A total of 747 patients at increased risk for carotid endarterectomy (CEA) were enrolled in a prospective registry at 47 US sites of the Boston Scientific EPI: A Carotid Stenting Trial for Risk Surgical Patients (BEACH) trial. Among them, 78 (10.4%) patients underwent contralateral CAS > 30 days after the primary CAS procedure. Patients were followed at 1, 6, and 12 months, and annually thereafter for 3 years. The primary endpoint was the cumulative incidence of non Q-wave myocardial infarction within 24 hours, periprocedural (adjustment for various clinical baseline factors revealed no differences in the primary endpoint when comparing the bilateral with the pivotal groups at 30 days (odds ratio [OR]: 0.8673, 95% confidence interval [CI] 0.4590-1.6389, P = .66) or 1 year (OR: 0.9102, 95% CI 0.5503-1.5053, P = .73). CONCLUSIONS: Bilateral carotid stenting is an effective treatment strategy in patients determined to be at high-risk for CEA with no increase in morbidity or mortality results extended out to one year in a prospective multicenter trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Nonalcoholic fatty liver disease (NAFLD), the most common cause of liver disease in children, is associated with obesity and insulin resistance. However, the relationship between NAFLD and cardiovascular risk factors in children is not fully understood. The objective of this study was to determine the association between NAFLD and the presence of metabolic syndrome in overweight and obese children. METHODS AND RESULTS: This case-control study of 150 overweight children with biopsy-proven NAFLD and 150 overweight children without NAFLD compared rates of metabolic syndrome using Adult Treatment Panel III criteria. Cases and controls were well matched in age, sex, and severity of obesity. Children with NAFLD had significantly higher fasting glucose, insulin, total cholesterol, low-density lipoprotein cholesterol, triglycerides, systolic blood pressure, and diastolic blood pressure than overweight and obese children without NAFLD. Subjects with NAFLD also had significantly lower high-density lipoprotein cholesterol than controls. After adjustment for age, sex, race, ethnicity, body mass index, and hyperinsulinemia, children with metabolic syndrome had 5.0 (95% confidence interval, 2.6 to 9.7) times the odds of having NAFLD as overweight and obese children without metabolic syndrome. CONCLUSIONS: NAFLD in overweight and obese children is strongly associated with multiple cardiovascular risk factors. The identification of NAFLD in a child should prompt global counseling to address nutrition, physical activity, and avoidance of smoking to prevent the development of cardiovascular disease and type 2 diabetes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Extracapsular tumor spread (ECS) has been identified as a possible risk factor for breast cancer recurrence, but controversy exists regarding its role in decision making for regional radiotherapy. This study evaluates ECS as a predictor of local, axillary, and supraclavicular recurrence. PATIENTS AND METHODS: International Breast Cancer Study Group Trial VI accrued 1475 eligible pre- and perimenopausal women with node-positive breast cancer who were randomly assigned to receive three to nine courses of classical combination chemotherapy with cyclophosphamide, methotrexate, and fluorouracil. ECS status was determined retrospectively in 933 patients based on review of pathology reports. Cumulative incidence and hazard ratios (HRs) were estimated using methods for competing risks analysis. Adjustment factors included treatment group and baseline patient and tumor characteristics. The median follow-up was 14 years. RESULTS: In univariable analysis, ECS was significantly associated with supraclavicular recurrence (HR = 1.96; 95% confidence interval 1.23-3.13; P = 0.005). HRs for local and axillary recurrence were 1.38 (P = 0.06) and 1.81 (P = 0.11), respectively. Following adjustment for number of lymph node metastases and other baseline prognostic factors, ECS was not significantly associated with any of the three recurrence types studied. CONCLUSIONS: Our results indicate that the decision for additional regional radiotherapy should not be based solely on the presence of ECS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Elderly individuals who provide care to a spouse suffering from dementia bear an increased risk of coronary heart disease (CHD). OBJECTIVE: To test the hypothesis that the Framingham CHD Risk Score would be higher in dementia caregivers relative to non-caregiving controls. METHODS: We investigated 64 caregivers providing in-home care for their spouse with Alzheimer's disease and 41 gender-matched non-caregiving controls. All subjects (mean age 70 +/- 8 years, 75% women, 93% Caucasian) had a negative history of CHD and cerebrovascular disease. The original Framingham CHD Risk Score was computed adding up categorical scores for age, blood lipids, blood pressure, diabetes, and smoking with adjustment made for sex. RESULTS: The average CHD risk score was higher in caregivers than in controls even when co-varying for socioeconomic status, health habits, medication, and psychological distress (8.0 +/- 2.9 vs. 6.3 +/- 3.0 points, p = 0.013). The difference showed a medium effect size (Cohen's d = 0.57). A relatively higher blood pressure in caregivers than in controls made the greatest contribution to this difference. The probability (area under the receiver operator curve) that a randomly selected caregiver had a greater CHD risk score than a randomly selected non-caregiver was 65.5%. CONCLUSIONS: Based on the Framingham CHD Risk Score, the potential to develop overt CHD in the following 10 years was predicted to be greater in dementia caregivers than in non-caregiving controls. The magnitude of the difference in the CHD risk between caregivers and controls appears to be clinically relevant. Clinicians may want to monitor caregiving status as a routine part of standard evaluation of their elderly patients' cardiovascular risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of the important morbidity and mortality associated with osteoporosis, it is essential to detect subjects at risk by screening methods, such as bone quantitative ultrasounds (QUSs). Several studies showed that QUS could predict fractures. None, however, compared prospectively different QUS devices, and few data of quality controls (QCs) have been published. The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk is a prospective multicenter study that compared three QUSs for the assessment of hip fracture risk in a population of 7609 women age >/=70 yr. Because the inclusion phase lasted 20 mo, and because 10 centers participated in this study, QC became a major issue. We therefore developed a QC procedure to assess the stability and precision of the devices, and for their cross-calibration. Our study focuses on the two heel QUSs. The water bath system (Achilles+) had a higher precision than the dry system (Sahara). The QC results were highly dependent on temperature. QUS stability was acceptable, but Sahara must be calibrated regularly. A sufficient homogeneity among all the Sahara devices could be demonstrated, whereas significant differences were found among the Achilles+ devices. For speed of sound, 52% of the differences among the Achilles+ was explained by the water s temperature. However, for broadband ultrasound attenuation, a maximal difference of 23% persisted after adjustment for temperature. Because such differences could influence measurements in vivo, it is crucial to develop standardized phantoms to be used in prospective multicenter studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Whereas it is well established that various soluble biomarkers can predict level of liver fibrosis, their ability to predict liver-related clinical outcomes is less clearly established, in particular among HIV/viral hepatitis co-infected persons. We investigated plasma hyaluronic acid’s (HA) ability to predict risk of liver-related events (LRE; hepatic coma or liver-related death) in the EuroSIDA study. Methods Patients included were positive for anti-HCV and/or HBsAg with at least one available plasma sample. The earliest collected plasma sample was tested for HA (normal range 0–75 ng/mL) and levels were associated with risk of LRE. Change in HA per year of follow-up was estimated after measuring HA levels in latest sample before the LRE for those experiencing this outcome (cases) and in a random selection of one sixth of the remaining patients (controls). Results During a median of 8.2 years of follow-up, 84/1252 (6.7%) patients developed a LRE. Baseline median (IQR) HA in those without and with a LRE was 31.8 (17.2–62.6) and 221.6 ng/mL (74.9–611.3), respectively (p<0.0001). After adjustment, HA levels predicted risk of contracting a LRE; incidence rate ratios for HA levels 75–250 or ≥250 vs. <75 ng/mL were 5.22 (95% CI 2.86–9.26, p<0.0007) and 28.22 (95% CI 14.95–46.00, p<0.0001), respectively. Median HA levels increased substantially prior to developing a LRE (107.6 ng/mL, IQR 0.8 to 251.1), but remained stable for controls (1.0 ng/mL, IQR –5.1 to 8.2), (p<0.0001 comparing cases and controls), and greater increases predicted risk of a LRE in adjusted models (p<0.001). Conclusions An elevated level of plasma HA, particularly if the level further increases over time, substantially increases the risk of contracting LRE over the next five years. HA is an inexpensive, standardized and non-invasive supplement to other methods aimed at identifying HIV/viral hepatitis co-infected patients at risk of hepatic complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Few data exist on tuberculosis (TB) incidence according to time from HIV seroconversion in high-income countries and whether rates following initiation of a combination of antiretroviral treatments (cARTs) differ from those soon after seroconversion. Methods Data on individuals with well estimated dates of HIV seroconversion were used to analyse post-seroconversion TB rates, ending at the earliest of 1 January 1997, death or last clinic visit. TB rates were also estimated following cART initiation, ending at the earliest of death or last clinic visit. Poisson models were used to examine the effect of current and past level of immunosuppression on TB risk after cART initiation. Results Of 19 815 individuals at risk during 1982–1996, TB incidence increased from 5.89/1000 person-years (PY) (95% CI 3.77 to 8.76) in the first year after seroconversion to 10.56 (4.83 to 20.04, p=0.01) at 10 years. Among 11 178 TB-free individuals initiating cART, the TB rate in the first year after cART initiation was 4.23/1000 PY (3.07 to 5.71) and dropped thereafter, remaining constant from year 2 onwards averaging at 1.64/1000 PY (1.29 to 2.05). Current CD4 count was inversely associated with TB rates, while nadir CD4 count was not associated with TB rates after adjustment for current CD4 count, HIV-RNA at cART initiation. Conclusions TB risk increases with duration of HIV infection in the absence of cART. Following cART initiation, TB incidence rates were lower than levels immediately following seroconversion. Implementation of current recommendations to prevent TB in early HIV infection could be beneficial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental factors can determine which group size will maximize the fitness of group members. This is particularly important in cooperative breeders, where group members often serve different purposes. Experimental studies are yet lacking to check whether ecologically mediated need for help will change the propensity of dominant group members to accept immigrants. Here, we manipulated the perceived risk of predation for dominant breeders of the cooperatively breeding cichlid fish Neolamprologus pulcher to test their response to unrelated and previously unknown immigrants. Potential immigrants were more readily accepted if groups were exposed to fish predators or egg predators than to herbivorous fish or control situations lacking predation risk. Our data are consistent with both risk dilution and helping effects. Egg predators were presented before spawning, which might suggest that the fish adjust acceptance rates also to a potential future threat. Dominant group members of N. pulcher apparently consider both present and future need of help based on ecological demand. This suggests that acceptance of immigrants and, more generally, tolerance of group members on demand could be a widespread response to ecological conditions in cooperatively breeding animals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many persons in the U.S. gain weight during young adulthood, and the prevalence of obesity has been increasing among young adults. Although obesity and physical inactivity are generally recognized as risk factors for coronary heart disease (CHD), the magnitude of their effect on risk may have been seriously underestimated due to failure to adequately handle the problem of cigarette smoking. Since cigarette smoking causes weight loss, physically inactive cigarette smokers may remain relatively lean because they smoke cigarettes. We hypothesize cigarette smoking modifies the association between weight gain during young adulthood and risk of coronary heart disease during middle age, and that the true effect of weight gain during young adulthood on risk of CHD can be assessed only in persons who have not smoked cigarettes. Specifically, we hypothesize that weight gain during young adulthood is positively associated with risk of CHD during middle-age in nonsmokers but that the association is much smaller or absent entirely among cigarette smokers. The purpose of this study was to test this hypothesis. The population for analysis was comprised of 1,934 middle-aged, employed men whose average age at the baseline examination was 48.7 years. Information collected at the baseline examinations in 1958 and 1959 included recalled weight at age 20, present weight, height, smoking status, and other CHD risk factors. To decrease the effect of intraindividual variation, the mean values of the 1958 and 1959 baseline examinations were used in analyses. Change in body mass index ($\Delta$BMI) during young adulthood was the primary exposure variable and was measured as BMI at baseline (kg/m$\sp2)$ minus BMI at age 20 (kg/m$\sp2).$ Proportional hazards regression analysis was used to generate relative risks of CHD mortality by category of $\Delta$BMI and cigarette smoking status after adjustment for age, family history of CVD, major organ system disease, BMI at age 20, and number of cigarettes smoked per day. Adjustment was not performed for systolic blood pressure or total serum cholesterol as these were regarded as intervening variables. Vital status was known for all men on the 25th anniversary of their baseline examinations. 705 deaths (including 319 CHD deaths) occurred over 40,136 person-years of experience. $\Delta$BMI was positively associated with risk of CHD mortality in never-smokers, but not in ever-smokers (p for interaction = 0.067). For never-smokers with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 1.62, 1.61, and 2.78, respectively (p for trend = 0.010). For ever-smokers, with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 0.74, 1.07, and 1.06, respectively (p for trend = 0.422). These results support the research hypothesis that cigarette smoking modifies the association between weight gain and CHD mortality. Current estimates of the magnitude of effect of obesity and physical inactivity on risk of coronary mortality may have been seriously underestimated due to inadequate handling of cigarette smoking. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cohort of 418 United States Air Force (USAF) personnel from over 15 different bases deployed to Morocco in 1994. This was the first study of its kind and was designed with two primary goals: to determine if the USAF was medically prepared to deploy with its changing mission in the new world order, and to evaluate factors that might improve or degrade USAF medical readiness. The mean length of deployment was 21 days. The cohort was 95% male, 86% enlisted, 65% married, and 78% white.^ This study shows major deficiencies indicating the USAF medical readiness posture has not fully responded to meet its new mission requirements. Lack of required logistical items (e.g., mosquito nets, rainboots, DEET insecticide cream, etc.) revealed a low state of preparedness. The most notable deficiency was that 82.5% (95% CI = 78.4, 85.9) did not have permethrin pretreated mosquito nets and 81.0% (95% CI = 76.8, 84.6) lacked mosquito net poles. Additionally, 18% were deficient on vaccinations and 36% had not received a tuberculin skin test. Excluding injections, the overall compliance for preventive medicine requirements had a mean frequency of only 50.6% (95% CI = 45.36, 55.90).^ Several factors had a positive impact on compliance with logistical requirements. The most prominent was "receiving a medical intelligence briefing" from the USAF Public Health. After adjustment for mobility and age, individuals who underwent a briefing were 17.2 (95% CI = 4.37, 67.99) times more likely to have received an immunoglobulin shot and 4.2 (95% CI = 1.84, 9.45) times more likely to start their antimalarial prophylaxsis at the proper time. "Personnel on mobility" had the second strongest positive effect on medical readiness. When mobility and briefing were included in models, "personnel on mobility" were 2.6 (95% CI = 1.19, 5.53) times as likely to have DEET insecticide and 2.2 (95% CI = 1.16, 4.16) times as likely to have had a TB skin test.^ Five recommendations to improve the medical readiness of the USAF were outlined: upgrade base level logistical support, improve medical intelligence messages, include medical requirements on travel orders, place more personnel on mobility or only deploy personnel on mobility, and conduct research dedicated to capitalize on the powerful effect from predeployment briefings.^ Since this is the first study of its kind, more studies should be performed in different geographic theaters to assess medical readiness and establish acceptable compliance levels for the USAF. ^