984 resultados para adjusted for vital effect


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our objective was to determine the effect of body mass index (BMI) on response to bacterial vaginosis (BV) treatment. A secondary analysis was conducted of two multicenter trials of therapy for BV and TRICHOMONAS VAGINALIS. Gravida were screened for BV between 8 and 22 weeks and randomized between 16 and 23 weeks to metronidazole or placebo. Of 1497 gravida with asymptomatic BV and preconceptional BMI, 738 were randomized to metronidazole; BMI was divided into categories: < 25, 25 to 29.9, and > or = 30. Rates of BV persistence at follow-up were compared using the Mantel-Haenszel chi square. Multiple logistic regression was used to evaluate the effect of BMI on BV persistence at follow-up, adjusting for potential confounders. No association was identified between BMI and BV rate at follow-up ( P = 0.21). BMI was associated with maternal age, smoking, marital status, and black race. Compared with women with BMI of < 25, adjusted odds ratio (OR) of BV at follow-up were BMI 25 to 29.9: OR, 0.66, 95% CI 0.43 to 1.02; BMI > or = 30: OR, 0.83, 95% CI 0.54 to 1.26. We concluded that the persistence of BV after treatment was not related to BMI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our objective was to determine the effect of body mass index (BMI) on response to bacterial vaginosis (BV) treatment. A secondary analysis was conducted of two multicenter trials of therapy for BV and TRICHOMONAS VAGINALIS. Gravida were screened for BV between 8 and 22 weeks and randomized between 16 and 23 weeks to metronidazole or placebo. Of 1497 gravida with asymptomatic BV and preconceptional BMI, 738 were randomized to metronidazole; BMI was divided into categories: < 25, 25 to 29.9, and > or = 30. Rates of BV persistence at follow-up were compared using the Mantel-Haenszel chi square. Multiple logistic regression was used to evaluate the effect of BMI on BV persistence at follow-up, adjusting for potential confounders. No association was identified between BMI and BV rate at follow-up ( P = 0.21). BMI was associated with maternal age, smoking, marital status, and black race. Compared with women with BMI of < 25, adjusted odds ratio (OR) of BV at follow-up were BMI 25 to 29.9: OR, 0.66, 95% CI 0.43 to 1.02; BMI > or = 30: OR, 0.83, 95% CI 0.54 to 1.26. We concluded that the persistence of BV after treatment was not related to BMI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many persons in the U.S. gain weight during young adulthood, and the prevalence of obesity has been increasing among young adults. Although obesity and physical inactivity are generally recognized as risk factors for coronary heart disease (CHD), the magnitude of their effect on risk may have been seriously underestimated due to failure to adequately handle the problem of cigarette smoking. Since cigarette smoking causes weight loss, physically inactive cigarette smokers may remain relatively lean because they smoke cigarettes. We hypothesize cigarette smoking modifies the association between weight gain during young adulthood and risk of coronary heart disease during middle age, and that the true effect of weight gain during young adulthood on risk of CHD can be assessed only in persons who have not smoked cigarettes. Specifically, we hypothesize that weight gain during young adulthood is positively associated with risk of CHD during middle-age in nonsmokers but that the association is much smaller or absent entirely among cigarette smokers. The purpose of this study was to test this hypothesis. The population for analysis was comprised of 1,934 middle-aged, employed men whose average age at the baseline examination was 48.7 years. Information collected at the baseline examinations in 1958 and 1959 included recalled weight at age 20, present weight, height, smoking status, and other CHD risk factors. To decrease the effect of intraindividual variation, the mean values of the 1958 and 1959 baseline examinations were used in analyses. Change in body mass index ($\Delta$BMI) during young adulthood was the primary exposure variable and was measured as BMI at baseline (kg/m$\sp2)$ minus BMI at age 20 (kg/m$\sp2).$ Proportional hazards regression analysis was used to generate relative risks of CHD mortality by category of $\Delta$BMI and cigarette smoking status after adjustment for age, family history of CVD, major organ system disease, BMI at age 20, and number of cigarettes smoked per day. Adjustment was not performed for systolic blood pressure or total serum cholesterol as these were regarded as intervening variables. Vital status was known for all men on the 25th anniversary of their baseline examinations. 705 deaths (including 319 CHD deaths) occurred over 40,136 person-years of experience. $\Delta$BMI was positively associated with risk of CHD mortality in never-smokers, but not in ever-smokers (p for interaction = 0.067). For never-smokers with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 1.62, 1.61, and 2.78, respectively (p for trend = 0.010). For ever-smokers, with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 0.74, 1.07, and 1.06, respectively (p for trend = 0.422). These results support the research hypothesis that cigarette smoking modifies the association between weight gain and CHD mortality. Current estimates of the magnitude of effect of obesity and physical inactivity on risk of coronary mortality may have been seriously underestimated due to inadequate handling of cigarette smoking. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Patients with prior coronary artery bypass graft surgery (CABG) who present with an acute coronary syndrome have a high risk for recurrent events. Whether intensive antiplatelet therapy with ticagrelor might be beneficial compared with clopidogrel is unknown. In this substudy of the PLATO trial, we studied the effects of randomized treatment dependent on history of CABG. METHODS Patients participating in PLATO were classified according to whether they had undergone prior CABG. The trial's primary and secondary end points were compared using Cox proportional hazards regression. RESULTS Of the 18,613 study patients, 1,133 (6.1%) had prior CABG. Prior-CABG patients had more high-risk characteristics at study entry and a 2-fold increase in clinical events during follow-up, but less major bleeding. The primary end point (composite of cardiovascular death, myocardial infarction, and stroke) was reduced to a similar extent by ticagrelor among patients with (19.6% vs 21.4%; adjusted hazard ratio [HR], 0.91 [0.67, 1.24]) and without (9.2% vs 11.0%; adjusted HR, 0.86 [0.77, 0.96]; P(interaction) = .73) prior CABG. Major bleeding was similar with ticagrelor versus clopidogrel among patients with (8.1% vs 8.7%; adjusted HR, 0.89 [0.55, 1.47]) and without (11.8% vs 11.4%; HR, 1.08 [0.98, 1.20]; P(interaction) = .46) prior CABG. CONCLUSIONS Prior-CABG patients presenting with acute coronary syndrome are a high-risk cohort for death and recurrent cardiovascular events but have a lower risk for major bleeding. Similar to the results in no-prior-CABG patients, ticagrelor was associated with a reduction in ischemic events without an increase in major bleeding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The water relations of two tree species in the Euphorbiaceae were compared to test in part a hypothesis that the forest understorey plays an integral role in drought response. At Danum, Sabah, the relatively common species Dimorphocalyx muricatus is associated with ridges whilst another species, Mallotus wrayi, occurs widely both on ridges and lower slopes. Sets of subplots within two 4 -ha permanent plots in this lowland dipterocarp rain forest, were positioned on ridges and lower slopes. Soil water potentials were recorded in 1995-1997, and leaf water potentials were measured on six occasions. Soil water potentials on the ridges (-0.047 MPa) were significantly lower than on the lower slopes (-0.012 MPa), but during the driest period in May 1997 they fell to similarly low levels on both sites (-0.53 MPa). A weighted 40-day accumulated rainfall index was developed to model the soil water potentials. At dry times, D. muricatus (ridge) had significantly higher pre-dawn (-0.21 v. -0.57 MPa) and mid-day (-0.59 v. -1.77 MPa) leaf water potentials than M. wrayi (mean of ridge and lower slope). Leaf osmotic potentials of M. wrayi on the ridges were lower (-1.63 MPa) than on lower slopes (-1.09 MPa), with those for D. muricatus being intermediate (-1.29 MPa): both species adjusted osmotically between wet and dry times. D. muricatus trees were more deeply rooted than M. wrayi trees (97 v. 70 cm). M. wrayi trees had greater lateral root cross-sectional areas than D. muricatus trees although a greater proportion of this sectional area for D. muricatus was further down the soil profile. D. muricatus appeared to maintain relatively high water potentials during dry periods because of its access to deeper water supplies and thus it largely avoided drought effects, but M. wrayi seemed to be more affected yet tolerant of drought and was more plastic in its response. The interaction between water availability and topography determines these species' distributions and provides insights into how rain forests can withstand occasional strong droughts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Little is known as to whether negative emotions adversely impact the prognosis of patients who undergo cardiac rehabilitation. We prospectively investigated the predictive value of state negative affect (NA) assessed at discharge from cardiac rehabilitation for prognosis and the moderating role of positive affect (PA) on the effect of NA on outcomes. METHODS A total of 564 cardiac patients (62.49 ± 11.51) completed a comprehensive three-month outpatient cardiac rehabilitation program, filling in the Global Mood Scale (GMS) at discharge. The combined endpoint was cardiovascular disease (CVD)-related hospitalizations plus all-cause mortality at follow-up. Cox regression models estimated the predictive value of NA, as well as the moderating influence of PA on outcomes. Survival models were adjusted for sociodemographic factors, traditional cardiovascular risk factors, and severity of disease. RESULTS During a mean follow-up period of 3.4 years, 71 patients were hospitalized for a CVD-related event and 15 patients died. NA score (range 0-20) was a significant and independent predictor (hazard ratio (HR) 1.091, 95% confidence interval (CI) 1.012-1.175; p = 0.023) with a three-point higher level in NA increasing the relative risk by 9.1%. Furthermore, PA interacted significantly with NA (p < 0.001). The relative risk of poor prognosis with NA was increased in patients with low PA (p = 0.012) but remained unchanged in combination with high PA (p = 0.12). CONCLUSION The combination of NA with low PA was particularly predictive of poor prognosis. Whether reduction of NA and increase of PA, particularly in those with high NA, improves outcome needs to be tested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE To quantify the coinciding improvement in the clinical diagnosis of sepsis, its documentation in the electronic health records, and subsequent medical coding of sepsis for billing purposes in recent years. METHODS We examined 98,267 hospitalizations in 66,208 patients who met systemic inflammatory response syndrome criteria at a tertiary care center from 2008 to 2012. We used g-computation to estimate the causal effect of the year of hospitalization on receiving an International Classification of Diseases, Ninth Revision, Clinical Modification discharge diagnosis code for sepsis by estimating changes in the probability of getting diagnosed and coded for sepsis during the study period. RESULTS When adjusted for demographics, Charlson-Deyo comorbidity index, blood culture frequency per hospitalization, and intensive care unit admission, the causal risk difference for receiving a discharge code for sepsis per 100 hospitalizations with systemic inflammatory response syndrome, had the hospitalization occurred in 2012, was estimated to be 3.9% (95% confidence interval [CI], 3.8%-4.0%), 3.4% (95% CI, 3.3%-3.5%), 2.2% (95% CI, 2.1%-2.3%), and 0.9% (95% CI, 0.8%-1.1%) from 2008 to 2011, respectively. CONCLUSIONS Patients with similar characteristics and risk factors had a higher of probability of getting diagnosed, documented, and coded for sepsis in 2012 than in previous years, which contributed to an apparent increase in sepsis incidence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

B-type natriuretic peptide (BNP) levels are elevated in patients with aortic stenosis (AS) and decrease acutely after replacement of the stenotic valve. The long-term prognostic value of BNP after transcatheter aortic valve implantation (TAVI) and the relative prognostic utility of single versus serial peri-interventional measurements of BNP and N-terminal prohormone BNP (NT-pro-BNP) are unknown. This study sought to determine the impact of BNP levels on long-term outcomes after TAVI and to compare the utility of BNP versus NT-pro-BNP measured before and after intervention. We analyzed 340 patients with severe AS and baseline pre-TAVI assessment of BNP. In 219 patients, BNP and NT-pro-BNP were measured serially before and after intervention. Clinical outcomes over 2 years were recorded. Patients with high baseline BNP (higher tertile ≥591 pg/ml) had increased risk of all-cause mortality (adjusted hazard ratio 3.16, 95% confidence interval 1.84 to 5.42; p <0.001) and cardiovascular death at 2 years (adjusted hazard ratio 3.37, 95% confidence interval 1.78 to 6.39; p <0.001). Outcomes were most unfavorable in patients with persistently high BNP before and after intervention. Comparing the 2 biomarkers, NT-pro-BNP levels measured after TAVI showed the highest prognostic discrimination for 2-year mortality (area under the curve 0.75; p <0.01). Baseline-to-discharge reduction, but not baseline levels of BNP, was related to New York Heart Association functional improvement. In conclusion, high preintervention BNP independently predicts 2-year outcomes after TAVI, particularly when elevated levels persist after the intervention. BNP and NT-pro-BNP and their serial periprocedural changes provide complementary prognostic information for symptomatic improvement and survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND As access to antiretroviral therapy (ART) expands, increasing numbers of older patients will start treatment and need specialised long-term care. However, the effect of age in ART programmes in resource-constrained settings is poorly understood. The HIV epidemic is ageing rapidly and South Africa has one of the highest HIV population prevalences worldwide. We explored the effect of age on mortality of patients on ART in South Africa and whether this effect is mediated by baseline immunological status. METHODS In this retrospective cohort analysis, we studied HIV-positive patients aged 16-80 years who started ART for the first time in six large South African cohorts of the International Epidemiologic Databases to Evaluate AIDS-Southern Africa collaboration, in KwaZulu-Natal, Gauteng, and Western Cape (two primary care clinics, three hospitals, and a large rural cohort). The primary outcome was mortality. We ascertained patients' vital status through linkage to the National Population Register. We used inverse probability weighting to correct mortality for loss to follow-up. We estimated mortality using Cox's proportional hazards and competing risks regression. We tested the interaction between baseline CD4 cell count and age. FINDINGS Between Jan 1, 2004, and Dec 31, 2013, 84,078 eligible adults started ART. Of these, we followed up 83,566 patients for 174,640 patient-years. 8% (1817 of 23,258) of patients aged 16-29 years died compared with 19% (93 of 492) of patients aged 65 years or older. The age adjusted mortality hazard ratio was 2·52 (95% CI 2·01-3·17) for people aged 65 years or older compared with those 16-29 years of age. In patients starting ART with a CD4 count of less than 50 cells per μL, the adjusted mortality hazard ratio was 2·52 (2·04-3·11) for people aged 50 years or older compared with those 16-39 years old. Mortality was highest in patients with CD4 counts of less than 50 cells per μL, and 15% (1103 of 7295) of all patients aged 50 years or older starting ART were in this group. The proportion of patients aged 50 years or older enrolling in ART increased with successive years, from 6% (290 of 4999) in 2004 to 10% (961 of 9657) in 2012-13, comprising 9% of total enrolment (7295 of 83 566). At the end of the study, 6304 (14%) of 44,909 patients still alive and in care were aged 50 years or older. INTERPRETATION Health services need reorientation towards HIV diagnosis and starting of ART in older individuals. Policies are needed for long-term care of older people with HIV. FUNDING National Institutes of Health (National Institute of Allergy and Infectious Diseases), US Agency for International Development, and South African Centre for Epidemiological Modelling and Analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To increase the efficiency of equine semen, it could be useful to split the artificial insemination dose and refreeze the redundant spermatozoa. In experiment I, semen of 10 sires of the Hanoverian breed, with poor and good semen freezability, was collected by artificial vagina, centrifuged, extended in INRA82 at 400 × 106 sperm/mL, and automatically frozen. After this first routinely applied freezing program, semen from each stallion was thawed, resuspended in INRA82 at 40 × 106 sperm/mL, filled in 0.5-mL straws, and refrozen. These steps were repeated, and sperm concentration was adjusted to 20 × 106 sperm/mL after a third freezing cycle. Regardless of stallion freezability group, sperm motility and sperm membrane integrity (FITC/PNA-Syto-PI-stain) decreased stepwise after first, second, and third freezing (62.3% ± 9.35, 24.0% ± 15.4, 3.3% ± 4.34, P ≤ .05; 29.6% ± 8.64, 14.9% ± 6.38, 8.3% ± 3.24, P ≤ .05), whereas the percentage of acrosome-reacted cells increased (19.5% ± 7.59, 23.9% ± 8.51, 29.2% ± 6.58, P ≤ .05). Sperm chromatin integrity was unaffected after multiple freeze/thaw cycles (DFI value: 18.6% ± 6.6, 17.2% ± 6.84, 17.1% ± 7.21, P > .05). In experiment II estrous, Hanoverian warmblood mares were inseminated with a total of 200 × 106 spermatozoa of two stallions with either good or poor semen freezability originating from the first, second, and third freeze/thaw cycle. First-cycle pregnancy rates were 4/10, 40%; 1/10, 10%; and 0/10, 0%. In conclusion, as expected, sperm viability of stallion spermatozoa significantly decreases as a consequence of multiple freezing. However, sperm chromatin integrity was not affected. Pregnancy rates after insemination of mares with refrozen semen are reduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND There are concerns about the effects of in utero exposure to antiretroviral drugs (ARVs) on the development of HIV-exposed but uninfected (HEU) children. The aim of this study was to evaluate whether in utero exposure to ARVs is associated with lower birth weight/height and reduced growth during the first 2 years of life. METHODS This cohort study was conducted among HEU infants born between 1996 and 2010 in Tertiary children's hospital in Rio de Janeiro, Brazil. Weight was measured by mechanical scale, and height was measured by measuring board. Z-scores for weight-for-age (WAZ), length-for-age (LAZ) and weight-for-length were calculated. We modeled trajectories by mixed-effects models and adjusted for mother's age, CD4 cell count, viral load, year of birth and family income. RESULTS A total of 588 HEU infants were included of whom 155 (26%) were not exposed to ARVs, 114 (19%) were exposed early (first trimester) and 319 (54%) later. WAZ were lower among infants exposed early compared with infants exposed later: adjusted differences were -0.52 (95% confidence interval [CI]: -0.99 to -0.04, P = 0.02) at birth and -0.22 (95% CI: -0.47 to 0.04, P = 0.10) during follow-up. LAZ were lower during follow-up: -0.35 (95% CI: -0.63 to -0.08, P = 0.01). There were no differences in weight-for-length scores. Z-scores of infants exposed late during pregnancy were similar to unexposed infants. CONCLUSIONS In HEU children, early exposure to ARVs was associated with lower WAZ at birth and lower LAZ up to 2 years of life. Growth of HEU children needs to be monitored closely.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problem. Recent statistics show that over a fifth of children aged 2-5 years in 2006-2008 were overweight, with 7% above the 97 th percentile of the BMI-for-age growth charts (extreme obesity). Because poor diet is an important environmental determinant of obesity and the preschool years are crucial developmentally, examination of factors related to diet in the pre-school years is important for obesity prevention efforts. ^ Objective. The goals of this study were to determine the association between BMI of the parents and the number of servings of fruits, vegetables, and whole grains (FVWG) packed; the nutrient content of preschool children’s lunches; and norms and expectations about FVWG intake.^ Methods. This study was a cross sectional analysis of parents enrolled in the Lunch is in the Bag program at baseline. The independent measure was weight status of the parents/caregivers, which was determined using body mass index (BMI) calculated from self-reported height and weight. BMI was classified as healthy weight (BMI <25) or overweight/obese (BMI ≥25). Outcomes for the study included the number of servings of fruits, vegetables and whole grains (FVWG) in sack lunches, as well as the nutrient content of the lunches, and psychosocial constructs related to FVWG consumption. Linear regression analysis was conducted and adjusted for confounders to examine the associations of these outcomes with parental weight status, the main predictor. ^ Results. A total of 132 parent/child dyads were enrolled in the study; 59.09% (n=78) of the parents/caregivers were healthy weight and 39.01% (n=54) of the parents/caregivers were overweight/obese. Parents/caregivers in the study were predominantly white (68%, n=87) and had at least some college education (98%, n=128). No significant associations were found between the weight status of the parents and the servings of fruits, vegetables and whole grain packed in preschool children’s lunchboxes. The results were similar for the association of parental weight status and the nutrient contents of the packed lunches. Both healthy weight and overweight/obese parents packed less than the recommended amounts of vegetables (mean servings = 0.49 and 0.534, respectively) and whole grains (mean servings = 0.58 and 0.511, respectively). However, the intentions of the obese/overweight parents were higher compare to the healthy for vegetables and whole grains.^ Conclusion. Results from this study indicate that there are few differences in the servings of fruits, vegetables and whole grains packed by healthy weight parents/caregivers compared to overweight/obese parents/caregivers in a high income, well-educated population, although neither group met the recommended number of servings of vegetables or whole grains. Thus, results indicate the need for behaviorally-based health promotion programs for parents, regardless of their weight status; however, this study should be replicated with larger and more diverse populations to determine if these results are similar with less homogenous populations.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Health care associated catheter related blood stream infections (CRBSI) represent a significant public health concern in the United States. Several studies have suggested that precautions such as maximum sterile barrier and use of antimicrobial catheters are efficacious at reducing CRBSI, but there is concern within the medical community that the prolonged use of antimicrobial catheters may be associated with increased bacterial resistance. Clinical studies have been done showing no association and a significant decrease in microbial resistance with prolonged minocycline/rifampin (M/R) catheter use. One explanation is the emergence of community acquired methicillin resistant Staphylococcus aureus (MRSA), which is more susceptible to antibiotics, as a cause of CRBSI.^ Methods. Data from 323 MRSA isolates cultured from cancer patients at The University of Texas MD Anderson Cancer center from 1997-2007 displaying MRSA infection were analyzed to determine whether there is a relationship between resistance to minocycline and rifampin and prolonged wide spread use of minocycline (M/R) catheters. Analysis was also conducted to determine whether there was a significant change in the prevalence community acquired MRSA (CA-MRSA) during this time period and if this emergence act as a confounder masquerading the true relationship between microbial resistance and prolonged M/R catheter use.^ Results. Our study showed that the significant (p=0.008) change in strain type over time is a confounding variable; the adjusted model showed a significant protective effect (OR 0.000281, 95% CI 1.4x10 -4-5.5x10-4) in the relationship between MRSA resistance to minocycline and prolonged M/R catheter use. The relationship between resistance to rifampin and prolonged M/R catheter use was not significant.^ Conclusion. The emergence of CA-MRSA is a confounder and in the relationship between resistance to minocycline and rifampin and prolonged M/R catheter use. However, despite the adjustment for the more susceptible CA-MRSA the widespread use of M/R catheters does not promote microbial resistance. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to investigate whether an incongruence between personality characteristics of individuals and concomitant charcteristics of health professional training environments on salient dimensions contributes to aspects of mental health. The dimensions examined were practical-theoretical orientation and the degree of structure-unstructure. They were selected for study as they are particularly important attributes of students and of learning environments. It was proposed that when the demand of the environment is disparate from the proclivities of the individual, strain arises. This strain was hypothesized to contribute to anxiety, depression, and subjective distress.^ Select subscales on the Omnibus Personality Inventory (OPI) were the operationalized measures for the personality component of the dimensions studied. An environmental index was developed to assess students' perceptions of the learning environment on these same dimensions. The Beck Depression Inventory, State-Trait Anxiety Inventory and General Well-Being schedule measured the outcome variables.^ A congruence model was employed to determine person-environment (P-E) interaction. Scores on the scales of the OPI and the environmental index were divided into high, medium, and low based on the range of scores. Congruence was defined as a match between the level of personality need and the complementary level of the perception of the environment. Alternatively, incongruence was defined as a mismatch between the person and the environment. The consistent category was compared to the inconsistent categories by an analysis of variance procedure. Furthermore, analyses of covariance were conducted with perceived supportiveness of the learning environment and life events external to the learning environment as the covariates. These factors were considered critical influences affecting the outcome measures.^ One hundred and eighty-five students (49% of the population) at the College of Optometry at the University of Houston participated in the study. Students in all four years of the program were equally represented in the study. However, the sample differed from the total population on representation by sex, marital status, and undergraduate major.^ The results of the study did not support the hypotheses. Further, after having adjusted for perceived supportiveness and life events external to the learning environment, there were no statistically significant differences between the congruent category and incongruent categories. Means indicated than the study sample experienced significantly lower depression and subjective distress than the normative samples.^ Results are interpreted in light of their utility for future study design in the investigation of the effects of P-E interaction. Emphasized is the question of the feasibility of testing a P-E interaction model with extant groups. Recommendations for subsequent research are proposed in light of the exploratory nature of the methodology. ^