925 resultados para Negative Outcomes associated with Medication
Resumo:
OBJECTIVE: To estimate the cumulative incidence of severe complications associated with genital chlamydia infection in the general female population. METHODS: The Uppsala Women's Cohort Study was a retrospective population based cohort study in Sweden, linking laboratory, hospital, and population registers. We estimated the cumulative incidence of hospital diagnosed pelvic inflammatory disease, ectopic pregnancy, and infertility, and used multivariable regression models to estimate hazard ratios according to screening status. RESULTS: We analysed complete data from 43 715 women in Uppsala aged 15-24 years between January 1985 and December 1989. Follow up until the end of 1999 included 709 000 woman years and 3025 events. The cumulative incidence of pelvic inflammatory disease by age 35 years was 3.9% (95% CI 3.7% to 4.0%) overall: 5.6% (4.7% to 6.7%) in women who ever tested positive for chlamydia, 4.0% (3.7% to 4.4%) in those with negative tests, and 2.9% (2.7% to 3.2%) in those who were never screened. The corresponding figures were: for ectopic pregnancy, 2.3% (2.2% to 2.5%) overall, 2.7% (2.1% to 3.5%), 2.0% (1.8% to 2.3%), and 1.9% (1.7% to 2.1%); and for infertility, 4.1% (3.9% to 4.3%) overall, 6.7% (5.7% to 7.9%), 4.7% (4.4% to 5.1%), and 3.1% (2.8% to 3.3%). Low educational attainment was strongly associated with the development of all outcomes. CONCLUSIONS: The incidence of severe chlamydia associated complications estimated from ours, and other population based studies, was lower than expected. Studies that incorporate data about pelvic inflammatory disease diagnosed in primary care and behavioural risk factors would further improve our understanding of the natural history of chlamydia. Our results provide reassurance for patients, but mean that the benefits of chlamydia screening programmes might have been overestimated.
Resumo:
OBJECTIVE: The few long-term follow-up data for sentinel lymph node (SLN) negative breast cancer patients demonstrate a 5-year disease-free survival of 96-98%. It remains to be elucidated whether the more accurate SLN staging defines a more selective node negative patient group and whether this is associated with better overall and disease-free survival compared with level I ; II axillary lymph node dissection (ALND). METHODS: Three-hundred and fifty-five consecutive node negative patients with early stage breast cancer (pT1 and pT2< or =3 cm, pN0/pN(SN)0) were assessed from our prospective database. Patients underwent either ALND (n=178) in 1990-1997 or SLN biopsy (n=177) in 1998-2004. All SLN were examined by step sectioning, stained with H;E and immunohistochemistry. Lymph nodes from ALND specimens were examined by standard H;E only. Neither immunohistochemistry nor step sections were performed in the analysis of ALND specimen. RESULTS: The median follow-up was 49 months in the SLN and 133 months in the ALND group. Patients in the SLN group had a significantly better disease-free (p=0.008) and overall survival (p=0.034). After adjusting for other prognostic factors in Cox proportional hazard regression analysis, SLN procedure was an independent predictor for improved disease-free (HR: 0.28, 95% CI: 0.10-0.73, p=0.009) and overall survival (HR: 0.34, 95% CI: 0.14-0.84, p=0.019). CONCLUSIONS: This is the first prospective analysis providing evidence that early stage breast cancer patients with a negative SLN have an improved disease-free and overall survival compared with node negative ALND patients. This is most likely due to a more accurate axillary staging in the SLN group.
Resumo:
We investigated whether occupational role stress is associated with differential levels of the stress hormone cortisol in response to acute psychosocial stress. Forty-three medication-free nonsmoking men aged between 22 and 65 years (mean ± SEM: 44.5 ± 2) underwent an acute standardized psychosocial stress task combining public speaking and mental arithmetic in front of an audience. We assessed occupational role stress in terms of role conflict and role ambiguity (combined into a measure of role uncertainty) as well as further work characteristics and psychological control variables including time pressure, overcommitment, perfectionism, and stress appraisal. Moreover, we repeatedly measured salivary cortisol and blood pressure levels before and after stress exposure, and several times up to 60 min thereafter. Higher role uncertainty was associated with a more pronounced cortisol stress reactivity (p = .016), even when controlling for the full set of potential confounders (p < .001). Blood pressure stress reactivity was not associated with role uncertainty. Our findings suggest that occupational role stress in terms of role uncertainty acts as a background stressor that is associated with increased HPA-axis reactivity to acute stress. This finding may represent a potential mechanism regarding how occupational role stress may precipitate adverse health outcomes.
Resumo:
BACKGROUND & AIMS Sporadic pancreatic neuroendocrine tumors (pNETs) are rare and genetically heterogeneous. Chromosome instability (CIN) has been detected in pNETs from patients with poor outcomes, but no specific genetic factors have been associated with CIN. Mutations in death domain-associated protein gene (DAXX) or ATR-X gene (ATRX) (which both encode proteins involved in chromatin remodeling) have been detected in 40% of pNETs, in association with activation of alternative lengthening of telomeres. We investigated whether loss of DAXX or ATRX, and consequent alternative lengthening of telomeres, are related to CIN in pNETs. We also assessed whether loss of DAXX or ATRX is associated with specific phenotypes of pNETs. METHODS We collected well-differentiated primary pNET samples from 142 patients at the University Hospital Zurich and from 101 patients at the University Hospital Bern (both located in Switzerland). Clinical follow-up data were obtained for 149 patients from general practitioners and tumor registries. The tumors were reclassified into 3 groups according to the 2010 World Health Organization classification. Samples were analyzed by immunohistochemistry and telomeric fluorescence in situ hybridization. We correlated loss of DAXX, or ATRX, expression, and activation of alternative lengthening of telomeres with data from comparative genomic hybridization array studies, as well as with clinical and pathological features of the tumors and relapse and survival data. RESULTS Loss of DAXX or ATRX protein and alternative lengthening of telomeres were associated with CIN in pNETs. Furthermore, loss of DAXX or ATRX correlated with tumor stage and metastasis, reduced time of relapse-free survival, and decreased time of tumor-associated survival. CONCLUSIONS Loss of DAXX or ATRX is associated with CIN in pNETs and shorter survival times of patients. These results support the hypothesis that DAXX- and ATRX-negative tumors are a more aggressive subtype of pNET, and could lead to identification of strategies to target CIN in pancreatic tumors.
Resumo:
AIM To investigate risk factors for the loss of multi-rooted teeth (MRT) in subjects treated for periodontitis and enrolled in supportive periodontal therapy (SPT). MATERIAL AND METHODS A total of 172 subjects were examined before (T0) and after active periodontal therapy (APT)(T1) and following a mean of 11.5 ± 5.2 (SD) years of SPT (T2). The association of risk factors with loss of MRT was analysed with multilevel logistic regression. The tooth was the unit of analysis. RESULTS Furcation involvement (FI) = 1 before APT was not a risk factor for tooth loss compared with FI = 0 (p = 0.37). Between T0 and T2, MRT with FI = 2 (OR: 2.92, 95% CI: 1.68, 5.06, p = 0.0001) and FI = 3 (OR: 6.85, 95% CI: 3.40, 13.83, p < 0.0001) were at a significantly higher risk to be lost compared with those with FI = 0. During SPT, smokers lost significantly more MRT compared with non-smokers (OR: 2.37, 95% CI: 1.05, 5.35, p = 0.04). Non-smoking and compliant subjects with FI = 0/1 at T1 lost significantly less MRT during SPT compared with non-compliant smokers with FI = 2 (OR: 10.11, 95% CI: 2.91, 35.11, p < 0.0001) and FI = 3 (OR: 17.18, 95% CI: 4.98, 59.28, p < 0.0001) respectively. CONCLUSIONS FI = 1 was not a risk factor for tooth loss compared with FI = 0. FI = 2/3, smoking and lack of compliance with regular SPT represented risk factors for the loss of MRT in subjects treated for periodontitis.
Resumo:
OBJECTIVES Although vitamin D is recognized as an important factor in bone health, its role in osteoarticular infections is unclear. We hypothesized that low vitamin D (25-hydroxycholecalciferol) levels are associated with a lower likelihood of treatment success in osteoarticular infections. METHODS This was a retrospective cohort study of patients with orthopedic infections who had a 25-hydroxycholecalciferol level drawn when their infection was diagnosed. Outcomes were determined at early (3-6 months) and late (≥6 months) follow-up after completing intravenous antibiotics. RESULTS We included 223 patients seen during an 11-month period with osteoarticular infections and baseline 25-hydroxycholecalciferol levels. During the initial inpatient management of the infection, hypovitaminosis D was identified and treated. The mean 25-hydroxycholecalciferol level was 23±14ng/ml; 167 (75%) patients had levels <30ng/ml. Overall, infection treatment success was 91% (159/174) at early follow-up and 88% (145/164) at late follow-up. 25-Hydroxycholecalciferol baseline levels were similar in those with and without successful clinical outcomes, both at early (25±15 vs. 21±9ng/ml; p=0.3) and late follow-up (25±15 vs. 23±16ng/ml; p=0.6). CONCLUSIONS To our knowledge this is the first report on hypovitaminosis D and its impact on outcomes of osteoarticular infections. Hypovitaminosis D was frequent in this cohort. With vitamin D repletion, there was no difference in treatment success whether patients had baseline hypovitaminosis or not.
Resumo:
The global social and economic burden of HIV/AIDS is great, with over forty million people reported to be living with HIV/AIDS at the end of 2005; two million of these are children from birth to 15 years of age. Antiretroviral therapy has been shown to improve growth and survival of HIV-infected individuals. The purpose of this study is to describe a cohort of HIV-infected pediatric patients and assess the association between clinical factors, with growth and mortality outcomes. ^ This was a historical cohort study. Medical records of infants and children receiving HIV care at Mulago Pediatric Infectious Disease Clinic (PIDC) in Uganda between July 2003 and March 2006 were analyzed. Height and weight measurements were age and sex standardized to Centers for Disease Control and prevention (CDC) 2000 reference. Descriptive and logistic regression analyses were performed to identify covariates associated with risk of stunting or being underweight, and mortality. Longitudinal regression analysis with a mixed model using autoregressive covariance structure was used to compare change in height and weight before and after initiation of highly active antiretroviral therapy (HAART). ^ The study population was comprised of 1059 patients 0-20 years of age, the majority of whom were aged thirteen years and below (74.6%). Mean height-for-age before initiation of HAART was in the 10th percentile, mean weight-for-age was in the 8th percentile, and the mean weight-for-height was in the 23rd percentile. Initiation of HAART resulted in improvement in both the mean standardized weight-for-age Z score and weight-for-age percentiles (p <0.001). Baseline age, and weight-for-age Z score were associated with stunting (p <0.001). A negative weight-for-age Z score was associated with stunting (OR 4.60, CI 3.04-5.49). Risk of death decreased from 84% in the >2-8 years age category to 21% in the >13 years age category respectively, compared to the 0-2 years of age (p <0.05). ^ This pediatric population gained weight significantly more rapidly than height after starting HAART. A low weight-for-age Z score was associated with poor survival in children. These findings suggest that age, weight, and height measurements be monitored closely at Mulago PIDC. ^
Resumo:
Introduction. Selectively manned units have a long, international history, both military and civilian. Some examples include SWAT teams, firefighters, the FBI, the DEA, the CIA, and military Special Operations. These special duty operators are individuals who perform a highly skilled and dangerous job in a unique environment. A significant amount of money is spent by the Department of Defense (DoD) and other federal agencies to recruit, select, train, equip and support these operators. When a critical incident or significant life event occurs, that jeopardizes an operator's performance; there can be heavy losses in terms of training, time, money, and potentially, lives. In order to limit the number of critical incidents, selection processes have been developed over time to “select out” those individuals most likely to perform below desired performance standards under pressure or stress and to "select in" those with the "right stuff". This study is part of a larger program evaluation to assess markers that identify whether a person will fail under the stresses in a selectively manned unit. The primary question of the study is whether there are indicators in the selection process that signify potential negative performance at a later date. ^ Methods. The population being studied included applicants to a selectively manned DoD organization between 1993 and 2001 as part of a unit assessment and selection process (A&S). Approximately 1900 A&S records were included in the analysis. Over this nine year period, seventy-two individuals were determined to have had a critical incident. A critical incident can come in the form of problems with the law, personal, behavioral or family problems, integrity issues, and skills deficit. Of the seventy-two individuals, fifty-four of these had full assessment data and subsequent supervisor performance ratings which assessed how an individual performed while on the job. This group was compared across a variety of variables including demographics and psychometric testing with a group of 178 individuals who did not have a critical incident and had been determined to be good performers with positive ratings by their supervisors.^ Results. In approximately 2004, an online pre-screen survey was developed in the hopes of preselecting out those individuals with items that would potentially make them ineligible for selection to this organization. This survey has aided the organization to increase its selection rates and save resources in the process. (Patterson, Howard Smith, & Fisher, Unit Assessment and Selection Project, 2008) When the same prescreen was used on the critical incident individuals, it was found that over 60% of the individuals would have been flagged as unacceptable. This would have saved the organization valuable resources and heartache.^ There were some subtle demographic differences between the two groups (i.e. those with critical incidents were almost twice as likely to be divorced compared with the positive performers). Upon comparison of Psychometric testing several items were noted to be different. The two groups were similar when their IQ levels were compared using the Multidimensional Aptitude Battery (MAB). When looking at the Minnesota Multiphasic Personality Inventory (MMPI), there appeared to be a difference on the MMPI Social Introversion; the Critical Incidence group scored somewhat higher. When analysis was done, the number of MMPI Critical Items between the two groups was similar as well. When scores on the NEO Personality Inventory (NEO) were compared, the critical incident individuals tended to score higher on Openness and on its subscales (Ideas, Actions, and Feelings). There was a positive correlation between Total Neuroticism T Score and number of MMPI critical items.^ Conclusions. This study shows that the current pre-screening process is working and would have saved the organization significant resources. ^ If one was to develop a profile of a candidate who potentially could suffer a critical incident and subsequently jeopardize the unit, mission and the safety of the public they would look like the following: either divorced or never married, score high on the MMPI in Social Introversion, score low on MMPI with an "excessive" amount of MMPI critical items; and finally scores high on the NEO Openness and subscales Ideas, Feelings, and Actions.^ Based on the results gleaned from the analysis in this study there seems to be several factors, within psychometric testing, that when taken together, will aid the evaluators in selecting only the highest quality operators in order to save resources and to help protect the public from unfortunate critical incidents which may adversely affect our health and safety.^
Resumo:
Background. Increasing rates of maternal employment highlight the need for non-maternal child care for infants at an earlier age. Several studies have shown that employment induced maternal depression or psychological distress is associated with the child's socio-emotional and cognitive development. However, separation anxiety, a common phenomenon observed among employed mothers during early years, has seldom been studied. Therefore, the purpose of this study was to evaluate the role of maternal separation anxiety in the child's cognitive development.^ Methods. Data were obtained from Phase I (birth to 36 months) of the National Institute of Child Health and Human Development, Study of Early Child Care and Youth Development (NICHD SECCYD). Bivariate and multivariate analyses were performed to determine the association between separation anxiety groups and child outcomes. Multivariate analysis was also used to examine the mediating and/or moderating effect of sensitivity and moderating effect of difficult temperament.^ Results. Separation anxiety showed a negative association with the Bracken, attachment security, maternal sensitivity and psychological state. Children whose mothers never reported high levels of separation anxiety showed higher levels of school readiness and attachment security compared to those whose mothers experienced high levels of separation anxiety at least once. There was a significant interaction between separation anxiety and maternal sensitivity for the Bracken and attachment security indicating the moderating effect of sensitivity. Maternal sensitivity was also found to partially mediate the association between high levels of separation anxiety and school readiness or attachment security. However, the interaction between difficult temperament and separation anxiety was not significant for any of the child outcomes. ^ Conclusions. High levels of separation anxiety have a negative impact on school readiness, attachment security, maternal sensitivity and psychological state. In addition, mothers who experience high levels of separation anxiety but are sensitive during the mother-child interaction have children with high school readiness and attachment security compared to those who are less sensitive.^ Keywords. Maternal separation anxiety, School readiness. ^
Resumo:
The Estudio Comunitario sobre la Salud del Niño cohort study followed 326 3- to 8-year-old Colombian children for 4 years to observe the natural history of Helicobacter pylori infection and identify risk factors for acquisition, recurrence and persistence. Acute H. pylori infection during childhood may predispose to other enteric infections and therefore increase the risk of diarrheal disease. This dissertation aimed to estimate the effect of H. pylori infection on the occurrence of diarrhea and parasitic co-infections. The analysis used Generalized Estimating Equations to obtain odds ratios to estimate relative risks for diarrhea and the Zhang-Yu algorithm to estimate relative risks for on parasitic infections. Andersen-Gill models were used to estimate rate ratios for the effect of H. pylori status on the recurrence of parasitic infections. H. pylori status was classified for the entire follow-up duration in 1 of 3 categories: persistently positive, intermittently positive, and persistently negative. Multivariable models included child’s sex, age, symptoms, medication use, and socio-environmental factors. H. pylori infection was weakly and imprecisely associated with diarrheal disease, which occurred at an unexpectedly low frequency in this study. Persistently H. pylori-positive children had a somewhat higher incidence of reported diarrhea than intermittently positive or persistently negative children. Stratified analysis revealed that the presence of specific helminthes modified the effect of persistent H. pylori infection on diarrhea. The incidence of any parasitic infections was higher in children with persistent H. pylori infection relative to those with intermittent or persistently negative status, but this association did not hold when adjusted for the full set of selected covariates. The effects of H. pylori persistent status were similar for the occurrence or recurrence of Giardia duodenalis, Entamoeba histolytica, and Ascaris lumbricoides. These results show that H. pylori frequently co-exists with other parasites in Andean children and suggest that intermittently H. pylori–positive children might be at a lower risk of parasitic infections than persistently positive children. The relationship of H. pylori infection, helminthic infection and diarrheal disease should be further explored in studies that devote more intensive resources to accurate ascertainment of diarrhea.^
Resumo:
Poster presented at the Scientific Toxomics Meeting, 28 September 2015, Lisbon, Portugal.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Cancer and its treatment can affect many different aspects of quality of life. As a construct measured subjectively, quality of life shows an inconsistent relationship with objective outcome measures. That is, sometimes subjective and objective outcomes correspond with each other and sometimes they show little or no relationship. In this article, we propose a model for the relationship between subjective and objective outcomes using the example of cognitive function in people with cancer. The model and the research findings on which it is based help demonstrate that, in some circumstances, subjective measures of cognitive function correlate more strongly with psychosocial variables such as appraisal, coping, and emotions than with objective cognitive function. The model may provide a useful framework for research and clinical practice in quality of life for people with cancer.
Resumo:
This study examined relations between stress and coping predictors and negative and positive outcomes in MS caregiving. A total of 222 carers and their care-recipients completed questionnaires at Time 1 and three months later, Time 2 ( n = 155). Predictors included care-recipient characteristics ( age, time since diagnosis, course and life satisfaction), and Times 1 and 2 carer problems, stress appraisal and coping. Dependent variables were Time 2 negative ( anxiety, depression) and positive outcomes ( life satisfaction, positive affect, benefits). Regressions indicated that, overall, the hypothesised direct effects of stress appraisal and coping strategies on positive and negative outcomes were supported. The hypothesised stress-buffering effects of positive reframing coping were also supported. All but one of the coping strategies were related to both positive and negative outcomes; specifically, practical assistance coping emerged as a unique predictor of distress. Of the model predictors, care-recipient life satisfaction emerged as the strongest and most consistent predictor of both positive and negative outcomes except benefit finding. Findings support the role of care-recipient characteristics and the carer's appraisal and coping processes in shaping both positive and negative outcomes. The guiding framework and findings have the potential to inform interventions designed to promote well-being in carers.
Resumo:
This study examined the direct and stress-buffering effects of benefit finding on positive and negative outcomes. A total of 502 people with multiple sclerosis completed a questionnaire at Time 1 and, 3 months later, at Time 2 (n = 404). Measures of illness were collected at Time 1, and number of problems, stress appraisal, benefit finding, subjective health, and negative (global distress, negative affect) and positive (life satisfaction, positive affect, dyadic adjustment) outcomes were measured at Time 2. Factor analyses showed the Benefit Finding scale to have 2 dimensions: Personal Growth and Family Relations Growth. Hierarchical regressions showed that after controlling for the effects of demographics, illness, problems, and appraisal, benefit finding showed strong direct effects on the positive outcomes. Benefit finding did not have a direct effect on distress, or subjective health but had a weak association with negative affect. Family Relations Growth had a stress-buffering effect on distress.