898 resultados para Cohort study


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a critical review of the literature to assess the efficacy of monotherapy and subsequent combinant anticonvulsant therapy in the treatment of neonatal seizures, four studies were examined; three randomised control trials and one retrospective cohort study. Each study used phenobarbital for monotherapy with doses reaching a maximum of 40mg/kg. Anticonvulsant drugs used in conjunction with phenobarbitone for combinant therapy included midazolam, clonazepam, lorazepam, phenytoin and lignocaine. Each study used an electroencephalograph for seizure diagnosis and neonatal monitoring when determining therapy efficacy and final outcome assessments. Collectively the studies suggest neither monotherapy nor combinant therapy are entirely effective in seizure control. Monotherapy demonstrated a 29% - 50% success rate for complete seizure control whereas combinant therapy administered after the failure of monotherapy demonstrated a success rate of 43% - 100%. When these trials were combined the overall success for monotherapy was 44% (n = 34/78) and for combinant therapy 72% ( n = 56/78). Though the evidence was inconclusive, it would appear that combinant therapy is of greater benefit to infants unresponsive to monotherapy. Further research such as multi-site randomised controlled trials using standardised criteria and data collection are required within this specialised area.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: To assess the validity of the Waterlow screening tool in a cohort of internal medicine patients and to identify factors contributing to pressure injury. Design: Longitudinal cohort study Setting: A tertiary hospital in Brisbane, Australia Participants: 274 patients admitted through the Emergency Department or outpatient clinics and expected to remain in hospital for at least three days were included in the study. The mean age was 65.3 years. Interventions: Patients were screened on admission using the Waterlow screening tool. Every second day, their pressure ulcer status was monitored and recorded. Main outcome measures: Pressure ulcer incidence Results: Fifteen participants (5.5%) had an existing pressure ulcer and a further 12 (4.4%) developed a pressure ulcer during their hospital stay. Sensitivity of the Waterlow scale was 0.67, (95% CI: 0.35 to 0.88); specificity 0.79, (95% CI: 0.73 to 0.85); PPV 0.13, (95% CI: 0.07 to 0.24); NPV 0.98, (95% CI: 0.94 to 0.99). Conclusion: This study provides further evidence of the poor predictive validity of the Waterlow scale. A suitably powered randomised controlled trial is urgently needed to provide definitive evidence about the usefulness of the Waterlow scale compared with other screening tools and with clinical judgement.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: The relationship between cigarette smoking and cardiovascular disease is well established, yet the underlying mechanisms remain unclear. Although smokers have a more atherogenic lipid profile, this may be mediated by other lifestyle-related factors. Analysis of lipoprotein subclasses by the use of nuclear magnetic resonance spectroscopy (NMR) may improve characterisation of lipoprotein abnormalities. OBJECTIVE: We used NMR spectroscopy to investigate the relationships between smoking status, lifestyle-related risk factors, and lipoproteins in a contemporary cohort. METHODS: A total of 612 participants (360 women) aged 40–69 years at baseline (199021994) enrolled in the Melbourne Collaborative Cohort Study had plasma lipoproteins measured with NMR. Data were analysed separately by sex. RESULTS: After adjusting for lifestyle-related risk factors, including alcohol and dietary intake, physical activity, and weight, mean total low-density lipoprotein (LDL) particle concentration was greater for female smokers than nonsmokers. Both medium- and small-LDL particle concentrations contributed to this difference. Total high-density lipoprotein (HDL) and large-HDL particle concentrations were lower for female smokers than nonsmokers. The proportion with low HDL particle number was greater for female smokers than nonsmokers. For men, there were few smoking-related differences in lipoprotein measures. CONCLUSION: Female smokers have a more atherogenic lipoprotein profile than nonsmokers. This difference is independent of other lifestyle-related risk factors. Lipoprotein profiles did not differ greatly between male smokers and nonsmokers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Goals: Few studies have repeatedly evaluated quality of life and potentially relevant factors in patients with benign primary brain tumor. The purpose of this study was to explore the relationship between the experience of the symptom distress, functional status, depression, and quality of life prior to surgery (T1) and 1 month post-discharge (T2). ---------- Patients and methods: This was a prospective cohort study including 58 patients with benign primary brain tumor in one teaching hospital in the Taipei area of Taiwan. The research instruments included the M.D. Anderson Symptom Inventory, the Functional Independence Measure scale, the Hospital Depression Scale, and the Functional Assessment of Cancer Therapy-Brain.---------- Results: Symptom distress (T1: r=−0.90, p<0.01; T2: r=−0.52, p<0.01), functional status (T1: r=0.56, p<0.01), and depression (T1: r=−0.71, p<0.01) demonstrated a significant relationship with patients' quality of life. Multivariate analysis identified symptom distress (explained 80.2%, Rinc 2=0.802, p=0.001) and depression (explained 5.2%, Rinc 2=0.052, p<0.001) continued to have a significant independent influence on quality of life prior to surgery (T1) after controlling for key demographic and medical variables. Furthermore, only symptom distress (explained 27.1%, Rinc 2=0.271, p=0.001) continued to have a significant independent influence on quality of life at 1 month after discharge (T2).---------- Conclusions: The study highlights the potential importance of a patient's symptom distress on quality of life prior to and following surgery. Health professionals should inquire about symptom distress over time. Specific interventions for symptoms may improve the symptom impact on quality of life. Additional studies should evaluate symptom distress on longer-term quality of life of patients with benign brain tumor.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Evidence that our food environment can affect meal size is often taken to indicate a failure of ‘conscious control’. By contrast, our research suggests that ‘expected satiation’ (fullness that a food is expected to confer) predicts self-selected meal size. However, the role of meal planning as a determinant of actual meal size remains unresolved, as does the extent to which meal planning is commonplace outside the laboratory. Here, we quantified meal planning and its relation to meal size in a large-cohort study. Participants (N= 764; 25.6 yrs, 78% female) completed a questionnaire containing items relating to their last meal. The majority (91%) of meals were consumed in their entirety. Furthermore, in 92% of these cases the participants decided to consume the whole meal, even before it began. A second major objective was to explore the prospect that meal plans are revised based on within-meal experience (e.g., development of satiation). Only 8% of participants reported ‘unexpected’ satiation that caused them to consume less than anticipated. Moreover, at the end of the meal 57% indicated that they were not fully satiated, and 29% continued eating beyond comfortable satiation (often to avoid wasting food). This pattern was neither moderated by BMI nor dieting status, and was observed across meal types. Together, these data indicate that meals are often planned and that planning corresponds closely with amount consumed. By contrast, we find limited evidence for within-meal modification of these plans, suggesting that ‘pre-meal cognition’ is an important determinant of meal size in humans.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Lately, there has been increasing interest in the association between temperature and adverse birth outcomes including preterm birth (PTB) and stillbirth. PTB is a major predictor of many diseases later in life, and stillbirth is a devastating event for parents and families. The aim of this study was to assess the seasonal pattern of adverse birth outcomes, and to examine possible associations of maternal exposure to temperature with PTB and stillbirth. We also aimed to identify if there were any periods of the pregnancy where exposure to temperature was particularly harmful. A retrospective cohort study design was used and we retrieved individual birth records from the Queensland Health Perinatal Data Collection Unit for all singleton births (excluding twins and triplets) delivered in Brisbane between 1 July 2005 and 30 June 2009. We obtained weather data (including hourly relative humidity, minimum and maximum temperature) and air-pollution data (including PM10, SO2 and O3) from the Queensland Department of Environment and Resource Management. We used survival analyses with the time-dependent variables of temperature, humidity and air pollution, and the competing risks of stillbirth and live birth. To assess the monthly pattern of the birth outcomes, we fitted month of pregnancy as a time-dependent variable. We examined the seasonal pattern of the birth outcomes and the relationship between exposure to high or low temperatures and birth outcomes over the four lag weeks before birth. We further stratified by categorisation of PTB: extreme PTB (< 28 weeks of gestation), PTB (28–36 weeks of gestation), and term birth (≥ 37 weeks of gestation). Lastly, we examined the effect of temperature variation in each week of the pregnancy on birth outcomes. There was a bimodal seasonal pattern in gestation length. After adjusting for temperature, the seasonal pattern changed from bimodal, to only one peak in winter. The risk of stillbirth was statistically significant lower in March compared with January. After adjusting for temperature, the March trough was still statistically significant and there was a peak in risk (not statistically significant) in winter. There was an acute effect of temperature on gestational age and stillbirth with a shortened gestation for increasing temperature from 15 °C to 25 °C over the last four weeks before birth. For stillbirth, we found an increasing risk with increasing temperatures from 12 °C to approximately 20 °C, and no change in risk at temperatures above 20 °C. Certain periods of the pregnancy were more vulnerable to temperature variation. The risk of PTB (28–36 weeks of gestation) increased as temperatures increased above 21 °C. For stillbirth, the fetus was most vulnerable at less than 28 weeks of gestation, but there were also effects in 28–36 weeks of gestation. For fetuses of more than 37 weeks of gestation, increasing temperatures did not increase the risk of stillbirth. We did not find any adverse affects of cold temperature on birth outcomes in this cohort. My findings contribute to knowledge of the relationship between temperature and birth outcomes. In the context of climate change, this is particularly important. The results may have implications for public health policy and planning, as they indicate that pregnant women would decrease their risk of adverse birth outcomes by avoiding exposure to high temperatures and seeking cool environments during hot days.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Known risk factors for secondary lymphedema only partially explain who develops lymphedema following cancer, suggesting that inherited genetic susceptibility may influence risk. Moreover, identification of molecular signatures could facilitate lymphedema risk prediction prior to surgery or lead to effective drug therapies for prevention or treatment. Recent advances in the molecular biology underlying development of the lymphatic system and related congenital disorders implicate a number of potential candidate genes to explore in relation to secondary lymphedema. Methods and Results: We undertook a nested case-control study, with participants who had developed lymphedema after surgical intervention within the first 18 months of their breast cancer diagnosis serving as cases (n=22) and those without lymphedema serving as controls (n=98), identified from a prospective, population-based, cohort study in Queensland, Australia. TagSNPs that covered all known genetic variation in the genes SOX18, VEGFC, VEGFD, VEGFR2, VEGFR3, RORC, FOXC2, LYVE1, ADM and PROX1 were selected for genotyping. Multiple SNPs within three receptor genes, VEGFR2, VEGFR3 and RORC, were associated with lymphedema defined by statistical significance (p<0.05) or extreme risk estimates (OR<0.5 or >2.0). Conclusions: These provocative, albeit preliminary, findings regarding possible genetic predisposition to secondary lymphedema following breast cancer treatment warrant further attention for potential replication using larger datasets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction The suitability of video conferencing (VC) technology for clinical purposes relevant to geriatric medicine is still being established. This project aimed to determine the validity of the diagnosis of dementia via VC. Methods This was a multisite, noninferiority, prospective cohort study. Patients, aged 50 years and older, referred by their primary care physician for cognitive assessment, were assessed at 4 memory disorder clinics. All patients were assessed independently by 2 specialist physicians. They were allocated one face-to-face (FTF) assessment (Reference standard – usual clinical practice) and an additional assessment (either usual FTF assessment or a VC assessment) on the same day. Each specialist physician had access to the patient chart and the results of a battery of standardized cognitive assessments administered FTF by the clinic nurse. Percentage agreement (P0) and the weighted kappa statistic with linear weight (Kw) were used to assess inter-rater reliability across the 2 study groups on the diagnosis of dementia (cognition normal, impaired, or demented). Results The 205 patients were allocated to group: Videoconference (n = 100) or Standard practice (n = 105); 106 were men. The average age was 76 (SD 9, 51–95) and the average Standardized Mini-Mental State Examination Score was 23.9 (SD 4.7, 9–30). Agreement for the Videoconference group (P0= 0.71; Kw = 0.52; P < .0001) and agreement for the Standard Practice group (P0= 0.70; Kw = 0.50; P < .0001) were both statistically significant (P < .05). The summary kappa statistic of 0.51 (P = .84) indicated that VC was not inferior to FTF assessment. Conclusions Previous studies have shown that preliminary standardized assessment tools can be reliably administered and scored via VC. This study focused on the geriatric assessment component of the interview (interpretation of standardized assessments, taking a history and formulating a diagnosis by medical specialist) and identified high levels of agreement for diagnosing dementia. A model of service incorporating either local or remote administered standardized assessments, and remote specialist assessment, is a reliable process for enabling the diagnosis of dementia for isolated older adults.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background & aims: The confounding effect of disease on the outcomes of malnutrition using diagnosis-related groups (DRG) has never been studied in a multidisciplinary setting. This study aims to determine the prevalence of malnutrition in a tertiary hospital in Singapore and its impact on hospitalization outcomes and costs, controlling for DRG. Methods: This prospective cohort study included a matched case control study. Subjective Global Assessment was used to assess the nutritional status on admission of 818 adults. Hospitalization outcomes over 3 years were adjusted for gender, age, ethnicity, and matched for DRG. Results: Malnourished patients (29%) had longer hospital stays (6.9 ± 7.3 days vs. 4.6 ± 5.6 days, p < 0.001) and were more likely to be readmitted within 15 days (adjusted relative risk = 1.9, 95%CI 1.1–3.2, p = 0.025). Within a DRG, the mean difference between actual cost of hospitalization and the average cost for malnourished patients was greater than well-nourished patients (p = 0.014). Mortality was higher in malnourished patients at 1 year (34% vs. 4.1 %), 2 years (42.6% vs. 6.7%) and 3 years (48.5% vs. 9.9%); p < 0.001 for all. Overall, malnutrition was a significant predictor of mortality (adjusted hazard ratio = 4.4, 95% CI 3.3-6.0, p < 0.001). Conclusions: Malnutrition was evident in up to one third of the inpatients and led to poor hospitalization outcomes and survival as well as increased costs of care, even after matching for DRG. Strategies to prevent and treat malnutrition in the hospital and post-discharge are needed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: The aim of this study was to examine whether older people are prepared to engage in appropriate falls prevention strategies after discharge from hospital. Design and Methods We used a semi-structured interview to survey older patients about to be discharged from hospital and examined their knowledge regarding falls prevention strategies to utilize in the post-discharge period. The study was part of a prospective cohort study, nested within a larger, randomized controlled trial. Participants (n = 333) were asked to suggest strategies to reduce their falls risk at home after discharge, and their responses were compared with current reported research evidence for falls prevention interventions.  Results Participants’ strategies (n = 629) were classified into 7 categories: behavioral, support while mobilizing, approach to movement, physical environment, visual, medical, and activities or exercise. Although exercise has been identified as an effective falls risk reduction strategy, only 2.9% of participants suggested engaging in exercises. Falls prevention was most often conceptualized by participants as requiring 1 (35.4%) or 2 (40.8%) strategies for avoiding an accidental event, rather than engaging in sustained multiple risk reduction behaviors.  Implications Results demonstrate that older patients have low levels of knowledge about appropriate falls prevention strategies that could be used after discharge in spite of their increased falls risk during this period. Findings suggest that health care workers should design and deliver falls prevention education programs specifically targeted to older people who are to be discharged from hospital.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: Malnutrition is common in older hospitalised patients, and barriers to adequate intake in hospital limit the effectiveness of hospital-based nutrition interventions. This pilot study was undertaken to determine whether nutrition-focussed care at discharge and in the early post-hospital period is feasible and acceptable to patients and carers, and improves nutritional status. Design: Prospective cohort study Setting: Internal medicine wards of a tertiary teaching hospital in Brisbane, Australia Participants: Patients aged 65 and older admitted for at least 3 days, identified as malnourished or at risk of malnutrition using Mini Nutritional Assessment (MNA). Interventions: An interdisciplinary discharge team (specialist discharge planning nurse and accredited practicing dietitian) provided nutrition-focussed education, advice, service coordination and follow-up (home visits and telephone) for 6 weeks following hospitalisation Measurements: Nutritional intake, weight, functional status and MNA were recorded 6 and 12 weeks after discharge. Service intensity and changes to care were noted, and hospital readmissions recorded. Service feedback from patients and carers was sought using a brief questionnaire. Results: 12 participants were enrolled during the 6 week pilot (mean age 82 years, 50% male). All received 1-2 home visits, and 3-8 telephone calls. Four participants had new community services arranged, 4 were commenced on oral nutritional supplements, and 7 were referred to community dietetics services for follow-up. Two participants had a decline in MNA score of more than 10% at 12 week follow-up, while the remainder improved by at least 10%. Individualised care including community service coordination was valued by participants. Conclusion: The proposed model of care for older adults was feasible, acceptable to patients and carers, and associated with improved nutritional status at 12 weeks for most participants. The pilot data will be useful for design of intervention trials.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective Although several validated nutritional screening tools have been developed to “triage” inpatients for malnutrition diagnosis and intervention, there continues to be debate in the literature as to which tool/tools clinicians should use in practice. This study compared the accuracy of seven validated screening tools in older medical inpatients against two validated nutritional assessment methods. Methods This was a prospective cohort study of medical inpatients at least 65 y old. Malnutrition screening was conducted using seven tools recommended in evidence-based guidelines. Nutritional status was assessed by an accredited practicing dietitian using the Subjective Global Assessment (SGA) and the Mini-Nutritional Assessment (MNA). Energy intake was observed on a single day during first week of hospitalization. Results In this sample of 134 participants (80 ± 8 y old, 50% women), there was fair agreement between the SGA and MNA (κ = 0.53), with MNA identifying more “at-risk” patients and the SGA better identifying existing malnutrition. Most tools were accurate in identifying patients with malnutrition as determined by the SGA, in particular the Malnutrition Screening Tool and the Nutritional Risk Screening 2002. The MNA Short Form was most accurate at identifying nutritional risk according to the MNA. No tool accurately predicted patients with inadequate energy intake in the hospital. Conclusion Because all tools generally performed well, clinicians should consider choosing a screening tool that best aligns with their chosen nutritional assessment and is easiest to implement in practice. This study confirmed the importance of rescreening and monitoring food intake to allow the early identification and prevention of nutritional decline in patients with a poor intake during hospitalization.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Tenofovir has been associated with renal phosphate wasting, reduced bone mineral density, and higher parathyroid hormone levels. The aim of this study was to carry out a detailed comparison of the effects of tenofovir versus non-tenofovir use on calcium, phosphate and, vitamin D, parathyroid hormone (PTH), and bone mineral density. Methods: A cohort study of 56 HIV-1 infected adults at a single centre in the UK on stable antiretroviral regimes comparing biochemical and bone mineral density parameters between patients receiving either tenofovir or another nucleoside reverse transcriptase inhibitor. Principal Findings: In the unadjusted analysis, there was no significant difference between the two groups in PTH levels (tenofovir mean 5.9 pmol/L, 95% confidence intervals 5.0 to 6.8, versus non-tenofovir; 5.9, 4.9 to 6.9; p = 0.98). Patients on tenofovir had significantly reduced urinary calcium excretion (median 3.01 mmol/24 hours) compared to non-tenofovir users (4.56; p,0.0001). Stratification of the analysis by age and ethnicity revealed that non-white men but not women, on tenofovir had higher PTH levels than non-white men not on tenofovir (mean difference 3.1 pmol/L, 95% CI 5.3 to 0.9; p = 0.007). Those patients with optimal 25-hydroxyvitamin D (.75 nmol/L) on tenofovir had higher 1,25-dihydroxyvitamin D [1,25(OH)2D] (median 48 pg/mL versus 31; p = 0.012), fractional excretion of phosphate (median 26.1%, versus 14.6;p = 0.025) and lower serum phosphate (median 0.79 mmol/L versus 1.02; p = 0.040) than those not taking tenofovir. Conclusions: The effects of tenofovir on PTH levels were modified by sex and ethnicity in this cohort. Vitamin D status also modified the effects of tenofovir on serum concentrations of 1,25(OH)2D and phosphate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To investigate the mental and general health of infertile women who had not sought medical advice for their recognized infertility and were therefore not represented in clinical populations. Design: Longitudinal cohort study.Setting Population based.Patient(s) Participants in the Australian Longitudinal Study on Women's Health aged 28-33 years in 2006 who had ever tried to conceive or had been pregnant (n = 5,936).Intervention(s) None.Main Outcome Measure(s) Infertility, not seeking medical advice. Result(s): Compared with fertile women (n = 4,905), infertile women (n = 1,031) had higher odds of self-reported depression (odds ratio [OR] 1.20, 95% confidence interval [CI] 1.01-1.43), endometriosis (5.43, 4.01-7.36), polycystic ovary syndrome (9.52, 7.30-12.41), irregular periods (1.99, 1.68-2.36), type II diabetes (4.70, 1.79-12.37), or gestational diabetes (1.66, 1.12-2.46). Compared with infertile women who sought medical advice (n = 728), those who had not sought medical advice (n = 303) had higher odds of self-reported depression (1.67, 1.18-2.37), other mental health problems (3.14, 1.14-8.64), urinary tract infections (1.67, 1.12-2.49), heavy periods (1.63, 1.16-2.29), or a cancer diagnosis (11.33, 2.57-49.89). Infertile women who had or had not sought medical advice had similar odds of reporting an anxiety disorder or anxiety-related symptoms. Conclusion(s): Women with self-reported depression were unlikely to have sought medical advice for infertility. Depression and depressive symptoms may be barriers to seeking medical advice for infertility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background and aims: Lower-limb lymphoedema is a serious and feared sequela after treatment for gynaecological cancer. Given the limited prospective data on incidence of and risk factors for lymphoedema after treatment for gynaecological cancer we initiated a prospective cohort study in 2008. Methods: Data were available for 353 women with malignant disease. Participants were assessed before treatment and at regular intervals after treatment for two years. Follow-up visits were grouped into time-periods of six weeks to six months (time 1), nine months to 15 months (time 2), and 18 months to 24 months (time 3). Preliminary data analyses were undertaken up to time 2 using generalised estimating equations to model the repeated measures data of Functional Assessment of Cancer Therapy-General (FACT-G) quality of life (QoL) scores and self-reported swelling at each follow-up period (best-fitting covariance structure). Results: Depending on the time-period, between 30% and 40% of patients self-reported swelling of the lower limb. The QoL of those with self-reported swelling was lower at all time-periods compared with those who did not have swelling. Mean (95% CI) FACT-G scores at time 0, 1 and 2 were 80.7 (78.2, 83.2), 83.0 (81.0, 85.0) and 86.3 (84.2, 88.4), respectively for those with swelling and 85.0 (83.0, 86.9), 86.0 (84.1, 88.0) and 88.9 (87.0, 90.7), respectively for those without swelling. Conclusions: Lower-limb swelling adversely influences QoL and change in QoL over time in patients with gynaecological cancer.