20 resultados para early and intensive behavioural intervention

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. To document symptoms associated with borderline, early and advanced ovarian cancer and identify personal characteristics associated with early versus late diagnosis. Methods. Information concerning symptoms and diagnosis history was available from 811 women with ovarian cancer who took part in an Australian case–control study in the early 1990s. Women were classified into three groups for comparison based on their diagnosis: borderline, early (stage I–II) and advanced (stage III–IV) invasive cancer. Results. Sixteen percent of women with borderline tumors, 7% with early cancer and 4% with advanced cancer experienced no symptoms before diagnosis (P < 0.0001). Among women with symptoms, abdominal pain (44%) or swelling (39%) were most frequently reported; an abdominal mass (12%) and gynecological symptoms (12%) were less common. Compared to advanced stage cancer, women with early stage cancer were more likely to report an abdominal mass or urinary symptoms but less likely to report gastrointestinal problems or general malaise. General malaise and ‘other’ symptoms were least common in borderline disease. Older women, and those with higher parity or a family history of breast or ovarian cancer, were more likely to be diagnosed at an advanced stage of disease. Conclusions. Women who experience persistent or recurrent abdominal symptoms, particularly swelling and/or pain should be encouraged to seek medical attention and physicians should be alert to the possibility of ovarian cancer even in the absence of an abdominal mass. Further information about the prevalence of these symptoms in the general population is essential to assist physicians in patient management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Few studies have examined the potential benefits of specialist nurse-led programs of care involving home and clinic-based follow-up to optimise the post-discharge management of chronic heart failure (CHF). Objective: To determine the effectiveness of a hybrid program of clinic plus home-based intervention (C+HBI) in reducing recurrent hospitalisation in CHF patients. Methods: CHF patients with evidence of left ventricular systolic dysfunction admitted to two hospitals in Northern England were assigned to a C+HBI lasting 6 months post-discharge (n=58) or to usual, post-discharge care (UC: n=48) via a cluster randomization protocol. The co-primary endpoints were death or unplanned readmission (event-free survival) and rate of recurrent, all-cause readmission within 6 months of hospital discharge. Results: During study follow-up, more UC patients had an unplanned readmission for any cause (44% vs. 22%: P=0.0191 OR 1.95 95% CI 1.10-3.48) whilst 7 (15%) versus 5 (9%) UC and C+HBI patients, respectively, died (P=NS). Overall, 15 (26%) C+HBI versus 21 (44%) UC patients experienced a primary endpoint. C+HBI was associated with a non-significant, 45% reduction in the risk of death or readmission when adjusting for potential confounders (RR 0.55, 95% CI 0.28-1.08: P=0.08). Overall, C+HBI patients accumulated significantly fewer unplanned readmissions (15 vs. 45: P

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent multidisciplinary investigations document an independent emergence of agriculture at Kuk Swamp in the highlands of Papua New Guinea. In this paper we report preliminary usewear analysis and details of prehistoric use of stone tools for processing starchy food and other plants at Kuk Swamp. Morphological diagnostics for starch granules are reported for two potentially significant economic species, taro (Colocasia esculenta) and yam (Dioscorea sp.), following comparisons between prehistoric and botanical reference specimens. Usewear and residue analyses of starch granules indicate that both these species were processed on the wetland margin during the early and mid Holocene. We argue that processing of taro and yam commences by at least 10,200 calibrated years before present (cal BP), although the taro and yam starch granules do not permit us to distinguish between wild or cultivated forms. From at least 6950 to 6440 cal BP the processing of taro, yam and other plants indicates that they are likely to have been integrated into cultivation practices on the wetland edge.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Treatment of sepsis remains a significant challenge with persisting high mortality and morbidity. Early and appropriate antibacterial therapy remains an important intervention for such patients. To optimise antibacterial therapy, the clinician must possess knowledge of the pharmacokinetic and pharmacodynamic properties of commonly used antibacterials and how these parameters may be affected by the constellation of pathophysiological changes occurring during sepsis. Sepsis, and the treatment thereof, increases renal preload and, via capillary permeability, leads to 'third-spacing', both resulting in higher antibacterial clearances. Alternatively, sepsis can induce multiple organ dysfunction, including renal and/or hepatic dysfunction, causing a decrease in antibacterial clearance. Aminoglycosides are concentration-dependent antibacterials and they display an increased volume of distribution (V-d) in sepsis, resulting in decreased peak serum concentrations. Reduced clearance from renal dysfunction would increase the likelihood of toxicity. Individualised dosing using extended interval dosing, which maximises the peak serum drug concentration (C-max)/minimum inhibitory concentration ratio is recommended. beta-Lactams and carbapenems are time-dependent antibacterials. An increase in Vd and renal clearance will require increased dosing or administration by continuous infusion. If renal impairment occurs a corresponding dose reduction may be required. Vancomycin displays predominantly time-dependent pharmacodynamic properties and probably requires higher than conventionally recommended doses because of an increased V-d and clearance during sepsis without organ dysfunction. However, optimal dosing regimens remain unresolved. The poor penetration of vancomycin into solid organs may require alternative therapies when sepsis involves solid organs (e.g. lung). Ciprofloxacin displays largely concentration-dependent kill characteristics, but also exerts some time-dependent effects. The V-d of ciprofloxacin is not altered with fluid shifts or over time, and thus no alterations of standard doses are required unless renal dysfunction occurs. In order to optimise antibacterial regimens in patients with sepsis, the pathophysiological effects of systemic inflammatory response syndrome need consideration, in conjunction with knowledge of the different kill characteristics of the various antibacterial classes. In conclusion, certain antibacterials can have a very high V-d, therefore leading to a low C-max and if a high peak is needed, then this would lead to underdosing. The Vd of certain antibacterials, namely aminoglycosides and vancomycin, changes over time, which means dosing may need to be altered over time. Some patients with serum creatinine values within the normal range can have very high drug clearances, thereby producing low serum drug levels and again leading to underdosing. Copyright © 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Extremely premature infants of normal intellectual ability have an increased prevalence of motor and attentional difficulties. Knowledge of the relationship between early motor difficulties and measures of attention at school age would enhance understanding of these developmental pathways, their interrelationship and opportunities for intervention. Objective: This study examines whether an association exists between early findings of minor motor difficulties and school age clinical and psychometric measures of attention. Methodology: 45/60 eligible ELBW(1000 g) or preterm (< 27/40 gestation) infants born at the Mater Mother's Hospital were assessed at 12 and 24 months for minor motor deficits (using NSMDA) and at 7-9 years for attention, using clinical (Conners and Du Paul Rating Scales) and psychometric (assessing attention span, selective and divided attention) measures. Results: NSMDA at 12 months was only associated with the psychometric measures of verbal attention span. It was not associated with later clinical measures of attention. NSMDA at 24months was strongly associated with specific clinical measures of attention at school age, independent of biological and social factors. It was not associated with psychometric measures of attention. Conclusion: The major finding of this study is that motor difficulties in ELBW infants at 2 years are associated with later clinical measures of attention. Possible mechanisms underlying this relationship are considered. Crown Copyright (c) 2005 Published by Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. The paper presents a study assessing the rate of adoption of a sedation scoring system and sedation guideline. Background. Clinical practice guidelines including sedation guidelines have been shown to improve patient outcomes by standardizing care. In particular sedation guidelines have been shown to be beneficial for intensive care patients by reducing the duration of ventilation. Despite the acceptance that clinical practice guidelines are beneficial, adoption rates are rarely measured. Adoption data may reveal other factors which contribute to improved outcomes. Therefore, the usefulness of the guideline may be more appropriately assessed by collecting adoption data. Method. A quasi-experimental pre-intervention and postintervention quality improvement design was used. Adoption was operationalized as documentation of sedation score every 4 hours and use of the sedation and analgesic medications suggested in the guideline. Adoption data were collected from patients' charts on a random day of the month; all patients in the intensive care unit on that day were assigned an adoption category. Sedation scoring system adoption data were collected before implementation of a sedation guideline, which was implemented using an intensive information-giving strategy, and guideline adoption data were fed back to bedside nurses. After implementation of the guideline, adoption data were collected for both the sedation scoring system and the guideline. The data were collected in the years 2002-2004. Findings. The sedation scoring system was not used extensively in the pre-intervention phase of the study; however, this improved in the postintervention phase. The findings suggest that the sedation guideline was gradually adopted following implementation in the postintervention phase of the study. Field notes taken during the implementation of the sedation scoring system and the guideline reveal widespread acceptance of both. Conclusion. Measurement of adoption is a complex process. Appropriate operationalization contributes to greater accuracy. Further investigation is warranted to establish the intensity and extent of implementation required to positively affect patient outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a pilot study of a brief, group-based, cognitive-behavioural intervention for anxiety-disordered children. Five children (aged 7 to 13 years) diagnosed with a clinically significant anxiety disorder were treated with a recently developed 6-session, child-focused, cognitive-behavioural intervention that was evaluated using multiple measures (including structured diagnostic interview, self-report questionnaires and behaviour rating scales completed by parents) over four follow-up occasions (posttreatment, 3-month follow-up, 6-month follow-up and 12-month follow-up). This trial aimed to (a) evaluate the conclusion suggested by the research of Cobham, Dadds, and Spence (1998) that anxious children with non-anxious parents require a child-focused intervention only in order to demonstrate sustained clinical gains; and (b) to evaluate a new and more cost-effective child-focused cognitive-behavioural intervention. Unfortunately, the return rate of the questionnaires was poor, rendering this data source of questionable value. However, diagnostic interviews (traditionally the gold standard in terms of outcome in this research area) were completed for all children at all follow-up points. Changes in diagnostic status indicated that meaningful treatment-related gains had been achieved and were maintained over the full follow-up period. The results would thus seem to support the principle of participant-intervention matching proposed by Cobham et al. (1998), as well as the utility of the more brief intervention evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coexistance of a swallowing impairment can severely impact upon the medical condition and recovery of a child with traumatic brain injury [ref.(1): Journal of Head Trauma Rehabilitation 9 (1) (1994) 43]. Limited data exist on the progression or outcome of dysphagia in the paediatric population with brainstem injury. The present prospective study documents the resolution of dysphagia in a 14-year-old female post-brainstem injury using clinical, radiological and endoscopic evaluations of swallowing. The subject presented with a pattern of severe oral-motor and oropharyngeal swallowing impairment post-injury that resolved rapidly for the initial 12 weeks, slowed to gradual progress for weeks 12-20, and then plateaued at 20 weeks post-injury. Whilst a clinically functional swallow was present at 10 months post-injury, radiological examination revealed a number of residual physiological impairments, reduced swallowing efficiency, and reduced independence for feeding, indicating a potential increased risk for aspiration. The data highlight the need for early and continued evaluation and intensive treatment programs, to focus on the underlying physiological swallowing impairment post-brainstem injury, and to help offset any potential deleterious effects of aspiration that may affect patient recovery, such as pneumonia. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Fetal scalp lactate testing has been shown to be as useful as pH with added benefits. One remaining question is What level of lactate should trigger intervention in the first stage of labour?' Aims: This study aimed to establish the lactate level in the first stage of labour that indicates the need for intervention to ensure satisfactory outcomes for both babies and mothers. Methods: A prospective study at Mater Mothers' Hospital, Brisbane, Australia, a tertiary referral centre. One hundred and forty women in labour, with non-reassuring fetal heart rate traces, were tested using fetal blood scalp sampling of 5 mu L of capillary blood tested on an Accusport (Boeringer, Mannheim, East Sussex, UK) lactate meter. Decision to intervene in labour was based on clinical assessment plus a predetermined cut off. Main outcome measures were APGAR scores, cord arterial pH, meconium stained liquor and Intensive Care Nursery admission. Results: Two-graph receiver operating characteristic (TG-ROC) analysis showed optimal specificity, and sensitivity for predicting adverse neonatal outcomes was a scalp lactate level above 4.2 mmol/L. Conclusions: Fetal blood sampling remains the standard for further investigating-non-reassuring cardiotocograph (CTG) traces. Even so, it is a poor predictor of fetal outcomes. Scalp lactate has been shown to be at least as good a predictor as scalp pH, with the advantages of being easier, cheaper and with a lower rate of technical failure. Our study, found that a cut off fetal scalp lactate level of 4.2 mmol/L, in combination with an assessment of the entire clinical picture, is a useful tool in identifying those women who need intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydrocephalus is a condition commonly encountered in paediatric and adult neurosurgery and cerebrospinal fluid (CSF) shunting remains the treatment of choice for many cases. Despite improvements in shunt technology and technique, morbidity and mortality remain. The incidence of early shunt obstruction is high with later failures seen less frequently. This review aims to examine mortality associated with mechanical failure of CSF shunts within Queensland. Neurosurgical and Intensive Care databases were reviewed for cases of mortality associated with shunt failure. Eight cases were identified between the years of 1992 and 2002 with the average age at death 7.7 years. Deaths occurred on average 2 years after last shunt revision. Seven of the eight patients lived outside the metropolitan area. Shunting remains an imperfect means of treating hydrocephalus. Mortality may be encountered at any time post surgery and delays to surgical intervention influence this. Alternative measures such as third ventriculostomy or the placement of a separate access device should be considered. In the event of emergency, a spinal needle could be used to access the ventricle along the course of the ventricular catheter. (C) 2004 Elsevier Ltd. All rights reserved.