962 resultados para FCE LTER Mid-term Review
Resumo:
AIMS The aim of this narrative review of the literature was to examine the current state of knowledge regarding the impact of aggressive surgical interventions for severe stroke on patient and caregiver quality of life and caregiver outcomes. BACKGROUND Decompressive hemicraniectomy (DHC) is a surgical therapeutic option for treatment of massive middle cerebral artery infarction (MCA), lobar intracerebral hemorrhage (ICH), and severe aneurysmal subarachnoid hemorrhage (aSAH). Decompressive hemicraniectomy has been shown to be effective in reducing mortality in these three life-threatening conditions. Significant functional impairment is an experience common to many severe stroke survivors worldwide and close relatives experience decision-making difficulty when confronted with making life or death choices related to surgical intervention for severe stroke. DATA SOURCES Academic Search Premier, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Medline, and PsychInfo. REVIEW METHODS A narrative review methodology was utilized in this review of the literature related to long-term outcomes following decompressive hemicraniectomy for stroke. The key words decompressive hemicraniectomy, severe stroke, middle cerebral artery stroke, subarachnoid hemorrhage, lobar ICH, intracerebral hemorrhage, quality of life, and caregivers, literature review were combined to search the databases. RESULTS Good functional outcomes following DHC for life-threatening stroke have been shown to be associated with younger age and few co-morbid conditions. It was also apparent that quality of life was reduced for many stroke survivors, although not assessed routinely in studies. Caregiver burden has not been systematically studied in this population. CONCLUSION Most patients and caregivers in the studies reviewed agreed with the original decision to undergo DHC and would make the same decision again. However, little is known about quality of life for both patients and caregivers and caregiver burden over the long-term post-surgery. Further research is needed to generate information and interventions for the management of ongoing patient and carer recovery following DHC for severe stroke.
Resumo:
Cued recall and item recognition are considered the standard episodic memory retrieval tasks. However, only the neural correlates of the latter have been studied in detail with fMRI. Using an event-related fMRI experimental design that permits spoken responses, we tested hypotheses from an auto-associative model of cued recall and item recognition [Chappell, M., & Humphreys, M. S. (1994). An auto-associative neural network for sparse representations: Analysis and application to models of recognition and cued recall. Psychological Review, 101, 103-128]. In brief, the model assumes that cues elicit a network of phonological short term memory (STM) and semantic long term memory (LTM) representations distributed throughout the neocortex as patterns of sparse activations. This information is transferred to the hippocampus which converges upon the item closest to a stored pattern and outputs a response. Word pairs were learned from a study list, with one member of the pair serving as the cue at test. Unstudied words were also intermingled at test in order to provide an analogue of yes/no recognition tasks. Compared to incorrectly rejected studied items (misses) and correctly rejected (CR) unstudied items, correctly recalled items (hits) elicited increased responses in the left hippocampus and neocortical regions including the left inferior prefrontal cortex (LIPC), left mid lateral temporal cortex and inferior parietal cortex, consistent with predictions from the model. This network was very similar to that observed in yes/no recognition studies, supporting proposals that cued recall and item recognition involve common rather than separate mechanisms.
Resumo:
Objective High utilisation of emergency department (ED) among the elderly is of worldwide concern. This study aims to review the effectiveness of interventions targeting the elderly population in reducing ED utilisation. Methods Major biomedical databases were searched for relevant studies. Qualitative approach was applied to derive common themes in the myriad interventions and to critically assess the variations influencing interventions’ effectiveness. Quality of studies was appraised using the Effective Public Health Practice Project (EPPHP) tool. Results 36 studies were included. Nine of 16 community-based interventions reported significant reductions in ED utilisation. Five of 20 hospital-based interventions proved effective while another four demonstrated failure. Seven key elements were identified. Ten of 14 interventions associated with significant reduction on ED use integrated at least three of the seven elements. All four interventions with significant negative results lacked five or more of the seven elements. Some key elements including multidisciplinary team, integrated primary care and social care often existed in effective interventions, while were absent in all significantly ineffective ones. Conclusions The investigated interventions have mixed effectiveness. Our findings suggest the hospital-based interventions have relatively poorer effects, and should be better connected to the community-based strategies. Interventions seem to achieve the most success with integration of multi-layered elements, especially when incorporating key elements such as a nurse-led multidisciplinary team, integrated social care, and strong linkages to the longer-term primary and community care. Notwithstanding limitations in generalising the findings, this review builds on the growing body of evidence in this particular area.
Resumo:
Objective: To evaluate the effects of exercise on cancer-related lymphedema and related symptoms, and to determine the need for those with lymphedema to wear compression during exercise. Data Sources: CINAHL, Cochrane, Ebscohost, MEDLINE, Pubmed, ProQuest Health and Medical Complete, ProQuest Nursing and Allied Health Source, Science Direct and SPORTDiscus databases were searched for trials published prior to 1 January, 2015. Study Selection: Randomised and non-randomised, controlled trials, and single group pre-post studies published in English-language were included. Twenty-one (exercise) and four (compression and exercise) studies met inclusion criteria. Data Extraction: Data was extracted into tabular format using predefined data fields by one reviewer and assessed for accuracy by a second reviewer. Study quality was evaluated using the Effective Public Health Practice Project assessment tool. Data Synthesis: Data was pooled using a random effects model to assess the effects of acute and long-term exercise on lymphedema and lymphedema-associated symptoms, with subgroup analyses for exercise mode and intervention length. There was no effect of exercise (acute or intervention) on lymphedema or associated symptoms with standardised mean differences from all analyses ranging between −0.2 and 0.1 (p-values ≥0.22). Findings from subgroup analyses for exercise mode (aerobic, resistance, mixed, other) and intervention duration (>12 weeks or ≤12 weeks) were consistent with these findings; that is, no effect on lymphedema or associated symptoms. There were too few studies evaluating the effect of compression during regular exercise to conduct a meta-analysis. Conclusions: Individuals with secondary lymphedema can safely participate in progressive, regular exercise without experiencing a worsening of lymphedema or related-symptoms. However, the results also do not suggest any improvements will occur in lymphedema. At present, there is insufficient evidence to support or refute the current clinical recommendation to wear compression garments during regular exercise.
Resumo:
Introduction Axillary web syndrome (AWS) can result in early post-operative and long-term difficulties following lymphadenectomy for cancer and should be recognised by clinicians. This systematic review was conducted to synthesise information on AWS clinical presentation and diagnosis, frequency, natural progression, grading, pathoaetiology, risk factors, symptoms, interventions and outcomes. Methods Electronic searches were conducted using Cochrane, Pubmed, MEDLINE, CINAHL, EMBASE, AMED, PEDro and Google Scholar until June 2013. The methodological quality of included studies was determined using the Downs and Black checklist. Narrative synthesis of results was undertaken. Results Thirty-seven studies with methodological quality scores ranging from 11 to 26 on a 28-point scale were included. AWS diagnosis relies on inspection and palpation; grading has not been validated. AWS frequency was reported in up to 85.4 % of patients. Biopsies identified venous and lymphatic pathoaetiology with five studies suggesting lymphatic involvement. Twenty-one studies reported AWS occurrence within eight post-operative weeks, but late occurrence of greater than 3 months is possible. Pain was commonly reported with shoulder abduction more restricted than flexion. AWS symptoms usually resolve within 3 months but may persist. Risk factors may include extensiveness of surgery, younger age, lower body mass index, ethnicity and healing complications. Low-quality studies suggest that conservative approaches including analgesics, non-steroidal anti-inflammatory drugs and/or physiotherapy may be safe and effective for early symptom reduction. Conclusions AWS appears common. Current evidence for the treatment of AWS is insufficient to provide clear guidance for clinical practice. Implications for Cancer Survivors Cancer survivors should be informed about AWS. Further investigation is needed into pathoaetiology, long-term outcomes and to determine effective treatment using standardised outcomes.
Resumo:
Placenta is a readily accessible translationally advantageous source of mesenchymal stem/stromal cells (MSCs) currently used in cryobanking and clinical trials. MSCs cultured from human chorion have been widely assumed to be fetal in origin, despite evidence that placental MSCs may be contaminated with maternal cells, resulting in entirely maternally derived MSC cultures. To document the frequency and determinants of maternal cell contamination in chorionic MSCs, we undertook a PRISMA-compliant systematic review of publications in the PubMed, Medline, and Embase databases (January 2000 to July 2013) on placental and/or chorionic MSCs from uncomplicated pregnancies. Of 147 studies, only 26 (18%) investigated fetal and/or maternal cell origin. After excluding studies that did not satisfy minimal MSC criteria, 7 of 15 informative studies documented MSC cultures as entirely fetal, a further 7 studies reported cultured human chorionic MSC populations to be either maternal (n=6) or mixed (n=1), whereas 1 study separately cultured pure fetal and pure maternal MSC from the same placenta. Maternal cell contamination was associated with term and chorionic membrane samples and greater passage number but was still present in 30% of studies of chorionic villous MSCs. Although most studies assume fetal origin for MSCs sourced from chorion, this systematic review documents a high incidence of maternal-origin MSC populations in placental MSC cultures. Given that fetal MSCs have more primitive properties than adult MSCs, our findings have implications for clinical trials in which knowledge of donor and tissue source is pivotal. We recommend sensitive methods to quantitate the source and purity of placental MSCs.
Resumo:
Objectives: To describe longitudinal height, weight, and body mass index changes up to 15 years after childhood liver transplantation. Study design: Retrospective chart review of patients who underwent liver transplant from 1985-2004 was performed. Subjects were age <18 years at transplant, survived ≥5 years, with at least 2 recorded measurements, of which one was ≥5 years post-transplant. Measurements were recorded pre-transplant, 1, 5, 10, and 15 years later. Results: Height and weight data were available in 98 and 104 patients, respectively; 47% were age <2 years at transplant; 58% were Australian, and the rest were from Japan. Height recovery continued for at least 10 years to reach the 26th percentile (Z-score -0.67) 15 years after transplant. Australians had better growth recovery and attained 47th percentile (Z-score -0.06) at 15 years. Weight recovery was most marked in the first year and continued for 15 years even in well-nourished children. Growth impaired and malnourished children at transplant exhibited the best growth, but remained significantly shorter and lighter even 15 years later. No effect of sex or age at transplant was noted on height or weight recovery. Post-transplant factors significantly impact growth recovery and likely caused the dichotomous growth recovery between Australian and Japanese children; 9% (9/98) of patients were overweight on body mass index calculations at 10-15 years but none were obese. Conclusions: After liver transplant, children can expect ongoing height and weight recovery for at least 10-15 years. Growth impairment at transplant and post-transplant care significantly impact long-term growth recovery. Copyright © 2013 Mosby Inc. All rights reserved.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
A genetic solution to breech strike control is attractive, as it is potentially permanent, cumulative, would not involve increased use of chemicals and may ultimately reduce labour inputs. There appears to be significant opportunity to reduce the susceptibility of Merinos to breech strike by genetic means although it is unlikely that in the short term breeding alone will be able to confer the degree of protection provided by mulesing and tail docking. Breeding programmes that aim to replace surgical techniques of flystrike prevention could potentially: reduce breech wrinkle; increase the area of bare skin in the perineal area; reduce tail length and wool cover on and near the tail; increase shedding of breech wool; reduce susceptibility to internal parasites and diarrhoea; and increase immunological resistance to flystrike. The likely effectiveness of these approaches is reviewed and assessed here. Any breeding programme that seeks to replace surgical mulesing and tail docking will need to make sheep sufficiently resistant that the increased requirement for other strike management procedures remains within practically acceptable bounds and that levels of strike can be contained to ethically acceptable levels.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
We review key issues, available approaches and analyses to encourage and assist practitioners to develop sound plans to evaluate the effectiveness of weed biological control agents at various phases throughout a program. Assessing the effectiveness of prospective agents before release assists the selection process, while post-release evaluation aims to determine the extent that agents are alleviating the ecological, social and economic impacts of the weeds. Information gathered on weed impacts prior to the initiation of a biological control program is necessary to provide baseline data and devise performance targets against which the program can subsequently be evaluated. Detailed data on weed populations, associated plant communities and, in some instances ecosystem processes collected at representative sites in the introduced range several years before the release of agents can be compared with similar data collected later to assess agent effectiveness. Laboratory, glasshouse and field studies are typically used to assess agent effectiveness. While some approaches used for field studies may be influenced by confounding factors, manipulative experiments where agents are excluded (or included) using chemicals or cages are more robust but time-consuming and expensive to implement. Demographic modeling and benefit–cost analyses are increasingly being used to complement other studies. There is an obvious need for more investment in long-term post-release evaluation of agent effectiveness to rigorously document outcomes of biological control programs.
Resumo:
In our recent paper [1], we discussed some potential undesirable consequences of public data archiving (PDA) with specific reference to long-term studies and proposed solutions to manage these issues. We reaffirm our commitment to data sharing and collaboration, both of which have been common and fruitful practices supported for many decades by researchers involved in long-term studies. We acknowledge the potential benefits of PDA (e.g., [2]), but believe that several potential negative consequences for science have been underestimated [1] (see also 3 and 4). The objective of our recent paper [1] was to define practices to simultaneously maximize the benefits and minimize the potential unwanted consequences of PDA.
Resumo:
Purpose Social marketing benchmark criteria were used to understand the extent to which single-substance alcohol education programmes targeting adolescents in middle and high school settings sought to change behaviour, utilised theory, included audience research and applied the market segmentation process. The paper aims to discuss these issues. Design/methodology/approach A systematic literature review retrieved a total of 1,495 identified articles; 565 duplicates were removed. The remaining 930 articles were then screened. Articles detailing formative research or programmes targeting multiple substances, parents, families and/or communities, as well as elementary schools and universities were excluded. A total of 31 articles, encompassing 16 qualifying programmes, were selected for detailed evaluation. Findings The majority of alcohol education programmes were developed on the basis of theory and achieved short- and medium-term behavioural effects. Importantly, most programmes were universal and did not apply the full market segmentation process. Limited audience research in the form of student involvement in programme design was identified. Research limitations/implications This systematic literature review focused on single-substance alcohol education programmes targeted at middle and high school student populations, retrieving studies back to the year 2000. Originality/value The results of this systematic literature review indicate that application of the social marketing benchmark criteria of market segmentation and audience research may represent an avenue for further extending alcohol education programme effectiveness in middle and high school settings.
Resumo:
Study question Can exercise referral schemes improve health outcomes in individuals with or without pre-existing conditions? Summary answer We found weak evidence of a short term increase in physical activity and reduction in levels of depression in sedentary individuals after participation in exercise referral schemes, compared with after usual care. What is known and what this paper adds Exercise referral schemes are commonly used in primary care to promote physical activity. Evidence indicating a health benefit of these schemes is limited, so their value in primary care remains to be ascertained.
Resumo:
The Australian Longitudinal Study on Women’s Health (ALSWH) commenced in Australia in 1996 when researchers recruited approximately 40,000 women in three birth cohorts: 1973–1978, 1946–1951, and 1921–1926. Since then participants have completed surveys on a wide range of health issues, at approximately three-year intervals. This overview describes changes in physical activity (PA) over time in the mid-age and older ALSWH cohorts, and summarizes the findings of studies published to date on the determinants of PA, and its associated health outcomes in Australian women. The ALSWH data show a significant increase in PA during mid-age, and a rapid decline in activity levels when women are in their 80s. The study has demonstrated the importance of life stages and key life events as determinants of activity, the additional benefits of vigorous activity for mid-age women, and the health benefits of ‘only walking’ for older women. ALSWH researchers have also drawn attention to the benefits of activity in terms of a wide range of physical and mental health outcomes, as well as overall vitality and well-being. The data indicate that maintaining a high level of PA throughout mid and older age will not only reduce the risk of premature death, but also significantly extend the number of years of healthy life.