332 resultados para Routine formulas
Resumo:
Background Malnutrition and unintentional weight loss are major clinical issues in people with dementia living in residential aged care facilities (RACFs) and are associated with serious adverse outcomes. However, evidence regarding effective interventions is limited and strategies to improve the nutritional status of this population are required. This presentation describes the implementation and results of a pilot randomised controlled trial of a multi-component intervention for improving the nutritional status of RACF residents with dementia. Method Fifteen residents with moderate-severe dementia living in a secure long-term RACF participated in a five week pilot study. Participants were randomly allocated to either an Intervention (n=8) or Control group (n=7). The intervention comprised four elements delivered in a separate dining room at lunch and dinner: the systematic reinforcement of residents’ eating behaviors using a specific communication protocol; family-style dining; high ambiance table presentation; and routine Dietary-Nutrition Champion supervision. Control group participants ate their meals according to the facility’s standard practice. Baseline and follow-up assessments of nutritional status, food consumption, and body mass index were obtained by qualified nutritionists. Additional assessments included measures of cognitive functioning, mealtime agitation, depression, wandering status and multiple measures of intervention fidelity. Results No participant was malnourished at study commencement and participants in both groups gained weight from follow-up to baseline which was not significantly different between groups (t=0.43; p=0.67). A high degree of treatment fidelity was evident throughout the intervention. Qualitative data from staff indicate the intervention was perceived to be beneficial for residents. Conclusions This multi-component nutritional intervention was well received and was feasible in the RACF setting. Participants’ sound nutritional status at baseline likely accounts for the lack of an intervention effect. Further research using this protocol in malnourished residents is recommended. For success, a collaborative approach between researchers and facility staff, particularly dietary staff, is essential.
Resumo:
This special edition of the International Journal of Critical Indigenous Studies focuses upon the work scholars within the growing discipline of Aboriginal and Torres Strait Islander health studies. The lamentable state of Indigenous health in Australia is reflected in Indigenous populations elsewhere, especially where settler colonialism has left an indelible mark. This special edition therefore speaks to where Indigenous health outcomes and the efficacy of remedies are causing concern. Common to all is the demand that Indigenous people are placed front and centre of all attempts to improve health outcomes and that improvements are sought in culturally sensitive ways. Terry Dunbar presents findings from a research study that set out to investigate the Indigenous experiences of health and family services in the Northern Territory, Australia. The study asserts that cultural security is an integral and vital element of any policy that will impact upon Indigenous peoples. Dunbar concludes by arguing that in seeking positive change with regard to cultural security or otherwise, the most vociferous champions of that change are likely to be the Aboriginal communities affected. The article by Bronwyn Fredericks, Karen Adams, Sandra Angus and Melissa Walker also highlights the need to involve Aboriginal and Torres Strait Islander people, in this case women, in the design and development of strategies affecting their lives. Utilising routine communication methods such the ‘talking circle’ and the process referred to as ‘talkin’ up’, where women ‘talk back’ to one another about issues of personal importance, the article argues that the health strategy which emerged through these consultation approaches was more accurately informed by and responsive to women’s health need. Indeed, the resulting strategy reflected the women’s sense of themselves and the clear direction they felt their health services and polices should take.
Resumo:
In this 'Summary Guidance for Daily Practice', we describe the basic principles of prevention and management of foot problems in persons with diabetes. This summary is based on the International Working Group on the Diabetic Foot (IWGDF) Guidance 2015. There are five key elements that underpin prevention of foot problems: (1) identification of the at-risk foot; (2) regular inspection and examination of the at-risk foot; (3) education of patient, family and healthcare providers; (4) routine wearing of appropriate footwear, and; (5) treatment of pre-ulcerative signs. Healthcare providers should follow a standardized and consistent strategy for evaluating a foot wound, as this will guide further evaluation and therapy. The following items must be addressed: type, cause, site and depth, and signs of infection. There are seven key elements that underpin ulcer treatment: (1) relief of pressure and protection of the ulcer; (2) restoration of skin perfusion; (3) treatment of infection; (4) metabolic control and treatment of co-morbidity; (5) local wound care; (6) education for patient and relatives, and; (7) prevention of recurrence. Finally, successful efforts to prevent and manage foot problems in diabetes depend upon a well-organized team, using a holistic approach in which the ulcer is seen as a sign of multi-organ disease, and integrating the various disciplines involved.
Resumo:
Background Chronic kidney disease (CKD) leads to a range of symptoms, which are often under-recognised and little is known about the multidimensional symptom experience in advanced CKD. Objectives To examine (1) symptom burden at CKD stages 4 and 5, and dialysis modalities, and (2) demographic and renal history correlates of symptom burden. Methods Using a cross-sectional design, a convenience sample of 436 people with CKD was recruited from three hospitals. The CKD Symptom Burden Index (CKD-SBI) was used to measure the prevalence, severity, distress and frequency of 32 symptoms. Demographic and renal history data were also collected. Results Of the sample, 75.5 % were receiving dialysis (haemodialysis, n = 287; peritoneal dialysis, n = 42) and 24.5 % were not undergoing dialysis (stage 4, n = 69; stage 5, n = 38). Participants reported an average of 13.01 ± 7.67 symptoms. Fatigue and pain were common and burdensome across all symptom dimensions. While approximately one-third experienced sexual symptoms, when reported these symptoms were frequent, severe and distressing. Haemodialysis, older age and being female were independently associated with greater symptom burden. Conclusions In CKD, symptom burden is better understood when capturing the multidimensional aspects of a range of physical and psychological symptoms. Fatigue, pain and sexual dysfunction are key contributors to symptom burden, and these symptoms are often under-recognised and warrant routine assessment. The CKD-SBI offers a valuable tool for renal clinicians to assess symptom burden, leading to the commencement of timely and appropriate interventions.
Resumo:
Chronic kidney disease (CKD) is increasing globally and in Saudi Arabia it affects approximately 8% annual increment of dialysis population. It is associated with a high symptom burden. Previous studies have largely reported on the prevalence of symptoms only in the haemodialysis population. This study examined symptom burden across disease stages and treatment groups in advanced CKD, and their correlation with demographic and clinical factors. Using a cross-sectional design, a convenience sample of 436 patients with CKD was recruited from three hospitals in Saudi Arabia. The CKD Symptom Burden Index (CKD-SBI) was used to measure 32 CKD symptoms. Demographic and clinical data were also collected. Of the sample 75.5% were receiving dialysis (haemodialysis, n = 287; peritoneal dialysis, n = 42) and 24.5% were non-dialysis (CKD stage 4, n = 69; CKD stage 5, n = 38). Average symptom reported was 13.01 ± 7.67. Fatigue and pain were common and burdensome across all symptom dimensions.Approximately one-third of participants experienced sexual symptoms. Dialysis patients reported greater symptom burden, especially patients on haemodialysis. Haemodialysis treatment, older age and being female were independently associated with greater total symptom burden. In conclusion, symptom burden is high among advanced stages of CKD, particularly among those receiving dialysis. Although fatigue, pain and sexual dysfunction are key contributors to symptom burden in CKD, these symptoms are often under-recognised and warrant routine assessment. The CKD-SBI offers a valuable tool to assess symptom burden, leading to the commencement of timely and appropriate interventions.
Resumo:
Context: Pheochromocytomas and paragangliomas (PPGLs) are heritable neoplasms that can be classified into gene-expression subtypes corresponding to their underlying specific genetic drivers. Objective: This study aimed to develop a diagnostic and research tool (Pheo-type) capable of classifying PPGL tumors into gene-expression subtypes that could be used to guide and interpret genetic testing, determine surveillance programs, and aid in elucidation of PPGL biology. Design: A compendium of published microarray data representing 205 PPGL tumors was used for the selection of subtype-specific genes that were then translated to the Nanostring gene-expression platform. A support vector machine was trained on the microarray dataset and then tested on an independent Nanostring dataset representing 38 familial and sporadic cases of PPGL of known genotype (RET, NF1, TMEM127, MAX, HRAS, VHL, and SDHx). Different classifier models involving between three and six subtypes were compared for their discrimination potential. Results: A gene set of 46 genes and six endogenous controls was selected representing six known PPGL subtypes; RTK1–3 (RET, NF1, TMEM127, and HRAS), MAX-like, VHL, and SDHx. Of 38 test cases, 34 (90%) were correctly predicted to six subtypes based on the known genotype to gene-expression subtype association. Removal of the RTK2 subtype from training, characterized by an admixture of tumor and normal adrenal cortex, improved the classification accuracy (35/38). Consolidation of RTK and pseudohypoxic PPGL subtypes to four- and then three-class architectures improved the classification accuracy for clinical application. Conclusions: The Pheo-type gene-expression assay is a reliable method for predicting PPGL genotype using routine diagnostic tumor samples.
Resumo:
Objectives Melanoma of the skin is the third most commonly diagnosed cancer in Australia. Given the high incidence of sunburn in children and the level of sun protection provided by parents is often infrequent and/or insufficient, this research employed qualitative methodology to examine parents' beliefs about their young child's sun safe behaviour. Methods Parents (N = 21; n = 14 mothers, n = 7 fathers) of children aged 2–5 years participated in focus groups to identify commonly held beliefs about their decision to sun protect their child. Data were analysed using thematic content analysis. Results Parents generally had knowledge of the broad sun safe recommendations; however, the specific details of the recommendations were not always known. Parents reported adopting a range of sun-protective measures for their child, which depended on the time of year. A range of advantages (e.g. reducing the risk of skin cancer, developing good habits early and parental peace of mind), disadvantages (e.g. false sense of safety and preventing vitamin D absorption), barriers (e.g. child refusal) and facilitators (e.g. routine and accessibility) to performing sun safe practices were identified. Normative pressures and expectations also affected parents' motivation to be sun safe for their child. Conclusions These identified beliefs can be used to inform interventions to improve sun safe behaviours in young children who reside in a region that has the highest skin cancer incidence in the world.
Resumo:
Background The preference amongst parents for heavier infants is in contrast to obesity prevention efforts worldwide. Parents are poor at identifying overweight in older children, but few studies have investigated maternal perception of weight status amongst toddlers and none in the Australian setting. Methods Mothers (n = 290) completed a self-administered questionnaire at child age 12–16 months, defining their child's weight status as underweight, normal weight, somewhat overweight or very overweight. Weight-for-length z-score was derived from measured weight and length, and children categorized as underweight, normal weight, at risk overweight or obese (WHO standards). Objective classification was compared with maternal perception of weight status. Mean weight-for-length z-score was compared across categories of maternal perception using one-way ANOVA. Multinomial logistic regression was used to determine child or maternal characteristics associated with inaccurate weight perception. Results Most children (83%) were perceived as normal weight. Twenty nine were described as underweight, although none were. Sixty-six children were at risk of overweight, but 57 of these perceived as normal weight. Of the 14 children who were overweight, only 4 were identified as somewhat overweight by their mother. Compared with mothers who could accurately classify their normal weight child, mothers who were older had higher odds of perceiving their normal weight child as underweight, while mothers with higher body mass index had slightly higher odds of describing their overweight/at risk child as normal weight. Conclusion The leaner but healthy weight toddler was perceived as underweight, while only the heaviest children were recognized as overweight. Mothers unable to accurately identify children at risk are unlikely to act to prevent further excess weight gain. Practitioners can lead a shift in attitudes towards weight in infants and young children, promoting routine growth monitoring and adequate but not rapid weight gain.
Resumo:
While the two decades since the study by Kavanagh et al. (1993) has given additional insights into effective dissemination of family interventions, the accompanying papers show that progress remains limited. The effectiveness trial that triggered this series of papers offers a cautionary tale. Despite management support, 30–35 hr of workshop training and training of local supervisors who could act as champions, use of the full intervention was limited. In part, this seemed due to the demanding nature of the intervention and its incompatibility with practitioners’ roles, in part, to limitations in the training, among other factors. While the accompanying papers note these and other barriers to dissemination, they miss a more disturbing finding in the original paper: Practitioners said they were using several aspects in routine care, despite being unable to accurately describe what they were. This finding highlights the risks in taking practitioners’ reports of their practice in files or supervision sessions at face value and potentially has implications for reports of other clinical work. The fidelity of disseminated treatments can only be assured by audits of practice, accompanied by affirming but also corrective feedback.
Resumo:
Background: Bhutan has reduced its malaria incidence significantly in the last 5 years, and is aiming for malaria elimination by 2016. To assist with the management of the Bhutanese malaria elimination programme a spatial decision support system (SDSS) was developed. The current study aims to describe SDSS development and evaluate SDSS utility and acceptability through informant interviews. Methods: The SDSS was developed based on the open-source Quantum geographical information system (QGIS) and piloted to support the distribution of long-lasting insecticidal nets (LLINs) and indoor residual spraying (IRS) in the two sub-districts of Samdrup Jongkhar District. It was subsequently used to support reactive case detection (RACD) in the two sub-districts of Samdrup Jongkhar and two additional sub-districts in Sarpang District. Interviews were conducted to ascertain perceptions on utility and acceptability of 11 informants using the SDSS, including programme and district managers, and field workers. Results: A total of 1502 households with a population of 7165 were enumerated in the four sub-districts, and a total of 3491 LLINs were distributed with one LLIN per 1.7 persons. A total of 279 households representing 728 residents were involved with RACD. Informants considered that the SDSS was an improvement on previous methods for organizing LLIN distribution, IRS and RACD, and could be easily integrated into routine malaria and other vector-borne disease surveillance systems. Informants identified some challenges at the programme and field level, including the need for more skilled personnel to manage the SDSS, and more training to improve the effectiveness of SDSS implementation and use of hardware. Conclusions: The SDSS was well accepted and informants expected its use to be extended to other malaria reporting districts and other vector-borne diseases. Challenges associated with efficient SDSS use included adequate skills and knowledge, access to training and support, and availability of hardware including computers and global positioning system receivers.
Resumo:
Background Foot complications have been found to be predictors of mobility impairment and falls in community dwelling elderly patients. However, fewer studies have investigated the link between foot complications and mobility impairment in hospital in patient populations. The aim of this paper was to investigate the associations between mobility impairment and various foot complications in general inpatient populations. Methods Eligible participants were all adults admitted overnight, for any reason, into five diverse hospitals on one day; excluding maternity, mental health and cognitively impaired patients. Participants underwent a foot examination to clinically diagnose different foot complications; including foot wounds, infections, deformity, peripheral arterial disease and peripheral neuropathy. They were also surveyed on social determinant, medical history, self-care, footwear, foot complication history risk factors, and, mobility impairment defined as requiring a mobility aid for mobilisation prior to hospitalisation. Results Overall, 733 participants consented; mean(±SD) age 62(±19) years, 408 (55.8%) male, 172 (23.5%) diabetes. Mobility impairment was present in 242 (33.2%) participants; diabetes populations reported more mobility impairment than non-diabetes populations (40.7% vs 30.9%, p < 0.05). In a backwards stepwise multivariate analysis, and controlling for other risk factors, those people with mobility impairment were independently associated with increasing years of age (OR = 1.04 (95% CI) (1.02-1.05)), male gender (OR = 1.7 (1.2-2.5)), being born in Australia (OR = 1.7 (1.1-2.8), vision impairment (2.0 (1.2-3.1)), peripheral neuropathy (OR = 3.1 (2.0-4.6) and foot deformity (OR = 2.0 (1.3-3.0). Conclusions These findings support the results of other large studies investigating community dwelling elderly patients that peripheral neuropathy and foot deformity are independently associated with mobility impairment and potentially falls. Furthermore the findings suggest routine clinical diagnosis of foot complications as defined by national diabetic foot guidelines were sufficient to determine these associated foot complication risk factors for mobility impairment. Further research is required to establish if these foot complication risk factors for mobility impairment are predictors of actual falls in the inpatient environment.
Resumo:
The suitability of human mesenchymal stem cells (hMSCs) in regenerative medicine relies on retention of their proliferative expansion potential in conjunction with the ability to differentiate toward multiple lineages. Successful utilisation of these cells in clinical applications linked to tissue regeneration requires consideration of biomarker expression, time in culture and donor age, as well as their ability to differentiate towards mesenchymal (bone, cartilage, fat) or non-mesenchymal (e.g., neural) lineages. To identify potential therapeutic suitability we examined hMSCs after extended expansion including morphological changes, potency (stemness) and multilineage potential. Commercially available hMSC populations were expanded in vitro for > 20 passages, equating to > 60 days and > 50 population doublings. Distinct growth phases (A-C) were observed during serial passaging and cells were characterised for stemness and lineage markers at representative stages (Phase A: P+5, approximately 13 days in culture; Phase B: P+7, approximately 20 days in culture; and Phase C: P+13, approximately 43 days in culture). Cell surface markers, stem cell markers and lineage-specific markers were characterised by FACS, ICC and Q-PCR revealing MSCs maintained their multilineage potential, including neural lineages throughout expansion. Co-expression of multiple lineage markers along with continued CD45 expression in MSCs did not affect completion of osteogenic and adipogenic specification or the formation of neurospheres. Improved standardised isolation and characterisation of MSCs may facilitate the identification of biomarkers to improve therapeutic efficacy to ensure increased reproducibility and routine production of MSCs for therapeutic applications including neural repair.
Resumo:
As a key component of the ocular surface required for vision, the cornea has been extensively studied as a site for cell and tissue-based therapies. Historically, these treatments have consisted of donor corneal tissue transplants, but cultivated epithelial autografts have become established over the last 15 years as a routine treatment for ocular surface disease. Ultimately, these treatments are performed with the intention of restoring corneal transparency and a smooth ocular surface. The degree of success, however, is often dependent upon the inherent level of corneal inflammation at time of treatment. In this regard, the anti-inflammatory and immuno-modulatory properties of mesenchymal stromal cells (MSC) have drawn attention to these cells as potential therapeutic agents for corneal repair. The origins for MSC-based therapies are founded in part on observations of the recruitment of endogenous bone marrow-derived cells to injured corneas, however, an increasing quantity of data is emerging for MSC administered following their isolation and ex vivo expansion from a variety of tissues including bone marrow, adipose tissue, umbilical cord and dental pulp. In brief, evidence has emerged of cultured MSC, or their secreted products, having a positive impact on corneal wound healing and retention of corneal allografts in animal models. Optimal dosage, route of administration and timing of treatment, however, all remain active areas of investigation. Intriguingly, amidst these studies, have emerged reports of MSC transdifferentiation into corneal cells. Clearest evidence has been obtained with respect to expression of markers associated with the phenotype of corneal stromal cells. In contrast, the evidence for MSC conversion to corneal epithelial cell types remains inconclusive. In any case, the conversion of MSC into corneal cells seems unlikely to be an essential requirement for their clinical use. This field of research has recently become more complicated by reports of MSC-like properties for cultures established from the peripheral corneal stroma (limbal stroma). The relationship and relative value of corneal-MSC compared to traditional sources of MSC such as bone marrow are at present unclear. This chapter is divided into four main parts. After providing a concise overview of corneal structure and function, we will highlight the types of corneal diseases that are likely to benefit from the anti-inflammatory and immuno-modulatory properties of MSC. We will subsequently summarize the evidence supporting the case for MSC-based therapies in the treatment of corneal diseases. In the third section we will review the literature concerning the keratogenic potential of MSC. Finally, we will review the more recent literature indicating the presence of MSC-like cells derived from corneal tissue.
Resumo:
While many studies have explored conditions and consequences of information systems adoption and use, few have focused on the final stages of the information system lifecycle. In this paper, I develop a theoretical and an initial empirical contribution to understanding individuals’ intentions to discontinue the use of an information system. This understanding is important because it yields implications about maintenance, retirement, and users’ switching decisions, which ultimately can affect work performance, system effectiveness, and return on technology investments. In this paper, I offer a new conceptualization of factors determining users’ intentions to discontinue the use of information systems. I then report on a preliminary empirical test of the model using data from a field study of information system users in a promotional planning routine in a large retail organization. Results from the empirical analysis provide first empirical support for the theoretical model. I discuss the work’s implications for theory on information systems continuance and dual-factor logic in information system use. I also provide suggestions for managers dealing with cessation of information systems and broader work routine change in organizations due to information system end-of-life decisions.
Resumo:
Objective While driveway run-over incidents continue to be a cause of serious injury and deaths among young children in Australia, few empirically evaluated educational interventions have been developed which address these incidents. Addressing this gap, this study describes the development and evaluation of a paper-based driveway safety intervention targeting caregivers of children aged 5 years or younger. Design Cross-sectional survey. Method and setting Informed by previous research, the intervention targeted key caregiver safety behaviours that address driveway risks. To assess the impact of the intervention, 137 Queensland (Australia) caregivers (95.0% women; mean age = 34.97 years) were recruited. After receiving the intervention, changes to a number of outcomes such as caregiver risk perception, safety knowledge and behavioural intentions were measured. Results Findings indicated that the intervention had increased general and specific situational risk awareness and safety knowledge among a substantial proportion of participants. Close to one-quarter of the sample strongly agreed that the intervention had increased these outcomes. In addition, 71.6% of the sample reported that they intended to make changes to their routine in and around the driveway, as a result of reading the intervention material and a further, quarter of the participants strongly agreed that the information provided would be a help both to themselves (26.5%) and other caregivers (33.8%) to keep their children safe in the driveway. Conclusion: While the educational intervention requires further validation, findings from this study suggest that intervention content and format increases driveway safety.