945 resultados para TOTAL PARENTERAL-NUTRITION
Resumo:
Articles > Journals > Health journals > Nutrition & Dietetics: The Journal of the Dieticians Association of Australia articles > March 2003 Article: An assessment of the potential of Family Day Care as a nutrition promotion setting in South Australia. (Original Research). Article from:Nutrition & Dietetics: The Journal of the Dieticians Association of Australia Article date:March 1, 2003 Author:Daniels, Lynne A.; Franco, Bunny; McWhinnie, Julie-Anne CopyrightCOPYRIGHT 2006 Dietitians Association of Australia. This material is published under license from the publisher through the Gale Group, Farmington Hills, Michigan. All inquiries regarding rights or concerns about this content should be directed to customer service. (Hide copyright information) Related articles Ads by Google TAFE Child Care Courses Government accredited courses. Study anytime, anywhere. www.seeklearning.com.au Get Work in Child Care Certificate III Children's Services 4 Day Course + Take Home Assessment HBAconsult.com.au Abstract Objective: To assess the potential role of Family Day Care in nutrition promotion for preschool children. Design and setting: A questionnaire to examine nutrition-related issues and practices was mailed to care providers registered in the southern region of Adelaide, South Australia. Care providers also supplied a descriptive, qualitative recall of the food provided by parents or themselves to each child less than five years of age in their care on the day closest to completion of the questionnaire. Subjects: 255 care providers. The response rate was 63% and covered 643 preschool children, mean 4.6 (SD 2.8) children per carer. Results: There was clear agreement that nutrition promotion was a relevant issue for Family Day Care providers. Nutrition and food hygiene knowledge was good but only 54% of respondents felt confident to address food quality issues with parents. Sixty-five percent of respondents reported non-neutral approaches to food refusal and dawdling (reward, punishment, cajoling) that overrode the child's control of the amount eaten. The food recalls indicated that most children (> 75%) were offered fruit at least once. Depending on the hours in care, (0 to 4, 5 to 8, greater than 8 hours), 20%, 32% and 55%, respectively, of children were offered milk and 65%, 82% and 87%, respectively, of children were offered high fat and sugar foods. Conclusions: Questionnaire responses suggest that many care providers are committed to and proactive in a range of nutrition promotion activities. There is scope for strengthening skills in the management of common problems, such as food refusal and dawdling, consistent with the current evidence for approaches to early feeding management that promote the development of healthy food preferences and eating patterns. Legitimising and empowering care providers in their nutrition promotion role requires clear policies, guide lines, adequate pre- and in-service training, suitable parent materials, and monitoring.
Resumo:
Undernutrition is common in patients admitted for surgery and is often unrecognised, untreated and worsens in hospital. The complex synergistic relationship between nutritional status and the physiological responses to surgery puts patients at high nutritional risk. There are clear prospective associations between inadequate nutritional status and the risk of poorer outcomes for surgical patients, including infection, complications and length of stay. However, practically and ethically evidence that nutritional interventions can significantly reduce these poor outcomes is difficult to obtain. Nevertheless health professionals have a duty of care to ensure our patients are properly fed, by whatever means, to meet their physiological requirements.
Resumo:
Objective: To evaluate the fruit and vegetable intakes of Australian adults aged 19-64 years. Methods: Intake data were collected as part of the National Nutrition Survey 1995 representing all Australian States and Territories, including city, metropolitan, rural and remote areas. Dietary intake of 8,891 19-to-64 year-olds was assessed using a structured 24-hour recall. Intake frequency was assessed as the proportion of participants consuming fruit and vegetables on the day prior to interview and variety was assessed as the number of subgroups of fruit and vegetables consumed. Intake levels were compared with the recommendations of the Australian Guide to Healthy Eating (AGHE). Results: Sixty-two per cent of participants consumed some fruit and 89% consumed some vegetables on the day surveyed. Males were less likely to consume fruit and younger adults less likely to consume fruit and vegetables compared with females and older adults respectively. Variety was primarily low (1 subcategory) for fruit and medium (3-4 subcategories) for vegetables. Thirty-two per cent of adults consumed the minimum two serves of fruit and 30% consumed the minimum five serves of vegetables as recommended by the AGHE. Eleven per cent of adults met the minimum recommendations for both fruit and vegetables. Conclusion: A large proportion of adults have fruit and vegetable intakes below the AGHE minimum recommendations. Implications: A nationally integrated, longterm campaign to increase fruit and vegetable consumption, supported by policy changes to address structural barriers to consumption, is vital to improve fruit and vegetable consumption among adults
Resumo:
We report the long term outcome of the flangeless, cemented all polyethylene Exeter cup at a mean of 14.6 years (range 10-17) after operation. Of the 263 hips in 243 patients, 122 hips are still in situ, 112 patients (119 hips) have died, eighteen hips were revised, and three patients (four hips) had moved abroad and were lost to follow-up (1.5%). Radiographs demonstrated two sockets had migrated and six more had radiolucent lines in all three zones. The Kaplan Meier survivorship at 15 years with endpoint revision for all causes is 89.9% (95% CI 84.6 to 95.2%) and for aseptic cup loosening or lysis 91.7% (CI 86.6 to 96.8%). In 210 hips with a diagnosis of primary osteoarthritis survivorship for all causes is 93.2% (95% CI 88.1 to 98.3%), and for aseptic cup loosening 95.0% (CI 90.3 to 99.7%). The cemented all polyethylene Exeter cup has an excellent long-term survivorship.
Resumo:
Background: The two-stage Total Laparoscopic Hysterectomy (TLH) versus Total Abdominal Hysterectomy (TAH) for stage I endometrial cancer (LACE) randomised controlled trial was initiated in 2005. The primary objective of stage 1 was to assess whether TLH results in equivalent or improved QoL up to 6 months after surgery compared to TAH. The primary objective of stage 2 was to test the hypothesis that disease-free survival at 4.5 years is equivalent for TLH and TAH. Results addressing the primary objective of stage 1 of the LACE trial are presented here. Methods: The first 361 LACE participants (TAH n= 142, TLH n=190) were enrolled in the QoL substudy at 19 centres across Australia, New Zealand and Hong Kong, and 332 completed the QoL analysis. Randomisation was performed centrally and independently from other study procedures via a computer generated, web-based system (providing concealment of the next assigned treatment) using stratified permuted blocks of 3 and 6, and assigned patients with histologically confirmed stage 1 endometrioid endometrial adenocarcinoma and ECOG performance status <2 to TLH or TAH stratified by histological grade and study centre. No blinding of patients or study personnel was attempted. QoL was measured at baseline, 1 and 4 weeks (early), and 3 and 6 months (late) after surgery using the Functional Assessment of Cancer Therapy-General (FACT-G) questionnaire. The primary endpoint was the difference between the groups in QoL change from baseline at early and late time points (a 5% difference was considered clinically significant). Analysis was performed according to the intention-to-treat principle using generalized estimating equations on differences from baseline for the early and late QoL recovery. The LACE trial is registered with clinicaltrials.gov (NCT00096408) and the Australian New Zealand Clinical Trials Registry (CTRN12606000261516). Patients for both stages of the trial have now been recruited and are being followed up for disease-specific outcomes. Findings: The proportion of missing values at the 5%, 10% 15% and 20% differences in the FACT-G scale was 6% (12/190) in the TLH and 14% (20/142) in the TAH group. There were 8/332 conversions (2.4%, 7 of which were from TLH to TAH). In the early phase of recovery, patients undergoing TLH reported significantly greater improvement of QoL from baseline compared to TAH in all subscales except the emotional and social well-being subscales. Improvements in QoL up to 6 months post-surgery continued to favour TLH except for the emotional and social well-being of the FACT and the visual analogue scale of the EuroQoL five dimensions (EuroQoL-VAS). Length of operating time was significantly longer in the TLH group (138±43 mins), than in the TAH group at (109±34 mins; p=0.001). While the proportion of intraoperative adverse events was similar between the treatment groups (TAH 8/142, 5.6%; TLH 14/190, 7.4%; p=0.55), postoperatively, twice as many patients in the TAH group experienced adverse events of CTC grade 3+ than in the TLH group (33/142, 23.2% and 22/190, 11.6%, respectively; p=0.004). Postoperative serious adverse events occurred more frequently in patients who had a TAH (27/142, 19.0%) than a TLH (15/190, 7.9%) (p=0.002). Interpretation: QoL improvements from baseline during early and later phases of recovery, and the adverse event profile significantly favour TLH compared to TAH for patients treated for Stage I endometrial cancer.
Resumo:
The effective atomic number is widely employed in radiation studies, particularly for the characterisation of interaction processes in dosimeters, biological tissues and substitute materials. Gel dosimeters are unique in that they comprise both the phantom and dosimeter material. In this work, effective atomic numbers for total and partial electron interaction processes have been calculated for the first time for a Fricke gel dosimeter, five hypoxic and nine normoxic polymer gel dosimeters. A range of biological materials are also presented for comparison. The spectrum of energies studied spans 10 keV to 100 MeV, over which the effective atomic number varies by 30 %. The effective atomic numbers of gels match those of soft tissue closely over the full energy range studied; greater disparities exist at higher energies but are typically within 4 %.
Resumo:
Background Most questionnaires used for physical activity (PA) surveillance have been developed for adults aged ≤65 years. Given the health benefits of PA for older adults and the aging of the population, it is important to include adults aged 65+ years in PA surveillance. However, few studies have examined how well older adults understand PA surveillance questionnaires. This study aimed to document older adults’ understanding of questions from the International PA Questionnaire (IPAQ), which is used worldwide for PA surveillance. Methods Participants were 41 community-dwelling adults aged 65-89 years. They each completed IPAQ in a face-to-face semi-structured interview, using the “think-aloud” method, in which they expressed their thoughts out loud as they answered IPAQ questions. Interviews were transcribed and coded according to a three-stage model: understanding the intent of the question; performing the primary task (conducting the mental operations required to formulate a response); and response formatting (mapping the response into pre-specified response options). Results Most difficulties occurred during the understanding and performing the primary task stages. Errors included recalling PA in an “average” week, not in the previous 7 days; including PA lasting ≤10 minutes/session; reporting the same PA twice or thrice; and including the total time of an activity for which only a part of that time was at the intensity specified in the question. Participants were unclear what activities fitted within a question’s scope and used a variety of strategies for determining the frequency and duration of their activities. Participants experienced more difficulties with the moderate-intensity PA and walking questions than with the vigorous-intensity PA questions. The sitting time question, particularly difficult for many participants, required the use of an answer strategy different from that used to answer questions about PA. Conclusions These findings indicate a need for caution in administering IPAQ to adults aged ≥65 years. Most errors resulted in over-reporting, although errors resulting in under-reporting were also noted. Given the nature of the errors made by participants, it is possible that similar errors occur when IPAQ is used in younger populations and that the errors identified could be minimized with small modifications to IPAQ.
Resumo:
In recent years the development and use of crash prediction models for roadway safety analyses have received substantial attention. These models, also known as safety performance functions (SPFs), relate the expected crash frequency of roadway elements (intersections, road segments, on-ramps) to traffic volumes and other geometric and operational characteristics. A commonly practiced approach for applying intersection SPFs is to assume that crash types occur in fixed proportions (e.g., rear-end crashes make up 20% of crashes, angle crashes 35%, and so forth) and then apply these fixed proportions to crash totals to estimate crash frequencies by type. As demonstrated in this paper, such a practice makes questionable assumptions and results in considerable error in estimating crash proportions. Through the use of rudimentary SPFs based solely on the annual average daily traffic (AADT) of major and minor roads, the homogeneity-in-proportions assumption is shown not to hold across AADT, because crash proportions vary as a function of both major and minor road AADT. For example, with minor road AADT of 400 vehicles per day, the proportion of intersecting-direction crashes decreases from about 50% with 2,000 major road AADT to about 15% with 82,000 AADT. Same-direction crashes increase from about 15% to 55% for the same comparison. The homogeneity-in-proportions assumption should be abandoned, and crash type models should be used to predict crash frequency by crash type. SPFs that use additional geometric variables would only exacerbate the problem quantified here. Comparison of models for different crash types using additional geometric variables remains the subject of future research.
Resumo:
Cutaneous cholecalciferol synthesis has not been considered in making recommendations for vitamin D intake. Our objective was to model the effects of sun exposure, vitamin D intake, and skin reflectance (pigmentation) on serum 25-hydroxyvitamin D (25[OH]D) in young adults with a wide range of skin reflectance and sun exposure. Four cohorts of participants (n = 72 total) were studied for 7-8 wk in the fall, winter, spring, and summer in Davis, CA [38.5° N, 121.7° W, Elev. 49 ft (15 m)]. Skin reflectance was measured using a spectrophotometer, vitamin D intake using food records, and sun exposure using polysulfone dosimeter badges. A multiple regression model (R^sup 2^ = 0.55; P < 0.0001) was developed and used to predict the serum 25(OH)D concentration for participants with low [median for African ancestry (AA)] and high [median for European ancestry (EA)] skin reflectance and with low [20th percentile, ~20 min/d, ~18% body surface area (BSA) exposed] and high (80th percentile, ~90 min/d, ~35% BSA exposed) sun exposure, assuming an intake of 200 IU/d (5 ug/d). Predicted serum 25(OH)D concentrations for AA individuals with low and high sun exposure in the winter were 24 and 42 nmol/L and in the summer were 40 and 60 nmol/L. Corresponding values for EA individuals were 35 and 60 nmol/L in the winter and in the summer were 58 and 85 nmol/L. To achieve 25(OH)D ≥75 nmol/L, we estimate that EA individuals with high sun exposure need 1300 IU/d vitamin D intake in the winter and AA individuals with low sun exposure need 2100-3100 IU/d year-round.
Resumo:
Objective: Diarrhoea in the enterally tube fed (ETF) intensive care unit (ICU) patient is a multifactorial problem. Diarrhoeal aetiologies in this patient cohort remain debatable; however, the consequences of diarrhoea have been well established and include electrolyte imbalance, dehydration, bacterial translocation, peri anal wound contamination and sleep deprivation. This study examined the incidence of diarrhoea and explored factors contributing to the development of diarrhoea in the ETF, critically ill, adult patient. ---------- Method: After institutional ethical review and approval, a single centre medical chart audit was undertaken to examine the incidence of diarrhoea in ETF, critically ill patients. Retrospective, non-probability sequential sampling was used of all emergency admission adult ICU patients who met the inclusion/exclusion criteria. ---------- Results: Fifty patients were audited. Faecal frequency, consistency and quantity were considered important criteria in defining ETF diarrhoea. The incidence of diarrhoea was 78%. Total patient diarrhoea days (r = 0.422; p = 0.02) and total diarrhoea frequency (r = 0.313; p = 0.027) increased when the patient was ETF for longer periods of time. Increased severity of illness, peripheral oxygen saturation (Sp02), glucose control, albumin and white cell count were found to be statistically significant factors for the development of diarrhoea. ---------- Conclusion: Diarrhoea in ETF critically ill patients is multi-factorial. The early identification of diarrhoea risk factors and the development of a diarrhoea risk management algorithm is recommended.
Resumo:
Background and Significance Venous leg ulcers are a significant cause of chronic ill-health for 1–3% of those aged over 60 years, increasing in incidence with age. The condition is difficult and costly to heal, consuming 1–2.5% of total health budgets in developed countries and up to 50% of community nursing time. Unfortunately after healing, there is a recurrence rate of 60 to 70%, frequently within the first 12 months after heaing. Although some risk factors associated with higher recurrence rates have been identified (e.g. prolonged ulcer duration, deep vein thrombosis), in general there is limited evidence on treatments to effectively prevent recurrence. Patients are generally advised to undertake activities which aim to improve the impaired venous return (e.g. compression therapy, leg elevation, exercise). However, only compression therapy has some evidence to support its effectiveness in prevention and problems with adherence to this strategy are well documented. Aim The aim of this research was to identify factors associated with recurrence by determining relationships between recurrence and demographic factors, health, physical activity, psychosocial factors and self-care activities to prevent recurrence. Methods Two studies were undertaken: a retrospective study of participants diagnosed with a venous leg ulcer which healed 12 to 36 months prior to the study (n=122); and a prospective longitudinal study of participants recruited as their ulcer healed and data collected for 12 months following healing (n=80). Data were collected from medical records on demographics, medical history and ulcer history and treatments; and from self-report questionnaires on physical activity, nutrition, psychosocial measures, ulcer history, compression and other self-care activities. Follow-up data for the prospective study were collected every three months for 12 months after healing. For the retrospective study, a logistic regression model determined the independent influences of variables on recurrence. For the prospective study, median time to recurrence was calculated using the Kaplan-Meier method and a Cox proportional-hazards regression model was used to adjust for potential confounders and determine effects of preventive strategies and psychosocial factors on recurrence. Results In total, 68% of participants in the retrospective study and 44% of participants in the prospective study suffered a recurrence. After mutual adjustment for all variables in multivariable regression models, leg elevation, compression therapy, self efficacy and physical activity were found to be consistently related to recurrence in both studies. In the retrospective study, leg elevation, wearing Class 2 or 3 compression hosiery, the level of physical activity, cardiac disease and self efficacy scores remained significantly associated (p<0.05) with recurrence. The model was significant (p <0.001); with a R2 equivalent of 0.62. Examination of relationships between psychosocial factors and adherence to wearing compression hosiery found wearing compression hosiery was significantly positively associated with participants’ knowledge of the cause of their condition (p=0.002), higher self-efficacy scores (p=0.026) and lower depression scores (p=0.009). Analysis of data from the prospective study found there were 35 recurrences (44%) in the 12 months following healing and median time to recurrence was 27 weeks. After adjustment for potential confounders, a Cox proportional hazards regression model found that at least an hour/day of leg elevation, six or more days/week in Class 2 (20–25mmHg) or 3 (30–40mmHg) compression hosiery, higher social support scale scores and higher General Self-Efficacy scores remained significantly associated (p<0.05) with a lower risk of recurrence, while male gender and a history of DVT remained significant risk factors for recurrence. Overall the model was significant (p <0.001); with an R2 equivalent 0.72. Conclusions The high rates of recurrence found in the studies highlight the urgent need for further information in this area to support development of effective strategies for prevention. Overall, results indicate leg elevation, physical activity, compression hosiery and strategies to improve self-efficacy are likely to prevent recurrence. In addition, optimal management of depression and strategies to improve patient knowledge and self-efficacy may positively influence adherence to compression therapy. This research provides important information for development of strategies to prevent recurrence of venous leg ulcers, with the potential to improve health and decrease health care costs in this population.
Resumo:
Background: Falls are a major health and injury problem for people with Parkinson disease (PD). Despite the severe consequences of falls, a major unresolved issue is the identification of factors that predict the risk of falls in individual patients with PD. The primary aim of this study was to prospectively determine an optimal combination of functional and disease-specific tests to predict falls in individuals with PD. ----- ----- Methods: A total of 101 people with early-stage PD undertook a battery of neurologic and functional tests in their optimally medicated state. The tests included Tinetti, Berg, Timed Up and Go, Functional Reach, and the Physiological Profile Assessment of Falls Risk; the latter assessment includes physiologic tests of visual function, proprioception, strength, cutaneous sensitivity, reaction time, and postural sway. Falls were recorded prospectively over 6 months. ----- ----- Results: Forty-eight percent of participants reported a fall and 24% more than 1 fall. In the multivariate model, a combination of the Unified Parkinson's Disease Rating Scale (UPDRS) total score, total freezing of gait score, occurrence of symptomatic postural orthostasis, Tinetti total score, and extent of postural sway in the anterior-posterior direction produced the best sensitivity (78%) and specificity (84%) for predicting falls. From the UPDRS items, only the rapid alternating task category was an independent predictor of falls. Reduced peripheral sensation and knee extension strength in fallers contributed to increased postural instability. ----- ----- Conclusions: Falls are a significant problem in optimally medicated early-stage PD. A combination of both disease-specific and balance- and mobility-related measures can accurately predict falls in individuals with PD.
Resumo:
Background: An estimated 285 million people worldwide have diabetes and its prevalence is predicted to increase to 439 million by 2030. For the year 2010, it is estimated that 3.96 million excess deaths in the age group 20-79 years are attributable to diabetes around the world. Self-management is recognised as an integral part of diabetes care. This paper describes the protocol of a randomised controlled trial of an automated interactive telephone system aiming to improve the uptake and maintenance of essential diabetes self-management behaviours. ---------- Methods/Design: A total of 340 individuals with type 2 diabetes will be randomised, either to the routine care arm, or to the intervention arm in which participants receive the Telephone-Linked Care (TLC) Diabetes program in addition to their routine care. The intervention requires the participants to telephone the TLC Diabetes phone system weekly for 6 months. They receive the study handbook and a glucose meter linked to a data uploading device. The TLC system consists of a computer with software designed to provide monitoring, tailored feedback and education on key aspects of diabetes self-management, based on answers voiced or entered during the current or previous conversations. Data collection is conducted at baseline (Time 1), 6-month follow-up (Time 2), and 12-month follow-up (Time 3). The primary outcomes are glycaemic control (HbA1c) and quality of life (Short Form-36 Health Survey version 2). Secondary outcomes include anthropometric measures, blood pressure, blood lipid profile, psychosocial measures as well as measures of diet, physical activity, blood glucose monitoring, foot care and medication taking. Information on utilisation of healthcare services including hospital admissions, medication use and costs is collected. An economic evaluation is also planned.---------- Discussion: Outcomes will provide evidence concerning the efficacy of a telephone-linked care intervention for self-management of diabetes. Furthermore, the study will provide insight into the potential for more widespread uptake of automated telehealth interventions, globally.