889 resultados para Daily living


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study established that the core principle underlying categorisation of activities have the potential to provide more comprehensive outcomes than the recognition of activities because it takes into consideration activities other than directional locomotion.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to characterise the functional outcome of 12 transfemoral amputees fitted with osseointegrated fixation using temporal gait characteristics. The objectives were (A) to present the cadence, duration of gait cycle, support and swing phases with an emphasis on the stride-to-stride and participant-to-participant variability, and (B) to compare these temporal variables with normative data extracted from the literature focusing on transfemoral amputees fitted with a socket and able-bodied participants. The temporal variables were extracted from the load applied on the residuum during straight level walking, which was collected at 200 Hz by a transducer. A total of 613 strides were assessed. The cadence (46±4 strides/min), the duration of the gait cycle (1.29±0.11 s), support (0.73±0.07 s, 57±3% of CG) and swing (0.56±0.07 s, 43±3% of GC) phases of the participants were 2% quicker, 3%, 6% shorter and 1% longer than transfemoral amputees using a socket as well as 11% slower, 9%, 6% and 13% longer than able-bodied, respectively. All combined, the results indicated that the fitting of an osseointegrated fixation has enabled this group of amputees to restore their locomotion with a highly functional level. Further longitudinal and cross-sectional studies would be required to confirm these outcomes. Nonetheless, the data presented can be used as benchmark for future comparisons. It can also be used as input in generic algorithms using templates of patterns of loading to recognise activities of daily living and to detect falls.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

People in developed countries are living longer with the help of medical advances. Literature has shown that older people prefer to stay independent and live at home for as long as possible. Therefore, it is important to find out how to best accommodate and assist them in maintaining quality of life and independence as well as easing human resources. Researchers have claimed that assistive devices assist in older people’s independence, however, only a small number of studies regarding the efficiency of assistive devices have been undertaken of which several have stated that devices are not being used. The overall aim of this research was to identify whether the disuse and ineffectiveness of assistive devices are related to change in abilities or related to the design of the devices. The objective was to gather information from the elderly; to identify what assistive devices are being used or not used and to gain an understanding on their attitudes towards assistive devices. Research was conducted in two phases. The initial phase of the research was conducted with the distribution of questionnaires to people over the age of fifty that asked general questions and specific questions on type of devices being used. Phase One was followed on by Phase Two, where participants from Phase One who had come in contact with assistive devices were invited to participate in a semi-structured interview. Questions were put forth to the interviewee on their use of and attitudes towards assistive devices. Findings indicated that the reasons for the disuse in assistive devices were mostly design related; bulkiness, reliability, performance of the device, difficulty of use. The other main reason for disuse was socially related; elderly people preferred to undertake activities on their own and only use a device as a precaution or when absolutely necessary. They would prefer not having to rely on the devices. Living situation and difference in gender did not affect the preference for the use of assistive devices over personal assistance. The majority strongly supported the idea of remaining independent for as long as possible. In conclusion, this study proposes that through these findings, product designers will have a better understanding of the requirements of an elderly user. This will enable the designers to produce assistive devices that are more practical, personalised, reliable, easy to use and tie in with the older people’s environments. Additional research with different variables is recommended to further justify these findings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Presbyopia affects individuals from the age of 45 years onwards, resulting in difficulty in accurately focusing on near objects. There are many optical corrections available including spectacles or contact lenses that are designed to enable presbyopes to see clearly at both far and near distances. However, presbyopic vision corrections also disturb aspects of visual function under certain circumstances. The impact of these changes on activities of daily living such as driving are, however, poorly understood. Therefore, the aim of this study was to determine which aspects of driving performance might be affected by wearing different types of presbyopic vision corrections. In order to achieve this aim, three experiments were undertaken. The first experiment involved administration of a questionnaire to compare the subjective driving difficulties experienced when wearing a range of common presbyopic contact lens and spectacle corrections. The questionnaire was developed and piloted, and included a series of items regarding difficulties experienced while driving under day and night-time conditions. Two hundred and fifty five presbyopic patients responded to the questionnaire and were categorised into five groups, including those wearing no vision correction for driving (n = 50), bifocal spectacles (BIF, n = 54), progressive addition lenses spectacles (PAL, n = 50), monovision (MV, n = 53) and multifocal contact lenses (MTF CL, n = 48). Overall, ratings of satisfaction during daytime driving were relatively high for all correction types. However, MV and MTF CL wearers were significantly less satisfied with aspects of their vision during night-time than daytime driving, particularly with regard to disturbances from glare and haloes. Progressive addition lens wearers noticed more distortion of peripheral vision, while BIF wearers reported more difficulties with tasks requiring changes in focus and those who wore no vision correction for driving reported problems with intermediate and near tasks. Overall, the mean level of satisfaction for daytime driving was quite high for all of the groups (over 80%), with the BIF wearers being the least satisfied with their vision for driving. Conversely, at night, MTF CL wearers expressed the least satisfaction. Research into eye and head movements has become increasingly of interest in driving research as it provides a means of understanding how the driver responds to visual stimuli in traffic. Previous studies have found that wearing PAL can affect eye and head movement performance resulting in slower eye movement velocities and longer times to stabilize the gaze for fixation. These changes in eye and head movement patterns may have implications for driving safety, given that the visual tasks for driving include a range of dynamic search tasks. Therefore, the second study was designed to investigate the influence of different presbyopic corrections on driving-related eye and head movements under standardized laboratory-based conditions. Twenty presbyopes (mean age: 56.1 ± 5.7 years) who had no experience of wearing presbyopic vision corrections, apart from single vision reading spectacles, were recruited. Each participant wore five different types of vision correction: single vision distance lenses (SV), PAL, BIF, MV and MTF CL. For each visual condition, participants were required to view videotape recordings of traffic scenes, track a reference vehicle and identify a series of peripherally presented targets while their eye and head movements were recorded using the faceLAB® eye and head tracking system. Digital numerical display panels were also included as near visual stimuli (simulating the visual displays of a vehicle speedometer and radio). The results demonstrated that the path length of eye movements while viewing and responding to driving-related traffic scenes was significantly longer when wearing BIF and PAL than MV and MTF CL. The path length of head movements was greater with SV, BIF and PAL than MV and MTF CL. Target recognition was less accurate when the near stimulus was located at eccentricities inferiorly and to the left, rather than directly below the primary position of gaze, regardless of vision correction type. The third experiment aimed to investigate the real world driving performance of presbyopes while wearing different vision corrections measured on a closed-road circuit at night-time. Eye movements were recorded using the ASL Mobile Eye, eye tracking system (as the faceLAB® system proved to be impractical for use outside of the laboratory). Eleven participants (mean age: 57.25 ± 5.78 years) were fitted with four types of prescribed vision corrections (SV, PAL, MV and MTF CL). The measures of driving performance on the closed-road circuit included distance to sign recognition, near target recognition, peripheral light-emitting-diode (LED) recognition, low contrast road hazards recognition and avoidance, recognition of all the road signs, time to complete the course, and driving behaviours such as braking, accelerating, and cornering. The results demonstrated that driving performance at night was most affected by MTF CL compared to PAL, resulting in shorter distances to read signs, slower driving speeds, and longer times spent fixating road signs. Monovision resulted in worse performance in the task of distance to read a signs compared to SV and PAL. The SV condition resulted in significantly more errors made in interpreting information from in-vehicle devices, despite spending longer time fixating on these devices. Progressive addition lenses were ranked as the most preferred vision correction, while MTF CL were the least preferred vision correction for night-time driving. This thesis addressed the research question of how presbyopic vision corrections affect driving performance and the results of the three experiments demonstrated that the different types of presbyopic vision corrections (e.g. BIF, PAL, MV and MTF CL) can affect driving performance in different ways. Distance-related driving tasks showed reduced performance with MV and MTF CL, while tasks which involved viewing in-vehicle devices were significantly hampered by wearing SV corrections. Wearing spectacles such as SV, BIF and PAL induced greater eye and head movements in the simulated driving condition, however this did not directly translate to impaired performance on the closed- road circuit tasks. These findings are important for understanding the influence of presbyopic vision corrections on vision under real world driving conditions. They will also assist the eye care practitioner to understand and convey to patients the potential driving difficulties associated with wearing certain types of presbyopic vision corrections and accordingly to support them in the process of matching patients to optical corrections which meet their visual needs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Achilles tendon has been seen to exhibit time-dependent conditioning when isometric muscle actions were of a prolonged duration, compared to those involved in dynamic activities, such as walking. Since, the effect of short duration muscle activation associated with dynamic activities is yet to be established, the present study aimed to investigate the effect of incidental walking activity on Achilles tendon diametral strain. Eleven healthy male participants refrained from physical activity in excess of the walking required to carry out necessary daily tasks and wore an activity monitor during the 24 h study period. Achilles tendon diametral strain, 2 cm proximal to the calcaneal insertion, was determined from sagittal sonograms. Baseline sonographic examinations were conducted at ∼08:00 h followed by replicate examinations at 12 and 24 h. Walking activity was measured as either present (1) or absent (0) and a linear weighting function was applied to account for the proximity of walking activity to tendon examination time. Over the course of the day the median (min, max) Achilles tendon diametral strain was −11.4 (4.5, −25.4)%. A statistically significant relationship was evident between walking activity and diametral strain (P < 0.01) and this relationship improved when walking activity was temporally weighted (AIC 131 to 126). The results demonstrate that the short yet repetitive loads generated during activities of daily living, such as walking, are sufficient to induce appreciable time-dependant conditioning of the Achilles tendon. Implications arise for the in vivo measurement of Achilles tendon properties and the rehabilitation of tendinopathy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Apart from promoting physical recovery and assisting in activities of daily living, a major challenge in stroke rehabilitation is to minimize psychosocial morbidity and to promote the reintegration of stroke survivors into their family and community. The identification of key factors influencing long-term outcome are essential in developing more effective rehabilitation measures for reducing stroke-related morbidity. The aim of this study was to test a theoretical model of predictors of participation restriction which included the direct and indirect effects between psychosocial outcomes, physical outcome, and socio-demographic variables at 12 months after stroke.--------- Methods: Data were collected from 188 stroke survivors at 12 months following their discharge from one of the two rehabilitation hospitals in Hong Kong. The settings included patients' homes and residential care facilities. Path analysis was used to test a hypothesized model of participation restriction at 12 months.---------- Results: The path coefficients show functional ability having the largest direct effect on participation restriction (β = 0.51). The results also show that more depressive symptoms (β = -0.27), low state self-esteem (β = 0.20), female gender (β = 0.13), older age (β = -0.11) and living in a residential care facility (β = -0.12) have a direct effect on participation restriction. The explanatory variables accounted for 71% of the variance in explaining participation restriction at 12 months.---------- Conclusion: Identification of stroke survivors at risk of high levels of participation restriction, depressive symptoms and low self-esteem will assist health professionals to devise appropriate rehabilitation interventions that target improving both physical and psychosocial functioning.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Little is known about cancer survivors’ experiences with and preferences for exercise programmes offered during rehabilitation (immediately after cancer treatment). This study documented colorectal cancer survivors’ experiences in an exercise rehabilitation programme and their preferences for programme content and delivery. At the completion of 12-weeks of supervised exercise, 10 participants took part in one-on-one semi-structured interviews. Data from these interviews were coded, and themes were identified using qualitative software. Key findings were that most participants experienced improvements in treatment symptoms, including reduced fatigue and increased energy and confidence to do activities of daily living. They also reported that interactions with the exercise trainer and a flexible programme delivery were important aspects of the intervention. Most participants reported that they preferred having a choice of exercise, starting to exercise within a month after completing treatment, having supervision and maintaining a one-on-one format. Frustrations included scheduling conflicts and a lack of a transition out of the programme. The findings indicate that colorectal cancers experience benefits from exercise offered immediately after treatment and prefer individual attention from exercise staff. They further indicate directions for the implementation of future exercise programmes with this population.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives To explore the extent of and factors associated with male residents who change wandering status post nursing home admission. Design Longitudinal design with secondary data analyses. Admissions over a 4-year period were examined using repeat assessments with the Minimum Data Set (MDS) to formulate a model understanding the development of wandering behavior. Setting One hundred thirty-four Veterans Administration (VA) nursing homes throughout the United States. Participants: Included 6673 residents admitted to VA nursing homes between October 2000 and October 2004. Measurements MDS variables (cognitive impairment, mood, behavior problems, activities of daily living and wandering) included ratings recorded at residents’ admission to the nursing home and a minimum of two other time points at quarterly intervals. Results The majority (86%) of the sample were classified as non wanderers at admission and most of these (94%) remained non wanderers until discharge or the end of the study. Fifty one per cent of the wanderers changed status to non wanderers with 6% of these residents fluctuating in status more than two times. Admission variables associated with an increased risk of changing status from non-wandering to wandering included older age, greater cognitive impairment, more socially inappropriate behavior, resisting care, easier distractibility, and needing less help with personal hygiene. Requiring assistance with locomotion and having three or more medical comorbidities were associated with a decreased chance of changing from non-wandering to wandering status. Conclusion A resident’s change from non-wandering to wandering status may reflect an undetected medical event that affects cognition, but spares mobility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim: To explore the lived experience of being a sole mother in Taiwan Background: The number of sole mothers in Taiwan has increased by 55 % in the last decade due to changes in the social and economic status of women (e.g. earlier divorce, the development of national policies for the protection of women, the rise of feminism, and changing work practices which have seen an increase in the number of women in the workforce) (Taiwan Department of Statistics, 2010). Issues confronting sole mothers as part of daily living involve inability to cope with daily life stressors, little social support, experiencing feelings of helplessness and hopelessness, and lack of self-confidence to assume responsibility for the physical and mental health needs of themselves and their children (Cairney, 2007; Loxton, Mooney & Young, 2006; Samuels-Dennis, 2006; Waldron et al., 1996). Although there have been a number of studies conducted concerning what it means to be a sole mother, few Taiwanese studies have been undertaken. In light of the absence of research on this topic from a Taiwanese perspective, this study was undertaken. Design:A descriptive phenomenological approach was used for this study. Methods: In-depth audio-taped interviews were conducted with 15 sole Taiwanese mothers. The audiotapes were later transcribed, translated into English, and then back translated into Chinese to ensure accuracy of participants‘ information. Colaizzi‘s phenomenological approach to analysis with one additional step (eight steps in all) informed the analytical process. Findings: The process of analysis identified six central themes: 1. Enduring the burdensome, 2. Survival means living day-by-day, 3. Living in the shadows of insomnia, depression and suicidal thoughts, 4. Living with rejection and social isolation, 5. Living with uncertainty, and 6. Transcending difficult times through being resilient. Conclusion: For the participants of this study, the lived world of Taiwanese sole mothers was replete with daily difficulties marked by isolation, loneliness, social disapproval and rejection. Feelings of sadness and dejection were their daily companions. However, amid their myriad hardships, the participants found strength and solace in their children and close friends. Rather than succumb to the pressures of being a sole mother, the participants forged new paths spurred on by their own hopes and dreams for a better future. The findings of this study have the potential to make significant contributions to extant knowledge concerning the lived experiences of sole mothers in Taiwan.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: To identify the prevalence of geriatric syndromes in the premorbid for all syndromes except falls (preadmission), admission, and discharge assessment periods and the incidence of new and significant worsening of existing syndromes at admission and discharge. DESIGN: Prospective cohort study. SETTING: Three acute care hospitals in Brisbane, Australia. PARTICIPANTS: Five hundred seventy-seven general medical patients aged 70 and older admitted to the hospital. MEASUREMENTS: Prevalence of syndromes in the premorbid (or preadmission for falls), admission, and discharge periods; incidence of new syndromes at admission and discharge; and significant worsening of existing syndromes at admission and discharge. RESULTS: The most frequently reported premorbid syndromes were bladder incontinence (44%), impairment in any activity of daily living (ADL) (42%). A high proportion (42%) experienced at least one fall in the 90 days before admission. Two-thirds of the participants experienced between one and five syndromes (cognitive impairment, dependence in any ADL item, bladder and bowel incontinence, pressure ulcer) before, at admission, and at discharge. A majority experienced one or two syndromes during the premorbid (49.4%), admission (57.0%), or discharge (49.0%) assessment period.The syndromes with a higher incidence of significant worsening at discharge (out of the proportion with the syndrome present premorbidly) were ADL limitation (33%), cognitive impairment (9%), and bladder incontinence (8%). Of the syndromes examined at discharge, a higher proportion of patients experienced the following new syndromes at discharge (absent premorbidly): ADL limitation (22%); and bladder incontinence (13%). CONCLUSION: Geriatric syndromes were highly prevalent. Many patients did not return to their premorbid function and acquired new syndromes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background and significance: Older adults with chronic diseases are at increasing risk of hospital admission and readmission. Approximately 75% of adults have at least one chronic condition, and the odds of developing a chronic condition increases with age. Chronic diseases consume about 70% of the total Australian health expenditure, and about 59% of hospital events for chronic conditions are potentially preventable. These figures have brought to light the importance of the management of chronic disease among the growing older population. Many studies have endeavoured to develop effective chronic disease management programs by applying social cognitive theory. However, limited studies have focused on chronic disease self-management in older adults at high risk of hospital readmission. Moreover, although the majority of studies have covered wide and valuable outcome measures, there is scant evidence on examining the fundamental health outcomes such as nutritional status, functional status and health-related quality of life. Aim: The aim of this research was to test social cognitive theory in relation to self-efficacy in managing chronic disease and three health outcomes, namely nutritional status, functional status, and health-related quality of life, in older adults at high risk of hospital readmission. Methods: A cross-sectional study design was employed for this research. Three studies were undertaken. Study One examined the nutritional status and validation of a nutritional screening tool; Study Two explored the relationships between participants. characteristics, self-efficacy beliefs, and health outcomes based on the study.s hypothesized model; Study Three tested a theoretical model based on social cognitive theory, which examines potential mechanisms of the mediation effects of social support and self-efficacy beliefs. One hundred and fifty-seven patients aged 65 years and older with a medical admission and at least one risk factor for readmission were recruited. Data were collected from medical records on demographics, medical history, and from self-report questionnaires. The nutrition data were collected by two registered nurses. For Study One, a contingency table and the kappa statistic was used to determine the validity of the Malnutrition Screening Tool. In Study Two, standard multiple regression, hierarchical multiple regression and logistic regression were undertaken to determine the significant influential predictors for the three health outcome measures. For Study Three, a structural equation modelling approach was taken to test the hypothesized self-efficacy model. Results: The findings of Study One suggested that a high prevalence of malnutrition continues to be a concern in older adults as the prevalence of malnutrition was 20.6% according to the Subjective Global Assessment. Additionally, the findings confirmed that the Malnutrition Screening Tool is a valid nutritional screening tool for hospitalized older adults at risk of readmission when compared to the Subjective Global Assessment with high sensitivity (94%), and specificity (89%) and substantial agreement between these two methods (k = .74, p < .001; 95% CI .62-.86). Analysis data for Study Two found that depressive symptoms and perceived social support were the two strongest influential factors for self-efficacy in managing chronic disease in a hierarchical multiple regression. Results of multivariable regression models suggested advancing age, depressive symptoms and less tangible support were three important predictors for malnutrition. In terms of functional status, a standard regression model found that social support was the strongest predictor for the Instrumental Activities of Daily Living, followed by self-efficacy in managing chronic disease. The results of standard multiple regression revealed that the number of hospital readmission risk factors adversely affected the physical component score, while depressive symptoms and self-efficacy beliefs were two significant predictors for the mental component score. In Study Three, the results of the structural equation modelling found that self-efficacy partially mediated the effect of health characteristics and depression on health-related quality of life. The health characteristics had strong direct effects on functional status and body mass index. The results also indicated that social support partially mediated the relationship between health characteristics and functional status. With regard to the joint effects of social support and self-efficacy, social support fully mediated the effect of health characteristics on self-efficacy, and self-efficacy partially mediated the effect of social support on functional status and health-related quality of life. The results also demonstrated that the models fitted the data well with relative high variance explained by the models, implying the hypothesized constructs under discussion were highly relevant, and hence the application for social cognitive theory in this context was supported. Conclusion: This thesis highlights the applicability of social cognitive theory on chronic disease self-management in older adults at risk of hospital readmission. Further studies are recommended to validate and continue to extend the development of social cognitive theory on chronic disease self-management in older adults to improve their nutritional and functional status, and health-related quality of life.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Ankle fractures are one of the more commonly occurring forms of trauma managed by orthopaedic teams worldwide. The impacts of these injuries are not restricted to pain and disability caused at the time of the incident, but may also result in long term physical, psychological, and social consequences. There are currently no ankle fracture specific patient-reported outcome measures with a robust content foundation. This investigation aimed to develop a thematic conceptual framework of life impacts following ankle fracture from the experiences of people who have suffered ankle fractures as well as the health professionals who treat them. Methods: A qualitative investigation was undertaken using in-depth semi-structured interviews with people (n=12) who had previously sustained an ankle fracture (patients) and health professionals (n=6) that treat people with ankle fractures. Interviews were audio-recorded and transcribed. Each phrase was individually coded and grouped in categories and aligned under emerging themes by two independent researchers. Results: Saturation occurred after 10 in-depth patient interviews. Time since injury for patients ranged from 6 weeks to more than 2 years. Experience of health professionals ranged from 1 year to 16 years working with people with ankle fractures. Health professionals included an Orthopaedic surgeon (1), physiotherapists (3), a podiatrist (1) and an occupational therapist (1). The emerging framework derived from patient data included eight themes (Physical, Psychological, Daily Living, Social, Occupational and Domestic, Financial, Aesthetic and Medication Taking). Health professional responses did not reveal any additional themes, but tended to focus on physical and occupational themes. Conclusions: The nature of life impact following ankle fractures can extend beyond short term pain and discomfort into many areas of life. The findings from this research have provided an empirically derived framework from which a condition-specific patient-reported outcome measure can be developed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prompted by the continuing transition to community care, mental health nurses are considering the role of social support in community adaptation. This article demonstrates the importance of distinguishing between kinds of social support and presents findings from the first round data of a longitudinal study of community adaptation in 156 people with schizophrenia conducted in Brisbane, Australia. All clients were interviewed using the relevant subscales of the Diagnostic Interview Schedule to confirm a primary diagnosis of schizophrenia. The study set out to investigate the relationship between community adaptation and social support. Community adaptation was measured with the Brief Psychiatric Rating Scale (BPRS), the Life Skills Profile (LSP) and measures of dissatisfaction with life and problems in daily living developed by the authors. Social support was measured with the Arizona Social Support Interview Schedule (ASSIS). The BPRS and ASSIS were incorporated into a client interview conducted by trained interviewers. The LSP was completed on each client by an informal carer (parent, relative or friend) or a professional carer (case manager or other health professional) nominated by the client. Hierarchical regression analysis was used to examine the relationship between community adaptation and four sets of social support variables. Given the order in which variables were entered in regression equations, a set of perceived social support variables was found to account for the largest unique variance of four measures of community adaptation in 96 people with schizophrenia for whom complete data are available from the first round of the three-wave longitudinal study. A set of the subjective experiences of the clients accounted for the largest unique variance in measures of symptomatology, life skills, dissatisfaction with life, and problems in daily living. Sets of community support, household support and functional variables accounted for less variance. Implications for mental health nursing practice are considered.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to perform a biomechanical analysis of the cement-in-cement (c-in-c) technique for fixation of selected Vancouver Type B1 femoral periprosthetic fractures and to assess the degree of cement interposition at the fracture site. Six embalmed cadaveric femora were implanted with a cemented femoral stem. Vancouver Type B1 fractures were created by applying a combined axial and rotational load to failure. The femora were repaired using the c-in-c technique and reloaded to failure. The mean primary fracture torque was 117 Nm (SD 16.6, range 89–133). The mean revision fracture torque was 50 Nm (SD 16.6, range 29–74), which is above the torque previously observed for activities of daily living. Cement interposition at the fracture site was found to be minimal.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background The largest proportion of cancer patients are aged 65 years and over. Increasing age is also associated with nutritional risk and multi-morbidities—factors which complicate the cancer treatment decision-making process in older patients. Objectives To determine whether malnutrition risk and Body Mass Index (BMI) are associated with key oncogeriatric variables as potential predictors of chemotherapy outcomes in geriatric oncology patients with solid tumours. Methods In this longitudinal study, geriatric oncology patients (aged ≥65 years) received a Comprehensive Geriatric Assessment (CGA) for baseline data collection prior to the commencement of chemotherapy treatment. Malnutrition risk was assessed using the Malnutrition Screening Tool (MST) and BMI was calculated using anthropometric data. Nutritional risk was compared with other variables collected as part of standard CGA. Associations were determined by chi-square tests and correlations. Results Over half of the 175 geriatric oncology patients were at risk of malnutrition (53.1%) according to MST. BMI ranged from 15.5–50.9kg/m2, with 35.4% of the cohort overweight when compared to geriatric cutoffs. Malnutrition risk was more prevalent in those who were underweight (70%) although many overweight participants presented as at risk (34%). Malnutrition risk was associated with a diagnosis of colorectal or lung cancer (p=0.001), dependence in activities of daily living (p=0.015) and impaired cognition (p=0.049). Malnutrition risk was positively associated with vulnerability to intensive cancer therapy (rho=0.16, p=0.038). Larger BMI was associated with a greater number of multi-morbidities (rho =.27, p=0.001. Conclusions Malnutrition risk is prevalent among geriatric patients undergoing chemotherapy, is more common in colorectal and lung cancer diagnoses, is associated with impaired functionality and cognition and negatively influences ability to complete planned intensive chemotherapy.