300 resultados para Daily living
Resumo:
Individuals with limb amputation fitted with conventional socket-suspended prostheses often experience socket related discomfort leading to a significant decrease in quality of life.[1-14] Most of these concerns can be overcome with osseointegration, a direct skeletal fixation method where the prosthetic componentry are directly attached to the fixation, resulting in the redundancy of the traditional socket system. There are two stages of osseointegration; Stage one, a titanium implant is inserted into the marrow space of residual limb bone and Stage two, a titanium extension is attached to the fixture. This surgical procedure is currently blooming worldwide, particularly within Queensland. Whilst providing improvements in quality of life, this new method also has potential to minimise the cost required for an amputee to ambulate during daily living. Thus, the aim of this project was to compare the differences in mean cost of services, cost of componentry and labour hours when using osseointegration compared to traditional socket-based prostheses. Data were extracted from Queensland Artificial Limb Services (QALS) database to determine cost of services, type of services and labour hours required to maintain a prosthetic limb. Five trans-femoral amputee male participants (age 46.4±10.1 yrs; height 175.4±16.3 cm; mass 83.8±14.0 kg; time since second stage 22.0± 8.1 mths) met inclusion criteria which was patient had to be more than 12 months post stage two osseointegration procedure. The socket and osseointegration prosthesis variables examined were the mean hours of labour, mean cost of services and mean cost of prosthetic componentry. Statistical analyses were conducted using an ANOVA. The results identified that there were only significant differences in the number of labour hours (p = 0.005) and cost of services (p = 0.021) when comparing the socket and osseointegration prosthetic type. These results identified that the cost of componentry were comparable between the two methods.
Resumo:
We usually find low levels of fitness condition affect other aspects of living for people with ID like dependency in carrying out activivities of daily living. Therefore we find high levels of dependency in activities of daily living due to poor fitness condition. The aim of the study is to explore the criterion validity of the Barthel index with a physical fitness test. An observational cross-sectional study was conducted. Data from the Barthel index and a physical fitness test were measured in 122 adults with intellectual disability. The data were analysed to find out the relationship between four categories of the physical fitness test and the Barthel index. It needs to be stressed that the correlations between the Barthel index and leg, abdominal and arm strength can confirm that these physical test are predictive of the Barthel index. The correlations between the balance variables as functional reach and single-leg stance with eyes open shown relationships with Barthel Index. We found important correlations between the physical fitness test and the Barthel index, so we can affirm that some physical fitness features are predictor variables of the Barthel index.
Resumo:
Obstructive sleep apnoea (OSA) is a chronic condition in which the upper airways collapse repeatedly during sleep, completely or partially obstructing breathing. This obstruction leads to chronic intermittent hypoxia and severe sleep fragmentation, disrupting the restorative functions of sleep. Beebe and Gozal (2002)a developed a theory which hypothesises that disruption of the restorative functions of sleep lead to a chronic low level brain damage most evident in executive functions (EF). Neuropsychological testing of EF, volumetric MRI, magnetic resonance spectroscopy, event related potentials and CSF biomarkers all provide support for this theory. Little research has been done to explore the nature of the subjective complaint and it’s impact on the activities of daily living.
Resumo:
Individuals with limb amputation fitted with conventional socket-suspended prostheses often experience socket-related discomfort leading to a significant decrease in quality of life. Bone-anchored prostheses are increasingly acknowledged as viable alternative method of attachment of artificial limb. In this case, the prosthesis is attached directly to the residual skeleton through a percutaneous fixation. To date, a few osseointegration fixations are commercially available. Several devices are at different stages of development particularly in Europe and the US. [1-15] Clearly, surgical procedures are currently blooming worldwide. Indeed, Australia and Queensland, in particular, have one of the fastest growing populations. Previous studies involving either screw-type implants or press-fit fixations for bone-anchorage have focused on biomechanics aspects as well as the clinical benefits and safety of the procedure. In principle, bone-anchored prostheses should eliminate lifetime expenses associated with sockets and, consequently, potentially alleviate the financial burden of amputation for governmental organizations. Unfortunately, publications focusing on cost-effectiveness are sparse. In fact, only one study published by Haggstrom et al (2012), reported that “despite significantly fewer visits for prosthetic service the annual mean costs for osseointegrated prostheses were comparable with socket-suspended prostheses”. Consequently, governmental organizations such as Queensland Artificial Limb Services (QALS) are facing a number of challenges while adjusting financial assistance schemes that should be fair and equitable to their clients fitted with bone-anchored prostheses. Clearly, more scientific evidence extracted from governmental databases is needed to further consolidate the analyses of financial burden associated with both methods of attachment (i.e., conventional sockets prostheses, bone-anchored prostheses). The purpose of the presentation will be to share the current outcomes of a cost-analysis study lead by QALS. The specific objectives will be: • To outline methodological avenues to assess the cost-effectiveness of bone-anchored prostheses compared to conventional sockets prostheses, • To highlight the potential obstacles and limitations in cost-effectiveness analyses of bone-anchored prostheses, • To present cohort results of a cost-effectiveness (QALY vs cost) including the determination of fair Incremental cost-effectiveness Ratios (ICER) as well as cost-benefit analysis focusing on the comparing costs and key outcome indicators (e.g., QTFA, TUG, 6MWT, activities of daily living) over QALS funding cycles for both methods of attachment.
Resumo:
Objective People with chronic liver disease, particularly those with decompensated cirrhosis, experience several potentially debilitating complications that can have a significant impact on activities of daily living and quality of life. These impairments combined with the associated complex treatment mean that they are faced with specific and high levels of supportive care needs. We aimed to review reported perspectives, experiences and concerns of people with chronic liver disease worldwide. This information is necessary to guide development of policies around supportive needs screening tools and to enable prioritisation of support services for these patients. Design Systematic searches of PubMed, MEDLINE, CINAHL and PsycINFO from the earliest records until 19 September 2014. Data were extracted using standardised forms. A qualitative, descriptive approach was utilised to analyse and synthesise data. Results The initial search yielded 2598 reports: 26 studies reporting supportive care needs among patients with chronic liver disease were included, but few of them were patient-reported needs, none used a validated liver disease-specific supportive care need assessment instrument, and only three included patients with cirrhosis. Five key domains of supportive care needs were identified: informational or educational (eg, educational material, educational sessions), practical (eg, daily living), physical (eg, controlling pruritus and fatigue), patient care and support (eg, support groups), and psychological (eg, anxiety, sadness). Conclusions While several key domains of supportive care needs were identified, most studies included hepatitis patients. There is a paucity of literature describing the supportive care needs of the chronic liver disease population likely to have the most needs—namely those with cirrhosis. Assessing the supportive care needs of people with chronic liver disease have potential utility in clinical practice for facilitating timely referrals to support services.
Resumo:
A large population-based survey of persons with multiple sclerosis (MS) and their caregivers was conducted in Ontario using self-completed mailed questionnaires. The objectives included describing assistance arrangements, needs, and use of and satisfaction with services, and comparing perceptions of persons with MS and their caregivers. Response rates were 83% and 72% for those with MS and caregivers, respectively. Based on 697 respondents with MS whose mean age is 48 years, 70% are female, and 75% are married. While 24% experience no mobility restrictions, the majority require some type of aid or a wheelchair for getting around. Among 345 caregivers, who have been providing care for 9 years on average, the majority are spouses. Caregivers report providing more frequent care than do persons with MS report receiving it, particularly for the following activities of daily living: eating, meal preparation, and help with personal finances. Caregivers also report assistance of longer duration per day than do care recipients with MS. Frequency and duration of assistance are positively associated with increased MS symptom severity and reduced mobility. Generally there is no rural-urban disparity in service provision, utilization or satisfaction, and although there is a wide range of service utilization, satisfaction is consistently high. Respite care is rarely used by caregivers. Use of several services is positively associated with increased severity of MS symptoms and reduced mobility. Assistance arrangements and use of services, each from the point of view of persons with MS and their caregivers, must be taken into account in efforts to prolong home care and to postpone early institutionalization of persons with MS.
Resumo:
REVIEW QUESTION/OBJECTIVE The quantitative objectives are to identify the impact of curative colorectal cancer treatment (surgery or adjuvant therapy) on physical activity, functional status and quality of life within one year of treatment or diagnosis. INCLUSION CRITERIA Types of participants: This review will consider studies that include individuals aged 18 years and over who have been diagnosed with colorectal cancer. Types of intervention(s)/phenomena of interest: This review will consider studies that evaluate the impact of curative colorectal cancer treatment: surgery and/or adjuvant therapy. Types of outcomes: This review will consider studies that include the following outcome measures assessed within one year of diagnosis or treatment: Physical activity - any bodily movement produced by skeletal muscles resulting in energy expenditure. Physical activity is not exclusive to exercise; activities can also be walking, housework, occupational or leisure. Physical activity can be measured objectively using pedometers or accelerometers, or subjectively using self-reported measures. Functional status – measured as the capacity to perform all activities of daily living such as walking, showering, and eating; and instrumental activities of daily living such as (but not limited to) grocery shopping, housekeeping and laundry. Quality of life – defined as the individual meaning of mental, physical and psychosocial wellbeing, as measured by validated tools such as SF-36, EORTC-QLQ-C30, or FACT-C.
Resumo:
The aims of this study were to investigate outcome and to evaluate areas of potential ongoing concern after orthotopic liver transplantation (OLT) in children. Actuarial survival in relation to age and degree of undernutrition at the time of OLT was evaluated in 53 children (age 0.58-14.2 years) undergoing OLT for endstage liver disease. Follow-up studies of growth and quality of life were undertaken in those with a minimum follow-up period of 12 months (n = 26). The overall 3 year actuarial survival was 70%. Survival rates did not differ between age groups (actuarial 2 year survival for ages <1, 1-5 and >5 years were 70, 70 and 69% respectively) but did differ according to nutritional status at OLT (actuarial 2 year survival for children with Z scores for weight <-1 was 57%, >-1 was 95%; P = 0.004). Significant catch-up weight gain was observed by 18 months post-transplant, while height improved less rapidly. Quality of life (assessed by Vineland Adaptive Behaviour Scales incorporating socialization, daily living skills, communication and motor skills) was good (mean composite score 91 ± 19). All school-aged children except one were attending normal school. Two children had mild to moderate intellectual handicap related to post-operative intracerebral complications. Satisfactory long-term survival can be achieved after OLT in children regardless of age but the importance of pre-operative nutrition is emphasized. Survivors have an excellent chance of a good quality of life and of satisfactory catch-up weight gain and growth.
Resumo:
Sit-to-stand (STS) tests measure the ability to get up from a chair, reproducing an important component of daily living activity. As this functional task is essential for human independence, STS performance has been studied in the past decades using several methods, including electromyography. The aim of this study was to measure muscular activity and fatigue during different repetitions and speeds of STS tasks using surface electromyography in lower-limb and trunk muscles. This cross-sectional study recruited 30 healthy young adults. Average muscle activation, percentage of maximum voluntary contraction, muscle involvement in motion and fatigue were measured using surface electrodes placed on the medial gastrocnemius (MG), biceps femoris (BF), vastus medialis of the quadriceps (QM), the abdominal rectus (AR), erector spinae (ES), rectus femoris (RF), soleus (SO) and the tibialis anterior (TA). Five-repetition STS, 10-repetition STS and 30-second STS variants were performed. MG, BF, QM, ES and RF muscles showed differences in muscle activation, while QM, AR and ES muscles showed significant differences in MVC percentage. Also, significant differences in fatigue were found in QM muscle between different STS tests. There was no statistically significant fatigue in the BF, MG and SO muscles of the leg although there appeared to be a trend of increasing fatigue. These results could be useful in describing the functional movements of the STS test used in rehabilitation programs, notwithstanding that they were measured in healthy young subjects.
Resumo:
Use of socket prostheses Currently, for individuals with limb loss, the conventional method of attaching a prosthetic limb relies on a socket that fits over the residual limb. However, there are a number of issues concerning the use of a socket (e.g., blisters, irritation, and discomfort) that result in dissatisfaction with socket prostheses, and these lead ultimately a significant decrease in quality of life. Bone-anchored prosthesis Alternatively, the concept of attaching artificial limbs directly to the skeletal system has been developed (bone anchored prostheses), as it alleviates many of the issues surrounding the conventional socket interface.Bone anchored prostheses rely on two critical components: the implant, and the percutaneous abutment or adapter, which forms the connection for the external prosthetic system (Figure 1). To date, an implant that screws into the long bone of the residual limb has been the most common intervention. However, more recently, press-fit implants have been introduced and their use is increasing. Several other devices are currently at various stages of development, particularly in Europe and the United States. Benefits of bone-anchored prostheses Several key studies have demonstrated that bone-anchored prostheses have major clinical benefits when compared to socket prostheses (e.g., quality of life, prosthetic use, body image, hip range of motion, sitting comfort, ease of donning and doffing, osseoperception (proprioception), walking ability) and acceptable safety, in terms of implant stability and infection. Additionally, this method of attachment allows amputees to participate in a wide range of daily activities for a substantially longer duration. Overall, the system has demonstrated a significant enhancement to quality of life. Challenges of direct skeletal attachment However, due to the direct skeletal attachment, serious injury and damage can occur through excessive loading events such as during a fall (e.g., component damage, peri-prosthetic fracture, hip dislocation, and femoral head fracture). These incidents are costly (e.g., replacement of components) and could require further surgical interventions. Currently, these risks are limiting the acceptance of bone-anchored technology and the substantial improvement to quality of life that this treatment offers. An in-depth investigation into these risks highlighted a clear need to re-design and improve the componentry in the system (Figure 2), to improve the overall safety during excessive loading events. Aim and purposes The ultimate aim of this doctoral research is to improve the loading safety of bone-anchored prostheses, to reduce the incidence of injury and damage through the design of load restricting components, enabling individuals fitted with the system to partake in everyday activities, with increased security and self-assurance. The safety component will be designed to release or ‘fail’ external to the limb, in a way that protects the internal bone-implant interface, thus removing the need for restorative surgery and potential damage to the bone. This requires detailed knowledge of the loads typically experienced by the limb and an understanding of potential overload situations that might occur. Hence, a comprehensive review of the loading literature surrounding bone anchored prostheses will be conducted as part of this project, with the potential for additional experimental studies of the loads during normal activities to fill in gaps in the literature. This information will be pivotal in determining the specifications for the properties of the safety component, and the bone-implant system. The project will follow the Stanford Biodesign process for the development of the safety component.
Resumo:
To strive to improve the rehabilitation program of individuals with transfemoral amputation fitted with bone-anchored prosthesis based on data from direct measurements of the load applied on the residuum we first of all need to understand the load applied on the fixation. Therefore the load applied on the residuum was first directly measured during standardized activities of daily living such as straight line level walking, ascending and descending stairs and a ramp and walking around a circle. From measuring the load in standardized activities of daily living the load was also measured during different phases of the rehabilitation program such as during walking with walking aids and during load bearing exercises.[1-15] The rehabilitation program for individuals with a transfemoral amputation fitted with an OPRA implant relies on a combination of dynamic and static load bearing exercises.[16-20] This presentation will focus on the study of a set of experimental static load bearing exercises. [1] A group of eleven individuals with unilateral transfemoral amputation fitted with an OPRA implant participated in this study. The load on the implant during the static load bearing exercises was measured using a portable system including a commercial transducer embedded in a short pylon, a laptop and a customized software package. This apparatus was previously shown effective in a proof-of-concept study published by Prof. Frossard. [1-9] The analysis of the static load bearing exercises included an analysis of the reliability as well as the loading compliance. The analysis of the loading reliability showed a high reliability between the loading sessions indicating a correct repetition of the LBE by the participants. [1, 5] The analysis of the loading compliance showed a significant lack of axial compliance leading to a systematic underloading of the long axis of the implant during the proposed experimental static LBE.
Resumo:
Objectives To investigate medication changes for older patients admitted to hospital and to explore associations between patient characteristics and polypharmacy. Design Prospective cohort study. Participants and setting Patients aged 70 years or older admitted to general medical units of 11 acute care hospitals in two Australian states between July 2005 and May 2010. All patients were assessed using the interRAI assessment system for acute care. Main outcome measures Measures of physical, cognitive and psychosocial functioning; and number of regular prescribed medications categorised into three groups: non-polypharmacy (0–4 drugs), polypharmacy (5–9 drugs) and hyperpolypharmacy (≥ 10 drugs). Results Of 1220 patients who were recruited for the study, medication records at admission were available for 1216. Mean age was 81.3 years (SD, 6.8 years), and 659 patients (54.2%) were women. For the 1187 patients with complete medication records on admission and discharge, there was a small but statistically significant increase in mean number of regular medications per day between admission and discharge (7.1 v 7.6), while the prevalence of medications such as statins (459 [38.7%] v 457 [38.5%] patients), opioid analgesics (155 [13.1%] v 166 [14.0%] patients), antipsychotics (59 [5.0%] v 65 [5.5%] patients) and benzodiazepines (122 [10.3%] v 135 [11.4%] patients) did not change significantly. Being in a higher polypharmacy category was significantly associated with increase in comorbidities (odds ratio [OR], 1.27; 95% CI, 1.20–1.34), presence of pain (OR, 1.31; 1.05–1.64), dyspnoea (OR, 1.64; 1.30–2.07) and dependence in terms of instrumental activities of daily living (OR, 1.70; 1.20–2.41). Hyperpolypharmacy was observed in 290/1216 patients (23.8%) at admission and 336/1187 patients (28.3%) on discharge, and the proportion of preventive medication in the hyperpolypharmacy category at both points in time remained high (1209/3371 [35.9%] at admission v 1508/4117 [36.6%] at discharge). Conclusions Polypharmacy is common among older people admitted to general medical units of Australian hospitals, with no clinically meaningful change to the number or classification (symptom control, prevention or both) of drugs made by treating physicians.
Resumo:
Older populations are more likely to have multiple co-morbid diseases that require multiple treatments, which make them a large consumer of medications. As a person grows older, their ability to tolerate medications becomes less due to age-related changes in pharmacokinetics and pharmacodynamics often heading along a path that leads to frailty. Frail older persons often have multiple co-morbidities with signs of impairment in activities of daily living. Prescribing drugs for these vulnerable individuals is difficult and is a potentially unsafe activity. Inappropriate prescribing in older population can be detected using explicit (criterion-based) or implicit (judgment-based) criteria. Unfortunately, most current therapeutic guidelines are applicable only to healthy older adults and cannot be generalized to frail patients. These discrepancies should be addressed either by developing new criteria or by refining the existing tools for frail older people. The first and foremost step is to identify the frail patient in clinical practice by applying clinically validated tools. Once the frail patient has been identified, there is a need for specific measures or criteria to assess appropriateness of therapy that consider such factors as quality of life, functional status and remaining life expectancy and thus modified goals of care.
Resumo:
Objective: To assess the effect of graded increases in exercised-induced energy expenditure (EE) on appetite, energy intake (EI), total daily EE and body weight in men living in their normal environment and consuming their usual diets. Design: Within-subject, repeated measures design. Six men (mean (s.d.) age 31.0 (5.0) y; weight 75.1 (15.96) kg; height 1.79 (0.10) m; body mass index (BMI) 23.3(2.4) kg/m2), were each studied three times during a 9 day protocol, corresponding to prescriptions of no exercise, (control) (Nex; 0 MJ/day), medium exercise level (Mex; ~1.6 MJ/day) and high exercise level (Hex; ~3.2 MJ/day). On days 1-2 subjects were given a medium fat (MF) maintenance diet (1.6 ´ resting metabolic rate (RMR)). Measurements: On days 3-9 subjects self-recorded dietary intake using a food diary and self-weighed intake. EE was assessed by continual heart rate monitoring, using the modified FLEX method. Subjects' HR (heart rate) was individually calibrated against submaximal VO2 during incremental exercise tests at the beginning and end of each 9 day study period. Respiratory exchange was measured by indirect calorimetry. Subjects completed hourly hunger ratings during waking hours to record subjective sensations of hunger and appetite. Body weight was measured daily. Results: EE amounted to 11.7, 12.9 and 16.8 MJ/day (F(2,10)=48.26; P<0.001 (s.e.d=0.55)) on the Nex, Mex and Hex treatments, respectively. The corresponding values for EI were 11.6, 11.8 and 11.8 MJ/day (F(2,10)=0.10; P=0.910 (s.e.d.=0.10)), respectively. There were no treatment effects on hunger, appetite or body weight, but there was evidence of weight loss on the Hex treatment. Conclusion: Increasing EE did not lead to compensation of EI over 7 days. However, total daily EE tended to decrease over time on the two exercise treatments. Lean men appear able to tolerate a considerable negative energy balance, induced by exercise, over 7 days without invoking compensatory increases in EI.
Resumo:
This exhibition and catalogue provides a visual record of student work exhibited at the Australian Institute of Architects offices in Brisbane from November 15 to 29, 2010. The exhibition features the final design outcomes of the inaugural Bushfire Sustainability unit conducted at QUT in semester two, 2010. The core objective of this unit was to develop our students’ skills in collaborative practice in design, research and presentation. The theme of ‘bushfire sustainability’ was chosen because living sustainably in bushfire prone landscapes presents a number of problems, the nature of which might only be resolved via multidisciplinary collaboration among the design disciplines. The students involved represent the disciplines of Interior Design, Landscape Architecture, Industrial Design, Architecture and Sustainability – all from within the School of Design at QUT. 55 students, mostly in their third year of study, worked in teams of five (one from each discipline) to design one of a number of homes in highly bushfire prone sites in either Western Australia or SE Queensland. This year level and the interdisciplinary mix are perhaps the best placed to resolve these problems: being unrestrained from the burdens of professional practice and technical overload they retain the potential for innovative, lateral thinking across the range of spatial scales and philosophical perspectives associated with inhabitation of bushfire prone landscapes. It is envisaged that, through the ‘vehicle’ of this design research, that the students’ work will contribute to understandings of how creative design disciplines might respond to this significant national problem, which hitherto has been attended to primarily by engineering and the sciences.