300 resultados para Daily activities
Resumo:
Individuals with limb amputation fitted with conventional socket-suspended prostheses often experience socket-related discomfort leading to a significant decrease in quality of life. Bone-anchored prostheses are increasingly acknowledged as viable alternative method of attachment of artificial limb. In this case, the prosthesis is attached directly to the residual skeleton through a percutaneous fixation. To date, a few osseointegration fixations are commercially available. Several devices are at different stages of development particularly in Europe and the US. [1-15] Clearly, surgical procedures are currently blooming worldwide. Indeed, Australia and Queensland, in particular, have one of the fastest growing populations. Previous studies involving either screw-type implants or press-fit fixations for bone-anchorage have focused on biomechanics aspects as well as the clinical benefits and safety of the procedure. In principle, bone-anchored prostheses should eliminate lifetime expenses associated with sockets and, consequently, potentially alleviate the financial burden of amputation for governmental organizations. Unfortunately, publications focusing on cost-effectiveness are sparse. In fact, only one study published by Haggstrom et al (2012), reported that “despite significantly fewer visits for prosthetic service the annual mean costs for osseointegrated prostheses were comparable with socket-suspended prostheses”. Consequently, governmental organizations such as Queensland Artificial Limb Services (QALS) are facing a number of challenges while adjusting financial assistance schemes that should be fair and equitable to their clients fitted with bone-anchored prostheses. Clearly, more scientific evidence extracted from governmental databases is needed to further consolidate the analyses of financial burden associated with both methods of attachment (i.e., conventional sockets prostheses, bone-anchored prostheses). The purpose of the presentation will be to share the current outcomes of a cost-analysis study lead by QALS. The specific objectives will be: • To outline methodological avenues to assess the cost-effectiveness of bone-anchored prostheses compared to conventional sockets prostheses, • To highlight the potential obstacles and limitations in cost-effectiveness analyses of bone-anchored prostheses, • To present cohort results of a cost-effectiveness (QALY vs cost) including the determination of fair Incremental cost-effectiveness Ratios (ICER) as well as cost-benefit analysis focusing on the comparing costs and key outcome indicators (e.g., QTFA, TUG, 6MWT, activities of daily living) over QALS funding cycles for both methods of attachment.
Resumo:
Objective People with chronic liver disease, particularly those with decompensated cirrhosis, experience several potentially debilitating complications that can have a significant impact on activities of daily living and quality of life. These impairments combined with the associated complex treatment mean that they are faced with specific and high levels of supportive care needs. We aimed to review reported perspectives, experiences and concerns of people with chronic liver disease worldwide. This information is necessary to guide development of policies around supportive needs screening tools and to enable prioritisation of support services for these patients. Design Systematic searches of PubMed, MEDLINE, CINAHL and PsycINFO from the earliest records until 19 September 2014. Data were extracted using standardised forms. A qualitative, descriptive approach was utilised to analyse and synthesise data. Results The initial search yielded 2598 reports: 26 studies reporting supportive care needs among patients with chronic liver disease were included, but few of them were patient-reported needs, none used a validated liver disease-specific supportive care need assessment instrument, and only three included patients with cirrhosis. Five key domains of supportive care needs were identified: informational or educational (eg, educational material, educational sessions), practical (eg, daily living), physical (eg, controlling pruritus and fatigue), patient care and support (eg, support groups), and psychological (eg, anxiety, sadness). Conclusions While several key domains of supportive care needs were identified, most studies included hepatitis patients. There is a paucity of literature describing the supportive care needs of the chronic liver disease population likely to have the most needs—namely those with cirrhosis. Assessing the supportive care needs of people with chronic liver disease have potential utility in clinical practice for facilitating timely referrals to support services.
Resumo:
The multifractal properties of daily rainfall time series at the stations in Pearl River basin of China over periods of up to 45 years are examined using the universal multifractal approach based on the multiplicative cascade model and the multifractal detrended fluctuation analysis (MF-DFA). The results from these two kinds of multifractal analyses show that the daily rainfall time series in this basin have multifractal behavior in two different time scale ranges. It is found that the empirical multifractal moment function K(q)K(q) of the daily rainfall time series can be fitted very well by the universal multifractal model (UMM). The estimated values of the conservation parameter HH from UMM for these daily rainfall data are close to zero indicating that they correspond to conserved fields. After removing the seasonal trend in the rainfall data, the estimated values of the exponent h(2)h(2) from MF-DFA indicate that the daily rainfall time series in Pearl River basin exhibit no long-term correlations. It is also found that K(2)K(2) and elevation series are negatively correlated. It shows a relationship between topography and rainfall variability.
Resumo:
Introduction and Aims Wastewater analysis provides a non-intrusive way of measuring drug use within a population. We used this approach to determine daily use of conventional illicit drugs [cannabis, cocaine, methamphetamine and 3,4-methylenedioxymethamphetamine (MDMA)] and emerging illicit psychostimulants (benzylpiperazine, mephedrone and methylone) in two consecutive years (2010 and 2011) at an annual music festival. Design and Methods Daily composite wastewater samples, representative of the festival, were collected from the on-site wastewater treatment plant and analysed for drug metabolites. Data over 2 years were compared using Wilcoxon matched-pair test. Data from 2010 festival were compared with data collected at the same time from a nearby urban community using equivalent methods. Results Conventional illicit drugs were detected in all samples whereas emerging illicit psychostimulants were found only on specific days. The estimated per capita consumption of MDMA, cocaine and cannabis was similar between the two festival years. Statistically significant (P < 0.05; Z = −2.0–2.2) decreases were observed in use of methamphetamine and one emerging illicit psychostimulant (benzyl piperazine). Only consumption of MDMA was elevated at the festival compared with the nearby urban community. Discussion and Conclusions Rates of substance use at this festival remained relatively consistent over two monitoring years. Compared with the urban community, drug use among festival goers was only elevated for MDMA, confirming its popularity in music settings. Our study demonstrated that wastewater analysis can objectively capture changes in substance use at a music setting without raising major ethical issues. It would potentially allow effective assessments of drug prevention strategies in such settings in the future.
Resumo:
The measurement of illicit drug metabolites in raw wastewater is increasingly being adopted as an approach to objectively monitor population-level drug use, and is an effective complement to traditional epidemiological methods. As such, it has been widely applied in western countries. In this study, we utilised this approach to assess drug use patterns over nine days during April 2011 in Hong Kong. Raw wastewater samples were collected from the largest wastewater treatment plant serving a community of approximately 3.5 million people and analysed for excreted drug residues including cocaine, ketamine, methamphetamine, 3,4-methylenedioxymethamphetamine (MDMA) and key metabolites using liquid chromatography coupled with tandem mass spectrometry. The overall drug use pattern determined by wastewater analysis was consistent with that have seen amongst people coming into contact with services in relation to substance use; among our target drugs, ketamine (estimated consumption: 1400–1600 mg/day/1000 people) was the predominant drug followed by methamphetamine (180–200 mg/day/1000 people), cocaine (160–180 mg/day/1000 people) and MDMA (not detected). The levels of these drugs were relatively steady throughout the monitoring period. Analysing samples at higher temporal resolution provided data on diurnal variations of drug residue loads. Elevated ratios of cocaine to benzoylecgonine were identified unexpectedly in three samples during the evening and night, providing evidence for potential dumping events of cocaine. This study provides the first application of wastewater analysis to quantitatively evaluate daily drug use in an Asian metropolitan community. Our data reinforces the benefit of wastewater monitoring to health and law enforcement authorities for strategic planning and evaluation of drug intervention strategies.
Resumo:
A large population-based survey of persons with multiple sclerosis (MS) and their caregivers was conducted in Ontario using self-completed mailed questionnaires. The objectives included describing assistance arrangements, needs, and use of and satisfaction with services, and comparing perceptions of persons with MS and their caregivers. Response rates were 83% and 72% for those with MS and caregivers, respectively. Based on 697 respondents with MS whose mean age is 48 years, 70% are female, and 75% are married. While 24% experience no mobility restrictions, the majority require some type of aid or a wheelchair for getting around. Among 345 caregivers, who have been providing care for 9 years on average, the majority are spouses. Caregivers report providing more frequent care than do persons with MS report receiving it, particularly for the following activities of daily living: eating, meal preparation, and help with personal finances. Caregivers also report assistance of longer duration per day than do care recipients with MS. Frequency and duration of assistance are positively associated with increased MS symptom severity and reduced mobility. Generally there is no rural-urban disparity in service provision, utilization or satisfaction, and although there is a wide range of service utilization, satisfaction is consistently high. Respite care is rarely used by caregivers. Use of several services is positively associated with increased severity of MS symptoms and reduced mobility. Assistance arrangements and use of services, each from the point of view of persons with MS and their caregivers, must be taken into account in efforts to prolong home care and to postpone early institutionalization of persons with MS.
Resumo:
REVIEW QUESTION/OBJECTIVE The quantitative objectives are to identify the impact of curative colorectal cancer treatment (surgery or adjuvant therapy) on physical activity, functional status and quality of life within one year of treatment or diagnosis. INCLUSION CRITERIA Types of participants: This review will consider studies that include individuals aged 18 years and over who have been diagnosed with colorectal cancer. Types of intervention(s)/phenomena of interest: This review will consider studies that evaluate the impact of curative colorectal cancer treatment: surgery and/or adjuvant therapy. Types of outcomes: This review will consider studies that include the following outcome measures assessed within one year of diagnosis or treatment: Physical activity - any bodily movement produced by skeletal muscles resulting in energy expenditure. Physical activity is not exclusive to exercise; activities can also be walking, housework, occupational or leisure. Physical activity can be measured objectively using pedometers or accelerometers, or subjectively using self-reported measures. Functional status – measured as the capacity to perform all activities of daily living such as walking, showering, and eating; and instrumental activities of daily living such as (but not limited to) grocery shopping, housekeeping and laundry. Quality of life – defined as the individual meaning of mental, physical and psychosocial wellbeing, as measured by validated tools such as SF-36, EORTC-QLQ-C30, or FACT-C.
Resumo:
Social work students consistently identify their field placement as having the most impact on their learning. Despite this, research on learning activities used during placement and the impact on practice competency and social work identity is limited. This is the second paper from a research study exploring student experiences of learning on placement. Data were gathered from 263 social work students about 14 key learning activities they experienced during placement. The more regularly students engaged in learning activities with their social work supervisor, the more likely they were to report a sense of social work identity and feelings of practice competence. However, the regular use of learning activities varied widely between placements. Surprisingly, approximately half the students did not regularly have the opportunity to observe social work practice, have their practice observed, or to link social work theory and the Code of Ethics to their practice with their social work supervisor.
Resumo:
The aims of this study were to investigate outcome and to evaluate areas of potential ongoing concern after orthotopic liver transplantation (OLT) in children. Actuarial survival in relation to age and degree of undernutrition at the time of OLT was evaluated in 53 children (age 0.58-14.2 years) undergoing OLT for endstage liver disease. Follow-up studies of growth and quality of life were undertaken in those with a minimum follow-up period of 12 months (n = 26). The overall 3 year actuarial survival was 70%. Survival rates did not differ between age groups (actuarial 2 year survival for ages <1, 1-5 and >5 years were 70, 70 and 69% respectively) but did differ according to nutritional status at OLT (actuarial 2 year survival for children with Z scores for weight <-1 was 57%, >-1 was 95%; P = 0.004). Significant catch-up weight gain was observed by 18 months post-transplant, while height improved less rapidly. Quality of life (assessed by Vineland Adaptive Behaviour Scales incorporating socialization, daily living skills, communication and motor skills) was good (mean composite score 91 ± 19). All school-aged children except one were attending normal school. Two children had mild to moderate intellectual handicap related to post-operative intracerebral complications. Satisfactory long-term survival can be achieved after OLT in children regardless of age but the importance of pre-operative nutrition is emphasized. Survivors have an excellent chance of a good quality of life and of satisfactory catch-up weight gain and growth.
Resumo:
Background Previous studies (mostly questionnaire-based in children) suggest that outdoor activity is protective against myopia. There are few studies on young adults investigating both the impact of simply being outdoors versus performing physical activity. The aim was to study the relationship between the refractive error of young adults and their physical activity patterns. Methods Twenty-seven university students, aged 18 to 25 years, wore a pedometer (Omron HJ720ITE) for seven days both during the semester and holiday periods. They simultaneously recorded the type of activity performed, its duration, the number of steps taken (from the pedometer) and their location (indoors/outdoors) in a logbook. Mean spherical refractive error was used to divide participants into three groups (emmetropes: +1.00 to -0.50 D, low myopes: -0.62 to -3.00 D, higher myopes: -3.12 D or greater myopia). Results There were no significant differences between the refractive groups during the semester or holiday periods; the average daily times spent outdoors, the duration of physical activity, the ratio of physical activity performed outdoors to indoors and amount of near work performed were similar. The peak exercise intensity was similar across all groups: approximately 100 steps perminute, a brisk walk. Up to one-third of all physical activity was performed outdoors. There were some significant differences in activities performed during semester and holiday times. For example, lowmyopes spent significantly less time outside (49 ± 47 versus 74 ± 41 minutes, p = 0.005) and performed less physical activity (6,388 ± 1,747 versus 6,779 ± 2,746 steps per day; p = 0.03) during the holidays compared to during semester. Conclusions The fact that all groups had similar low exercise intensity butmany were notmyopic suggests that physical activity levels are not critical. There were differences in the activity patterns of lowmyopes during semester and holiday periods. This study highlights the need for a larger longitudinal-based study with particular emphasis on how discretionary time is spent.
Resumo:
Purpose This study evaluated the impact of a daily and weekly image-guided radiotherapy protocols in reducing setup errors and setting of appropriate margins in head and neck cancer patients. Materials and methods Interfraction and systematic shifts for the hypothetical day 1–3 plus weekly imaging were extrapolated from daily imaging data from 31 patients (964 cone beam computed tomography (CBCT) scans). In addition, residual setup errors were calculated by taking the average shifts in each direction for each patient based on the first three shifts and were presumed to represent systematic setup error. The clinical target volume (CTV) to planning target volume (PTV) margins were calculated using van Herk formula and analysed for each protocol. Results The mean interfraction shifts for daily imaging were 0·8, 0·3 and 0·5 mm in the S-I (superior-inferior), L-R (left-right) and A-P (anterior-posterior) direction, respectively. On the other hand the mean shifts for day 1–3 plus weekly imaging were 0·9, 1·8 and 0·5 mm in the S-I, L-R and A-P direction, respectively. The mean day 1–3 residual shifts were 1·5, 2·1 and 0·7 mm in the S-I, L-R and A-P direction, respectively. No significant difference was found in the mean setup error for the daily and hypothetical day 1–3 plus weekly protocol. However, the calculated CTV to PTV margins for the daily interfraction imaging data were 1·6, 3·8 and 1·4 mm in the S-I, L-R and A-P directions, respectively. Hypothetical day 1–3 plus weekly resulted in CTV–PTV margins of 5, 4·2 and 5 mm in the S-I, L-R and A-P direction. Conclusions The results of this study show that a daily CBCT protocol reduces setup errors and allows setup margin reduction in head and neck radiotherapy compared to a weekly imaging protocol.
Resumo:
To strive to improve the rehabilitation program of individuals with transfemoral amputation fitted with bone-anchored prosthesis based on data from direct measurements of the load applied on the residuum we first of all need to understand the load applied on the fixation. Therefore the load applied on the residuum was first directly measured during standardized activities of daily living such as straight line level walking, ascending and descending stairs and a ramp and walking around a circle. From measuring the load in standardized activities of daily living the load was also measured during different phases of the rehabilitation program such as during walking with walking aids and during load bearing exercises.[1-15] The rehabilitation program for individuals with a transfemoral amputation fitted with an OPRA implant relies on a combination of dynamic and static load bearing exercises.[16-20] This presentation will focus on the study of a set of experimental static load bearing exercises. [1] A group of eleven individuals with unilateral transfemoral amputation fitted with an OPRA implant participated in this study. The load on the implant during the static load bearing exercises was measured using a portable system including a commercial transducer embedded in a short pylon, a laptop and a customized software package. This apparatus was previously shown effective in a proof-of-concept study published by Prof. Frossard. [1-9] The analysis of the static load bearing exercises included an analysis of the reliability as well as the loading compliance. The analysis of the loading reliability showed a high reliability between the loading sessions indicating a correct repetition of the LBE by the participants. [1, 5] The analysis of the loading compliance showed a significant lack of axial compliance leading to a systematic underloading of the long axis of the implant during the proposed experimental static LBE.
Resumo:
Objectives To investigate medication changes for older patients admitted to hospital and to explore associations between patient characteristics and polypharmacy. Design Prospective cohort study. Participants and setting Patients aged 70 years or older admitted to general medical units of 11 acute care hospitals in two Australian states between July 2005 and May 2010. All patients were assessed using the interRAI assessment system for acute care. Main outcome measures Measures of physical, cognitive and psychosocial functioning; and number of regular prescribed medications categorised into three groups: non-polypharmacy (0–4 drugs), polypharmacy (5–9 drugs) and hyperpolypharmacy (≥ 10 drugs). Results Of 1220 patients who were recruited for the study, medication records at admission were available for 1216. Mean age was 81.3 years (SD, 6.8 years), and 659 patients (54.2%) were women. For the 1187 patients with complete medication records on admission and discharge, there was a small but statistically significant increase in mean number of regular medications per day between admission and discharge (7.1 v 7.6), while the prevalence of medications such as statins (459 [38.7%] v 457 [38.5%] patients), opioid analgesics (155 [13.1%] v 166 [14.0%] patients), antipsychotics (59 [5.0%] v 65 [5.5%] patients) and benzodiazepines (122 [10.3%] v 135 [11.4%] patients) did not change significantly. Being in a higher polypharmacy category was significantly associated with increase in comorbidities (odds ratio [OR], 1.27; 95% CI, 1.20–1.34), presence of pain (OR, 1.31; 1.05–1.64), dyspnoea (OR, 1.64; 1.30–2.07) and dependence in terms of instrumental activities of daily living (OR, 1.70; 1.20–2.41). Hyperpolypharmacy was observed in 290/1216 patients (23.8%) at admission and 336/1187 patients (28.3%) on discharge, and the proportion of preventive medication in the hyperpolypharmacy category at both points in time remained high (1209/3371 [35.9%] at admission v 1508/4117 [36.6%] at discharge). Conclusions Polypharmacy is common among older people admitted to general medical units of Australian hospitals, with no clinically meaningful change to the number or classification (symptom control, prevention or both) of drugs made by treating physicians.
Resumo:
Older populations are more likely to have multiple co-morbid diseases that require multiple treatments, which make them a large consumer of medications. As a person grows older, their ability to tolerate medications becomes less due to age-related changes in pharmacokinetics and pharmacodynamics often heading along a path that leads to frailty. Frail older persons often have multiple co-morbidities with signs of impairment in activities of daily living. Prescribing drugs for these vulnerable individuals is difficult and is a potentially unsafe activity. Inappropriate prescribing in older population can be detected using explicit (criterion-based) or implicit (judgment-based) criteria. Unfortunately, most current therapeutic guidelines are applicable only to healthy older adults and cannot be generalized to frail patients. These discrepancies should be addressed either by developing new criteria or by refining the existing tools for frail older people. The first and foremost step is to identify the frail patient in clinical practice by applying clinically validated tools. Once the frail patient has been identified, there is a need for specific measures or criteria to assess appropriateness of therapy that consider such factors as quality of life, functional status and remaining life expectancy and thus modified goals of care.
Resumo:
Creating better gameplay experiences is dependent upon understand the act of gameplay. An expert focus group of games researchers, designers and players refined 16 activity categories from an existing list of 30 commonly used videogame challenges. Identifying categories of play activities has future potential to facilitate better research design and game design.