300 resultados para Daily living
Resumo:
Objective: During hospitalisation older people often experience functional decline which impacts on their future independence. The objective of this study was to evaluate a multifaceted transitional care intervention including home-based exercise strategies for at-risk older people on functional status, independence in activities of daily living, and walking ability. Methods: A randomised controlled trial was undertaken in a metropolitan hospital in Australia with 128 patients (64 intervention, 64 control) aged over 65 years with an acute medical admission and at least one risk factor for hospital readmission. The intervention group received an individually tailored program for exercise and follow-up care which was commenced in hospital and included regular visits in hospital by a physiotherapist and a Registered Nurse, a home visit following discharge, and regular telephone follow-up for 24 weeks following discharge. The program was designed to improve health promoting behaviours, strength, stability, endurance and mobility. Data were collected at baseline, then 4, 12 and 24 weeks following discharge using the Index of Activities of Daily Living (ADL), Instrumental Index of Activities of Daily Living (IADL), and the Walking Impairment Questionnaire (Modified). Results: Significant improvements were found in the intervention group in IADL scores (p<.001), ADL scores (p<.001), and WIQ scale scores (p<.001) in comparison to the control group. The greatest improvements were found in the first four weeks following discharge. Conclusions: Early introduction of a transitional model of care incorporating a tailored exercise program and regular telephone follow-up for hospitalised at-risk older adults can improve independence and functional ability.
Resumo:
The purpose of this preliminary study was to determine the relevance of the categorization of the load regime data to assess the functional output and usage of the prosthesis of lower limb amputees. The objectives were a) to introduce a categorization of load regime, b) to present some descriptors of each activity, and c) to report the results for a case. The load applied on the osseointegrated fixation of one transfemoral amputee was recorded using a portable kinetic system for 5 hours. The periods of directional locomotion, localized locomotion, and stationary loading occurred 44%, 34%, and 22% of recording time and each accounted for 51%, 38%, and 12% of the duration of the periods of activity, respectively. The absolute maximum force during directional locomotion, localized locomotion, and stationary loading was 19%, 15%, and 8% of the body weight on the anteroposterior axis, 20%, 19%, and 12% on the mediolateral axis, and 121%, 106%, and 99% on the long axis. A total of 2,783 gait cycles were recorded. Approximately 10% more gait cycles and 50% more of the total impulse than conventional analyses were identified. The proposed categorization and apparatus have the potential to complement conventional instruments, particularly for difficult cases.
Resumo:
Background: Decreased ability to perform Activities of Daily Living (ADLs) during hospitalisation has negative consequences for patients and health service delivery. Objective: To develop an Index to stratify patients at lower and higher risk of a significant decline in ability to perform ADLs at discharge. Design: Prospective two cohort study comprising a derivation (n=389; mean age 82.3 years; SD� 7.1) and a validation cohort (n=153; mean age 81.5 years; SD� 6.1). Patients and setting: General medical patients aged = 70 years admitted to three university-affiliated acute care hospitals in Brisbane, Australia. Measurement and main results: The short ADL Scale was used to identify a significant decline in ability to perform ADLs from premorbid to discharge. In the derivation cohort, 77 patients (19.8%) experienced a significant decline. Four significant factors were identified for patients independent at baseline: 'requiring moderate assistance to being totally dependent on others with bathing'; 'difficulty understanding others (frequently or all the time)'; 'requiring moderate assistance to being totally dependent on others with performing housework'; a 'history of experiencing at least one fall in the previous 90 days prior to hospital admission' in addition to 'independent at baseline', which was protective against decline at discharge. 'Difficulty understanding others (frequently or all the time)' and 'requiring moderate assistance to being totally dependent on others with performing housework' were also predictors for patients dependent in ADLs at baseline. Sensitivity, specificity, Positive Predictive Value (PPV), and Negative Predictive Value (NPV) of the DADLD dichotomised risk scores were: 83.1% (95% CI 72.8; 90.7); 60.5% (95% CI 54.8; 65.9); 34.2% (95% CI 27.5; 41.5); 93.5% (95% CI 89.2; 96.5). In the validation cohort, 47 patients (30.7%) experienced a significant decline. Sensitivity, specificity, PPV and NPV of the DADLD were: 78.7% (95% CI 64.3; 89.3); 69.8% (95% CI 60.1, 78.3); 53.6% (95% CI 41.2; 65.7); 88.1% (95% CI 79.2; 94.1). Conclusions: The DADLD Index is a useful tool for identifying patients at higher risk of decline in ability to perform ADLs at discharge.
Resumo:
The monitoring of the actual activities of daily living of individuals with lower limb amputation is essential for an evidence-based fitting of the prosthesis, more particularly the choice of components (e.g., knees, ankles, feet)[1-4]. The purpose of this presentation was to give an overview of the categorization of the load regime data to assess the functional output and usage of the prosthesis of lower limb amputees has presented in several publications[5, 6]. The objectives were to present a categorization of load regime and to report the results for a case.
Resumo:
Background The purpose of this presentation is to outline the relevance of the categorization of the load regime data to assess the functional output and usage of the prosthesis of lower limb amputees. The objectives are • To highlight the need for categorisation of activities of daily living • To present a categorization of load regime applied on residuum, • To present some descriptors of the four types of activity that could be detected, • To provide an example the results for a case. Methods The load applied on the osseointegrated fixation of one transfemoral amputee was recorded using a portable kinetic system for 5 hours. The load applied on the residuum was divided in four types of activities corresponding to inactivity, stationary loading, localized locomotion and directional locomotion as detailed in previously publications. Results The periods of directional locomotion, localized locomotion, and stationary loading occurred 44%, 34%, and 22% of recording time and each accounted for 51%, 38%, and 12% of the duration of the periods of activity, respectively. The absolute maximum force during directional locomotion, localized locomotion, and stationary loading was 19%, 15%, and 8% of the body weight on the anteroposterior axis, 20%, 19%, and 12% on the mediolateral axis, and 121%, 106%, and 99% on the long axis. A total of 2,783 gait cycles were recorded. Discussion Approximately 10% more gait cycles and 50% more of the total impulse than conventional analyses were identified. The proposed categorization and apparatus have the potential to complement conventional instruments, particularly for difficult cases.
Resumo:
Children with developmental co-ordination disorder (DCD) face evident motor difficulties in activities of daily living (ADL). Assessment of their capacity in ADL is essential for diagnosis and intervention, in order to limit the daily consequences of the disorder. The aim of this study is to systematically review potential instruments for standardized and objective assessment of children's capacity in ADL, suited for children with DCD. As a first step, databases of MEDLINE, EMBASE, CINAHL and PsycINFO were searched to identify studies that described instruments with potential for assessment of capacity in ADL. Second, instruments were included for review when two independent reviewers agreed that the instruments: (1) are standardized and objective; (2) assess at activity level and comprise items that reflect ADL, and; (3) are applicable to school-aged children that can move independently. Out of 1507 publications, 66 publications were selected, describing 39 instruments. Seven of these instruments were found to fulfil the criteria and were included for review: the Bruininks-Oseretsky Test of Motor Performance-2 (BOT2); the Do-Eat (Do-Eat); the Movement Assessment Battery for Children-2 (MABC2); the school-Assessment of Motor and Process Skills (schoolAMPS); the Tuffts Assessment of Motor Performance (TAMP); the Test of Gross Motor Development (TGMD); and the Functional Independence Measure for Children (WeeFIM). As a third step, for the included instruments, suitability for children with DCD was discussed based on the ADL comprised, ecological validity and other psychometric properties. We concluded that current instruments do not provide comprehensive and ecologically valid assessment of capacity in ADL as required for children with DCD.
Resumo:
Difficulties in the performance of activities of daily living (ADL) are a key feature of developmental coordination disorder (DCD). The DCDDaily-Q was developed to address children's motor performance in a comprehensive range ADL. The aim of this study was to investigate the psychometric properties of this parental questionnaire. Parents of 218 five to eight year-old children (DCD group: N=25; reference group: N=193) completed the research version of the new DCDDaily-Q and the Movement Assessment Battery for Children-2 (MABC2) Checklist and Developmental Coordination Disorder Questionnaire (DCDQ). Children were assessed with the MABC2 and DCDDaily. Item reduction analyses were performed and reliability (internal consistency and factor structure) and concurrent, discriminant, and incremental validity of the DCDDaily-Q were investigated. The final version of the DCDDaily-Q comprises 23 items that cover three underlying factors and shows good internal consistency (Cronbach's α>.80). Moderate correlations were found between the DCDDaily-Q and the other instruments used (p<.001 for the reference group; p>.05 for the DCD group). Discriminant validity of the DCDDaily-Q was good for DCDDaily-Q total scores (p<.001) and all 23 item scores (p<.01), indicating poorer performance in the DCD group. Sensitivity (88%) and specificity (92%) were good. The DCDDaily-Q better predicted DCD than currently used questionnaires (R2=.88). In conclusion, the DCDDaily-Q is a valid and reliable questionnaire to address children's ADL performance.
Resumo:
Background Children with developmental coordination disorder (DCD) face evident motor difficulties in daily functioning. Little is known, however, about their difficulties in specific activities of daily living (ADL). Objective The purposes of this study were: (1) to investigate differences between children with DCD and their peers with typical development for ADL performance, learning, and participation, and (2) to explore the predictive values of these aspects. Design. This was a cross-sectional study. Methods In both a clinical sample of children diagnosed with DCD (n=25 [21 male, 4 female], age range=5-8 years) and a group of peers with typical development (25 matched controls), the children’s parents completed the DCDDaily-Q. Differences in scores between the groups were investigated using t tests for performance and participation and Pearson chi-square analysis for learning. Multiple regression analyses were performed to explore the predictive values of performance, learning, and participation. Results Compared with their peers, children with DCD showed poor performance of ADL and less frequent participation in some ADL. Children with DCD demonstrated heterogeneous patterns of performance (poor in 10%-80% of the items) and learning (delayed in 0%-100% of the items). In the DCD group, delays in learning of ADL were a predictor for poor performance of ADL, and poor performance of ADL was a predictor for less frequent participation in ADL compared with the control group. Limitations A limited number of children with DCD were addressed in this study. Conclusions This study highlights the impact of DCD on children’s daily lives and the need for tailored intervention.
Resumo:
Objective To develop the DCDDaily, an instrument for objective and standardized clinical assessment of capacity in activities of daily living (ADL) in children with developmental coordination disorder (DCD), and to investigate its usability, reliability, and validity. Subjects Five to eight-year-old children with and without DCD. Main measures The DCDDaily was developed based on thorough review of the literature and extensive expert involvement. To investigate the usability (assessment time and feasibility), reliability (internal consistency and repeatability), and validity (concurrent and discriminant validity) of the DCDDaily, children were assessed with the DCDDaily and the Movement Assessment Battery for Children-2 Test, and their parents filled in the Movement Assessment Battery for Children-2 Checklist and Developmental Coordination Disorder Questionnaire. Results 459 children were assessed (DCD group, n = 55; normative reference group, n = 404). Assessment was possible within 30 minutes and in any clinical setting. For internal consistency, Cronbach’s α = 0.83. Intraclass correlation = 0.87 for test–retest reliability and 0.89 for inter-rater reliability. Concurrent correlations with Movement Assessment Battery for Children-2 Test and questionnaires were ρ = −0.494, 0.239, and −0.284, p < 0.001. Discriminant validity measures showed significantly worse performance in the DCD group than in the control group (mean (SD) score 33 (5.6) versus 26 (4.3), p < 0.001). The area under curve characteristic = 0.872, sensitivity and specificity were 80%. Conclusions The DCDDaily is a valid and reliable instrument for clinical assessment of capacity in ADL, that is feasible for use in clinical practice.
Resumo:
Background: Caring for family members with dementia can be a long-term, burdensome task resulting in physical and emotional distress and impairment. Research has demonstrated significantly lower levels of selfefficacy among family caregivers of people with dementia (CGs) than caregivers of relatives with non-dementia diseases. Intervention studies have also suggested that the mental and physical health of dementia CGs could be improved through the enhancement of their self-efficacy. However, studies are limited in terms of the influences of caregiver self-efficacy on caregiver behaviour, subjective burden and health-related quality of life. Of particular note is that there are no studies on the applicability of caregiver self-efficacy in the social context of China. Objective: The purpose of this thesis was to undertake theoretical exploration using Bandura’s (1997) self-efficacy theory to 1) revise the Revised Caregiving Self-Efficacy Scale (C-RCSES) (Steffen, McKibbin, Zeiss, Gallagher-Thompson, & Bandura, 2002), and 2) explore determinants of caregiver self-efficacy and the role of caregiver self-efficacy and other conceptual constructs (including CGs’ socio-demographic characteristics, CRs’ impairment and CGs’ social support) in explaining and predicting caregiver behaviour, subjective burden and health-related quality of life among CGs in China. Methodology: Two studies were undertaken: a qualitative elicitation study with 10 CGs; and a cross-sectional survey with 196 CGs. In the first study, semi-structured interviews were conducted to explore caregiver behaviours and corresponding challenges for their performance. The findings of the study assisted in the development of the initial items and domains of the Chinese version of the Revised Caregiving Self-Efficacy Scale (C-RCSES). Following changes to items in the scale, the second study, a cross-sectional survey with 196 CGs was conducted to evaluate the psychometric properties of C-RCSES and to test a hypothesised self-efficacy model of family caregiving adapted from Bandura’s theory (1997). Results: 35 items were generated from the qualitative data. The content validity of the C-RCSES was assessed and ensured in Study One before being used for the cross-sectional survey. Eight items were removed and five subscales (caregiver self-efficacy for gathering information about treatment, symptoms and health care; obtaining support; responding to problematic behaviours; management of household, personal and medical care; and controlling upsetting thoughts about caregiving) were identified after principal component factor analysis on the cross-sectional survey data. The reliability of the scale is acceptable: the Cronbach’s alpha coefficients for the whole scale and for each subscale were all over .80; and the fourweek test-retest reliabilities for the whole scale and for each subscale ranged from .64 to .85. The concurrent, convergent and divergent validity were also acceptable. CGs reported moderate levels of caregiver self-efficacy. Furthermore, the level of self-efficacy for management of household, personal and medical care was relatively high in comparison to those of the other four domains of caregiver self-efficacy. Caregiver self-efficacy was also significantly influenced by CGs’ socio-demographic characteristics and the caregiving external factors (CR impairment and social support that CGs obtained). The level of caregiver behaviour that CGs reported was higher than that reported in other Chinese research. CGs’ socio-demographics significantly influenced caregiver behaviour, whereas caregiver self-efficacy did not influence caregiver behaviour. Regarding the two external factors, CGs who cared for highly impaired relatives reported high levels of caregiver behaviour, but social support did not influence caregiver behaviour. Regarding caregiver subjective burden and health-related quality of life, CGs reported moderate levels of subjective burden, and their level of healthrelated quality of life was significantly lower than that of the general population in China. The findings also indicated that CGs’ subjective burden and health-related quality of life were influenced by all major factors in the hypothesised model, including CGs’ socio-demographics, CRs’ impairment, social support that CGs obtained, caregiver self-efficacy and caregiver behaviour. Of these factors, caregiver self-efficacy and social support significantly improved their subjective burden and health-related quality of life; whereas caregiver behaviour and CRs’ impairment were detrimental to CGs, such as increasing subjective burden and worsening health-related quality of life. Conclusion: While requiring further exploration, the qualitative study was the first qualitative research conducted in China to provide an in-depth understanding of CGs’ caregiving experience, including their major caregiver behaviours and the corresponding challenges. Meanwhile, although the C-RCSES needs further psychometric testing, it is a useful tool for assessing caregiver self-efficacy in Chinese populations. Results of the qualitative and quantitative study provide useful information for future studies regarding the explanatory power of caregiver self-efficacy to caregiver behaviour, subjective burden and health-related quality of life. Additionally, integrated with Bandura’s theory, the findings from the quantitative study also suggested a further study exploring the role of outcome expectations in caregiver behaviour, subjective burden and healthrelated quality of life.
Resumo:
This study established that the core principle underlying categorisation of activities have the potential to provide more comprehensive outcomes than the recognition of activities because it takes into consideration activities other than directional locomotion.
Resumo:
The purpose of this study was to characterise the functional outcome of 12 transfemoral amputees fitted with osseointegrated fixation using temporal gait characteristics. The objectives were (A) to present the cadence, duration of gait cycle, support and swing phases with an emphasis on the stride-to-stride and participant-to-participant variability, and (B) to compare these temporal variables with normative data extracted from the literature focusing on transfemoral amputees fitted with a socket and able-bodied participants. The temporal variables were extracted from the load applied on the residuum during straight level walking, which was collected at 200 Hz by a transducer. A total of 613 strides were assessed. The cadence (46±4 strides/min), the duration of the gait cycle (1.29±0.11 s), support (0.73±0.07 s, 57±3% of CG) and swing (0.56±0.07 s, 43±3% of GC) phases of the participants were 2% quicker, 3%, 6% shorter and 1% longer than transfemoral amputees using a socket as well as 11% slower, 9%, 6% and 13% longer than able-bodied, respectively. All combined, the results indicated that the fitting of an osseointegrated fixation has enabled this group of amputees to restore their locomotion with a highly functional level. Further longitudinal and cross-sectional studies would be required to confirm these outcomes. Nonetheless, the data presented can be used as benchmark for future comparisons. It can also be used as input in generic algorithms using templates of patterns of loading to recognise activities of daily living and to detect falls.
Resumo:
People in developed countries are living longer with the help of medical advances. Literature has shown that older people prefer to stay independent and live at home for as long as possible. Therefore, it is important to find out how to best accommodate and assist them in maintaining quality of life and independence as well as easing human resources. Researchers have claimed that assistive devices assist in older people’s independence, however, only a small number of studies regarding the efficiency of assistive devices have been undertaken of which several have stated that devices are not being used. The overall aim of this research was to identify whether the disuse and ineffectiveness of assistive devices are related to change in abilities or related to the design of the devices. The objective was to gather information from the elderly; to identify what assistive devices are being used or not used and to gain an understanding on their attitudes towards assistive devices. Research was conducted in two phases. The initial phase of the research was conducted with the distribution of questionnaires to people over the age of fifty that asked general questions and specific questions on type of devices being used. Phase One was followed on by Phase Two, where participants from Phase One who had come in contact with assistive devices were invited to participate in a semi-structured interview. Questions were put forth to the interviewee on their use of and attitudes towards assistive devices. Findings indicated that the reasons for the disuse in assistive devices were mostly design related; bulkiness, reliability, performance of the device, difficulty of use. The other main reason for disuse was socially related; elderly people preferred to undertake activities on their own and only use a device as a precaution or when absolutely necessary. They would prefer not having to rely on the devices. Living situation and difference in gender did not affect the preference for the use of assistive devices over personal assistance. The majority strongly supported the idea of remaining independent for as long as possible. In conclusion, this study proposes that through these findings, product designers will have a better understanding of the requirements of an elderly user. This will enable the designers to produce assistive devices that are more practical, personalised, reliable, easy to use and tie in with the older people’s environments. Additional research with different variables is recommended to further justify these findings.
Resumo:
Presbyopia affects individuals from the age of 45 years onwards, resulting in difficulty in accurately focusing on near objects. There are many optical corrections available including spectacles or contact lenses that are designed to enable presbyopes to see clearly at both far and near distances. However, presbyopic vision corrections also disturb aspects of visual function under certain circumstances. The impact of these changes on activities of daily living such as driving are, however, poorly understood. Therefore, the aim of this study was to determine which aspects of driving performance might be affected by wearing different types of presbyopic vision corrections. In order to achieve this aim, three experiments were undertaken. The first experiment involved administration of a questionnaire to compare the subjective driving difficulties experienced when wearing a range of common presbyopic contact lens and spectacle corrections. The questionnaire was developed and piloted, and included a series of items regarding difficulties experienced while driving under day and night-time conditions. Two hundred and fifty five presbyopic patients responded to the questionnaire and were categorised into five groups, including those wearing no vision correction for driving (n = 50), bifocal spectacles (BIF, n = 54), progressive addition lenses spectacles (PAL, n = 50), monovision (MV, n = 53) and multifocal contact lenses (MTF CL, n = 48). Overall, ratings of satisfaction during daytime driving were relatively high for all correction types. However, MV and MTF CL wearers were significantly less satisfied with aspects of their vision during night-time than daytime driving, particularly with regard to disturbances from glare and haloes. Progressive addition lens wearers noticed more distortion of peripheral vision, while BIF wearers reported more difficulties with tasks requiring changes in focus and those who wore no vision correction for driving reported problems with intermediate and near tasks. Overall, the mean level of satisfaction for daytime driving was quite high for all of the groups (over 80%), with the BIF wearers being the least satisfied with their vision for driving. Conversely, at night, MTF CL wearers expressed the least satisfaction. Research into eye and head movements has become increasingly of interest in driving research as it provides a means of understanding how the driver responds to visual stimuli in traffic. Previous studies have found that wearing PAL can affect eye and head movement performance resulting in slower eye movement velocities and longer times to stabilize the gaze for fixation. These changes in eye and head movement patterns may have implications for driving safety, given that the visual tasks for driving include a range of dynamic search tasks. Therefore, the second study was designed to investigate the influence of different presbyopic corrections on driving-related eye and head movements under standardized laboratory-based conditions. Twenty presbyopes (mean age: 56.1 ± 5.7 years) who had no experience of wearing presbyopic vision corrections, apart from single vision reading spectacles, were recruited. Each participant wore five different types of vision correction: single vision distance lenses (SV), PAL, BIF, MV and MTF CL. For each visual condition, participants were required to view videotape recordings of traffic scenes, track a reference vehicle and identify a series of peripherally presented targets while their eye and head movements were recorded using the faceLAB® eye and head tracking system. Digital numerical display panels were also included as near visual stimuli (simulating the visual displays of a vehicle speedometer and radio). The results demonstrated that the path length of eye movements while viewing and responding to driving-related traffic scenes was significantly longer when wearing BIF and PAL than MV and MTF CL. The path length of head movements was greater with SV, BIF and PAL than MV and MTF CL. Target recognition was less accurate when the near stimulus was located at eccentricities inferiorly and to the left, rather than directly below the primary position of gaze, regardless of vision correction type. The third experiment aimed to investigate the real world driving performance of presbyopes while wearing different vision corrections measured on a closed-road circuit at night-time. Eye movements were recorded using the ASL Mobile Eye, eye tracking system (as the faceLAB® system proved to be impractical for use outside of the laboratory). Eleven participants (mean age: 57.25 ± 5.78 years) were fitted with four types of prescribed vision corrections (SV, PAL, MV and MTF CL). The measures of driving performance on the closed-road circuit included distance to sign recognition, near target recognition, peripheral light-emitting-diode (LED) recognition, low contrast road hazards recognition and avoidance, recognition of all the road signs, time to complete the course, and driving behaviours such as braking, accelerating, and cornering. The results demonstrated that driving performance at night was most affected by MTF CL compared to PAL, resulting in shorter distances to read signs, slower driving speeds, and longer times spent fixating road signs. Monovision resulted in worse performance in the task of distance to read a signs compared to SV and PAL. The SV condition resulted in significantly more errors made in interpreting information from in-vehicle devices, despite spending longer time fixating on these devices. Progressive addition lenses were ranked as the most preferred vision correction, while MTF CL were the least preferred vision correction for night-time driving. This thesis addressed the research question of how presbyopic vision corrections affect driving performance and the results of the three experiments demonstrated that the different types of presbyopic vision corrections (e.g. BIF, PAL, MV and MTF CL) can affect driving performance in different ways. Distance-related driving tasks showed reduced performance with MV and MTF CL, while tasks which involved viewing in-vehicle devices were significantly hampered by wearing SV corrections. Wearing spectacles such as SV, BIF and PAL induced greater eye and head movements in the simulated driving condition, however this did not directly translate to impaired performance on the closed- road circuit tasks. These findings are important for understanding the influence of presbyopic vision corrections on vision under real world driving conditions. They will also assist the eye care practitioner to understand and convey to patients the potential driving difficulties associated with wearing certain types of presbyopic vision corrections and accordingly to support them in the process of matching patients to optical corrections which meet their visual needs.
Resumo:
The Achilles tendon has been seen to exhibit time-dependent conditioning when isometric muscle actions were of a prolonged duration, compared to those involved in dynamic activities, such as walking. Since, the effect of short duration muscle activation associated with dynamic activities is yet to be established, the present study aimed to investigate the effect of incidental walking activity on Achilles tendon diametral strain. Eleven healthy male participants refrained from physical activity in excess of the walking required to carry out necessary daily tasks and wore an activity monitor during the 24 h study period. Achilles tendon diametral strain, 2 cm proximal to the calcaneal insertion, was determined from sagittal sonograms. Baseline sonographic examinations were conducted at ∼08:00 h followed by replicate examinations at 12 and 24 h. Walking activity was measured as either present (1) or absent (0) and a linear weighting function was applied to account for the proximity of walking activity to tendon examination time. Over the course of the day the median (min, max) Achilles tendon diametral strain was −11.4 (4.5, −25.4)%. A statistically significant relationship was evident between walking activity and diametral strain (P < 0.01) and this relationship improved when walking activity was temporally weighted (AIC 131 to 126). The results demonstrate that the short yet repetitive loads generated during activities of daily living, such as walking, are sufficient to induce appreciable time-dependant conditioning of the Achilles tendon. Implications arise for the in vivo measurement of Achilles tendon properties and the rehabilitation of tendinopathy.