866 resultados para Visual analog scale (VAS)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The measurement of alcohol craving began with single-item scales. Multifactorial scales developed with the intention to capture more fully the phenomenon of craving. This study examines the construct validity of a multifactorial scale, the Yale-Brown Obsessive Compulsive Scale for heavy drinking (Y-BOCS-hd). The study compares its clinical utility with a single item visual-analogue craving scale. The study includes 212 alcohol dependent subjects (127 males, 75 females) undertaking an outpatient treatment program between 1999-2001. Subjects completed the Y-BOCS-hd and a single item visual-analogue scale, in addition to alcohol consumption and dependence severity measures. The Y-BOCS-hd had strong construct validity. Both the visual-analogue alcohol craving scale and Y-BOCS-hd were weakly associated with pretreatment dependence severity. There was a significant association between pretreatment alcohol consumption and the visual-analogue craving scale. Neither craving measure was able to predict total program abstinence or days abstinent. The relationship between obsessive-compulsive behavior in alcohol dependence and craving remains unclear.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fibromyalgia (FM) is a chronic, rheumatic disease characterized by widespread myofascial pain, of unknown aetiology, having a major impact on quality of life (QOL). Available pharmacotherapy for FM is marginally effective. FM is associated with co-morbidities of gastrointestinal (GI) disorders and Irritable Bowel Syndrome (IBS). There is growing evidence that diets low in FODMAPs, “fermentable oligo-, di- or mono-saccharides and polyols” [Low FODMAP Diet (LFD)], are effective in treating IBS. The aim of this pilot study was to examine the effects of LFDs on symptoms of FM, especially with regard to pain, QOL and GI disorders. Methods A longitudinal study using LFD intervention was performed on 38, 51 ± 10 year-old, female patients diagnosed with FM for an average of 10 years, based on ACR (American College of Rheumatology) 2010 criteria. The study was conducted from January through May, 2015, using a four-week, repeated-assessment model, as follows: Moment 0 – introduction of the protocol to participants; Moment 1 – first assessment and delivery of individual LFD dietary plans; Moment 2 – second assessment and reintroduction of FODMAPs; Moment 3 – last assessment and final nutritional counselling. Assessment tools used were the following: RFIQ (Revised Fibromyalgia Impact Questionnaire), FSQ (Fibromyalgia Survey Questionnaire), IBS-SSS (Severity Score System), EQ-5D (Euro-QOL quality of life instrument), and VAS (Visual Analogue Scale). Daily consumption of FODMAPs was quantified based on published food content analyses. Statistical analyses included ANOVA, non-parametric Friedman, t-student and Chi-square tests, using SPSS 22 software. Results The mean scores of the 38 participants at the beginning of the study were: FSQ (severity of FM, 0–31) – 22 ± 4.4; RFIQ (0–100) – 65 ± 17; IBS-SSS (0–500) – 275 ± 101; and EQ-5D (0–100) – 48 ± 19. Mean adherence to dietary regimens was 86%, confirmed by significant difference in FODMAP intakes (25 g/day vs. 2.5 g/day; p < 0.01). Comparisons between the three moments of assessment showed significant (p < 0.01) declines in scores in VAS, FSQ, and RFIQ scores, in all domains measured. An important improvement was observed with a reduction in the severity of GI symptoms, with 50% reduction in IBS scores to 138 ± 117, following LFD therapy. A significant correlation (r = 0.36; p < 0.05) was found between improvements in FM impact (declined scores) and gastrointestinal scores. There was also a significant correlation (r = 0.65; p < 0.01) between “satisfaction with improvement” after introduction of LFDs and “diet adherence”, with satisfaction of the diet achieving 77% among participants. A significant difference was observed between patients who improved as compared to those that did not improve (Chi-square χ2 = 6.16; p < .05), showing that the probability of improvement, depends on the severity of the RFIQ score. Conclusions Implementation of diet therapy involving FODMAP restrictions, in this cohort of FM patients, resulted in a significant reduction in GI disorders and FM symptoms, including pain scores. These results need to be extended in future larger studies on dietary therapy for treatment of FM. Implications According to current scientific knowledge, these are the first relevant results found in an intervention with LFD therapy in FM and must be reproduced looking for a future dietetic approach in FM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE Malnutrition is common among peritoneal dialysis (PD) patients. Reduced nutrient intake contributes to this. It has long been assumed that this reflects disturbed appetite. We set out to define the appetite profiles of a group of PD patients using a novel technique. DESIGN Prospective, cross-sectional comparison of PD patients versus controls. SETTING Teaching hospital dialysis unit. PATIENTS 39 PD patients and 42 healthy controls. INTERVENTION Visual analog ratings were recorded at hourly intervals to generate daily profiles for hunger and fullness. Summary statistics were generated to compare the groups. Food intake was measured using 3-day dietary records. MAIN OUTCOME MEASURES Hunger and fullness profiles. Derived hunger and fullness scores. RESULTS Controls demonstrated peaks of hunger before mealtimes, with fullness scores peaking after meals. The PD profiles had much reduced premeal hunger peaks. A postmeal reduction in hunger was evident, but the rest of the trace was flat. The PD fullness profile was also flatter than in the controls. Mean scores were similar despite the marked discrepancy in the profiles. The PD group had lower peak hunger and less diurnal variability in their hunger scores. They also demonstrated much less change in fullness rating around mealtimes, while the mean and peak fullness scores were little different. The reported nutrient intake was significantly lower for PD. CONCLUSION The data suggest that PD patients normalize their mean appetite perception at a lower level of nutrient intake than controls, suggesting that patient-reported appetite may be misleading in clinical practice. There is a loss of the usual daily variation for the PD group, which may contribute to their reduced food intake. The technique described here could be used to assess the impact of interventions upon the abnormal PD appetite profile.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous studies have shown that exercise (Ex) interventions create a stronger coupling between energy intake (EI) and energy expenditure (EE) leading to increased homeostasis of the energy-balance (EB) regulatory system compared to a diet intervention where an un-coupling between EI and EE occurs. The benefits of weight loss from Ex and diet interventions greatly depend on compensatory responses. The present study investigated an 8-week medium-term Ex and diet intervention program (Ex intervention comprised of 500kcal EE five days per week over four weeks at 65-75% maximal heart rate, whereas the diet intervention comprised of a 500kcal decrease in EI five days per week over four weeks) and its effects on compensatory responses and appetite regulation among healthy individuals using a between- and within-subjects design. Effects of an acute dietary manipulation on appetite and compensatory behaviours and whether a diet and/or Ex intervention pre-disposes individuals to disturbances in EB homeostasis were tested. Energy intake at an ad libitum lunch test meal after a breakfast high- and low-energy pre-load (the high energy pre-load contained 556kcal and the low energy pre-load contained 239kcal) were measured at the Baseline (Weeks -4 to 0) and Intervention (Weeks 0 to 4) phases in 13 healthy volunteers (three males and ten females; mean age 35 years [sd + 9] and mean BMI 25 kg/m2 [sd + 3.8]) [participants in each group included Ex=7, diet=5 (one female in the diet group dropped out midway), thus, 12 participants completed the study]. At Weeks -4, 0 and 4, visual analogue scales (VAS) were used to assess hunger and satiety and liking and wanting (L&W) for nutrient and taste preferences using a computer-based system (E-Prime v1.1.4). Ad libitum test meal EI was consistently lower after the HE pre-load compared to the LE pre-load. However, this was not consistent during the diet intervention however. A pre-load x group interaction on ad libitum test meal EI revealed that during the intervention phase the Ex group showed an improved sensitivity to detect the energy content between the two pre-loads and improved compensation for the ad libitum test meal whereas the diet group’s ability to differentiate between the two pre-loads decreased and showed poorer compensation (F[1,10]=2.88, p-value not significant). This study supports previous findings of the effect Ex and diet interventions have on appetite and compensatory responses; Ex increases and diet decreases energy balance sensitivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diet Induced Thermogenesis (DIT) is the energy expended consequent to meal consumption, and reflects the energy required for the processing and digestion of food consumed throughout each day. Although DIT is the total energy expended across a day in digestive processes to a number of meals, most studies measure thermogenesis in response to a single meal (Meal Induced Thermogenesis: MIT) as a representation of an individual’s thermogenic response to acute food ingestion. As a component of energy expenditure, DIT may have a contributing role in weight gain and weight loss. While the evidence is inconsistent, research has tended to reveal a suppressed MIT response in obese compared to lean individuals, which identifies individuals with an efficient storage of food energy, hence a greater tendency for weight gain. Appetite is another factor regulating body weight through its influence on energy intake. Preliminary research has shown a potential link between MIT and postprandial appetite as both are responses to food ingestion and have a similar response dependent upon the macronutrient content of food. There is a growing interest in understanding how both MIT and appetite are modified with changes in diet, activity levels and body size. However, the findings from MIT research have been highly inconsistent, potentially due to the vastly divergent protocols used for its measurement. Therefore, the main theme of this thesis was firstly, to address some of the methodological issues associated with measuring MIT. Additionally this thesis aimed to measure postprandial appetite simultaneously to MIT to test for any relationships between these meal-induced variables and to assess changes that occur in MIT and postprandial appetite during periods of energy restriction (ER) and following weight loss. Two separate studies were conducted to achieve these aims. Based on the increasing prevalence of obesity, it is important to develop accurate methodologies for measuring the components potentially contributing to its development and to understand the variability within these variables. Therefore, the aim of Study One was to establish a protocol for measuring the thermogenic response to a single test meal (MIT), as a representation of DIT across a day. This was done by determining the reproducibility of MIT with a continuous measurement protocol and determining the effect of measurement duration. The benefit of a fixed resting metabolic rate (RMR), which is a single measure of RMR used to calculate each subsequent measure of MIT, compared to separate baseline RMRs, which are separate measures of RMR measured immediately prior to each MIT test meal to calculate each measure of MIT, was also assessed to determine the method with greater reproducibility. Subsidiary aims were to measure postprandial appetite simultaneously to MIT, to determine its reproducibility between days and to assess potential relationships between these two variables. Ten healthy individuals (5 males, 5 females, age = 30.2 ± 7.6 years, BMI = 22.3 ± 1.9 kg/m2, %Fat Mass = 27.6 ± 5.9%) undertook three testing sessions within a 1-4 week time period. During the first visit, participants had their body composition measured using DXA for descriptive purposes, then had an initial 30-minute measure of RMR to familiarise them with the testing and to be used as a fixed baseline for calculating MIT. During the second and third testing sessions, MIT was measured. Measures of RMR and MIT were undertaken using a metabolic cart with a ventilated hood to measure energy expenditure via indirect calorimetry with participants in a semi-reclined position. The procedure on each MIT test day was: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard 576 kcal breakfast (54.3% CHO, 14.3% PRO, 31.4% FAT), comprising muesli, milk toast, butter, jam and juice, and 3) six hours of measuring MIT with two, ten-minute breaks at 3 and 4.5 hours for participants to visit the bathroom. On the MIT test days, pre and post breakfast then at 45-minute intervals, participants rated their subjective appetite, alertness and comfort on visual analogue scales (VAS). Prior to each test, participants were required to be fasted for 12 hours, and have undertaken no high intensity physical activity for the previous 48 hours. Despite no significant group changes in the MIT response between days, individual variability was high with an average between-day CV of 33%, which was not significantly improved by the use of a fixed RMR to 31%. The 95% limits of agreements which ranged from 9.9% of energy intake (%EI) to -10.7%EI with the baseline RMRs and between 9.6%EI to -12.4%EI with the fixed RMR, indicated very large changes relative to the size of the average MIT response (MIT 1: 8.4%EI, 13.3%EI; MIT 2: 8.8%EI, 14.7%EI; baseline and fixed RMRs respectively). After just three hours, the between-day CV with the baseline RMR was 26%, which may indicate an enhanced MIT reproducibility with shorter measurement durations. On average, 76, 89, and 96% of the six-hour MIT response was completed within three, four and five hours, respectively. Strong correlations were found between MIT at each of these time points and the total six-hour MIT (range for correlations r = 0.990 to 0.998; P < 0.01). The reproducibility of the proportion of the six-hour MIT completed at 3, 4 and 5 hours was reproducible (between-day CVs ≤ 8.5%). This indicated the suitability to use shorter durations on repeated occasions and a similar percent of the total response to be completed. There was a lack of strong evidence of any relationship between the magnitude of the MIT response and subjective postprandial appetite. Given a six-hour protocol places a considerable burden on participants, these results suggests that a post-meal measurement period of only three hours is sufficient to produce valid information on the metabolic response to a meal. However while there was no mean change in MIT between test days, individual variability was large. Further research is required to better understand which factors best explain the between-day variability in this physiological measure. With such a high prevalence of obesity, dieting has become a necessity to reduce body weight. However, during periods of ER, metabolic and appetite adaptations can occur which may impede weight loss. Understanding how metabolic and appetite factors change during ER and weight loss is important for designing optimal weight loss protocols. The purpose of Study Two was to measure the changes in the MIT response and subjective postprandial appetite during either continuous (CONT) or intermittent (INT) ER and following post diet energy balance (post-diet EB). Thirty-six obese male participants were randomly assigned to either the CONT (Age = 38.6 ± 7.0 years, weight = 109.8 ± 9.2 kg, % fat mass = 38.2 ± 5.2%) or INT diet groups (Age = 39.1 ± 9.1 years, weight = 107.1 ± 12.5 kg, % fat mass = 39.6 ± 6.8%). The study was divided into three phases: a four-week baseline (BL) phase where participants were provided with a diet to maintain body weight, an ER phase lasting either 16 (CONT) or 30 (INT) weeks, where participants were provided with a diet which supplied 67% of their energy balance requirements to induce weight loss and an eight-week post-diet EB phase, providing a diet to maintain body weight post weight loss. The INT ER phase was delivered as eight, two-week blocks of ER interspersed with two-week blocks designed to achieve weight maintenance. Energy requirements for each phase were predicted based on measured RMR, and adjusted throughout the study to account for changes in RMR. All participants completed MIT and appetite tests during BL and the ER phase. Nine CONT and 15 INT participants completed the post-diet EB MIT and 14 INT and 15 CONT participants completed the post-diet EB appetite tests. The MIT test day protocol was as follows: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard breakfast meal (874 kcal, 53.3% CHO, 14.5% PRO, 32.2% FAT), and 3) three hours of measuring MIT. MIT was calculated as the energy expenditure above the pre-meal RMR. Appetite test days were undertaken on a separate day using the same 576 kcal breakfast used in Study One. VAS were used to assess appetite pre and post breakfast, at one hour post breakfast then a further three times at 45-minute intervals. Appetite ratings were calculated for hunger and fullness as both the intra-meal change in appetite and the AUC. The three-hour MIT response at BL, ER and post-diet EB respectively were 5.4 ± 1.4%EI, 5.1 ± 1.3%EI and 5.0 ± 0.8%EI for the CONT group and 4.4 ± 1.0%EI, 4.7 ± 1.0%EI and 4.8 ± 0.8%EI for the INT group. Compared to BL, neither group had significant changes in their MIT response during ER or post-diet EB. There were no significant time by group interactions (p = 0.17) indicating a similar response to ER and post-diet EB in both groups. Contrary to what was hypothesised, there was a significant increase in postprandial AUC fullness in response to ER in both groups (p < 0.05). However, there were no significant changes in any of the other postprandial hunger or fullness variables. Despite no changes in MIT in both the CONT or INT group in response to ER or post-diet EB and only a minor increase in postprandial AUC fullness, the individual changes in MIT and postprandial appetite in response to ER were large. However those with the greatest MIT changes did not have the greatest changes in postprandial appetite. This study shows that postprandial appetite and MIT are unlikely to be altered during ER and are unlikely to hinder weight loss. Additionally, there were no changes in MIT in response to weight loss, indicating that body weight did not influence the magnitude of the MIT response. There were large individual changes in both variables, however further research is required to determine whether these changes were real compensatory changes to ER or simply between-day variation. Overall, the results of this thesis add to the current literature by showing the large variability of continuous MIT measurements, which make it difficult to compare MIT between groups and in response to diet interventions. This thesis was able to provide evidence to suggest that shorter measures may provide equally valid information about the total MIT response and can therefore be utilised in future research in order to reduce the burden of long measurements durations. This thesis indicates that MIT and postprandial subjective appetite are most likely independent of each other. This thesis also shows that, on average, energy restriction was not associated with compensatory changes in MIT and postprandial appetite that would have impeded weight loss. However, the large inter-individual variability supports the need to examine individual responses in more detail.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Aphasia is an acquired language disorder that can present a significant barrier to patient involvement in healthcare decisions. Speech-language pathologists (SLPs) are viewed as experts in the field of communication. However, many SLP students do not receive practical training in techniques to communicate with people with aphasia (PWA) until they encounter PWA during clinical education placements. Methods This study investigated the confidence and knowledge of SLP students in communicating with PWA prior to clinical placements using a customised questionnaire. Confidence in communicating with people with aphasia was assessed using a 100-point visual analogue scale. Linear, and logistic, regressions were used to examine the association between confidence and age, as well as confidence and course type (graduate-entry masters or undergraduate), respectively. Knowledge of strategies to assist communication with PWA was examined by asking respondents to list specific strategies that could assist communication with PWA. Results SLP students were not confident with the prospect of communicating with PWA; reporting a median 29-points (inter-quartile range 17–47) on the visual analogue confidence scale. Only, four (8.2%) of respondents rated their confidence greater than 55 (out of 100). Regression analyses indicated no relationship existed between confidence and students‘ age (p = 0.31, r-squared = 0.02), or confidence and course type (p = 0.22, pseudo r-squared = 0.03). Students displayed limited knowledge about communication strategies. Thematic analysis of strategies revealed four overarching themes; Physical, Verbal Communication, Visual Information and Environmental Changes. While most students identified potential use of resources (such as images and written information), fewer students identified strategies to alter their verbal communication (such as reduced speech rate). Conclusions SLP students who had received aphasia related theoretical coursework, but not commenced clinical placements with PWA, were not confident in their ability to communicate with PWA. Students may benefit from an educational intervention or curriculum modification to incorporate practical training in effective strategies to communicate with PWA, before they encounter PWA in clinical settings. Ensuring students have confidence and knowledge of potential communication strategies to assist communication with PWA may allow them to focus their learning experiences in more specific clinical domains, such as clinical reasoning, rather than building foundation interpersonal communication skills.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the prevalence of acute cough in children (<2 weeks duration), the burden to parents and families is largely unknown. The objectives of this study were to determine the parental burden of children’s acute cough, and to evaluate psychological and other infl uences on the reported burden of acute cough in children. Methods Parents of children with a current acute cough (<2 weeks) at enrolment completed 4 questionnaires (state trait anxiety inventory (STAI); short form health survey (SF-8); depression, anxiety and stress 21-item scale (DASS21); and our preliminary 48-item parent acute cough specifi c quality of life (PAC-QOL48) questionnaire). In PAC-QOL48, lower scores refl ect worse QOL. Results Median age of the 104 children enrolled was 2.63 (IQR 1.42, 4.79) years, 54 were boys. Median length of cough at enrolment was 3 (IQR 2, 5) days. Median total PAC-QOL48 score of parents enrolled at presentation to the emergency department (n = 70) was signifi cantly worse than of parents enrolled through the community (n = 24) (p < 0.01). More than half (n = 55) had sought medical assistance more than once for the current acute coughing illness. PAC-QOL48 score was signifi cantly negatively correlated to verbal category descriptive and visual analogue scale cough scores (Spearman r = −0.26, p = 0.05 and r = −0.46, p = 0.01 respectively) and DASS21 total score (r = −0.36, p = 0.01), but not to child’s age. Conclusions Consistent with data on chronic cough, stress was the predominant factor of parental burden. This study highlights the ongoing need for clinicians to be cognizant of parental worries and concerns when their children are coughing, and for further research into safe and effective therapies for acute cough in children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigated the influence of two different intensities of acute interval exercise on food preferences and appetite sensations in overweight and obese men. Twelve overweight/obese males (age=29.0±4.1 years; BMI =29.1±2.4 kg/m2) completed three exercise sessions: an initial graded exercise test, and two interval cycling sessions: moderate-(MIIT) and high-intensity (HIIT) interval exercise sessions on separate days in a counterbalanced order. The MIIT session involved cycling for 5-minute repetitions of alternate workloads 20% below and 20% above maximal fat oxidation. The HIIT session consisted of cycling for alternate bouts of 15 seconds at 85% VO2max and 15 seconds unloaded recovery. Appetite sensations and food preferences were measured immediately before and after the exercise sessions using the Visual Analogue Scale and the Liking & Wanting experimental procedure. Results indicated that liking significantly increased and wanting significantly decreased in all food categories after both MIIT and HIIT. There were no differences between MIIT and HIIT on the effect on appetite sensations and Liking & Wanting. In conclusion, manipulating the intensity of acute interval exercise did not affect appetite and nutrient preferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To compare the efficacy of two exercise programs in reducing pain and disability for individuals with non-specific low back pain and to examine the underlying mechanical factors related to pain and disability for individuals with NSLBP. Design A single-blind, randomized controlled trial. Methods: Eighty participants were recruited from eleven community-based general medical practices and randomized into two groups completing either a lumbopelvic motor control or a combined lumbopelvic motor control and progressive hip strengthening exercise therapy program. All participants received an education session, 6 rehabilitation sessions including real time ultrasound training, and a home based exercise program manual and log book. The primary outcomes were pain (0-100mm visual analogue scale), and disability (Oswestry Disability Index V2). The secondary outcomes were hip strength (N/kg) and two-dimensional frontal plane biomechanics (°) measure during the static Trendelenburg test and while walking. All outcomes were measured at baseline and at 6-week follow up. Results There was no statistical difference in the change in pain (xˉ = -4.0mm, t= -1.07, p =0.29, 95%CI -11.5, 3.5) or disability (xˉ = -0.3%, t= -0.19, p =0.85, 95%CI -3.5, 2.8) between groups. Within group comparisons revealed clinically meaningful reductions in pain for both Group One (xˉ =-20.9mm, 95%CI -25.7, -16.1) and Group Two (xˉ =-24.9, 95%CI -30.8, -19.0). Conclusion Both exercise programs had similar efficacy in reducing pain. The addition of hip strengthening exercises to a motor control exercise program does not appear to result in improved clinical outcome for pain for individuals with non-specific low back pain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Resistance exercise is emerging as a potential adjunct therapy to aid in the management of breast cancer-related lymphedema (BCRL). However, the mechanisms underlying the relationships between the acute and long-term benefits of resistance exercise on BCRL are not well understood. Purpose. To examine the acute inflammatory response to upper-body resistance exercise in women with BCRL and to compare these effects between resistance exercises involving low-, moderate- and high-loads. The impact on lymphoedema status and associated symptoms was also compared. Methods Twenty-one women aged 62 ± 10 years with mild to severe BCRL participated in the study. Participants completed a low-load (15-20 repetition maximum), moderate-load (10-12 repetition maximum) and high-load (6-8 repetition maximum) exercise sessions consisting of three sets of six upper-body resistance exercises. Sessions were completed in a randomized order separated by a seven to 10 day wash-out period. Venous blood samples were obtained to assess markers of exercise-induced muscle damage and inflammation (creatine kinase [CK], C-reactive protein [CRP], interleukin-6 [IL-6] and tumour necrosis factor-alpha [TNF-α]). Lymphoedema status was assessed using bioimpedance spectroscopy and arm circumferences, and associated symptoms were assessed using visual analogue scales (VAS) for pain, heaviness and tightness. Measurements were conducted before and 24 hours after the exercise sessions. Results No significant changes in CK, CRP, IL-6 and TNF-α were observed following the low-, moderate- or high-load resistance exercise sessions. There were no significant changes in arm swelling or symptom severity scores across the three resistance exercise conditions. Conclusions The magnitude of acute exercise-induced inflammation following upper-body resistance exercise in women with BCRL does not vary between resistance exercise loads. Given these observations, moderate- to high-load resistance training is recommended for this patient population as these loads prompt superior physiological and functional benefits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate the efficacy and safety of adalimumab in patients with non-radiographic axial spondyloarthritis (nr-axSpA). Methods: Patients fulfilled Assessment of Spondyloarthritis international Society (ASAS) criteria for axial spondyloarthritis, had a Bath Ankylosing Spondylitis Disease Activity Index (BASDAI) score of ≥ 4, total back pain score of ≥ 4 (10 cm visual analogue scale) and inadequate response, intolerance or contraindication to non-steroidal anti-inflammatory drugs (NSAIDs); patients fulfilling modified New York criteria for ankylosing spondylitis were excluded. Patients were randomised to adalimumab (N=91) or placebo (N=94). The primary endpoint was the percentage of patients achieving ASAS40 at week 12. Efficacy assessments included BASDAI and Ankylosing Spondylitis Disease Activity Score (ASDAS). MRI was performed at baseline and week 12 and scored using the Spondyloarthritis Research Consortium of Canada (SPARCC) index. Results: Significantly more patients in the adalimumab group achieved ASAS40 at week 12 compared with patients in the placebo group (36% vs 15%, p<0.001). Significant clinical improvements based on other ASAS responses, ASDAS and BASDAI were also detected at week 12 with adalimumab treatment, as were improvements in quality of life measures. Inflammation in the spine and sacroiliac joints on MRI significantly decreased after 12 weeks of adalimumab treatment. Shorter disease duration, younger age, elevated baseline C-reactive protein or higher SPARCC MRI sacroiliac joint scores were associated with better week 12 responses to adalimumab. The safety profile was consistent with what is known for adalimumab in ankylosing spondylitis and other diseases. Conclusions: In patients with nr-axSpA, adalimumab treatment resulted in effective control of disease activity, decreased inflammation and improved quality of life compared with placebo. Results from ABILITY-1 suggest that adalimumab has a positive benefit-risk profile in active nr-axSpA patients with inadequate response to NSAIDs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To develop a short and easy to use questionnaire to measure use and usability of custom-made orthopaedic shoes, and to investigate its reproducibility. DESIGN Development of the questionnaire (Monitor Orthopaedic Shoes) was based on a literature search, expert interviews, 2 expert meetings, and exploration and testing of reproducibility. The questionnaire comprises 2 parts: a pre part, measuring expectations; and a post part, measuring experiences. Patients The pre part of the final version was completed twice by 37 first-time users before delivery of their orthopaedic shoes. The post part of the final version was completed twice by 39 first-time users who had worn their orthopaedic shoes for 2–4 months. RESULTS High reproducibility scores (Cohen’s kappa > 0.60 or intra class correlation > 0.70) were found in all but one question of both parts of the final version of the Monitor Orthopaedic Shoes questionnaire. The smallest real difference on a visual analogue scale (100 mm) ranged from 21 to 50 mm. It took patients approximately 15 minutes to complete one part. CONCLUSION Monitor Orthopaedic Shoes is a practical and reproducible questionnaire that can measure relevant aspects of use and usability of orthopaedic shoes from a patient’s perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste trabalho foi avaliar através de questionários de escalas visuais analógicas a percepção da dor após a inserção do primeiro arco ortodôntico, comparando-se o efeito analgésico de ibuprofeno, acetaminofeno, placebo e goma de mascar. Este trabalho também partiu da hipótese de que ibuprofeno, acetaminofeno e gomas de mascar seriam mais eficazes que placebo no controle da dor de origem ortodôntica e que gomas de mascar poderiam ser uma alternativa ao uso de ibuprofeno e acetaminofeno no manejo da dor dentária de origem ortodôntica. Neste estudo, tomaram parte 41 pacientes da Clínica de Ortodontia da Faculdade de Odontologia da Universidade do Estado do Rio de Janeiro. Os pacientes foram aleatoriamente distribuídos em cinco diferentes grupos: placebo, acetaminofeno 500 miligramas, ibuprofeno 400 miligramas, goma de mascar e controle. Todos os indivíduos tiveram bráquetes com slots .022" colados em seus dentes e molares bandados em uma das arcadas. Os grupos placebo, ibuprofeno e acetaminofeno foram orientados a tomar 01 cápsula do respectivo composto logo após a inserção do arco inicial de liga de níquel-titânio de dimensão .014 e, se a dor persistisse, a cada 6 horas por uma semana.O grupo goma de mascar foi orientado a mascar um tablete de goma por 5 minutos imediatamente após a inserção do arco inicial de liga de níquel-titânio de dimensão .014 e a cada 6 horas por 5 minutos durante uma semana, caso a dor persistisse. O grupo controle recebeu nenhum método de controle da dor. Os indivíduos foram orientados a marcar nas escalas visuais analógicas nas primeiras 24 horas, às 09:00, 13:00, 17:00, 21:00 a percepção de dor espontânea e durante a mastigação. Do terceiro até o vigésimo primeiro dia as marcações foram feitas somente em dois tempos às 09:00 e 21:00. Através da análise estatística descritiva, concluiu-se que o placebo foi mais eficiente que ibuprofeno, acetaminofeno e goma de mascar no controle da dor ortodôntica, tanto em dor espontânea quanto em dor durante a mastigação. O grupo goma de mascar foi tão eficiente quanto o acetaminofeno no controle da dor espontânea 24 horas após a inserção do arco inicial. Para alívio da dor durante a mastigação, a goma de mascar pode ser uma alternativa à atuação medicamentosa no controle da dor ortodôntica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Cancer related fatigue (CRF) is considered the most severe, debilitating and under-managed symptom of cancer. Patients receiving chemotherapy experience high levels of CRF which profoundly impacts on their lives. Aim: 1). To explore and measure CRF and determine the most effective self-care strategies used to combat CRF in a cohort of patients with a diagnosis of cancer (breast cancer, colorectal cancer, Hodgkin’s and Non-Hodgkin’s lymphoma) 2). To explore self-care agency and its relationship to CRF. Method: A mixed methods study which incorporated a descriptive, comparative, correlational design and qualitative descriptions of patients’ (n=362) experiences gleaned through open ended questions and use of a diary. The study utilised The Revised Pipers Fatigue Scale, the Appraisal of Self-Care Agency and a researcher developed Fatigue Visual Analogue Scale, Fatigue Self-Care Survey, and Diary. Findings: Having breast cancer, Hodgkin’s lymphoma, non-Hodgkin’s lymphoma; using the strategies of counselling, taking a 20–30 minute nap, resting and sleeping, self-monitoring and complementary therapies were all associated with increased odds of developing fatigue. Increased self-care agency; being in the divorced / separated cohort; being widowed; increased length of time since commencement of chemotherapy; engagement in exercise, and socializing were associated with a reduced risk of developing fatigue. Females had 20% higher fatigue levels than males (p=<.001). Receiving support was the strategy used most frequently and rated most effective. Fatigue was very problematic and distressing, four key qualitative categories emerged: the behavioural impact, affective impact, the sensory impact, and the cognitive impact. Keeping a diary was considered very beneficial and cathartic. Conclusions: Fatigue severely impacted on the daily lives of patients undergoing chemotherapy. There are a range of self-care strategies that patients should be encouraged to use e.g. exercise, socializing, and enhancement of psychological well-being. The enhancement of self-care agency and use of diaries should also be considered.