994 resultados para Chronic poverty
Resumo:
Most people complain about being tired and wish they could sleep in for a few more hours instead of going to work. However, 'being tired' has a whole different meaning for people living with chronic fatigue syndrome, which is also known as myalgic encephalomyelitis.
Resumo:
Chronic myeloid leukemia (CML) is a malignant clonal blood disease that originates from a pluripotent hematopoietic stem cell. The cytogenetic hallmark of CML, the Philadelphia chromosome (Ph), is formed as a result of reciprocal translocation between chromosomes 9 and 22, which leads to a formation of a chimeric BCR-ABL fusion gene. The BCR-ABL protein is a constitutively active tyrosine kinase that changes the adhesion properties of cells, constitutively activates mitogenic signaling, enhances cell proliferation and reduces apoptosis. This results in leukemic growth and the clinical disease, CML. With the advent of targeted therapies against the BCR-ABL fusion protein, the treatment of CML has changed considerably during the recent decade. In this thesis, the clinical significance of different diagnostic methods and new prognostic factors in CML have been assessed. First, the association between two different methods for measuring CML disease burden (the RQ-PCR and the high mitotic index metaphase FISH) was assessed in bone marrow and peripheral blood samples. The correlation between positive RQ-PCR and metaphase FISH samples was high. However, RQ-PCR was more sensitive and yielded measurable transcripts in 40% of the samples that were negative by metaphase FISH. The study established a laboratory-specific conversion factor for setting up the International Scale when standardizing RQ-PCR measurements. Secondly, the amount of minimal residual disease (MRD) after allogeneic hematopoietic stem cell transplantation (alloHSCT) was determined. For this, metaphase FISH was done for the bone marrow samples of 102 CML patients. Most (68%), had no residual cells during the entire follow-up time. Some (12 %) patients had minor (<1%) MRD which decreased even further with time, whereas 19% had a progressive rise in MRD that exceeded 1% or had more than 1% residual cells when first detected. Residual cells did not become eradicated spontaneously if the frequency of Ph+ cells exceeded 1% during follow-up. Next, the impact of deletions in the derivative chromosome 9, was examined. Deletions were observed in 15% of the CML patients who later received alloHSCT. After alloHSCT, there was no difference in the total relapse rate in patients with or without deletions. Nor did the estimates of overall survival, transplant-related mortality, leukemia-free survival and relapse-free time show any difference between these groups. When conventional treatment regimens are used, the der(9) status could be an important criterion, in conjunction with other prognostic factors, when allogeneic transplantation is considered. The significance of der(9) deletions for patients treated with tyrosine kinase inhibitors is not clear and requires further investigation. In addition to the der(9) status of the patient, the significance of bone marrow lymphocytosis as a prognostic factor in CML was assessed. Bone marrow lymphocytosis during imatinib therapy was a positive predictive factor and heralded optimal response. When combined with major cytogenetic response at three months of treatment, bone marrow lymphocytosis predicted a prognostically important major molecular response at 18 months of imatinib treatment. Although the validation of these findings is warranted, the determination of the bone marrow lymphocyte count could be included in the evaluation of early response to imatinib treatment already now. Finally, BCR-ABL kinase domain mutations were studied in CML patients resistant against imatinib treatment. Point mutations detected in the kinase domain were the same as previously reported, but other sequence variants, e.g. deletions or exon splicing, were also found. The clinical significance of the other variations remains to be determined.
Resumo:
Working on the serotonin (5-hydroxytryptamine, 5-HT) 5-HT2B receptor since several years, we have read with high interest the review by Hertz et al. (2015). Previous studies from our group demonstrated that a direct injection in mouse raphe nucleus of the 5-HT2B agonist BW723C86 has the ability to increase extracellular levels of serotonin, which can be blocked by the selective 5-HT2B receptor antagonist RS127445 (Doly et al., 2008, 2009). We also reported that an acute injection of paroxetine 2 mg/kg in mice knocked out for the 5-HT2B receptor gene or in wild type mice injected with RS127445 (0.5 mg/kg) triggers a strong reduction in extracellular accumulation of 5-HT in hippocampus (Diaz et al., 2012). Following these observations, we showed that acute and chronic BW723C86 injection (3 mg/kg) can mimic the fluoxetine (3 mg/kg) and paroxetine (1 mg/kg) behavioral and biochemical antidepressant effects in mice (Diaz and Maroteaux, 2011; Diaz et al., 2012)...
Resumo:
Future time perspective - the way individuals perceive their remaining time in life - importantly influences socio-emotional goals and motivational outcomes. Recently, researchers have called for studies that investigate relationships between personality and future time perspective. Using a cross-lagged panel design, this study investigated effects of chronic regulatory focus dimensions (promotion and prevention orientation) on future time perspective dimensions (focus on opportunities and limitations). Survey data were collected two times, separated by a 3. month time lag, from 85 participants. Results of structural equation modeling showed that promotion orientation had a positive lagged effect on focus on opportunities, and prevention orientation had a positive lagged effect on focus on limitations.
Resumo:
Köyhiä maanviljelijöitä on usein syytetty kehitysmaiden ympäristöongelmista. On väitetty, että eloonjäämistaistelu pakottaa heidät käyttämään maata ja muita luonnonvaroja lyhytnäköisesti. Harva asiaa koskeva tutkimus on kuitenkaan tukenut tätä väitettä; perheiden köyhyyden astetta ja heidän aiheuttamaansa ympäristövaikutusta ei ole kyetty kytkemään toisiinsa. Selkeyttääkseen köyhyys-ympäristö –keskustelua, Thomas Reardon ja Steven Vosti kehittivät investointiköyhyyden käsitteen. Se tunnistaa sen kenties suuren joukon maanviljelijäperheitä, jotka eivät ole köyhiä perinteisten köyhyysmittareiden mukaan, mutta joiden hyvinvointi ei ole riittävästi köyhyysrajojen yläpuolella salliakseen perheen investoida kestävämpään maankäyttöön. Reardon ja Vosti korostivat myös omaisuuden vaikutusta perheiden hyvinvointiin, ja uskoivat sen vaikuttavan tuotanto- ja investointipäätöksiin. Tässä tutkimuksessa pyritään vastaamaan kahteen kysymykseen: Miten investointiköyhyyttä voidaan ymmärtää ja mitata? Ja, mikä on viljelijäperheiden omaisuuden hyvinvointia lisäävä vaikutus? Tätä tutkimusta varten haastateltiin 402 maanviljelijäperhettä Väli-Amerikassa, Panaman tasavallan Herreran läänissä. Näiden perheiden hyvinvointia mitattiin heidän kulutuksensa mukaan, ja paikalliset köyhyysrajat laskettiin paikallisen ruoan hinnan mukaan. Herrerassa ihminen tarvitsee keskimäärin 494 dollaria vuodessa saadakseen riittävän ravinnon, tai 876 dollaria vuodessa voidakseen ravinnon lisäksi kattaa muitakin välttämättömiä menoja. Ruoka- eli äärimmäisen köyhyyden rajan alle jäi 15,4% tutkituista perheistä, ja 33,6% oli jokseenkin köyhiä, eli saavutti kyllä riittävän ravitsemuksen, muttei kyennyt kustantamaan muita perustarpeitaan. Molempien köyhyysrajojen yläpuolelle ylsi siis 51% tutkituista perheistä. Näiden köyhyysryhmien välillä on merkittäviä eroavaisuuksia ei vain perheiden varallisuuden, tulojen ja investointistrategioiden välillä, mutta myös perheiden rakenteessa, elinympäristössä ja mahdollisuuksissa saada palveluja. Investointiköyhyyden mittaaminen osoittautui haastavaksi. Herrerassa viljelijät eivät tee investointeja puhtaasti ympäristönsuojeluun, eikä maankäytön kestävyyttä muutenkaan pystytty yhdistämään perheiden hyvinvoinnin tasoon. Siksi investointiköyhyyttä etsittiin sellaisena hyvinvoinnin tasona, jonka alapuolella elävien perheiden parissa tuottavat maanparannusinvestoinnit eivät enää ole suorassa suhteessa hyvinvointiin. Tällaisia investointeja ovat mm. istutetut aidat, lannoitus ja paranneltujen laiduntyyppien viljely. Havaittiin, että jos perheen hyvinvointi putoaa alle 1000 dollarin/henkilö/vuosi, tällaiset tuottavat maanparannusinvestoinnit muuttuvat erittäin harvinaisiksi. Investointiköyhyyden raja on siis noin kaksi kertaa riittävän ravitsemuksen hinta, ja sen ylitti 42,3% tutkituista perheistä. Heille on tyypillistä, että molemmat puolisot käyvät työssä, ovat korkeasti koulutettuja ja yhteisössään aktiivisia, maatila tuottaa paremmin, tilalla kasvatetaan vaativampia kasveja, ja että he ovat kerryttäneet enemmän omaisuutta kuin investointi-köyhyyden rajan alla elävät perheet. Tässä tutkimuksessa kyseenalaistettiin yleinen oletus, että omaisuudesta olisi poikkeuksetta hyötyä viljelijäperheelle. Niinpä omaisuuden vaikutusta perheiden hyvinvointiin tutkittiin selvittämällä, mitä reittejä pitkin perheiden omistama maa, karja, koulutus ja työikäiset perheenjäsenet voisivat lisätä perheen hyvinvointia. Näiden hyvinvointi-mekanismien ajateltiin myös riippuvan monista väliin tulevista tekijöistä. Esimerkiksi koulutus voisi lisätä hyvinvointia, jos sen avulla saataisiin paremmin palkattuja töitä tai perustettaisiin yritys; mutta näihin mekanismeihin saattaa vaikuttaa vaikkapa etäisyys kaupungeista tai se, omistaako perhe ajoneuvon. Köyhimpien perheiden parissa nimenomaan koulutus olikin ainoa tutkittu omaisuuden muoto, joka edisti perheen hyvinvointia, kun taas maasta, karjasta tai työvoimasta ei ollut apua köyhyydestä nousemiseen. Varakkaampien perheiden parissa sen sijaan korkeampaa hyvinvointia tuottivat koulutuksen lisäksi myös maa ja työvoima, joskin monesta väliin tulevasta muuttujasta, kuten tuotantopanoksista riippuen. Ei siis ole automaatiota, jolla omaisuus parantaisi perheiden hyvinvointia. Vaikka rikkailla onkin yleensä enemmän karjaa kuin köyhemmillä, ei tässä aineistossa löydetty yhtään mekanismia, jota kautta karjan määrä tuottaisi korkeampaa hyvinvointia viljelijäperheille. Omaisuuden keräämisen ja hyödyntämisen strategiat myös muuttuvat hyvinvoinnin kasvaessa ja niihin vaikuttavat monet ulkoiset tekijät. Ympäristön ja köyhyyden suhde on siis edelleen epäselvä. Köyhyyden voittaminen vaatii pitkällä tähtäimellä sitä, että viljelijäperheet nousisivat investointiköyhyyden rajan yläpuolelle. Näin heillä olisi varaa alkaa kartuttaa omaisuutta ja investoida kestävämpään maankäyttöön. Tällä hetkellä kuitenkin isolle osalle herreralaisia perheitä tuo raja on kaukana tavoittamattomissa. Miten päästä yli tuhannen dollarin kulutukseen perheenjäsentä kohden, mikäli elintaso ei yllä edes riittävään ravitsemukseen? Ja sittenkin, vaikka hyvinvointi kohenisi, ei ympäristön kannalta parannuksia ole välttämättä odotettavissa, mikäli karjalaumat kasvavat ja eroosioalttiit laitumet leviävät.
Resumo:
Dietary nitrate (NO3−) supplementation with beetroot juice (BR) over 4–6 days has been shown to reduce the O2 cost of submaximal exercise and to improve exercise tolerance. However, it is not known whether shorter (or longer) periods of supplementation have similar (or greater) effects. We therefore investigated the effects of acute and chronic NO3− supplementation on resting blood pressure (BP) and the physiological responses to moderate-intensity exercise and ramp incremental cycle exercise in eight healthy subjects. Following baseline tests, the subjects were assigned in a balanced crossover design to receive BR (0.5 l/day; 5.2 mmol of NO3−/day) and placebo (PL; 0.5 l/day low-calorie juice cordial) treatments. The exercise protocol (two moderate-intensity step tests followed by a ramp test) was repeated 2.5 h following first ingestion (0.5 liter) and after 5 and 15 days of BR and PL. Plasma nitrite concentration (baseline: 454 ± 81 nM) was significantly elevated (+39% at 2.5 h postingestion; +25% at 5 days; +46% at 15 days; P < 0.05) and systolic and diastolic BP (baseline: 127 ± 6 and 72 ± 5 mmHg, respectively) were reduced by ∼4% throughout the BR supplementation period (P < 0.05). Compared with PL, the steady-state V̇o2 during moderate exercise was reduced by ∼4% after 2.5 h and remained similarly reduced after 5 and 15 days of BR (P < 0.05). The ramp test peak power and the work rate at the gas exchange threshold (baseline: 322 ± 67 W and 89 ± 15 W, respectively) were elevated after 15 days of BR (331 ± 68 W and 105 ± 28 W; P < 0.05) but not PL (323 ± 68 W and 84 ± 18 W). These results indicate that dietary NO3− supplementation acutely reduces BP and the O2 cost of submaximal exercise and that these effects are maintained for at least 15 days if supplementation is continued.
Resumo:
- Objective This study examined chronic disease risks and the use of a smartphone activity tracking application during an intervention in Australian truck drivers (April-October 2014). - Methods Forty-four men (mean age=47.5 [SD 9.8] years) completed baseline health measures, and were subsequently offered access to a free wrist-worn activity tracker and smartphone application (Jawbone UP) to monitor step counts and dietary choices during a 20-week intervention. Chronic disease risks were evaluated against guidelines; weekly step count and dietary logs registered by drivers in the application were analysed to evaluate use of the Jawbone UP. - Results Chronic disease risks were high (e.g. 97% high waist circumference [≥94 cm]). Eighteen drivers (41%) did not start the intervention; smartphone technical barriers were the main reason for drop out. Across 20-weeks, drivers who used the Jawbone UP logged step counts for an average of 6 [SD 1] days/week; mean step counts remained consistent across the intervention (weeks 1–4=8,743[SD 2,867] steps/day; weeks 17–20=8,994[SD 3,478] steps/day). The median number of dietary logs significantly decreased from start (17 [IQR 38] logs/weeks) to end of the intervention (0 [IQR 23] logs/week; p<0.01); the median proportion of healthy diet choices relative to total diet choices logged increased across the intervention (weeks 1–4=38[IQR 21]%; weeks 17–20=58[IQR 18]%). - Conclusions Step counts were more successfully monitored than dietary choices in those drivers who used the Jawbone UP. - Implications Smartphone technology facilitated active living and healthy dietary choices, but also prohibited intervention engagement in a number of these high-risk Australian truck drivers.
Resumo:
- Background Nilotinib and dasatinib are now being considered as alternative treatments to imatinib as a first-line treatment of chronic myeloid leukaemia (CML). - Objective This technology assessment reviews the available evidence for the clinical effectiveness and cost-effectiveness of dasatinib, nilotinib and standard-dose imatinib for the first-line treatment of Philadelphia chromosome-positive CML. - Data sources Databases [including MEDLINE (Ovid), EMBASE, Current Controlled Trials, ClinicalTrials.gov, the US Food and Drug Administration website and the European Medicines Agency website] were searched from search end date of the last technology appraisal report on this topic in October 2002 to September 2011. - Review methods A systematic review of clinical effectiveness and cost-effectiveness studies; a review of surrogate relationships with survival; a review and critique of manufacturer submissions; and a model-based economic analysis. - Results Two clinical trials (dasatinib vs imatinib and nilotinib vs imatinib) were included in the effectiveness review. Survival was not significantly different for dasatinib or nilotinib compared with imatinib with the 24-month follow-up data available. The rates of complete cytogenetic response (CCyR) and major molecular response (MMR) were higher for patients receiving dasatinib than for those with imatinib for 12 months' follow-up (CCyR 83% vs 72%, p < 0.001; MMR 46% vs 28%, p < 0.0001). The rates of CCyR and MMR were higher for patients receiving nilotinib than for those receiving imatinib for 12 months' follow-up (CCyR 80% vs 65%, p < 0.001; MMR 44% vs 22%, p < 0.0001). An indirect comparison analysis showed no difference between dasatinib and nilotinib for CCyR or MMR rates for 12 months' follow-up (CCyR, odds ratio 1.09, 95% CI 0.61 to 1.92; MMR, odds ratio 1.28, 95% CI 0.77 to 2.16). There is observational association evidence from imatinib studies supporting the use of CCyR and MMR at 12 months as surrogates for overall all-cause survival and progression-free survival in patients with CML in chronic phase. In the cost-effectiveness modelling scenario, analyses were provided to reflect the extensive structural uncertainty and different approaches to estimating OS. First-line dasatinib is predicted to provide very poor value for money compared with first-line imatinib, with deterministic incremental cost-effectiveness ratios (ICERs) of between £256,000 and £450,000 per quality-adjusted life-year (QALY). Conversely, first-line nilotinib provided favourable ICERs at the willingness-to-pay threshold of £20,000-30,000 per QALY. - Limitations Immaturity of empirical trial data relative to life expectancy, forcing either reliance on surrogate relationships or cumulative survival/treatment duration assumptions. - Conclusions From the two trials available, dasatinib and nilotinib have a statistically significant advantage compared with imatinib as measured by MMR or CCyR. Taking into account the treatment pathways for patients with CML, i.e. assuming the use of second-line nilotinib, first-line nilotinib appears to be more cost-effective than first-line imatinib. Dasatinib was not cost-effective if decision thresholds of £20,000 per QALY or £30,000 per QALY were used, compared with imatinib and nilotinib. Uncertainty in the cost-effectiveness analysis would be substantially reduced with better and more UK-specific data on the incidence and cost of stem cell transplantation in patients with chronic CML. - Funding The Health Technology Assessment Programme of the National Institute for Health Research.
Resumo:
Objectives In 2012, the National Institute for Health and Care Excellence assessed dasatinib, nilotinib, and standard-dose imatinib as first-line treatment of chronic phase chronic myelogenous leukemia (CML). Licensing of these alternative treatments was based on randomized controlled trials assessing complete cytogenetic response (CCyR) and major molecular response (MMR) at 12 months as primary end points. We use this case study to illustrate the validation of CCyR and MMR as surrogate outcomes for overall survival in CML and how this evidence was used to inform National Institute for Health and Care Excellence’s recommendation on the public funding of these first-line treatments for CML. Methods We undertook a systematic review and meta-analysis to quantify the association between CCyR and MMR at 12 months and overall survival in patients with chronic phase CML. We estimated life expectancy by extrapolating long-term survival from the weighted overall survival stratified according to the achievement of CCyR and MMR. Results Five studies provided data on the observational association between CCyR or MMR and overall survival. Based on the pooled association between CCyR and MMR and overall survival, our modeling showed comparable predicted mean duration of survival (21–23 years) following first-line treatment with imatinib, dasatinib, or nilotinib. Conclusions This case study illustrates the consideration of surrogate outcome evidence in health technology assessment. Although it is often recommended that the acceptance of surrogate outcomes be based on randomized controlled trial data demonstrating an association between the treatment effect on both the surrogate outcome and the final outcome, this case study shows that policymakers may be willing to accept a lower level of evidence (i.e., observational association).
Resumo:
- BACKGROUND Chronic diseases are increasing worldwide and have become a significant burden to those affected by those diseases. Disease-specific education programs have demonstrated improved outcomes, although people do forget information quickly or memorize it incorrectly. The teach-back method was introduced in an attempt to reinforce education to patients. To date, the evidence regarding the effectiveness of health education employing the teach-back method in improved care has not yet been reviewed systematically. - OBJECTIVES This systematic review examined the evidence on using the teach-back method in health education programs for improving adherence and self-management of people with chronic disease. - INCLUSION CRITERIA Types of participants: Adults aged 18 years and over with one or more than one chronic disease. Types of intervention: All types of interventions which included the teach-back method in an education program for people with chronic diseases. The comparator was chronic disease education programs that did not involve the teach-back method. Types of studies: Randomized and non-randomized controlled trials, cohort studies, before-after studies and case-control studies. Types of outcomes: The outcomes of interest were adherence, self-management, disease-specific knowledge, readmission, knowledge retention, self-efficacy and quality of life. - SEARCH STRATEGY Searches were conducted in CINAHL, MEDLINE, EMBASE, Cochrane CENTRAL, Web of Science, ProQuest Nursing and Allied Health Source, and Google Scholar databases. Search terms were combined by AND or OR in search strings. Reference lists of included articles were also searched for further potential references. - METHODOLOGICAL QUALITY Two reviewers conducted quality appraisal of papers using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument. - DATA EXTRACTION Data were extracted using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument data extraction instruments. - DATA SYNTHESIS There was significant heterogeneity in selected studies, hence a meta-analysis was not possible and the results were presented in narrative form. - RESULTS Of the 21 articles retrieved in full, 12 on the use of the teach-back method met the inclusion criteria and were selected for analysis. Four studies confirmed improved disease-specific knowledge in intervention participants. One study showed a statistically significant improvement in adherence to medication and diet among type 2 diabetics patients in the intervention group compared to the control group (p < 0.001). Two studies found statistically significant improvements in self-efficacy (p = 0.0026 and p < 0.001) in the intervention groups. One study examined quality of life in heart failure patients but the results did not improve from the intervention (p = 0.59). Five studies found a reduction in readmission rates and hospitalization but these were not always statistically significant. Two studies showed improvement in daily weighing among heart failure participants, and in adherence to diet, exercise and foot care among those with type 2 diabetes. - CONCLUSION Overall, the teach-back method showed positive effects in a wide range of health care outcomes although these were not always statistically significant. Studies in this systematic review revealed improved outcomes in disease-specific knowledge, adherence, self-efficacy and the inhaler technique. There was a positive but inconsistent trend also seen in improved self-care and reduction of hospital readmission rates. There was limited evidence on improvement in quality of life or disease related knowledge retention.
Resumo:
Objective The objective of this study was to investigate the risk of chronic kidney disease (CKD) stage 4-5 and dialysis treatment on incidence of foot ulceration and major lower extremity amputation in comparison to CKD stage 3. Methods In this retrospective study, all individuals who visited our hospital between 2006 and 2012 because of CKD stages 3 to 5 or dialysis treatment were included. Medical records were reviewed for incidence of foot ulceration and major amputation. The time from CKD 3, CKD 4-5, and dialysis treatment until first foot ulceration and first major lower extremity amputation was calculated and analyzed by Kaplan-Meier curves and multivariate Cox proportional hazards model. Diabetes mellitus, peripheral arterial disease, peripheral neuropathy, and foot deformities were included for potential confounding. Results A total of 669 individuals were included: 539 in CKD 3, 540 in CKD 4-5, and 259 in dialysis treatment (individuals could progress from one group to the next). Unadjusted foot ulcer incidence rates per 1000 patients per year were 12 for CKD 3, 47 for CKD 4-5, and 104 for dialysis (P < .001). In multivariate analyses, the hazard ratio for incidence of foot ulceration was 4.0 (95% confidence interval [CI], 2.6-6.3) in CKD 4-5 and 7.6 (95% CI, 4.8-12.1) in dialysis treatment compared with CKD 3. Hazard ratios for incidence of major amputation were 9.5 (95% CI, 2.1-43.0) and 15 (95% CI, 3.3-71.0), respectively. Conclusions CKD 4-5 and dialysis treatment are independent risk factors for foot ulceration and major amputation compared with CKD 3. Maximum effort is needed in daily clinical practice to prevent foot ulcers and their devastating consequences in all individuals with CKD 4-5 or dialysis treatment.