991 resultados para 778
Resumo:
BACKGROUND: In many countries, primary care physicians determine whether or not older drivers are fit to drive. Little, however, is known regarding the effects of cognitive decline on driving performance and the means to detect it. This study explores to what extent the trail making test (TMT) can provide indications to clinicians about their older patients' on-road driving performance in the context of cognitive decline. METHODS: This translational study was nested within a cohort study and an exploratory psychophysics study. The target population of interest was constituted of older drivers in the absence of important cognitive or physical disorders. We therefore recruited and tested 404 home-dwelling drivers, aged 70 years or more and in possession of valid drivers' licenses, who volunteered to participate in a driving refresher course. Forty-five drivers also agreed to undergo further testing at our lab. On-road driving performance was evaluated by instructors during a 45 minute validated open-road circuit. Drivers were classified as either being excellent, good, moderate, or poor depending on their score on a standardized evaluation of on-road driving performance. RESULTS: The area under the receiver operator curve for detecting poorly performing drivers was 0.668 (CI95% 0.558 to 0.778) for the TMT-A, and 0.662 (CI95% 0.542 to 0.783) for the TMT-B. TMT was related to contrast sensitivity, motion direction, orientation discrimination, working memory, verbal fluency, and literacy. Older patients with a TMT-A ≥ 54 seconds or a TMT-B ≥ 150 seconds have a threefold (CI95% 1.3 to 7.0) increased risk of performing poorly during the on-road evaluation. TMT had a sensitivity of 63.6%, a specificity of 64.9%, a positive predictive value of 9.5%, and a negative predictive value of 96.9%. CONCLUSION: In screening settings, the TMT would have clinicians uselessly consider driving cessation in nine drivers out of ten. Given the important negative impact this could have on older drivers, this study confirms the TMT not to be specific enough for clinicians to justify driving cessation without complementary investigations on driving behaviors.
Resumo:
Mucocutaneous leishmaniasis is caused by infections with intracellular parasites of the Leishmania Viannia subgenus, including Leishmania guyanensis. The pathology develops after parasite dissemination to nasopharyngeal tissues, where destructive metastatic lesions form with chronic inflammation. Currently, the mechanisms involved in lesion development are poorly understood. Here we show that metastasizing parasites have a high Leishmania RNA virus-1 (LRV1) burden that is recognized by the host Toll-like receptor 3 (TLR3) to induce proinflammatory cytokines and chemokines. Paradoxically, these TLR3-mediated immune responses rendered mice more susceptible to infection, and the animals developed an increased footpad swelling and parasitemia. Thus, LRV1 in the metastasizing parasites subverted the host immune response to Leishmania and promoted parasite persistence.
Resumo:
BACKGROUND: Postmenopausal women with hormone receptor-positive early breast cancer have persistent, long-term risk of breast-cancer recurrence and death. Therefore, trials assessing endocrine therapies for this patient population need extended follow-up. We present an update of efficacy outcomes in the Breast International Group (BIG) 1-98 study at 8·1 years median follow-up. METHODS: BIG 1-98 is a randomised, phase 3, double-blind trial of postmenopausal women with hormone receptor-positive early breast cancer that compares 5 years of tamoxifen or letrozole monotherapy, or sequential treatment with 2 years of one of these drugs followed by 3 years of the other. Randomisation was done with permuted blocks, and stratified according to the two-arm or four-arm randomisation option, participating institution, and chemotherapy use. Patients, investigators, data managers, and medical reviewers were masked. The primary efficacy endpoint was disease-free survival (events were invasive breast cancer relapse, second primaries [contralateral breast and non-breast], or death without previous cancer event). Secondary endpoints were overall survival, distant recurrence-free interval (DRFI), and breast cancer-free interval (BCFI). The monotherapy comparison included patients randomly assigned to tamoxifen or letrozole for 5 years. In 2005, after a significant disease-free survival benefit was reported for letrozole as compared with tamoxifen, a protocol amendment facilitated the crossover to letrozole of patients who were still receiving tamoxifen alone; Cox models and Kaplan-Meier estimates with inverse probability of censoring weighting (IPCW) are used to account for selective crossover to letrozole of patients (n=619) in the tamoxifen arm. Comparison of sequential treatments to letrozole monotherapy included patients enrolled and randomly assigned to letrozole for 5 years, letrozole for 2 years followed by tamoxifen for 3 years, or tamoxifen for 2 years followed by letrozole for 3 years. Treatment has ended for all patients and detailed safety results for adverse events that occurred during the 5 years of treatment have been reported elsewhere. Follow-up is continuing for those enrolled in the four-arm option. BIG 1-98 is registered at clinicaltrials.govNCT00004205. FINDINGS: 8010 patients were included in the trial, with a median follow-up of 8·1 years (range 0-12·4). 2459 were randomly assigned to monotherapy with tamoxifen for 5 years and 2463 to monotherapy with letrozole for 5 years. In the four-arm option of the trial, 1546 were randomly assigned to letrozole for 5 years, 1548 to tamoxifen for 5 years, 1540 to letrozole for 2 years followed by tamoxifen for 3 years, and 1548 to tamoxifen for 2 years followed by letrozole for 3 years. At a median follow-up of 8·7 years from randomisation (range 0-12·4), letrozole monotherapy was significantly better than tamoxifen, whether by IPCW or intention-to-treat analysis (IPCW disease-free survival HR 0·82 [95% CI 0·74-0·92], overall survival HR 0·79 [0·69-0·90], DRFI HR 0·79 [0·68-0·92], BCFI HR 0·80 [0·70-0·92]; intention-to-treat disease-free survival HR 0·86 [0·78-0·96], overall survival HR 0·87 [0·77-0·999], DRFI HR 0·86 [0·74-0·998], BCFI HR 0·86 [0·76-0·98]). At a median follow-up of 8·0 years from randomisation (range 0-11·2) for the comparison of the sequential groups with letrozole monotherapy, there were no statistically significant differences in any of the four endpoints for either sequence. 8-year intention-to-treat estimates (each with SE ≤1·1%) for letrozole monotherapy, letrozole followed by tamoxifen, and tamoxifen followed by letrozole were 78·6%, 77·8%, 77·3% for disease-free survival; 87·5%, 87·7%, 85·9% for overall survival; 89·9%, 88·7%, 88·1% for DRFI; and 86·1%, 85·3%, 84·3% for BCFI. INTERPRETATION: For postmenopausal women with endocrine-responsive early breast cancer, a reduction in breast cancer recurrence and mortality is obtained by letrozole monotherapy when compared with tamoxifen montherapy. Sequential treatments involving tamoxifen and letrozole do not improve outcome compared with letrozole monotherapy, but might be useful strategies when considering an individual patient's risk of recurrence and treatment tolerability. FUNDING: Novartis, United States National Cancer Institute, International Breast Cancer Study Group.
Resumo:
BACKGROUND: The risk of falls is the most commonly cited reason for not providing oral anticoagulation, although the risk of bleeding associated with falls on oral anticoagulants is still debated. We aimed to evaluate whether patients on oral anticoagulation with high falls risk have an increased risk of major bleeding. METHODS: We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants. The outcome was the time to a first major bleed within a 12-month follow-up period adjusted for age, sex, alcohol abuse, number of drugs, concomitant treatment with antiplatelet agents, and history of stroke or transient ischemic attack. RESULTS: Among the 515 enrolled patients, 35 patients had a first major bleed during follow-up (incidence rate: 7.5 per 100 patient-years). Overall, 308 patients (59.8%) were at high risk of falls, and these patients had a nonsignificantly higher crude incidence rate of major bleeding than patients at low risk of falls (8.0 vs 6.8 per 100 patient-years, P=.64). In multivariate analysis, a high falls risk was not statistically significantly associated with the risk of a major bleed (hazard ratio 1.09; 95% confidence interval, 0.54-2.21). Overall, only 3 major bleeds occurred directly after a fall (incidence rate: 0.6 per 100 patient-years). CONCLUSIONS: In this prospective cohort, patients on oral anticoagulants at high risk of falls did not have a significantly increased risk of major bleeds. These findings suggest that being at risk of falls is not a valid reason to avoid oral anticoagulants in medical patients.
Resumo:
The aim of this study was to determine the effect of using video analysis software on the interrater reliability of visual assessments of gait videos in children with cerebral palsy. Two clinicians viewed the same random selection of 20 sagittal and frontal video recordings of 12 children with cerebral palsy routinely acquired during outpatient rehabilitation clinics. Both observers rated these videos in a random sequence for each lower limb using the Observational Gait Scale, once with standard video software and another with video analysis software (Dartfish(®)) which can perform angle and timing measurements. The video analysis software improved interrater agreement, measured by weighted Cohen's kappas, for the total score (κ 0.778→0.809) and all of the items that required angle and/or timing measurements (knee position mid-stance κ 0.344→0.591; hindfoot position mid-stance κ 0.160→0.346; foot contact mid-stance κ 0.700→0.854; timing of heel rise κ 0.769→0.835). The use of video analysis software is an efficient approach to improve the reliability of visual video assessments.
Resumo:
J Clin Hypertens (Greenwich). 2012;14:773-778. ©2012 Wiley Periodicals, Inc. Postmenopausal women are at greater risk for hypertension-related cardiovascular disease. Antihypertensive therapy may help alleviate arterial stiffness that represents a potential modifiable risk factor of hypertension. This randomized controlled study investigated the difference between an angiotensin receptor blocker and a calcium channel blocker in reducing arterial stiffness. Overall, 125 postmenopausal hypertensive women (age, 61.4±6 years; systolic blood pressure/diastolic blood pressure [SBP/DBP], 158±11/92±9 mm Hg) were randomized to valsartan 320 mg±hydrochlorothiazide (HCTZ) (n=63) or amlodipine 10 mg±HCTZ (n=62). The primary outcome was carotid-to-femoral pulse wave velocity (PWV) changes after 38 weeks of treatment. Both treatments lowered peripheral blood pressure (BP) (-22.9/-10.9 mm Hg for valsartan and -25.2/-11.7 mm Hg for amlodipine, P=not significant) and central BP (-15.7/-7.6 mm Hg for valsartan and -19.2/-10.3 mm Hg for amlodipine, P<.05 for central DBP). Both treatments similarly reduced the carotid-femoral PWV (-1.9 vs -1.7 m/s; P=not significant). Amlodipine was associated with a higher incidence of peripheral edema compared with the valsartan group (77% vs 14%, P<.001). BP lowering in postmenopausal women led to a reduction in arterial stiffness as assessed by PWV measurement. Both regimens reduced PWV to a similar degree after 38 weeks of treatment despite differences in central BP lowering, suggesting that the effect of valsartan on PWV is mediated through nonhemodynamic effects.
Resumo:
Background: General practitioners play a central role in taking deprivation into consideration when caring for patients in primary care. Validated questions to identify deprivation in primary-care practices are still lacking. For both clinical and research purposes, this study therefore aims to develop and validate a standardized instrument measuring both material and social deprivation at an individual level. Methods: The Deprivation in Primary Care Questionnaire (DiPCare-Q) was developed using qualitative and quantitative approaches between 2008 and 2011. A systematic review identified 199 questions related to deprivation. Using judgmental item quality, these were reduced to 38 questions. Two focus groups (primary-care physicians, and primary-care researchers), structured interviews (10 laymen), and think aloud interviews (eight cleaning staff) assured face validity. Item response theory analysis was then used to derive the DiPCare-Q index using data obtained from a random sample of 200 patients who were to complete the questionnaire a second time over the phone. For construct and criterion validity, the final 16 questions were administered to a random sample of 1,898 patients attending one of 47 different private primary-care practices in western Switzerland (validation set) along with questions on subjective social status (subjective SES ladder), education, source of income, welfare status, and subjective poverty. Results: Deprivation was defined in three distinct dimensions (table); material deprivation (eight items), social deprivation (five items) and health deprivation (three items). Item consistency was high in both the derivation (KR20 = 0.827) and the validation set (KR20 = 0.778). The DiPCare-Q index was reliable (ICC = 0.847). For construct validity, we showed the DiPCare-Q index to be correlated to patients' estimation of their position on the subjective SES ladder (rs = 0.539). This position was correlated to both material and social deprivation independently suggesting two separate mechanisms enhancing the feeling of deprivation. Conclusion: The DiPCare-Q is a rapid, reliable and validated instrument useful for measuring both material and social deprivation in primary care. Questions from the DiPCare-Q are easy to use when investigating patients' social history and could improve clinicians' ability to detect underlying social distress related to deprivation.
Resumo:
Images of myocardial strain can be used to diagnose heart disease, plan and monitor treatment, and to learn about cardiac structure and function. Three-dimensional (3D) strain is typically quantified using many magnetic resonance (MR) images obtained in two or three orthogonal planes. Problems with this approach include long scan times, image misregistration, and through-plane motion. This article presents a novel method for calculating cardiac 3D strain using a stack of two or more images acquired in only one orientation. The zHARP pulse sequence encodes in-plane motion using MR tagging and out-of-plane motion using phase encoding, and has been previously shown to be capable of computing 3D displacement within a single image plane. Here, data from two adjacent image planes are combined to yield a 3D strain tensor at each pixel; stacks of zHARP images can be used to derive stacked arrays of 3D strain tensors without imaging multiple orientations and without numerical interpolation. The performance and accuracy of the method is demonstrated in vitro on a phantom and in vivo in four healthy adult human subjects.
Resumo:
Foram avaliados os efeitos da calagem e da adubação NPK no estado nutricional e na produção de borracha seca do clone RRIM 600. Utilizou-se o delineamento de blocos casualizados, com quatro repetições, em parcelas subdivididas. Nas parcelas foram testados duas testemunhas (sem adubação e sem calagem; sem adubação e com calagem), e seis tratamentos com calagem e adubação (N1P1K0, N2P2K0, N1P1K1, N2P2K1, N1P1K2 e N2P2K2). Os níveis anuais de NPK utilizados corresponderam a 40 e 80 kg ha-1 de N, 17,5 e 35,0 kg ha-1 de P2O5 e 0, 33,2 e 66,4 kg ha-1 de K2O. Nas subparcelas foram utilizados os sistemas de explotação ½S d/4 6 d/7 ET 2,5% LaPa 1/1 10/y e ½S d/6 6 d/7 ET 5,0% LaPa 1/1 10/y. Houve efeito significativo dos tratamentos sobre os teores de N, P, S, Cu e Zn nas folhas. A aplicação de N, nas duas doses, não elevou o seu teor nas folhas. O aumento nas doses de K2O na presença de N2P2 promoveu decréscimo no teor de zinco. A maior produção de borracha seca (1.778,9 kg ha-1), na média dos três anos, foi obtida no tratamento N2P2K1 + calagem nos dois sistemas de explotação.
Resumo:
O trabalho foi realizado no período de maio a outubro de 1991 no campo experimental do setor de olericultura da Ufla, Lavras, MG, com o objetivo de avaliar a influência de doses de paclobutrazol sobre o controle do pseudoperfilhamento e nas características morfológicas e comerciais do alho (Allium sativum L.). Utilizou-se o delineamento experimental de blocos ao acaso, com quatro doses de paclobutrazol (0, 500, 1.000 e 1.500 mg de i.a. L-1), em cinco repetições. Com o aumento das concentrações de paclobutrazol, houve uma redução na altura das plantas e no número de folhas por planta aos 60 e 90 dias após o plantio. A produtividade total e comercial de bulbos apresentou efeito significativo em relação às doses de paclobutrazol, sendo as concentrações de 725 e 778 mg L-1 as que proporcionaram as maiores produtividades. A porcentagem de bulbos pseudoperfilhados evidenciou efeito quadrático com o incremento das doses de paclobutrazol, cuja concentração de 1.163 mg L-1 propiciou maior redução na porcentagem de pseudoperfilhamento. A concentração de 744 mg L-1 de paclobutrazol proporcionou o maior peso médio de bulbo; e em relação a número de bulbilhos por bulbo, não se verificaram diferenças significativas entre os tratamentos.
Resumo:
OBJECTIVES: Registro Informatizado de Enfermedad TromboEmbólica (RIETE) database was used to investigate whether neurosurgical patients with venous thromboembolism (VTE) were more likely to die of bleeding or VTE and the influence of anticoagulation on these outcomes. METHODS: Clinical characteristics, treatment details, and 3-month outcomes were assessed in those who developed VTE after neurosurgery. RESULTS: Of 40 663 patients enrolled, 392 (0.96%) had VTE in less than 60 days after neurosurgery. Most patients in the cohort (89%) received initial therapy with low-molecular-weight heparin, (33% received subtherapeutic doses). In the first week, 10 (2.6%) patients died (8 with pulmonary embolism [PE], no bleeding deaths; P = .005). After the first week, 20 (5.1%) patients died (2 with fatal bleeding, none from PE). Overall, this cohort was more likely to develop a fatal PE than a fatal bleed (8 vs 2 deaths, P = .058). CONCLUSIONS: Neurosurgical patients developing VTE were more likely to die from PE than from bleeding in the first week, despite anticoagulation.