498 resultados para BRAZILIAN PATIENTS
Resumo:
Background: Seizures and interictal spikes in mesial temporal lobe epilepsy (MTLE) affect a network of brain regions rather than a single epileptic focus. Simultaneous electroencephalography and functional magnetic resonance imaging (EEG-fMRI) studies have demonstrated a functional network in which hemodynamic changes are time-locked to spikes. However, whether this reflects the propagation of neuronal activity from a focus, or conversely the activation of a network linked to spike generation remains unknown. The functional connectivity (FC) changes prior to spikes may provide information about the connectivity changes that lead to the generation of spikes. We used EEG-fMRI to investigate FC changes immediately prior to the appearance of interictal spikes on EEG in patients with MTLE. Methods/principal findings: Fifteen patients with MTLE underwent continuous EEG-fMRI during rest. Spikes were identified on EEG and three 10 s epochs were defined relative to spike onset: spike (0–10 s), pre-spike (−10 to 0 s), and rest (−20 to −10 s, with no previous spikes in the preceding 45s). Significant spike-related activation in the hippocampus ipsilateral to the seizure focus was found compared to the pre-spike and rest epochs. The peak voxel within the hippocampus ipsilateral to the seizure focus was used as a seed region for FC analysis in the three conditions. A significant change in FC patterns was observed before the appearance of electrographic spikes. Specifically, there was significant loss of coherence between both hippocampi during the pre-spike period compared to spike and rest states. Conclusion/significance: In keeping with previous findings of abnormal inter-hemispheric hippocampal connectivity in MTLE, our findings specifically link reduced connectivity to the period immediately before spikes. This brief decoupling is consistent with a deficit in mutual (inter-hemispheric) hippocampal inhibition that may predispose to spike generation.
Resumo:
PURPOSE: The prevalence of anaplastic lymphoma kinase (ALK) gene fusion (ALK positivity) in early-stage non-small-cell lung cancer (NSCLC) varies by population examined and detection method used. The Lungscape ALK project was designed to address the prevalence and prognostic impact of ALK positivity in resected lung adenocarcinoma in a primarily European population. METHODS: Analysis of ALK status was performed by immunohistochemistry (IHC) and fluorescent in situ hybridization (FISH) in tissue sections of 1,281 patients with adenocarcinoma in the European Thoracic Oncology Platform Lungscape iBiobank. Positive patients were matched with negative patients in a 1:2 ratio, both for IHC and for FISH testing. Testing was performed in 16 participating centers, using the same protocol after passing external quality assessment. RESULTS: Positive ALK IHC staining was present in 80 patients (prevalence of 6.2%; 95% CI, 4.9% to 7.6%). Of these, 28 patients were ALK FISH positive, corresponding to a lower bound for the prevalence of FISH positivity of 2.2%. FISH specificity was 100%, and FISH sensitivity was 35.0% (95% CI, 24.7% to 46.5%), with a sensitivity value of 81.3% (95% CI, 63.6% to 92.8%) for IHC 2+/3+ patients. The hazard of death for FISH-positive patients was lower than for IHC-negative patients (P = .022). Multivariable models, adjusted for patient, tumor, and treatment characteristics, and matched cohort analysis confirmed that ALK FISH positivity is a predictor for better overall survival (OS). CONCLUSION: In this large cohort of surgically resected lung adenocarcinomas, the prevalence of ALK positivity was 6.2% using IHC and at least 2.2% using FISH. A screening strategy based on IHC or H-score could be envisaged. ALK positivity (by either IHC or FISH) was related to better OS.
Resumo:
INTRODUCTION: The phase III FLEX study (NCT00148798) in advanced non-small-cell lung cancer indicated that the survival benefit associated with the addition of cetuximab to cisplatin and vinorelbine was limited to patients whose tumors expressed high levels of epidermal growth factor receptor (EGFR) (immunohistochemistry score of >/=200; scale 0-300). We assessed whether the treatment effect was also modulated in FLEX study patients by tumor EGFR mutation status. METHODS: A tumor mutation screen of EGFR exons 18 to 21 included 971 of 1125 (86%) FLEX study patients. Treatment outcome in low and high EGFR expression groups was analyzed across efficacy endpoints according to tumor EGFR mutation status. RESULTS: Mutations in EGFR exons 18 to 21 were detected in 133 of 971 tumors (14%), 970 of which were also evaluable for EGFR expression level. The most common mutations were exon 19 deletions and L858R (124 of 133 patients; 93%). In the high EGFR expression group (immunohistochemistry score of >/=200), a survival benefit for the addition of cetuximab to chemotherapy was demonstrated in patients with EGFR wild-type (including T790M mutant) tumors. Although patient numbers were small, those in the high EGFR expression group whose tumors carried EGFR mutations may also have derived a survival benefit from the addition of cetuximab to chemotherapy. Response data suggested a cetuximab benefit in the high EGFR expression group regardless of EGFR mutation status. CONCLUSIONS: The survival benefit associated with the addition of cetuximab to first-line chemotherapy for advanced non-small-cell lung cancer expressing high levels of EGFR is not limited by EGFR mutation status.
Resumo:
Diarrhoea is a common complication observed in critically ill patients. Relationships between diarrhoea, enteral nutrition and aerobic intestinal microflora have been disconnectedly examined in this patient cohort. This research used a two-study, observational design to examine these associations. Higher diarrhoea incidence rates were observed when patients received enteral tube feeding, had abnormal serum blood results, received multiple medications and had aerobic microflora dysbiosis. Further, significant aerobic intestinal microflora changes were observed over time in patients who experienced diarrhoea. These results establish a platform for further work to improve the intestinal health of critically ill patients.
Resumo:
Converging evidence from epidemiological, clinical and neuropsychological research suggests a link between cannabis use and increased risk of psychosis. Long-term cannabis use has also been related to deficit-like “negative” symptoms and cognitive impairment that resemble some of the clinical and cognitive features of schizophrenia. The current functional brain imaging study investigated the impact of a history of heavy cannabis use on impaired executive function in first-episode schizophrenia patients. Whilst performing the Tower of London task in a magnetic resonance imaging scanner, event-related blood oxygenation level-dependent (BOLD) brain activation was compared between four age and gender-matched groups: 12 first-episode schizophrenia patients; 17 long-term cannabis users; seven cannabis using first-episode schizophrenia patients; and 17 healthy control subjects. BOLD activation was assessed as a function of increasing task difficulty within and between groups as well as the main effects of cannabis use and the diagnosis of schizophrenia. Cannabis users and non-drug using first-episode schizophrenia patients exhibited equivalently reduced dorsolateral prefrontal activation in response to task difficulty. A trend towards additional prefrontal and left superior parietal cortical activation deficits was observed in cannabis-using first-episode schizophrenia patients while a history of cannabis use accounted for increased activation in the visual cortex. Cannabis users and schizophrenia patients fail to adequately activate the dorsolateral prefrontal cortex, thus pointing to a common working memory impairment which is particularly evident in cannabis-using first-episode schizophrenia patients. A history of heavy cannabis use, on the other hand, accounted for increased primary visual processing, suggesting compensatory imagery processing of the task.
Resumo:
Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure.
Resumo:
Platelet-derived microparticles that are produced during platelet activation are capable of adhesion and aggregation. Endothelial trauma that occurs during percutaneous transluminal coronary angioplasty (PTCA) may support platelet-derived microparticle adhesion and contribute to development of restenosis. We have previously reported an increase in platelet-derived microparticles in peripheral arterial blood with angioplasty. This finding raised concerns regarding the role of plateletderived microparticles in restenosis, and therefore the aim of this study was to monitor levels in the coronary circulation. The study population consisted of 19 angioplasty patients. Paired coronary artery and sinus samples were obtained following heparinization, following contrast administration, and subsequent to all vessel manipulation. Platelet-derived microparticles were identified with an anti-CD61 (glycoprotein IIIa) fluorescence-conjugated antibody using flow cytometry. There was a significant decrease in arterial platelet-derived microparticles from heparinization to contrast administration (P 0.001), followed by a significant increase to the end of angioplasty (P 0.004). However, there was no significant change throughout the venous samples. These results indicate that the higher level of platelet-derived microparticles after angioplasty in arterial blood remained in the coronary circulation. Interestingly, levels of thrombin–antithrombin complexes did not rise during PTCA. This may have implications for the development of coronary restenosis post-PTCA, although this remains to be determined.
Resumo:
BACKGROUND: About 1-5% of cancer patients suffer from significant normal tissue reactions as a result of radiotherapy (RT). It is not possible at this time to predict how most patients' normal tissues will respond to RT. DNA repair dysfunction is implicated in sensitivity to RT particularly in genes that mediate the repair of DNA double-strand breaks (DSBs). Phosphorylation of histone H2AX (phosphorylated molecules are known as gammaH2AX) occurs rapidly in response to DNA DSBs, and, among its other roles, contributes to repair protein recruitment to these damaged sites. Mammalian cell lines have also been crucial in facilitating the successful cloning of many DNA DSB repair genes; yet, very few mutant cell lines exist for non-syndromic clinical radiosensitivity (RS). METHODS: Here, we survey DNA DSB induction and repair in whole cells from RS patients, as revealed by gammaH2AX foci assays, as potential predictive markers of clinical radiation response. RESULTS: With one exception, both DNA focus induction and repair in cell lines from RS patients were comparable with controls. Using gammaH2AX foci assays, we identified a RS cancer patient cell line with a novel ionising radiation-induced DNA DSB repair defect; these data were confirmed by an independent DNA DSB repair assay. CONCLUSION: gammaH2AX focus measurement has limited scope as a pre-RT predictive assay in lymphoblast cell lines from RT patients; however, the assay can successfully identify novel DNA DSB repair-defective patient cell lines, thus potentially facilitating the discovery of novel constitutional contributions to clinical RS.
Resumo:
Background/Aim: Cardiotoxicity resulting in heart failure is a devastating complication of cancer therapy. It is possible that a patient may survive cancer only to develop heart failure (HF), which is more deadly than cancer. The aim of this project was to profile the characteristics of patients at risk of cancer treatment induced heart failure. Methods: Linked Health Data Analysis of Queensland Cancer Registry (QCR) from 1996-2009, Death Registry and Hospital Administration records for HF and chemotherapy admissions were reviewed. Index heart failure admission must have occurred after the date of cancer registry entry. Results: A total of 15,987 patients were included in this analysis; 1,062 (6.6%) had chemotherapy+HF admission (51.4% Female) and 14,925 (93.4%) chemotherapy_no HF admission. Median age of chemotherapy+HF patients was 67 years (IQR 58 to 75) vs. 54 years (IQR 44 to 64) for chemotherapy_no HF admission. Chemotherapy+HF patients had increased risk of all cause mortality (HR 2.79 [95% CI 2.58-3.02] and 1.67 [95% CI, 1.54 to 1.81] after adjusting for age, sex, marital status, country of birth, cancer site and chemotherapy dose). Index HF admission occurred within one year of cancer diagnosis in 47% of HF patients with 80% of patinets having there index admission with 3 years. The number of chemotherapy cycles was not associated with significant reduction in survival time in chemotherapy+HF patients. Mean survival for heart failure patients was 5.3 years (95% CI, 4.99 - 5.62) vs.9.57 years (95% CI, 9.47-9.68) for chemotherapy_no HF admission patients. Conclusion: All-cause mortality was 67% higher in patients diagnosed with HF following chemotherapy in adjusted analysis for covariates. Methods to improve and better coordinate of the interdisciplinary care for cancer patients with HF involving cardiologists and oncologists are required, including evidence-based guidelines for the comprehensive assessment, monitoring and management of this cohort.
Resumo:
The evidence for nutritional support in COPD is almost entirely based on oral nutritional supplements (ONS) yet despite this dietary counseling and food fortification (DA) are often used as the first line treatment for malnutrition. This study aimed to investigate the effectiveness of ONS vs. DA in improving nutritional intake in malnourished outpatients with COPD. 70 outpatients (BMI 18.4 SD 1.6 kg/m2, age 73 SD 9 years, severe COPD) were randomised to receive a 12-week intervention of either ONS or DA (n 33 ONS vs. n 37 DA). Paired t-test analysis revealed total energy intakes significantly increased with ONS at week 6 (+302 SD 537 kcal/d; p = 0.002), with a slight reduction at week 12 (+243 SD 718 kcal/d; p = 0.061) returning to baseline levels on stopping supplementation. DA resulted in small increases in energy that only reached significance 3 months post-intervention (week 6: +48 SD 623 kcal/d, p = 0.640; week 12: +157 SD 637 kcal/d, p = 0.139; week 26: +247 SD 592 kcal/d, p = 0.032). Protein intake was significantly higher in the ONS group at both week 6 and 12 (ONS: +19.0 SD 25.0 g/d vs. DA: +1.0 SD 13.0 g/d; p = 0.033 ANOVA) but no differences were found at week 26. Vitamin C, Iron and Zinc intakes significantly increased only in the ONS group. ONS significantly increased energy, protein and several micronutrient intakes in malnourished COPD patients but only during the period of supplementation. Trials investigating the effects of combined nutritional interventions are required.
Resumo:
BACKGROUND: The evaluation of retinal image quality in cataract eyes has gained importance and the clinical modulation transfer functions (MTF) can obtained by aberrometer and double pass (DP) system. This study aimed to compare MTF derived from a ray tracing aberrometer and a DP system in early cataractous and normal eyes. METHODS: There were 128 subjects with 61 control eyes and 67 eyes with early cataract defined according to the Lens Opacities Classification System III. A laser ray-tracing wavefront aberrometer (iTrace) and a double pass (DP) system (OQAS) assessed ocular MTF for 6.0 mm pupil diameters following dilation. Areas under the MTF (AUMTF) and their correlations were analyzed. Stepwise multiple regression analysis assessed factors affecting the differences between iTrace- and OQAS-derived AUMTF for the early cataract group. RESULTS: For both early cataract and control groups, iTrace-derived MTFs were higher than OQAS-derived MTFs across a range of spatial frequencies (P < 0.01). No significant difference between the two groups occurred for iTrace-derived AUMTF, but the early cataract group had significantly smaller OQAS-derived AUMTF than did the control group (P < 0.01). AUMTF determined from both the techniques demonstrated significant correlations with nuclear opacities, higher-order aberrations (HOAs), visual acuity, and contrast sensitivity functions, while the OQAS-derived AUMTF also demonstrated significant correlations with age and cortical opacity grade. The factors significantly affecting the difference between iTrace and OQAS AUMTF were root-mean-squared HOAs (standardized beta coefficient = -0.63, P < 0.01) and age (standardized beta coefficient = 0.26, P < 0.01). CONCLUSIONS: MTFs determined from a iTrace and a DP system (OQAS) differ significantly in early cataractous and normal subjects. Correlations with visual performance were higher for the DP system. OQAS-derived MTF may be useful as an indicator of visual performance in early cataract eyes.
Resumo:
BACKGROUND: Postural instability is one of the major complications found in stroke survivors. Parameterising the functional reach test (FRT) could be useful in clinical practice and basic research. OBJECTIVES: To analyse the reliability, sensitivity, and specificity in the FRT parameterisation using inertial sensors for recording kinematic variables in patients who have suffered a stroke. DESIGN: Cross-sectional study. While performing FRT, two inertial sensors were placed on the patient's back (lumbar and trunk). PARTICIPANTS: Five subjects over 65 who suffer from a stroke. MEASUREMENTS: FRT measures, lumbosacral/thoracic maximum angular displacement, maximum time of lumbosacral/thoracic angular displacement, time return initial position, and total time. Speed and acceleration of the movements were calculated indirectly. RESULTS: FRT measure is 12.75±2.06 cm. Intrasubject reliability values range from 0.829 (time to return initial position (lumbar sensor)) to 0.891 (lumbosacral maximum angular displacement). Intersubject reliability values range from 0.821 (time to return initial position (lumbar sensor)) to 0.883 (lumbosacral maximum angular displacement). FRT's reliability was 0.987 (0.983-0.992) and 0.983 (0.979-0.989) intersubject and intrasubject, respectively. CONCLUSION: The main conclusion could be that the inertial sensors are a tool with excellent reliability and validity in the parameterization of the FRT in people who have had a stroke.
Resumo:
Objective. This study investigated cognitive functioning among older adults with physical debility not attributable to an acute injury or neurological condition who were receiving subacute inpatient physical rehabilitation. Design. A cohort investigation with assessments at admission and discharge. Setting. Three geriatric rehabilitation hospital wards. Participants. Consecutive rehabilitation admissions () following acute hospitalization (study criteria excluded orthopaedic, neurological, or amputation admissions). Intervention. Usual rehabilitation care. Measurements. The Functional Independence Measure (FIM) Cognitive and Motor items. Results. A total of 704 (86.5%) participants (mean age = 76.5 years) completed both assessments. Significant improvement in FIM Cognitive items (-score range 3.93–8.74, all ) and FIM Cognitive total score (-score = 9.12, ) occurred, in addition to improvement in FIM Motor performance. A moderate positive correlation existed between change in Motor and Cognitive scores (Spearman’s rho = 0.41). Generalized linear modelling indicated that better cognition at admission (coefficient = 0.398, ) and younger age (coefficient = −0.280, ) were predictive of improvement in Motor performance. Younger age (coefficient = −0.049, ) was predictive of improvement in FIM Cognitive score. Conclusions. Improvement in cognitive functioning was observed in addition to motor function improvement among this population. Causal links cannot be drawn without further research.
Resumo:
Background Whilst waiting for patients undergoing surgery, a lack of information regarding the patient’s status and the outcome of surgery, can contribute to the anxiety experienced by family members. Effective strategies for providing information to families are therefore required. Objectives To synthesize the best available evidence in relation to the most effective information-sharing interventions to reduce anxiety for families waiting for patients undergoing an elective surgical procedure. Inclusion criteria Types of participants All studies of family members over 18 years of age waiting for patients undergoing an elective surgical procedure were included, including those waiting for both adult and pediatric patients. Types of intervention All information-sharing interventions for families of patients undergoing an elective surgical procedure were eligible for inclusion in the review. Types of studies All randomized controlled trials (RCTs) quasi-experimental studies, case-controlled and descriptive studies, comparing one information-sharing intervention to another or to usual care were eligible for inclusion in the review. Types of outcomes Primary outcome: The level of anxiety amongst family members or close relatives whilst waiting for patients undergoing surgery, as measured by a validated instrument such as the S-Anxiety portion of the State-Trait Anxiety Inventory (STAI). Secondary outcomes: Family satisfaction and other measurements that may be considered indicators of stress and anxiety, such as mean arterial pressure (MAP) and heart rate. Search strategy A comprehensive search, restricted to English language only, was undertaken of the following databases from 1990 to May 2013: Medline, CINAHL, EMBASE, ProQuest, Web of Science, PsycINFO, Scopus, Dissertation and Theses PQDT (via ProQuest), Current Contents, CENTRAL, Google Scholar, OpenGrey, Clinical Trials, Science.gov, Current Controlled Trials and National Institute for Clinical Studies (NHMRC). Methodological quality Two independent reviewers critically appraised retrieved papers for methodological quality using the standardized critical appraisal instruments for randomized controlled trials and descriptive studies from the Joanna Briggs Institute Meta Analysis of Statistics Assessment and Review Instruments (JBI-MAStARI). Data extraction Two independent reviewers extracted data from included papers using a customized data extraction form. Data synthesis Statistical pooling was not possible, mainly due to issues with data reporting in two of the studies, therefore the results are presented in narrative form. Results Three studies with a total of 357 participants were included in the review. In-person reporting to family members was found to be effective in comparison with usual care in which no reports were provided. Telephone reporting was also found to be effective at reducing anxiety, in comparison with usual care, although not as effective as in-person reporting. The use of paging devices to keep family members informed were found to increase, rather than decrease anxiety. Conclusions Due to the lack of high quality research in this area, the strength of the conclusions are limited. It appears that in-person and telephone reporting to family members decreases anxiety, however the use of paging devices increases anxiety.
Resumo:
Background The frequency of prescribing potentially inappropriate medications (PIMs) in older patients remains high despite evidence of adverse outcomes from their use. Little is known about whether admission to hospital has any effect on appropriateness of prescribing. Objectives This study aimed to identify the prevalence and nature of PIMs and explore the association of risk factors for receiving a PIM. Methods This was a prospective study of 206 patients discharged to residential aged care facilities (RACFs) from acute care. All patients were aged at least 70 years and were admitted between July 2005 and May 2010; their admission and discharge medications were evaluated. Results Mean patient age was 84.8 ± 6.7 years; the majority (57%) were older than 85 years and mean (SD) Frailty Index was 0.42 (0.15). At least one PIM was identified in 112 (54.4%) patients on admission and 102 (49.5%) patients on discharge. Of all medications prescribed at admission (1728), 10.8% were PIMs and at discharge of 1759 medications, 9.6% were PIMs. Of total 187 PIMs on admission, 56 (30%) were stopped and 131 were continued; 32 new PIMs were introduced. Of the potential risk factors considered, in-hospital cognitive decline and frailty status were the only significant predictors of PIMs. Conclusion Although, admission to hospital is an opportunity to review the indications for specific medications, a high prevalence of inappropriate drug use was observed. The only associations with PIM use were the frailty status and in-hospital cognitive decline. Additional studies are needed to further evaluate this association.