939 resultados para Time since diagnosis
Resumo:
Pós-graduação em Ginecologia, Obstetrícia e Mastologia - FMB
Resumo:
Introduction: Parkinson’s Disease (PD) is characterized by a set of four motor symptoms: tremor, rigidity, bradykinesia and postural instability. These defi cits may predispose individuals to limitations resulting from falls and their secondary consequences. Objective: To evaluate the functional balance and quality of life (QoL) in individuals with PD and determine whether there is correlation between performance on tests of balance with the QoL. Method: The project was referred to the Ethics Committee in Research of Universidade Estadual Paulista “Julio de Mesquita Filho” Campus de Marília and was approved under protocol number 1806/09. Participated in this study with individuals diagnosed with PD between levels one and four in the Hoen and Yahr scale. The subjects were evaluated according to functional balance and QoL, respectively by the instruments: Functional Balance Scale Berg (EEFB), Time Up and Go test (TUG), and Parkinson’s Disease Questionnaire-39 (PDQ-39). To perform the statistical analysis used the GraphPad Prism 5. To perform the correlation analysis for the variables passed normality by the Shapiro-Wilk. Since the variables were non-parametric test was used Spearman. During the analysis the statistical signifi cance level was considered p ≤ 0, 05. Results: We studied 25 individuals aged between 54 and 85 years (71.20 ± 8.50), time of diagnosis between one and 39 years (6.54 ± 7.71) disease. Moderate correlation was found between the EEFB with QoL (r =- 0.6), and TUG with QoL (r = 0.6836). Among the aspects of QoL balance showed a higher correlation with the domains “mobility” (TUG r = 0, 6768; EEFB r = -0.6155) and “Activities of daily living” (TUG r = 0, 7357, and EEFB r = -0.6521). Conclusion: Patients with Parkinson’s disease show defi cits in balance and QoL. The balance disorders have a high correlation among themselves and show how aspects of QoL.
Resumo:
Background: Oropharyngeal dysphagia is common in individuals after stroke. Taste and temperature are used in dysphagia rehabilitation. The influence of stimuli, such as taste and temperature, on swallowing biomechanics has been investigated in both healthy individuals and in individuals with neurological disease. However, some questions still remain unanswered, such as how the sequence of offered stimuli influences the pharyngeal response. The goal of the present study was to determine the influence of the sequence of stimuli, sour taste and cold temperature, on pharyngeal transit time during deglutition in individuals after stroke. Methods: The study included 60 individuals with unilateral ischemic stroke, 29 males and 31 females, aged 41–88 years (mean age: 66.2 years) examined 0–50 days after ictus (median: 6 days), with mild to moderate oropharyngeal dysphagia. Exclusion criteria were hemorrhagic stroke patients, patients with decreased level of consciousness, and clinically unstable patients, as confirmed by medical evaluation. The individuals were divided into two groups of 30 individuals each. Group 1 received a nonrandomized sequence of stimuli (i.e. natural, cold, sour, and sour-cold) and group 2 received a randomized sequence of stimuli. A videofluoroscopic swallowing study was performed to analyze the pharyngeal transit time. Four different stimuli (natural, cold, sour, and sour-cold) were offered. The images were digitalized and specific software was used to measure the pharyngeal transit time. Since the values did not present regular distribution and uniform variances, nonparametric tests were performed. Results: Individuals in group 1 presented a significantly shorter pharyngeal transit time with the sour-cold stimulus than with the other stimuli. Individuals in group 2 did not show a significant difference in pharyngeal transit time between stimuli. Conclusions: The results showed that the sequence of offered stimuli influences the pharyngeal transit time in a different way in individuals after stroke and suggest that, when the sour-cold stimulus is offered in a randomized sequence, it can influence the response to the other stimuli in stroke patients. Hence, the sour-cold stimulus could be used as a therapeutic aid in dysphagic stroke patients.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Oncology is presenting an important role in clinical practice as a speciality in recent years in Veterinary Medicine. Mammary gland tumors are detected mainly in old and middleaged bitches that are sexually intact or spayed and the caudal abdominal and inguinal mammary glands are the most affected and they present a percentage up to 75% of malignancy. The majority of dogs with mammary neoplasms are clinically healthy at the time of diagnosis and the tumors can be identified by the owner or a professional during a routine physical examination. Cytological examination of fine needle aspirates can be performed. This procedure is easy and low cost and some criteria that may indicate malignancy are evaluated, however to obtain a definitive diagnosis is performed histopathology of the excised tissue or from biopsy. Regional lymph nodes are the first lymph node to receive lymphatic drainage from the neoplasm. They are at the highest risk of regional metastasis, while the lung is the most common site for distant metastasis. Determining the clinical stage enables the definition of the extension of the tumor. As a consequence, this allows a prognosis to be established and treatment to be planned. The type of therapy to be chosen incites controversy since there are numerous treatment options described, but the surgery is the chosen treatment. However, surgery is not always effective for malignant tumors, and recurrences may occur and in these cases, auxiliary chemotherapy treatments are used. The prognosis for animals that have mammary tumors depends on several factors, such as: size, stage, type of tumor cells and clinical behavior of the tumor, age and medical condition of the animal, and presence of metastasis. Because of this, more detailed studies are needed based on epidemiological surveys in order to provide more informations about risk factors, prevalence and follow-up after treatment of mammary... (Complete abstract click electronic access below)
Resumo:
Pós-graduação em Microbiologia Agropecuária - FCAV
Resumo:
We compared the effects of two anesthesia protocols in both immediate recovery time (IRT) and postoperative respiratory complications (PRCs) after laparotomy for bariatric surgery, and we determined the association between the longer IRT and the increase of PRC incidence. We conducted the study in two stages: (i) in a randomized controlled trial (RCT), patients received either intervention (sevoflurane-remifentanil-rocuronium-ropivacaine) or control protocol (isoflurane-sufentanil-atracurium-levobupivacaine). All patients received general anesthesia plus continuous epidural anesthesia and analgesia. Treatment was masked for all, except the provider anesthesiologist. We defined IRT as time since anesthetics discontinuation until tracheal extubation. Primary outcomes were IRT and PRCs incidence within 15 days after surgery. We also analyzed post-anesthesia care unit (PACU) and hospital length of stays; (ii) after the end of the RCT, we used the available data in an extension cohort study to investigate IRT > 20 min as exposure factor for PRCs. Control protocol (n = 152) resulted in longer IRT (30.4 ± 7.9 vs 18.2 ± 9.6 min; p < 0.0001), higher incidence of PRCs (6.58 vs 2.5 %; p = 0.048), and longer PACU and hospital stays than intervention protocol (n = 200); PRC relative risk (RR) = 2.6. Patients with IRT > 20 min (n = 190) presented higher incidence of PRCs (7.37 vs 0.62 %; p < 0.0001); RR = 12.06. Intervention protocol, with short-acting anesthetics, was more beneficial and safe compared to control protocol, with long-acting drugs, regarding the reduction of IRT, PRCs, and PACU and hospital stays for laparotomy in bariatric patients. We identified a 4.5-fold increase in the relative risk of PRCs when morbid obese patients are exposed to an IRT > 20 min.
Resumo:
Objective. To identify the factors linked to patients and health services in delays in the diagnosis of tuberculosis. Methods. Epidemiological study in Foz do Iguacu, Parana, Brazil, 2009. The Primary Care Assessment Tool, adapted for appraising tuberculosis treatment, was the instrument used. Descriptive statistics techniques were used, such as frequency distributions, central tendency and dispersion measurements (median and interquartile intervals), and odds ratios. Results. There were greater delays in seeking health services for those in the age group 60 years and older, for females, for patients with low levels of education, and for patients with poor knowledge of the disease. Clinical variables (being a new case and HIV infection) and behavioral variables (use of tobacco and alcohol consumption) were not linked with delays in diagnosis. The median time delays before diagnosis attributable to patients and to the health services were 30 days and 10 days, respectively. Emergency 24-hour medical services and primary health care services were not effective in identifying suspicious cases of tuberculosis and requesting tests to confirm the diagnosis, with a high percentage of referrals to the Tuberculosis Control Program clinic. Conclusions. Going to primary health care services for diagnosis increased the time before diagnosis of the disease was reached. The Tuberculosis Control Program clinic was more effective in diagnosis of tuberculosis, due to the training of the staff and to an organized process for receiving patients, including the availability of tests to support the diagnosis.
Resumo:
Abstract Background The aim of this study was evaluate the late-onset repercussions of heart alterations of patients with systemic lupus erythematosus (SLE) after a 13-year follow up. Methods A historical prospective study was carried out involving the analysis of data from the charts of patients with a confirmed diagnosis of lupus in follow up since 1998. The 13-year evolution was systematically reviewed and tabulated to facilitate the interpretation of the data. Results Forty-eight patient charts were analyzed. Mean patient age was 34.5 ± 10.8 years at the time of diagnosis and 41.0 ± 10.3 years at the time of the study (45 women and 3 men). Eight deaths occurred in the follow-up period (two due to heart problems). Among the alterations found on the complementary exams, 46.2% of cases demonstrated worsening at reevaluation and four patients required a heart catheterization. In these cases, coronary angioplasty was performed due to the severity of the obstructions and one case required a further catheterization, culminating in the need for surgical myocardial revascularization. Conclusion The analysis demonstrated progressive heart impairment, with high rates of alterations on conventional complementary exams, including the need for angioplasty or revascularization surgery in four patients. These findings indicate the need for rigorous cardiac follow up in patients with systemic lupus erythematosus.
Resumo:
Prediction of long-term disability in patients with multiple sclerosis (MS) is essential. Magnetic resonance imaging (MRI) measurement of brain volume may be of predictive value but sophisticated MRI techniques are often inaccessible in clinical practice. The corpus callosum index (CCI) is a normalized measurement that reflects changes of brain volume. We investigated medical records and 533 MRI scans at diagnosis and during clinical follow-up of 169 MS patients (mean age 42 +/- 11 years, 86% relapsing-remitting MS, time since first relapse 11 +/- 9 years). CCI at diagnosis was 0.345 +/- 0.04 and correlated with duration of disease (p = 0.002; r = -0.234) and expanded disability status scale (EDSS) score at diagnosis (r = -0.428; p < 0.001). Linear regression analyses identified age, duration of disease, relapse rate and EDSS at diagnosis as independent predictors for disability after mean of 7.1 years (Nagelkerkes' R:0.56). Annual CCI decrease was 0.01 +/- 0.02 (annual tissue loss: 1.3%). In secondary progressive MS patients, CCI decrease was double compared to that in relapsing-remitting MS patients (p = 0.04). There was a trend of greater CCI decrease in untreated patients compared to those who received disease modifying drugs (p = 0.2). CCI is an easy to use MRI marker for estimating brain atrophy in patients with MS. Brain atrophy as measured with CCI was associated with disability progression but it was not an independent predictor of long-term disability.
Resumo:
QUESTIONS UNDER STUDY/PRINCIPLES: After arterial ischemic stroke (AIS) an early diagnosis helps preserve treatment options that are no longer available later. Paediatric AIS is difficult to diagnose and often the time to diagnosis exceeds the time window of 6 hours defined for thrombolysis in adults. We investigated the delay from the onset of symptoms to AIS diagnosis in children and potential contributing factors.
Resumo:
BACKGROUND: After bovine spongiform encephalopathy (BSE) emerged in European cattle livestock in 1986 a fundamental question was whether the agent established also in the small ruminants' population. In Switzerland transmissible spongiform encephalopathies (TSEs) in small ruminants have been monitored since 1990. While in the most recent TSE cases a BSE infection could be excluded, for historical cases techniques to discriminate scrapie from BSE had not been available at the time of diagnosis and thus their status remained unclear. We herein applied state-of-the-art techniques to retrospectively classify these animals and to re-analyze the affected flocks for secondary cases. These results were the basis for models, simulating the course of TSEs over a period of 70 years. The aim was to come to a statistically based overall assessment of the TSE situation in the domestic small ruminant population in Switzerland. RESULTS: In sum 16 TSE cases were identified in small ruminants in Switzerland since 1981, of which eight were atypical and six were classical scrapie. In two animals retrospective analysis did not allow any further classification due to the lack of appropriate tissue samples. We found no evidence for an infection with the BSE agent in the cases under investigation. In none of the affected flocks, secondary cases were identified. A Bayesian prevalence calculation resulted in most likely estimates of one case of BSE, five cases of classical scrapie and 21 cases of atypical scrapie per 100'000 small ruminants. According to our models none of the TSEs is considered to cause a broader epidemic in Switzerland. In a closed population, they are rather expected to fade out in the next decades or, in case of a sporadic origin, may remain at a very low level. CONCLUSIONS: In summary, these data indicate that despite a significant epidemic of BSE in cattle, there is no evidence that BSE established in the small ruminant population in Switzerland. Classical and atypical scrapie both occur at a very low level and are not expected to escalate into an epidemic. In this situation the extent of TSE surveillance in small ruminants requires reevaluation based on cost-benefit analysis.
Resumo:
BACKGROUND: Recently, several cases of symptomatic and/or electrically detectable intracardiac inside-out abrasions in silicon-coated Riata® and Riata® ST leads have been described. However, the prevalence in asymptomatic patients with unremarkable implantable cardioverter defibrillator (ICD) interrogation is unknown. The aim of this study was to determine the prevalence of asymptomatic and electrically undetectable intracardiac inside-out abrasion in silicon-coated Riata® and Riata® ST leads. METHODS: All 52 patients with an active silicone-coated Riata® and Riata® ST lead followed up in our outpatient clinic were scheduled for a premature ICD interrogation and a biplane chest radiograph. When an intracardiac inside-out abrasion was suspected, this finding was confirmed by fluoroscopy. RESULTS: Mean time since implantation was 71±18months. An intracardiac inside-out abrasion was confirmed by fluoroscopy in 6 patients (11.5%). Mean time from lead implantation to detection of intracardiac inside-out abrasion was 79±14months. In all patients with an intracardiac inside-out abrasion, ICD interrogation showed normal and stable electrical parameters. Retrospectively, in 4 of these 6 patients, a coronary angiography performed 25±18months before diagnosis of intracardiac inside-out abrasion already showed the defect. Despite undetected intracardiac inside-out abrasion, 2 of these 4 patients experienced adequate antitachycardia pacing and ICD-shocks. ICD leads were replaced in all 6 patients. CONCLUSIONS: The prevalence of asymptomatic intracardiac inside-out abrasion in silicon-coated Riata® and Riata® ST leads is higher than 10% when assessed by fluoroscopy, and most intracardiac inside-out abrasions are not detectable by ICD interrogation.
Resumo:
OBJECTIVE: Posttraumatic stress disorder (PTSD) has been associated with an increased cardiovascular risk, though the pathophysiologic mechanisms involved are elusive. A hypercoagulable state before occurrence of coronary thrombosis contributes to atherosclerosis development. We investigated whether PTSD would be associated with increased coagulation activity. METHODS: We measured resting plasma levels of clotting factor VII activity (FVII:C), FVIII:C, FXII:C, fibrinogen, and D-dimer in 14 otherwise healthy patients with PTSD and in 14 age- and gender-matched, trauma-exposed non-PTSD controls. Categorical and dimensional diagnoses of PTSD were made using the Clinician-Administered PTSD Scale (CAPS) interview. We also investigated to what extent the relationship between PTSD and coagulation measures would be confounded by demographics, cardiovascular risk factors, lifestyle variables, time since trauma, and mood. RESULTS: Coagulation factor levels did not significantly differ between patients with a categorical diagnosis of PTSD and controls while controlling for covariates. In all subjects, FVIII:C was predicted by hyperarousal severity (beta = 0.46, p = .014) independent of covariates and by overall PTSD symptom severity (beta = 0.38, p = .045); the latter association was of borderline significance when separately controlling for gender, smoking, exercise, and anxiety (p values <.07). In patients, fibrinogen was predicted by hyperarousal severity (beta = 0.70, p = .005) and by overall PTSD symptom severity (beta = 0.61, p = .020), with mood partially affecting these associations. FVII:C, fibrinogen, and D-dimer showed no independent association with PTSD symptoms. CONCLUSIONS: PTSD may elicit hypercoagulability, even at subthreshold levels, offering one psychobiological pathway by which posttraumatic stress might contribute to atherosclerosis progression and clinical cardiovascular disease.
Resumo:
Arterial hypertension and diabetes are potent independent risk factors for cardiovascular, cerebral, renal and peripheral (atherosclerotic) vascular disease. The prevalence of hypertension in diabetic individuals is approximately twice that in the non-diabetic population. Diabetic individuals with hypertension have a greater risk of macrovascular and microvascular disease than normotensive diabetic individuals. Hypertension is a major contributor to morbidity and mortality in diabetes, and should be recognized and treated early. Type 2 diabetes and hypertension share certain risk factors such as overweight, visceral obesity, and possibly insulin resistance. Life-style modifications (weight reduction, exercise, limitation of daily alcohol intake, stop smoking) are the foundation of hypertension and diabetes management as the definitive treatment or adjunctive to pharmacological therapy. Additional pharmacological therapy should be initiated when life-style modifications are unsuccessful or hypertension is too severe at the time of diagnosis. All classes of antihypertensive drugs are effective in controlling blood pressure in diabetic patients. For single-agent therapy, ACE-inhibitors, angiotensin receptor blocker, beta-blockers, and diuretics can be recommended. Because of concerns about the lower effectiveness of calcium channel blockers in decreasing coronary events and heart failure and in reducing progression of renal disease in diabetes, it is recommended to use these agents as second-line drugs for patients who cannot tolerate the other preferred classes or who require additional agents to achieve the target blood pressure. The choice depends on the patients specific treatment indications since each of these drugs have potential advantages and disadvantages. In patients with microalbuminuria or clinical nephropathy, both ACE-inhibitors and angiotensin receptor blockers are considered first line therapy for the prevention of and progression of nephropathy. Since treatment is usually life-long, cost effectiveness should be included in treatment evaluation.