917 resultados para Cognitive and motor measures
Resumo:
OBJECTIVE: To determine the psychometric properties of an adapted version of the Falls Efficacy Scale (FES) in older rehabilitation patients. DESIGN: Cross-sectional survey. SETTING: Postacute rehabilitation facility in Switzerland. PARTICIPANTS: Seventy elderly persons aged 65 years and older receiving postacute, inpatient rehabilitation. INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: FES questions asked about subject's confidence (range, 0 [none]-10 [full]) in performing 12 activities of daily living (ADLs) without falling. Construct validity was assessed using correlation with measures of physical (basic ADLs [BADLs]), cognitive (Mini-Mental State Examination [MMSE]), affective (15-item Geriatric Depression Scale [GDS]), and mobility (Performance Oriented Mobility Assessment [POMA]) performance. Predictive validity was assessed using the length of rehabilitation stay as the outcome. To determine test-retest reliability, FES administration was repeated in a random subsample (n=20) within 72 hours. RESULTS: FES scores ranged from 10 to 120 (mean, 88.7+/-26.5). Internal consistency was optimal (Cronbach alpha=.90), and item-to-total correlations were all significant, ranging from .56 (toilet use) to .82 (reaching into closets). Test-retest reliability was high (intraclass correlation coefficient, .97; 95% confidence interval, .95-.99; P<.001). Subjects reporting a fall in the previous year had lower FES scores than nonfallers (85.0+/-25.2 vs 94.4+/-27.9, P=.054). The FES correlated with POMA (Spearman rho=.40, P<.001), MMSE (rho=.37, P=.001), BADL (rho=.43, P<.001), and GDS (rho=-.53, P<.001) scores. These relationships remained significant in multivariable analysis for BADLs and GDS, confirming FES construct validity. There was a significant inverse relationship between FES score and the length of rehabilitation stay, independent of sociodemographic, functional, cognitive, and fall status. CONCLUSIONS: This adapted FES is reliable and valid in older patients undergoing postacute rehabilitation. The independent association between poor falls efficacy and increased length of stay has not been previously described and needs further investigations.
Resumo:
Recent advances in signal analysis have engendered EEG with the status of a true brain mapping and brain imaging method capable of providing spatio-temporal information regarding brain (dys)function. Because of the increasing interest in the temporal dynamics of brain networks, and because of the straightforward compatibility of the EEG with other brain imaging techniques, EEG is increasingly used in the neuroimaging community. However, the full capability of EEG is highly underestimated. Many combined EEG-fMRI studies use the EEG only as a spike-counter or an oscilloscope. Many cognitive and clinical EEG studies use the EEG still in its traditional way and analyze grapho-elements at certain electrodes and latencies. We here show that this way of using the EEG is not only dangerous because it leads to misinterpretations, but it is also largely ignoring the spatial aspects of the signals. In fact, EEG primarily measures the electric potential field at the scalp surface in the same way as MEG measures the magnetic field. By properly sampling and correctly analyzing this electric field, EEG can provide reliable information about the neuronal activity in the brain and the temporal dynamics of this activity in the millisecond range. This review explains some of these analysis methods and illustrates their potential in clinical and experimental applications.
Resumo:
Multi-resistant gram-negative rods are important pathogens in intensive care units (ICU), cause high rates of mortality, and need infection control measures to avoid spread to another patients. This study was undertaken prospectively with all of the patients hospitalized at ICU, Anesthesiology of the Hospital São Paulo, using the ICU component of the National Nosocomial Infection Surveillance System (NNIS) methodology, between March 1, 1997 and June 30, 1998. Hospital infections occurring during the first three months after the establishment of prevention and control measures (3/1/97 to 5/31/97) were compared to those of the last three months (3/1/98 to 5/31/98). In this period, 933 NNIS patients were studied, with 139 during the first period and 211 in the second period. The overall rates of infection by multi-resistant microorganisms in the first and second periods were, respectively, urinary tract infection: 3.28/1000 patients/day; 2.5/1000 patients/day; pneumonia: 2.10/1000 patients/day; 5.0/1000 patients/day; bloodstream infection: 1.09/1000 patients/day; 2.5/1000 patients/day. A comparison between overall infection rates of both periods (Wilcoxon test) showed no statistical significance (p = 0.067). The use of intervention measures effectively decreased the hospital bloodstream infection rate (p < 0.001), which shows that control measures in ICU can contribute to preventing hospital infections.
Resumo:
In 2008 a 4-year plan for HIV and AIDS Education and Prevention in Ireland was published. The plan aimed to contribute to a reduction in new infections of HIV and AIDS through education and prevention measures. It also aimed to guide and inform the development of policy and services in the statutory and non-statutory sectors with responsibility in this regard. This report is produced as a response to a letter from the Secretariat of the National AIDS Strategy Committee (NASC). The letter requested “feedback from the Education and Prevention Sub-Committee on prevention activities currently in place and on progress to date on the Education and Prevention Action Plan (2008-2012).” In addition, action 2 under Action Area 5: Monitoring and evaluation states that “a mid-term review of the implementation of this action plan should be published”. We note from the HPSC data that there has been a slight decrease in the overall number of new HIV infections however; there has been a huge concern over the large increase in new diagnoses in men who have sex with men (MSM). Although we cannot provide the evidence for the reason for this increase, it is stipulated that there has been a huge increase in the education and prevention programmes targeted at MSM and the report will show the evidence of that increase (Action Area 3: Preventing new infections: population group MSM). There is a presumption that because of increased awareness, access and confidence of MSM and improved treatment that there are more MSM being tested and more diagnoses. This report presents an update on the progress of the implementation of the actions in the HIV and AIDS Education and Prevention Plan 2008-2012.This resource was contributed by The National Documentation Centre on Drug Use.
Resumo:
A total of 296 Shigella spp. were received from State Public Health Laboratories, during the period from 1999 to 2004, by National Reference Laboratory for Cholera and Enteric Diseases (NRLCED) - IOC/Fiocruz, Rio de Janeiro, Brazil. The frequency of Shigella spp. was: S. flexneri (52.7%), S. sonnei (44.2%), S. boydii (2.3%), and S. dysenteriae (0.6%). The most frequent S. flexneri serovars were 2a and 1b. The highest incidence rates of Shigella isolation were observed in the Southeast (39%) and Northeast (34%) regions and the lowest rate in the South (3%) of Brazil. Strains were further analyzed for antimicrobial susceptibility by disk diffusion method as part of a surveillance program on antimicrobial resistance. The highest rates of antimicrobial resistance were to trimethoprim-sulfamethozaxole (90%), tetracycline (88%), ampicillin (56%), and chloramphenicol (35%). The patterns of antimicrobial resistance among Shigella isolates pose a major difficulty in the determination of an appropriate drug for shigellosis treatment. Continuous monitoring of antimicrobial susceptibilities of Shigella spp. through a surveillance system is thus essential for effective therapy and control measures against shigellosis.
Resumo:
Malaria emerges from a disequilibrium of the system 'human-plasmodium-mosquito' (HPM). If the equilibrium is maintained, malaria does not ensue and the result is asymptomatic plasmodium infection. The relationships among the components of the system involve coadaptive linkages that lead to equilibrium. A vast body of evidence supports this assumption, including the strategies involved in the relationships between plasmodium and human and mosquito immune systems, and the emergence of resistance of plasmodia to antimalarial drugs and of mosquitoes to insecticides. Coadaptive strategies for malaria control are based on the following principles: (1) the system HPM is composed of three highly complex and dynamic components, whose interplay involves coadaptive linkages that tend to maintain the equilibrium of the system; (2) human and mosquito immune systems play a central role in the coadaptive interplay with plasmodium, and hence, in the mainten-ance of the system's equilibrium; the under- or overfunction of human immune system may result in malaria and influence its severity; (3) coadaptation depends on genetic and epigenetic phenomena occurring at the interfaces of the components of the system, and may involve exchange of infectrons (genes or gene fragments) between the partners; (4) plasmodia and mosquitoes have been submitted to selective pressures, leading to adaptation, for an extremely long while and are, therefore, endowed with the capacity to circumvent both natural (immunity) and artificial (drugs, insecticides, vaccines) measures aiming at destroying them; (5) since malaria represents disequilibrium of the system HPM, its control should aim at maintaining or restoring this equilibrium; (6) the disequilibrium of integrated systems involves the disequilibrium of their components, therefore the maintenance or restoration of the system's equilibrium depend on the adoption of integrated and coordinated measures acting on all components, that means, panadaptive strategies. Coadaptive strategies for malaria control should consider that: (1) host immune response has to be induced, since without it, no coadaptation is attained; (2) the immune response has to be sustained and efficient enough to avoid plasmodium overgrowth; (3) the immune response should not destroy all parasites; (4) the immune response has to be well controlled in order to not harm the host. These conditions are mostly influenced by antimalarial drugs, and should also be taken into account for the development of coadaptive malaria vaccines.
Resumo:
OBJECTIVE: We examined cognitive performance in children after stroke to study the influence of age at stroke, seizures, lesion characteristics, neurologic impairment (NI), and functional outcome on cognitive outcome. METHODS: This was a prospectively designed study conducted in 99 children who sustained an arterial ischemic stroke (AIS) between the age of 1 month and 16 years. All children underwent cognitive and neurologic follow-up examination sessions 2 years after the insult. Cognitive development was assessed with age-appropriate instruments. RESULTS: Although mean cognitive performance was in the lower normative range, we found poorer results in subtests measuring visuoconstructive skills, short-term memory, and processing speed. Risk factors for negative cognitive outcome were young age at stroke, seizures, combined lesion location (cortical and subcortical), as well as marked NI. CONCLUSIONS: We recommend that all children with a history of AIS undergo regularly scheduled neuropsychological assessment to ensure implementation of appropriate interventions and environmental adjustments as early as possible.
Resumo:
Background: Atazanavir boosted with ritonavir (ATV/r) and efavirenz (EFV) are both recommended as first-line therapies for HIV-infected patients. We compared the 2 therapies for virologic efficacy and immune recovery. Methods: We included all treatment-naïve patients in the Swiss HIV Cohort Study starting therapy after May 2003 with either ATV/r or EFV and a backbone of tenofovir and either emtricitabine or lamivudine. We used Cox models to assess time to virologic failure and repeated measures models to assess the change in CD4 cell counts over time. All models were fit as marginal structural models using both point of treatment and censoring weights. Intent-to-treat and various as-treated analyses were carried out: In the latter, patients were censored at their last recorded measurement if they changed therapy or if they were no longer adherent to therapy. Results: Patients starting EFV (n = 1,097) and ATV/r (n = 384) were followed for a median of 35 and 37 months, respectively. During follow-up, 51% patients on EFV and 33% patients on ATV/r remained adherent and made no change to their first-line therapy. Although intent-to-treat analyses suggest virologic failure was more likely with ATV/r, there was no evidence for this disadvantage in patients who adhered to first-line therapy. Patients starting ATV/r had a greater increase in CD4 cell count during the first year of therapy, but this advantage disappeared after one year. Conclusions: In this observational study, there was no good evidence of any intrinsic advantage for one therapy over the other, consistent with earlier clinical trials. Differences between therapies may arise in a clinical setting because of differences in adherence to therapy.
Resumo:
Background Maternal exposure to air pollution has been related to fetal growth in a number of recent scientific studies. The objective of this study was to assess the association between exposure to air pollution during pregnancy and anthropometric measures at birth in a cohort in Valencia, Spain. Methods Seven hundred and eighty-five pregnant women and their singleton newborns participated in the study. Exposure to ambient nitrogen dioxide (NO2) was estimated by means of land use regression. NO2 spatial estimations were adjusted to correspond to relevant pregnancy periods (whole pregnancy and trimesters) for each woman. Outcome variables were birth weight, length, and head circumference (HC), along with being small for gestational age (SGA). The association between exposure to residential outdoor NO2 and outcomes was assessed controlling for potential confounders and examining the shape of the relationship using generalized additive models (GAM). Results For continuous anthropometric measures, GAM indicated a change in slope at NO2 concentrations of around 40 μg/m3. NO2 exposure >40 μg/m3 during the first trimester was associated with a change in birth length of -0.27 cm (95% CI: -0.51 to -0.03) and with a change in birth weight of -40.3 grams (-96.3 to 15.6); the same exposure throughout the whole pregnancy was associated with a change in birth HC of -0.17 cm (-0.34 to -0.003). The shape of the relation was seen to be roughly linear for the risk of being SGA. A 10 μg/m3 increase in NO2 during the second trimester was associated with being SGA-weight, odds ratio (OR): 1.37 (1.01-1.85). For SGA-length the estimate for the same comparison was OR: 1.42 (0.89-2.25). Conclusions Prenatal exposure to traffic-related air pollution may reduce fetal growth. Findings from this study provide further evidence of the need for developing strategies to reduce air pollution in order to prevent risks to fetal health and development.
Resumo:
Patients with Temporal Lobe Epilepsy (TLE) suffer from widespread subtle white matter abnormalities and abnormal functional connectivity extending beyond the affected lobe, as revealed by Diffusion Tensor MR Imaging, volumetric and functional MRI studies. Diffusion Spectrum Imaging (DSI) is a diffusion imaging technique with high angular resolution for improving the mapping of white matter pathways. In this study, we used DSI, connectivity matrices and topological measures to investigate how the alteration in structural connectivity influences whole brain structural networks. Eleven patients with right-sided TLE and hippocampal sclerosis and 18 controls underwent our DSI protocol at 3T. The cortical and subcortical grey matters were parcellated into 86 regions of interest and the connectivity between every region pair was estimated using global tractography and a connectivity matrix (the adjacency matrix of the structural network). We then compared the networks of patients and controls using topological measures. In patients, we found a higher characteristic path length and a lower clustering coefficient compared to controls. Local measures at node level of the clustering and efficiency showed a significant difference after a multiple comparison correction (Bonferroni). These significant nodes were located within as well outside the temporal lobe, and the localisation of most of them was consistent with regions known to be part of epileptic networks in TLE. Our results show altered connectivity patterns that are concordant with the mapping of functional epileptic networks in patients with TLE. Further studies are needed to establish the relevance of these findings for the propagation of epileptic activity, cognitive deficits in medial TLE and outcome of epilepsy surgery in individual patients.
Resumo:
Subthalamic nucleus deep brain stimulation (STN-DBS) is a recognized treatment for advanced and severe forms of Parkinson's Disease. The procedure improves motor signs and often allows a reduction of the medication. The impact of the procedure on cognitive and neuropsychiatric signs of the disease is more debated and there is an international consensus for the need of a multidisciplinary evaluation of patients undergoing such programs, including a neuropsychiatric assessment. We present a review of the literature as well as the experience at our centre focused on the short and long term outcome on mood following STN-DBS.
Resumo:
A 57-year-old male with no family history was diagnosed with semantic dementia. He also showed some unusual cognitive features such as episodic memory and executive dysfunctions, spatial disorientation, and dyscalculia. Rapidly progressive cognitive and physical decline occurred. About 1.5 years later, he developed clinical features of a corticobasal syndrome. He died at the age of 60. Brain autopsy revealed numerous 4R-tau-positive lesions in the frontal, parietal and temporal lobes, basal ganglia, and brainstem. Neuronal loss was severe in the temporal cortex. Such association of semantic dementia with tauopathy and corticobasal syndrome is highly unusual. These findings are discussed in the light of current knowledge about frontotemporal lobar degeneration.
Prevalence and genotyping of hepatitis C virus in blood donors in the state of Pará, Northern Brazil
Resumo:
Given the scarcity of epidemiological information on hepatitis C virus (HCV) infection in Northern Brazil, we determined the prevalence and genotypic frequency in blood donors in the state of Pará (PA). Blood samples from all of the blood donors at the Fundação HEMOPA (blood bank of PA) from 2004-2006 were screened for the presence of antibodies to anti-HCV and samples seroreactive to anti-HCV were further tested for HCV RNA using real-time PCR. In total, 116 HCV-RNA samples were genotyped, based on maximum likelihood phylogenetic analyses, using BioEdit, Modelgenerator, PHYML and FigTree software. The population consisted of 242,726 volunteers who donated blood from 2004-2006; the most common subgroup was males between the ages of 18-29 years old (37.30%). Within the whole group, 1,112 blood donors (0.46%) had indeterminate or positive serology; among these, 28.78% were males whose ages ranged from 18-29 years. A diagnosis of chronic HCV infection was confirmed for 304 donors (60.20% males; 66.45% were 30-49 years old), resulting in a prevalence of HCV RNA in 0.13% of the samples (304 of 242,726). HCV genotyping revealed a high frequency of genotype 1 (108/116) followed by genotype 3 (8/116). This study found HCV infection to be relatively infrequent in PA; genotype 1 was most commonly isolated. This information can help guide prevention and control policies aimed at efficient diagnosis and control measures.
Resumo:
Biologicals have been used for decades in biopharmaceutical topical preparations. Because cellular therapies are rou-tinely used in the clinic they have gained significant attention. Different derivatives are possible from different cell and tissue sources, making the selection of cell types and establishment of consistent cell banks crucial steps in the initial whole-cell bioprocessing. Various cell and tissue types have been used in treatment of skin wounds including autolo-gous and allogenic skin cells, platelets, placenta and amniotic extracts from either human or animal sources. Experience with progenitor cells show that they may provide an interesting cell choice due to facility of out-scaling and known properties for wound healing without scar. Using defined animal cell lines to develop cell-free derivatives may provide initial starting material for pharmaceutical formulations that help in overall stability. Cell lines derived from ovine tis-sue (skin, muscle, connective tissue) can be developed in short periods of time and consistency of these cell lines was monitored by cellular life-span, protein concentrations, stability and activity. Each cell line had long culture periods up to 37 - 41 passages and protein measures for each cell line at passages 2 - 15 had only 1.4-fold maximal difference. Growth stimulation activity towards two target skin cell lines (GM01717 and CRL-1221; 40 year old human males) at concentrations ranging up to 6 μg/ml showed 2-3-fold (single extracts) and 3-7-fold (co-cultured extracts) increase. Proteins from co-culture remained stable up to 1 year in pharmaceutical preparations shown by separation on SDS- PAGE gels. Pharmaceutical cell-free preparations were used for veterinary and human wounds and burns. Cell lines and cell-free extracts can show remarkable consistency and stability for preparation of biopharmaceutical creams, moreover when cells are co-cultured, and have positive effects for tissue repair.
Resumo:
Nutritional metabolic management, together with other treatment and support measures used, is one of the mainstays of the treatment of septic patients. Nutritional support should be started early, after initial life support measures, to avoid the consequences of malnutrition, to provide adequate nutritional intake and to prevent the development of secondary complications such as superinfection or multiorgan failure. As in other critically-ill patients, when the enteral route cannot be used to ensure calorie-protein requirements, the association of parenteral nutrition has been shown to be safe in this subgroup of patients. Studies evaluating the effect of specific pharmaconutrients in septic patients are scarce and are insufficient to allow recommendations to be made. To date, enteral diets with a mixture of substrates with distinct pharmaconutrient properties do not seem to be superior to standard diets in altering the course of sepsis, although equally there is no evidence that these diets are harmful. There is insufficient evidence to recommend the use of glutamine in septic patients receiving parenteral nutrition. However, given the good results and absence of glutamine-related adverse effects in the various studies performed in the general population of critically-ill patients, these patients could benefit from the use of this substance. Routine use of omega-3 fatty acids cannot be recommended until further evidence has been gathered, although the use of lipid emulsions with a high omega-6 fatty acid content should be avoided. Septic patients should receive an adequate supply of essential trace elements and vitamins. Further studies are required before the use of high-dose selenium can be recommended.