990 resultados para ELEMENTS DIAGNOSTIC UNIT
Resumo:
The use, manipulation and application of electrical currents, as a controlled interference mechanism in the human body system, is currently a strong source of motivation to researchers in areas such as clinical, sports, neuroscience, amongst others. In electrical stimulation (ES), the current applied to tissue is traditionally controlled concerning stimulation amplitude, frequency and pulse-width. The main drawbacks of the transcutaneous ES are the rapid fatigue induction and the high discomfort induced by the non-selective activation of nervous fibers. There are, however, electrophysiological parameters whose response, like the response to different stimulation waveforms, polarity or a personalized charge control, is still unknown. The study of the following questions is of great importance: What is the physiological effect of the electric pulse parametrization concerning charge, waveform and polarity? Does the effect change with the clinical condition of the subjects? The parametrization influence on muscle recruitment can retard fatigue onset? Can parametrization enable fiber selectivity, optimizing the motor fibers recruitment rather than the nervous fibers, reducing contraction discomfort? Current hardware solutions lack flexibility at the level of stimulation control and physiological response assessment. To answer these questions, a miniaturized, portable and wireless controlled device with ES functions and full integration with a generic biosignals acquisition platform has been created. Hardware was also developed to provide complete freedom for controlling the applied current with respect to the waveform, polarity, frequency, amplitude, pulse-width and duration. The impact of the methodologies developed is successfully applied and evaluated in the contexts of fundamental electrophysiology, psycho-motor rehabilitation and neuromuscular disorders diagnosis. This PhD project was carried out in the Physics Department of Faculty of Sciences and Technology (FCT-UNL), in straight collaboration with PLUX - Wireless Biosignals S.A. company and co-funded by the Foundation for Science and Technology.
Resumo:
INTRODUCTION: Food security remains to be one of the world's biggest problems and is found to be related to HIV/AIDS. The objective was to examine food insecurity in HIV/AIDS patients from Brasilia, Brazil. METHODS: Short version of the Food Security Scale was applied to patients with HIV/AIDS. RESULTS: A total of 103 patients participated (65 HIV+ and 38 with AIDS). Food insecurity was found in 33.8% of HIV+ patients and 36.8% of patients with AIDS. A relation between food insecurity and low educational and social levels was established. CONCLUSIONS: Food security should be an important component in HIV/AIDS treatment programs.
Resumo:
INTRODUCTION : Antimicrobial resistance is an increasing threat in hospitalized patients, and inappropriate empirical antimicrobial therapy is known to adversely affect outcomes in ventilator-associated pneumonia (VAP). The aim of this study was to evaluate antimicrobial usage, incidence, etiology, and antimicrobial resistance trends for prominent nosocomial pathogens causing ventilator-associated pneumonia in a clinical-surgical intensive care unit (ICU). METHODS : Gram-negative bacilli and Staphylococcus aureus causing VAP, as well as their antimicrobial resistance patterns and data on consumption (defined daily dose [DDD] per 1,000 patient days) of glycopeptides, extended-spectrum cephalosporins, and carbapenems in the unit were evaluated in two different periods (A and B). RESULTS: Antimicrobial use was high, mainly of broad-spectrum cephalosporins, with a significant increase in the consumption of glycopeptides (p < 0.0001) and carbapenems (p < 0.007) in period B. For Acinetobacter baumannii and members of the Enterobacteriaceae family, 5.27- and 3.06-fold increases in VAPs, respectively, were noted, and a significant increase in resistance rates was found for imipenem-resistant A. baumannii (p = 0.003) and third-generation cephalosporins-resistant Enterobacteriaceae (p = 0.01) isolates in this same period. CONCLUSIONS: Our results suggest that there is a link between antibiotics usage at institutional levels and resistant bacteria. The use of carbapenems was related to the high rate of resistance in A. baumannii and therefore a high consumption of imipenem/meropenem could play a major role in selective pressure exerted by antibiotics in A. baumannii strains.
Resumo:
INTRODUCTION: While the incidence of HIV infection and AIDS is increasing in small Brazilian cities, epidemiological studies are often conducted in large urban centers. METHODS: Our group conducted a retrospective analysis of survival determinants among 358 patients who attended a reference unit in a small city. RESULTS: Death risk was lower among men that had sex with men, patients with an HIV-seropositive partner, and those admitted after highly active antiretroviral therapy (HAART) was available. CONCLUSIONS: The study documents the striking beneficial effect of HAART. The finding of other groups with improved survival may aid in the development of programmatic strategies.
Resumo:
INTRODUCTION: West Nile virus (WNV) is a flavivirus with a natural cycle involving mosquitoes and birds. Over the last 11 years, WNV has spread throughout the Americas with the imminent risk of its introduction in Brazil. METHODS: Envelope protein domain III of WNV (rDIII) was bacterially expressed and purified. An enzyme-linked immunosorbent assay with WNV rDIII antigen was standardized against mouse immune fluids (MIAFs) of different flavivirus. RESULTS: WNV rDIII reacted strongly with St. Louis encephalitis virus (SLEV) MIAF but not with other flaviviruses. CONCLUSIONS: This antigen may be a potentially useful tool for serologic diagnosis and may contribute in future epidemiological surveillance of WNV infections in Brazil.
Resumo:
Theropods form a highly successful and morphologically diversified group of dinosaurs that gave rise to birds. They include most, if not all, carnivorous dinosaurs, yet many theropod clades were secondarily adapted to piscivory, omnivory and herbivory, and theropods show a large array of skull and dentition morphologies. This work aims to investigate aspects of the evolution of theropod dinosaurs by analyzing in detail both the anatomy and ontogeny of teeth and quadrates in non-avian theropods, and by studying embryonic and adult material of a new species of theropod. A standardized list of terms and notations for each anatomical entity of the tooth, quadrate, and maxilla is here proposed with the goal of facilitating descriptions of these important cranial and dental elements. The distribution of thirty dental characters among 113 theropod taxa is investigated, and a list of diagnostic dental characters is proposed. As an example, four isolated theropod teeth from the Lourinhã Formation (Kimmeridgian‒Tithonian) of Portugal are described and identified based on a cladistic analysis performed on a data matrix of 141 dentition-based characters coded in 60 taxa. Two shed teeth are referred to an abelisaurid, providing the first record of Abelisauridae in the Jurassic of Laurasia and the one of the oldest records of this clade in the world, suggesting a possible radiation of Abelisauridae in Europe well before the Upper Cretaceous. The consensus tree resulting from this phylogenetic analysis, the most extensive on theropod teeth, indicates that theropod teeth provide reliable data for identification at approximately family level, and this method will help identifying theropod teeth with more confidence. A detailed description of the dentition of Megalosauridae is also provided, and a discriminant analysis performed on a dataset of numerical data collected on the teeth of 62 theropod taxa reveals that megalosaurid teeth are hardly distinguishable from other theropod clades with ziphodont dentition. This study highlights the importance of detailing anatomical descriptions and providing additional morphometric data on teeth with the purpose of helping to identify isolated theropod teeth. In order to evaluate the phylogenetic potential and investigate the evolutionary transformations of the quadrate, a phylogenetic morphometric analysis as well as a cladistic analysis using 98 discrete quadrate related characters were conducted. The quadrate morphology by its own provides a wealth of data with strong phylogenetic signal, and the phylogenetic morphometric analysis reveals two main morphotypes of the mandibular articulation of the quadrate linked to function. As an example, six isolated quadrates from the Kem Kem beds (Cenomanian) of Morocco are determined to be from juvenile and adult individuals of Spinosaurinae based on phylogenetic, morphometric, and phylogenetic morphometric analyses. Morphofunctional analysis of the spinosaurid mandibular articulation has shown that the posterior parts of the two mandibular rami displaced laterally when the jaw was depressed due to a mediolaterally oriented intercondylar sulcus of the quadrate. Such lateral movement of the mandibular ramus was possible due to a movable mandibular symphysis in spinosaurids, allowing the pharynx to be widened. A new species of theropod from the Lourinhã Formation of Portugal, Torvosaurus gurneyi, is erected based on a right maxilla and an incomplete caudal centrum. This taxon supports the mechanism of vicariance that occurred in the Iberian Meseta during the Late Jurassic when the proto-Atlantic was already well formed. A theropod clutch containing several crushed eggs and embryonic material is also assigned to this new species of Torvosaurus. Investigation on the maxilla ontogeny in basal tetanurans reveals that crown denticles, elongation of the anterior ramus, and fusion of interdental plates appear at a posthatchling stage. On the other hand, maxillary pneumaticity is already present at an embryonic stage in non-avian theropods.
Resumo:
Introduction Dengue is prevalent in many tropical and sub-tropical regions. The clinical diagnosis of dengue is still complex, and not much data are available. This work aimed at assessing the diagnostic accuracy of the tourniquet test in patients with suspected dengue infection and its positivity in different classifications of this disease as reported to the Information System for Notifiable Disease in Belo Horizonte, State of Minas Gerais, Brazil between 2001 and 2006. Methods Cross-section analysis of the diagnostic accuracy of the tourniquet test for dengue, using IgM-anti-DENV ELISA as a gold standard. Results We selected 9,836 suspected cases, of which 41.1% were confirmed to be dengue. Classic dengue was present in 95.8%, dengue with complications in 2.5% and dengue hemorrhagic fever in 1.7%. The tourniquet test was positive in 16.9% of classic dengue cases, 61.7% of dengue cases with complications and 82.9% of cases of dengue hemorrhagic fever. The sensitivity and specificity of the tourniquet test were 19.1% and 86.4%, respectively. Conclusions A positive tourniquet test can be a valuable tool to support diagnosis of dengue where laboratory tests are not available. However, the absence of a positive test should not be read as the absence of infection. In addition, the tourniquet test was demonstrated to be an indicator of dengue severity.
Resumo:
Introduction Rapid diagnostic tests (RDTs) may improve the early detection of visceral leishmaniasis (VL), but their real-world performance requires additional study. Therefore, we evaluated the performance of an rK39-based RDT (Kalazar Detect™) for the detection of VL in an endemic, large urban area. Methods Data were collected from a registry of rK39 RDT performed at 11 emergency care units in Belo Horizonte, Brazil, and from a national database of reportable communicable diseases of the Sistema de Informação de Agravos de Notificação (SINAN). Results The rapid rK39 test was performed in 476 patients, with 114 (23.9%) positive results. The analysis of rK39 RDT performance was based on 381 (80%) cases reported to the SINAN database, of which 145 (38.1%) were confirmed cases. Estimates for sensitivity and specificity were 72.4% (95% CI: 64.6-79%) and 99.6% (95%CI: 97.6-99.9%), respectively. Positive and negative predictive values were estimated at 99.1% (95%CI: 94.9-99.8%) and 85.5% (95%CI: 80.8-89.1%), respectively. In addition, close agreement between the rK39 RDT and indirect immunofluorescence was observed. Conclusions In summary, the rK39 RDT showed a high specificity but only moderate sensitivity. In endemic areas for VL, treatment may be considered in cases with clinical manifestations and a positive rK39 RDT, but those with a negative test should be subjected to further investigation.
Resumo:
Introduction: Acute kidney injury (AKI) is a frequent and potentially fatal complication in infectious diseases. The aim of this study was to investigate the clinical aspects of AKI associated with infectious diseases and the factors associated with mortality. Methods: This retrospective study was conducted in patients with AKI who were admitted to the intensive care unit (ICU) of a tertiary infectious diseases hospital from January 2003 to January 2012. The major underlying diseases and clinical and laboratory findings were evaluated. Results: A total of 253 cases were included. The mean age was 46±16 years, and 72% of the patients were male. The main diseases were human immunodeficiency virus (HIV) infection, HIV/acquired immunodeficiency syndrome (AIDS) (30%), tuberculosis (12%), leptospirosis (11%) and dengue (4%). Dialysis was performed in 70 cases (27.6%). The patients were classified as risk (4.4%), injury (63.6%) or failure (32%). The time between AKI diagnosis and dialysis was 3.6±4.7 days. Oliguria was observed in 112 cases (45.7%). The Acute Physiology and Chronic Health Evaluation (APACHE) II scores were higher in patients with HIV/AIDS (57±20, p-value=0.01) and dengue (68±11, p-value=0.01). Death occurred in 159 cases (62.8%). Mortality was higher in patients with HIV/AIDS (76.6%, p-value=0.02). A multivariate analysis identified the following independent risk factors for death: oliguria, metabolic acidosis, sepsis, hypovolemia, the need for vasoactive drugs, the need for mechanical ventilation and the APACHE II score. Conclusions: AKI is a common complication in infectious diseases, with high mortality. Mortality was higher in patients with HIV/AIDS, most likely due to the severity of immunosuppression and opportunistic diseases.
Resumo:
Introduction In addition to the common alterations and diseases inherent in the aging process, elderly persons with a history of leprosy are particularly vulnerable to dependence because of disease-related impairments. Objective determine whether physical impairment from leprosy is associated with dependence among the elderly. Methods An analytical cross-sectional study of elderly individuals with a history of leprosy and no signs of cognitive impairment was conducted using a database from a former leprosy colony-hospital. The patients were evaluated for dependence in the basic activities of daily living (BADL) and instrumental activities of daily living (IADL), respectively) and subjected to standard leprosy physical disability grading. Subsequently, descriptive and univariate analyses were conducted, the latter using Pearson's chi-squared test. Results A total of 186 elderly persons were included in the study. Of these individuals, 53.8% were women, 49.5% were older than 75 years of age, 93% had four or less years of formal education, 24.2% lived in an institution for the long-term care of the elderly (ILTC), and 18.3% had lower limb amputations. Among those evaluated, 79.8% had visible physical impairments from leprosy (grade 2), 83.3% were independent in BADL, and 10.2% were independent in IADL. There was a higher impairment grade among those patients who were IADL dependent (p=0.038). Conclusion s: The leprosy physical impairment grade is associated with dependence for IADL, creating the need for greater social support and systematic monitoring by a multidisciplinary team. The results highlight the importance of early diagnosis and treatment of leprosy to prevent physical impairment and dependence in later years.
Resumo:
RESUMO: A hipertensão arterial (HA) é uma patologia altamente prevalente, embora claramente subdiagnosticada, em doentes com síndrome de apneia obstrutiva do sono (SAOS). Estas duas patologias apresentam uma estreita relação e a monitorização ambulatória da pressão arterial (MAPA), por um período de 24 horas, parece ser o método mais preciso para o diagnóstico de hipertensão em doentes com SAOS. No entanto, esta ferramenta de diagnóstico para além de ser dispendiosa e envolver um número acrescido de meios técnicos e humanos, é mais morosa e, por conseguinte, não é utilizada por rotina no contexto do diagnóstico da SAOS. Por outro lado, apesar da aplicação de pressão positiva contínua nas vias aéreas (CPAP – Continous Positive Airway Pressure) ser considerada a terapêutica de eleição para os doentes com SAOS, o seu efeito no abaixamento da pressão arterial (PA) parece ser modesto, exigindo, por conseguinte, a implementação concomitante de terapêutica anti-hipertensora. Acontece que são escassos os dados relativos aos regimes de fármacos anti-hipertensores utilizados em doentes com SAOS e, acresce ainda que, as guidelines terapêuticas para o tratamento farmacológico da HA, neste grupo particular de doentes, permanecem, até ao momento, inexistentes. A utilização de modelos animais de hipóxia crónica intermitente (CIH), que mimetizam a HA observada em doentes com SAOS, revela-se extremamente importante, uma vez que se torna imperativo identificar fármacos que promovam um controle adequado da PA neste grupo de doentes. No entanto, estudos concebidos com o intuito de investigar o efeito anti-hipertensor dos fármacos neste modelo animal revelam-se insuficientes e, por outro lado, os escassos estudos que testaram fármacos anti-hipertensores neste modelo não foram desenhados para responder a questões de natureza farmacológica. Acresce ainda que se torna imprescindível garantir a escolha de um método para administração destes fármacos que seja não invasivo e que minimize o stress do animal. Embora a gavagem seja uma técnica indiscutivelmente eficaz e amplamente utilizada para a administração diária de fármacos a animais de laboratório, ela compreende uma sequência de procedimentos geradores de stress para os animais e, que podem por conseguinte, constituir um viés na interpretação dos resultados obtidos. O objectivo global da presente investigação translacional foi contribuir para a identificação de fármacos anti-hipertensores mais efectivos para o tratamento da HT nos indivíduos com SAOS e investigar mecanismos subjacentes aos efeitos sistémicos associadas à SAOS bem como a sua modulação por fármacos anti-hipertensores. Os objectivos específicos foram: em primeiro lugar,encontrar novos critérios, baseados nas medidas antropométricas, que permitam a identificação de doentes com suspeita de SAOS, que erroneamente se auto-classifiquem como nãohipertensos, e desta forma promover um uso mais criterioso do MAPA; em segundo lugar, investigar a existência de uma hipotética associação entre os esquemas de fármacos antihipertensores e o controle da PA (antes e após a adaptação de CPAP) em doentes com SAOS em terceiro lugar, avaliar a eficácia do carvedilol (CVD), um fármaco bloqueador β-adrenérgico não selectivo com actividade antagonista α1 intrínseca e propriedades anti-oxidantes num modelo animal de hipertensão induzida pela CIH; em quarto lugar, explorar os efeitos da CIH sobre o perfil farmacocinético do CVD; e, em quinto lugar, investigar um método alternativo à gavagem para a administração crónica de fármacos anti-hipertensores a animais de laboratório. Com este intuito, na primeira fase deste projecto, fizemos uso de uma amostra com um número apreciável de doentes com SAOS (n=369), que acorreram, pela primeira vez, à consulta de Patologia do Sono do CHLN e que foram submetidos a um estudo polissonográfico do sono, à MAPA e que preencheram um questionário que contemplava a obtenção de informação relativa ao perfil da medicação anti-hipertensora em curso. Numa segunda fase, utilizámos um modelo experimental de HT no rato induzida por um paradigma de CIH. Do nosso trabalho resultaram os seguintes resultados principais: em primeiro lugar, o índice de massa corporal (IMC) e o perímetro do pescoço (PP) foram identificados como preditores independentes de “auto-classificação errónea” da HA em doentes com suspeita de SAOS; em segundo lugar, não encontramos qualquer associação com significado estatístico entre os vários esquemas de fármacos anti-hipertensores bem como o número de fármacos incluídos nesse esquemas, e o controle da PA (antes e depois da adaptação do CPAP); em terceiro lugar, apesar das doses de 10, 30 e 50 mg/kg de carvedilol terem promovido uma redução significativa da frequência cardíaca, não foi observado qualquer decréscimo na PA no nosso modelo animal; em quarto lugar, as razões S/(R+S) dos enantiómeros do CVD nos animais expostos à CIH e a condições de normóxia revelaram-se diferentes; e, em quinto lugar, a administração oral voluntária mostrou ser um método eficaz para a administração diária controlada de fármacos anti-hipertensores e que é independente da manipulação e contenção do animal. Em conclusão, os resultados obtidos através do estudo clínico revelaram que o controle da PA, antes e após a adaptação do CPAP, em doentes com SAOS é independente, quer do esquema de fármacos anti-hipertensores, quer do número de fármacos incluídos num determinado esquema. Os nossos resultados salientam ainda a falta de validade da chamada self-reported hypertension e sugerem que em todos os doentes com suspeita de SAOS, com HA não diagnosticada e com um IMC e um PP acima de 27 kg/m2 e 39 cm, respectivamente, a confirmação do diagnóstico de HA deverá ser realizada através da MAPA, ao invés de outros métodos que com maior frequência são utilizados com este propósito. Os resultados obtidos no modelo animal de HA induzida pela CIH sugerem que o bloqueio do sistema nervoso simpático, juntamente com os supostos efeitos pleiotrópicos do CVD, não parece ser a estratégia mais adequada para reverter este tipo particular de hipertensão e indicam que as alterações farmacocinéticas induzidas pela CIH no ratio S/(R+S) não justificam a falta de eficácia anti-hipertensora do CVD observada neste modelo animal. Por último, os resultados do presente trabalho suportam ainda a viabilidade da utilização da administração oral voluntária, em alternativa à gavagem, para a administração crónica de uma dose fixa de fármacos anti-hipertensores.---------------------------- ABSTRACT: Hypertension (HT) is a highly prevalent condition, although under diagnosed, in patients with obstructive sleep apnea (OSA). These conditions are closely related and 24-hour ambulatory blood pressure monitoring (ABPM) seems to be the most accurate measurement for diagnosing hypertension in OSA. However, this diagnostic tool is expensive and time-consuming and, therefore, not routinely used. On the other hand, although continuous positive airway pressure (CPAP) is considered the gold standard treatment for symptomatic OSA, its lowering effect on blood pressure (BP) seems to be modest and, therefore, concomitant antihypertensive therapy is still required. Data on antihypertensive drug regimens in patients with OSA are scarce and specific therapeutic guidelines for the pharmacological treatment of hypertension in these patients remain absent. The use of animal models of CIH, which mimic the HT observed in patients with OSA, is extremely important since it is imperative to identify preferred compounds for an adequate BP control in this group of patients. However, studies aimed at investigating the antihypertensive effect of antihypertensive drugs in this animal model are insufficient, and most reports on CIH animal models in which drugs have been tested were not designed to respond to pharmacological issues. Moreover, when testing antihypertensive drugs (AHDs) it becomes crucial to ensure the selection of a non-invasive and stress-free method for drug delivery. Although gavage is effective and a widely performed technique for daily dosing in laboratory rodents, it comprises a sequence of potentially stressful procedures for laboratory animals that may constitute bias for the experimental results. The overall goal of the present translational research was to contribute to identify more effective AHDs for the treatment of hypertension in patients with OSA and investigate underlying mechanisms of systemic effects associated with OSA, as well as its modulation by AHDs. The specific aims were: first, to find new predictors based on anthropometric measures to identify patients that misclassify themselves as non-hypertensive, and thereby promote the selective use of ABPM; second, to investigate a hypothetical association between ongoing antihypertensive regimens and BP control rates in patients with OSA, before and after CPAP adaptation; third, to determine, in a rat model of CIH-induced hypertension, the efficacy of carvedilol (CVD), a nonselective beta-blocker with intrinsic anti-α1-adrenergic activity and antioxidant properties; fourth, to explore the effects of CIH on the pharmacokinetics profile of CVD and fifth, to investigate an alternative method to gavage, for chronic administration of AHDs to laboratory rats. For that, in the first phase of this project, we used a sizeable sample of patients with OSA (n=369), that attended a first visit at Centro Hospitalar Lisboa Norte, EPE Sleep Unit, and underwent overnight polysomnography, 24-h ABPM and filled a questionnaire that included ongoing antihypertensive medication profile registration. In the second phase, a rat experimental model of HT induced by a paradigm of CIH that simulates OSA was used. The main findings of this work were: first, body mass index (BMI) and neck circumference (NC) were identified as independent predictors of hypertension misclassification in patients suspected of OSA; second, in patients with OSA, BP control is independent of both the antihypertensive regimen and the number of antihypertensive drugs, either before or after CPAP adaptation; third, although the doses of 10, 30 and 50 mg/Kg of CVD promoted a significant reduction in heart rate, no decrease in mean arterial pressure was observed; fourth, the S/(R+S) ratios of CVD enantiomers, between rats exposed to CIH and normoxic conditions, were different and fifth, voluntary ingestion proved to be an effective method for a controlled daily dose administration, with a define timetable, that is independent of handling and restraint procedures. In conclusion, the clinical study showed that BP control in OSA patients is independent of both the antihypertensive regimen and the number of antihypertensive drugs. Additionally, our results highlight the lack of validity of self-reported hypertension and suggest that all patients suspected of OSA with undiagnosed hypertension and with a BMI and NC above 27 Kg/m2 and 39 cm should be screened for hypertension, through ABPM. The results attained in the rat model of HT related to CIH suggest that the blockade of the sympathetic nervous system together with the putative pleiotropic effects of carvedilol is not able to revert hypertension induced by CIH and point out that the pharmacokinetic changes induced by CIH on S/(R+S) ratio are not apparently responsible for the lack of efficacy of carvedilol in reversing this particular type of hypertension. Finally, the results here presented support the use of voluntary oral administration as a viable alternative to gavage for chronic administration of a fixed dose of AHDs.
Resumo:
Introduction Surveillance of nosocomial infections (NIs) is an essential part of quality patient care; however, there are few reports of National Healthcare Safety Network (NHSN) surveillance in neonatal intensive care units (NICUs) and none in developing countries. The purpose of this study was to report the incidence of NIs, causative organisms, and antimicrobial susceptibility patterns in a large cohort of neonates admitted to the NICU during a 16-year period. Methods The patients were followed 5 times per week from birth to discharge or death, and epidemiological surveillance was conducted according to the NHSN. Results From January 1997 to December 2012, 4,615 neonates, representing 62,412 patient-days, were admitted to the NICU. The device-associated infection rates were as follows: 17.3 primary bloodstream infections per 1,000 central line-days and 3.2 pneumonia infections per 1,000 ventilator-days. A total of 1,182 microorganisms were isolated from sterile body site cultures in 902 neonates. Coagulase-negative staphylococci (CoNS) (34.3%) and Staphylococcus aureus (15.6%) were the most common etiologic agents isolated from cultures. The incidences of oxacillin-resistant CoNS and Staphylococcus aureus were 86.4% and 28.3%, respectively. Conclusions The most important NI remains bloodstream infection with staphylococci as the predominant pathogens, observed at much higher rates than those reported in the literature. Multiresistant microorganisms, especially oxacillin-resistant staphylococci and gram-negative bacilli resistant to cephalosporin were frequently found. Furthermore, by promoting strict hygiene measures and meticulous care of the infected infants, the process itself of evaluating the causative organisms was valuable.
Resumo:
Introduction Since the launch of the Global Programme to Eliminate Lymphatic Filariasis, more than 70% of the endemic countries have implemented mass drug administration (MDA) to interrupt disease transmission. The monitoring of filarial infection in sentinel populations, particularly schoolchildren, is recommended to assess the impact of MDA. A key issue is choosing the appropriate tools for these initial assessments (to define the best intervention) and for monitoring transmission. Methods This study compared the pre-MDA performance of five diagnostic methods, namely, thick film test, Knott's technique, filtration, Og4C3-ELISA, and the AD12-ICT card test, in schoolchildren from Brazil. Venous and capillary blood samples were collected between 11 pm and 1 am. The microfilarial loads were analyzed with a negative binomial regression, and the prevalence and associated 95% confidence intervals were estimated for all methods. The accuracies of the AD12-ICT card and Og4C3-ELISA tests were assessed against the combination of parasitological test results. Results A total of 805 schoolchildren were examined. The overall and stratified prevalence by age group and gender detected by Og4C3-ELISA and AD12-ICT were markedly higher than the prevalence estimated by the parasitological methods. The sensitivity of the AD12-ICT card and Og4C3-ELISA tests was approximately 100%, and the positive likelihood ratios were above 6. The specificity of the Og4C3-ELISA was higher than that of the AD12-ICT at different prevalence levels. Conclusions The ICT card test should be the recommended tool for monitoring school-age populations living in areas with ongoing or completed MDA.
Resumo:
The Free and Cued Selective Reminding Test (FCSRT) is a memory test that controls attention and acquisition, by providing category cues in the learning process. Because it enables an assessment of memory not confounded by normal age-related changes in cognition and a high accuracy on Alzheimer's disease (AD) evaluation, it has been suggested by the International Working Group on AD. Our aim was to assess the construct related validity of the FCSRT in the AD spectrum disorders.
Resumo:
INTRODUCTION: To evaluate predictive indices for candidemia in an adult intensive care unit (ICU) and to propose a new index. METHODS: A prospective cohort study was conducted between January 2011 and December 2012. This study was performed in an ICU in a tertiary care hospital at a public university and included 114 patients staying in the adult ICU for at least 48 hours. The association of patient variables with candidemia was analyzed. RESULTS: There were 18 (15.8%) proven cases of candidemia and 96 (84.2%) cases without candidemia. Univariate analysis revealed the following risk factors: parenteral nutrition, severe sepsis, surgical procedure, dialysis, pancreatitis, acute renal failure, and an APACHE II score higher than 20. For the Candida score index, the odds ratio was 8.50 (95% CI, 2.57 to 28.09); the sensitivity, specificity, positive predictive value, and negative predictive value were 0.78, 0.71, 0.33, and 0.94, respectively. With respect to the clinical predictor index, the odds ratio was 9.45 (95%CI, 2.06 to 43.39); the sensitivity, specificity, positive predictive value, and negative predictive value were 0.89, 0.54, 0.27, and 0.96, respectively. The proposed candidemia index cutoff was 8.5; the sensitivity, specificity, positive predictive value, and negative predictive value were 0.77, 0.70, 0.33, and 0.94, respectively. CONCLUSIONS: The Candida score and clinical predictor index excluded candidemia satisfactorily. The effectiveness of the candidemia index was comparable to that of the Candida score.