58 resultados para screening and isolation
Resumo:
The main objective of the present study was to evaluate the diagnostic value (clinical application) of brain measures and cognitive function. Alzheimer and multiinfarct patients (N = 30) and normal subjects over the age of 50 (N = 40) were submitted to a medical, neurological and cognitive investigation. The cognitive tests applied were Mini-Mental, word span, digit span, logical memory, spatial recognition span, Boston naming test, praxis, and calculation tests. The brain ratios calculated were the ventricle-brain, bifrontal, bicaudate, third ventricle, and suprasellar cistern measures. These data were obtained from a brain computer tomography scan, and the cutoff values from receiver operating characteristic curves. We analyzed the diagnostic parameters provided by these ratios and compared them to those obtained by cognitive evaluation. The sensitivity and specificity of cognitive tests were higher than brain measures, although dementia patients presented higher ratios, showing poorer cognitive performances than normal individuals. Normal controls over the age of 70 presented higher measures than younger groups, but similar cognitive performance. We found diffuse losses of tissue from the central nervous system related to distribution of cerebrospinal fluid in dementia patients. The likelihood of case identification by functional impairment was higher than when changes of the structure of the central nervous system were used. Cognitive evaluation still seems to be the best method to screen individuals from the community, especially for developing countries, where the cost of brain imaging precludes its use for screening and initial assessment of dementia.
Resumo:
Chronic kidney disease (CKD) is a world-wide public health problem, with adverse outcomes of kidney failure, cardiovascular disease, and premature death. This finding has led to the hypothesis that earlier recognition of kidney disease and successful intervention may improve outcome. The National Kidney Foundation, through its Kidney Disease Outcomes Quality Initiative (K/DOQI), and other National institutions recommend glomerular filtration rate (GFR) for the definition, classification, screening, and monitoring of CKD. Blood creatinine clearance, the most widely used clinical marker of kidney function, is now recognized as an unreliable measure of GFR because serum creatinine is affected by age, weight, muscle mass, race, various medications, and extra-glomerular elimination. Cystatin C concentration is a new and promising marker for kidney dysfunction in both native and transplanted kidneys. Because of its low molecular weight, cystatin C is freely filtered at the glomerulus and is almost completely reabsorbed and catabolized, but not secreted, by tubular cells. Given these characteristics, cystatin C concentration may be superior to creatinine concentration in detecting chronic kidney disease. This review aims to evaluate from recent literature the clinical efficiency and relevance of these GFR markers in terms of screening CKD.
Resumo:
Polymorphisms of hormone receptor genes have been linked to modifications in reproductive factors and to an increased risk of breast cancer (BC). In the present study, we have determined the allelic and genotypic frequencies of the ERα-397 PvuII C/T, ERα-351 XbaI A/G and PGR PROGINS polymorphisms and investigated their relationship with mammographic density, body mass index (BMI) and other risk factors for BC. A consecutive and unselected sample of 750 Brazilian BC-unaffected women enrolled in a mammography screening program was recruited. The distribution of PGR PROGINS genotypic frequencies was 72.5, 25.5 and 2.0% for A1A1, A1A2 and A2A2, respectively, which was equivalent to that encountered in other studies with healthy women. The distribution of ERα genotypes was: ERα-397 PvuII C/T: 32.3% TT, 47.5% TC, and 20.2% CC; ERα-351 XbaI A/G: 46.3% AA, 41.7% AG and 12.0% GG. ERα haplotypes were 53.5% PX, 14.3% Px, 0.3% pX, and 32.0% px. These were significantly different from most previously published reports worldwide (P < 0.05). Overall, the PGR PROGINS genotypes A2A2 and A1A2 were associated with fatty and moderately fatty breast tissue. The same genotypes were also associated with a high BMI in postmenopausal women. In addition, the ERα-351 XbaI GG genotype was associated with menarche ≥12 years (P = 0.02). ERα and PGR polymorphisms have a phenotypic effect and may play an important role in BC risk determination. Finally, if confirmed in BC patients, these associations could have important implications for mammographic screening and strategies and may be helpful to identify women at higher risk for the disease.
Resumo:
The increased marketing of olive oil in Brazil has intensified legal requirements to ensure regulation of this product. The measurement of the specific extinction at 270 nm (E 270) and content of stigmastadiene can be used to assess the presence of refined oils in virgin olive oil. During the vegetable oil refining process, compounds with conjugated double bonds are generated from unsaturated fatty acids that absorb at 270 nm and sterols, such as stigmasta-3,5-diene. To compare these parameters, seven samples of extra virgin olive oil and three samples of olive oil (blend of virgin and refined) were analyzed. Among the samples analyzed, four extra virgin samples had levels of stigmastadiene and E 270 higher than expected, among which two were adulterated with seed oil (rich in linoleic acid) and the other two with olive pomace oil. The results demonstrate the higher sensitivity of stigmastadiene to determine the presence of the refined oil in virgin olive oil and good agreement with determining E 270. The latter technique is a simple, quick, and low cost method of determination that can be easily implemented in laboratories to assist in the screening and regulation of olive oils sold in Brazil.
Resumo:
Testing problems in diagnosing human T-lymphotropic virus (HTLV) infection, mostly HTLV-II, have been documented in HIV/AIDS patients. Since December 1998, the Immunology Department of Instituto Adolfo Lutz (IAL) offers HTLV-I/II serology to Public Health Units that attend HTLV high-risk individuals. Two thousand, three hundred and twelve serum samples: 1,393 from AIDS Reference Centers (Group I), and 919 from HTLV out-patient clinics (Group II) were sent to IAL for HTLV-I/II antibodies detection. The majority of them were screened by two enzyme immunoassays (EIAs), and confirmed by Western Blot (WB 2.4, Genelabs). Seven different EIA kits were employed during the period, and according to WB results, the best performance was obtained by EIAs that contain HTLV-I and HTLV-II viral lysates and rgp21 as antigens. Neither 1st and 2nd, nor 3rd generation EIA kits were 100% sensitive in detecting truly HTLV-I/II reactive samples. HTLV-I and HTLV-II prevalence rates of 3.3% and 2.5% were detected in Group I, and of 9.6% and 3.6% in Group II, respectively. High percentages of HTLV-seroindeterminate WB sera were detected in both Groups. The algorithm testing to be employed in HTLV high-risk population from São Paulo, Brazil, needs the use of two EIA kits of different formats and compounds as screening, and because of high seroindeterminate WB, may be another confirmatory assay.
Resumo:
We detected Toxoplasma gondii oocysts in feces of experimentally infected cats, using a Kato Katz approach with subsequent Kinyoun staining. Animals serologically negative to T. gondii were infected orally with 5x10² mice brain cysts of ME49 strain. Feces were collected daily from the 3rd to the 30th day after challenge. Oocysts were detected by qualitative sugar flotation and the quantitative modified Kato Katz stained by Kinyoun (KKK). In the experimentally infected cats, oocysts were detected from the 7th to 15th day through sugar flotation technique, but oocysts were found in KKK from the 6th to 16th day, being sensitive for a larger period, with permanent documentation. The peak of oocysts excretion occurred between the 8th to 11th days after challenge, before any serological positive result. KKK could be used in the screening and quantification of oocysts excretion in feces of suspected animals, with reduced handling of infective material, decreasing the possibility of environmental and operator contamination.
Resumo:
Parasitic infection is highly prevalent throughout the developing countries of the world. Food handlers are a potential source of infection for many intestinal parasites and other enteropathogenic infections as well. The aim of this study was to determine the prevalence of intestinal parasite carriers among food handlers attending the public health center laboratory in Sari, Northern Iran for annual check-up. The study was performed from August 2011 through February 2012. Stool samples were collected from 1041 male and female food handlers of different jobs aged between 18 to 63 years and were examined following standard procedures. Sociodemographic, environmental and behavioral data analysis of the food handlers were recorded in a separate questionnaire. Intestinal parasites were found in 161 (15.5%) of the studied samples. Seven species of protozoan or helminth infections were detected. Most of the participants were infected with Giardia lamblia (53.9%) followed by Blastocystis hominis (18%), Entamoeba coli (15.5%), Entamoeba histolytica/dispar (5.5%), Cryptosporidium sp. (3.1%), Iodamoeba butschlii (3.1%) and Hymenolepis nana (1.9%) as the only helminth infection. The findings emphasized that food handlers with different pathogenic organisms may predispose consumers to significant health risks. Routine screening and treatment of food handlers is a proper tool in preventing food-borne infections.
Resumo:
Oxacillin-resistant Staphylococcus aureus (ORSA) infection is an important cause of hospital morbidity and mortality. The objective of this study was to identify the main factors associated with death in patients colonized or infected with Staphylococcus aureus in a cancer center. A matched-pair case-control study enrolled all patients infected or colonized with ORSA (cases) admitted to the Hospital do Câncer in Rio de Janeiro from 01/01/1992 to 12/31/1994. A control was defined as a patient hospitalized during the same period as the case-patients and colonized or infected with oxacillin-susceptible Staphylococcus aureus (OSSA). The study enrolled 95 cases and 95 controls. Patient distribution was similar for the two groups (p > or = 0.05) with respect to gender, underlying diseases, hospital transfer, prior infection, age, temperature, heart and respiratory rates, neutrophil count, and duration of hospitalization. Univariate analysis of putative risk factors associated with mortality showed the following significant variables: admission to the intensive care unit (ICU), presence of bacteremia, use of central venous catheter (CVC), ORSA colonization or infection, pneumonia, use of urinary catheter, primary lung infection, prior use of antibiotics, mucositis, and absence of cutaneous abscesses. Multivariate analysis showed a strong association between mortality and the following independent variables: admission to ICU (OR [odds ratio]=7.2), presence of Staphylococcus bacteremia (OR=6.8), presence of CVC (OR=5.3), and isolation of ORSA (OR=2.7). The study suggests a higher virulence of ORSA in comparison to OSSA in cancer patients.
Resumo:
Colorectal cancer (CRC) represents the third most common malignancy throughout the world. Little or no improvement in survival has been effectively achieved in the last 50 years. Extensive epidemiological and genetic data are able to identify more precisely definite risk-groups so screening and early diagnosis can be more frequently accomplished. CRC is best detected by colonoscopy, which allows sampling for histologic diagnosis. Colonoscopy is the gold standard for detection of small and premalignant lesions, although it is not cost-effective for screening average-risk population. Colonoscopic polypectomy and mucosal resection constitute curative treatment for selective cases of invasive CRC. Similarly, alternative trans-colonoscopic treatment can be offered for adequate palliation, thus avoiding surgery.
Resumo:
Experiences with population-based chemotherapy and other methods for the control of schistosomiasis mansoni in two subsaharan foci are described. In the forest area of Maniema (Zaire), intense transmission of Schistosoma mansoni, high prevalences and intensities of infection, and important morbidity have been documental. Taking into account the limited financial means and the poor logistic conditions, the control strategy has been based mainly on targeted chemotherapy of heavily infected people (>600 epg). After ten years of intervention, prevalences and intensities have hardly been affected, but the initial severe hepatosplenic morbidity has almost disappeared. In Burundi, a national research and control programme has been initiated in 1982. Prevalences, intensities and morbidity were moderate, transmission was focal and erratic in time and space. A more structural control strategy was developed, based on screening and selective therapy, health education, sanitation and domestic water supply. Prevalences and intensities have been considerably reduced, though the results show focal and unpredicatable variations. Transmission and reinfection were not signifcantly affected by chemotherapy alone, and eventual outcome of repeated selective treatment appears to be limited by the sensitivity of the screening method. Intestinal morbidity was strongly reduced by community-based selective treatment, but hepatosplenic enlargement was hardly affected; this is possibly due to the confounding impact of increasing malaria morbidity. The experiences show the importance of local structures and conditions for the development of an adapted control strategy. It is further concluded that population-based chemotherapy is a highly valid tool for the rapid control of morbidity, but should in most operational conditions not be considered as a tool for transmission control. Integration of planning, execution and surveillance in regular health services...
Resumo:
This paper reviews three different approaches to modelling the cost-effectiveness of schistosomiasis control. Although these approaches vary in their assessment of costs, the major focus of the paper is on the evaluation of effectiveness. The first model presented is a static economic model which assesses effectiveness in terms of the proportion of cases cured. This model is important in highlighting that the optimal choice of chemotherapy regime depends critically on the level of budget constraint, the unit costs of screening and treatment, the rates of compliance with screening and chemotherapy and the prevalence of infection. The limitations of this approach is that it models the cost-effectiveness of only one cycle of treatment, and effectiveness reflects only the immediate impact of treatment. The second model presented is a prevalence-based dynamic model which links prevalence rates from one year to the next, and assesses effectiveness as the proportion of cases prevented. This model was important as it introduced the concept of measuring the long-term impact of control by using a transmission model which can assess reduction in infection through time, but is limited to assessing the impact only on the prevalence of infection. The third approach presented is a theoretical framework which describes the dynamic relationships between infection and morbidity, and which assesses effectiveness in terms of case-years prevented of infection and morbidity. The use of this model in assessing the cost-effectiveness of age-targeted treatment in controlling Schistosoma mansoni is explored in detail, with respect to varying frequencies of treatment and the interaction between drug price and drug efficacy.
Resumo:
Poultry meat and its derivatives are among the foodstuffs considered by environmental health authorities to present the highest risks to the public. A total of 185 samples were collected in five monthly batches, from different processing stages in a sausage plant that uses mechanically-deboned chicken meat (MDCM), and testedfor the presence of Salmonella. Enrichment was carried out in both Kauffman's tetrathionate broth and Rappaport-Vassiliadis broth and isolation on Salmonella-Shigella agar and brilliant-green agar. Live Salmonella bacteria were isolated from six samples of the raw meat and from the emulsion, in batches three, four, and five, but not from any sample in batches one or two. The six isolated strains were all classified as Salmonella Albany, which has not previously been reported in MDCM. Of the two enrichment broths, Rappaport-Vassiliadis gave the better results. The pattern of contamination suggests a probable common source, given that a new supplier was used in the third, fourth, and fifth months. It was also shown that the industrial cooking was effective in preventing Salmonella surviving in the final product.
Resumo:
Monitoring the extent of and trends in multidrug-resistant tuberculosis (MDR-TB) is a priority of the Brazilian National Tuberculosis Control Programme. The current study aimed to estimate the incidence of MDR-TB, describe the profile of TB drug resistance in risk groups and examine whether screening for MDR-TB adhered to the recommended guidelines. A descriptive study that examined diagnosed cases of pulmonary TB was conducted in the city of Santos, Brazil, between 2000-2004. Of the 2,176 pulmonary TB cases studied, 671 (30.8%) met the criteria for drug sensitivity testing and, of these cases, 31.7% (213/671) were tested. Among the tested cases, 9.4% were resistant to one anti-TB drug and 15% were MDR. MDR was observed in 11.6% of 86 new TB cases and 17.3% of 127 previously treated cases. The average annual incidence of MDR-TB was 1.9 per 100,000 inhabitants-years. The extent of known MDR-TB in the city of Santos is high, though likely to be underestimated. Our study therefore indicates an inadequate adherence to the guidelines for MDR-TB screening and suggests the necessity of alternative strategies of MDR-TB surveillance.
Resumo:
In 2009, the World Health Organization (WHO) issued a new guideline that stratifies dengue-affected patients into severe (SD) and non-severe dengue (NSD) (with or without warning signs). To evaluate the new recommendations, we completed a retrospective cross-sectional study of the dengue haemorrhagic fever (DHF) cases reported during an outbreak in 2011 in northeastern Brazil. We investigated 84 suspected DHF patients, including 45 (53.6%) males and 39 (46.4%) females. The ages of the patients ranged from five-83 years and the median age was 29. According to the DHF/dengue shock syndrome classification, 53 (63.1%) patients were classified as having dengue fever and 31 (36.9%) as having DHF. According to the 2009 WHO classification, 32 (38.1%) patients were grouped as having NSD [4 (4.8%) without warning signs and 28 (33.3%) with warning signs] and 52 (61.9%) as having SD. A better performance of the revised classification in the detection of severe clinical manifestations allows for an improved detection of patients with SD and may reduce deaths. The revised classification will not only facilitate effective screening and patient management, but will also enable the collection of standardised surveillance data for future epidemiological and clinical studies.
Resumo:
Studies on the impact of Eucalyptus spp. on Brazilian soils have focused on soil chemical properties and isolating interesting microbial organisms. Few studies have focused on microbial diversity and ecology in Brazil due to limited coverage of traditional cultivation and isolation methods. Molecular microbial ecology methods based on PCR amplified 16S rDNA have enriched the knowledge of soils microbial biodiversity. The objective of this work was to compare and estimate the bacterial diversity of sympatric communities within soils from two areas, a native forest (NFA) and an eucalyptus arboretum (EAA). PCR primers, whose target soil metagenomic 16S rDNA were used to amplify soil DNA, were cloned using pGEM-T and sequenced to determine bacterial diversity. From the NFA soil 134 clones were analyzed, while 116 clones were analyzed from the EAA soil samples. The sequences were compared with those online at the GenBank. Phylogenetic analyses revealed differences between the soil types and high diversity in both communities. Soil from the Eucalyptus spp. arboretum was found to have a greater bacterial diversity than the soil investigated from the native forest area.