938 resultados para Serologic tests and antigen
Resumo:
Steel slag is a byproduct of iron and steel production by the metallurgical industries. Annually, 21 million tons of steel slag is produced in the United States. Most of the slag is landfilled, which represents a significant economic loss and a waste of valuable land space. Steel slag has great potential for the construction of highway embankments; however, its use has been limited due to its high swelling potential and alkalinity. The swelling potential of steel slags may lead to deterioration of the structural stability of highways, and high alkalinity poses an environmental challenge as it affects the leaching behavior of trace metals. This study seeks a methodology that promotes the use of steel slag in highway embankments by minimizing these two main disadvantages. Accelerated swelling tests were conducted to evaluate the swelling behavior of pure steel slag and water treatment residual (WTR) treated steel slag, where WTR is an alum-rich by-product of drinking water treatment plants. Sequential batch tests and column leach tests, as well as two different numerical analyses, UMDSurf and WiscLEACH, were carried out to check the environmental suitability of the methods. Tests were conducted to study the effect of a common borrow fill material that encapsulated the slag in the embankment and the effects of two subgrade soils on the chemical properties of slag leachate. The results indicated that an increase in WTR content in the steel slag-WTR mixtures yields a decrease in pH and most of the leached metal concentrations, except aluminum. The change in the levels of pH, after passing through encapsulation and subgrade, depends on the natural pHs of materials.
Resumo:
African Americans are disproportionately affected by colorectal cancer (CRC) incidence and mortality. CRC early detection leads to better treatment outcomes and, depending on the screening test, can prevent the development of CRC. African Americans, however, are screened less often than Whites. Aspects of decision making (e.g., decisional conflict, decision self-efficacy) can impact decision making outcomes and may be influenced by social determinants of health, including health literacy. However the relationship between social determinants of health and indicators of decision making in this population is not fully understood. Additionally, individuals have a choice between different CRC screening tests and an individual’s desire to use a particular screening test may be associated with social determinants of health such as health literacy. This study aimed to examine the relationship between social determinants of health and indicators of decision making for CRC screening among African Americans. A total of 111 participants completed a baseline and 14-month follow-up survey assessing decisional conflict, decision self-efficacy, decisional preference (shared versus informed decision making), and CRC test preference. Health literacy was negatively associated with decisional conflict and positively associated with decision self-efficacy (ps < .05). Individuals who were unemployed or working part-time had significantly greater decisional conflict than individuals working full-time (ps < .05). Individuals with a first-degree family history of CRC had significantly lower decision self-efficacy than individuals without a family history (p < .05). Women were significantly more likely to prefer making a shared decision rather than an informed decision compared to men (p < .05). Lastly, previous CRC screening behavior was significantly associated with CRC test preference (e.g., individuals previously screened using colonoscopy were significantly more likely to prefer colonoscopy for their next screening test; ps < .05). These findings begin to identify social determinants of health (e.g., health literacy, employment) that are related to indicators of decision making for CRC among African Americans. Furthermore, these findings suggest further research is needed to better understand these relationships to help with the future development and improvement of interventions targeting decision making outcomes for CRC screening in this population.
Resumo:
International audience
Resumo:
We present a case of a gravida 1 para 1 woman, who presented with an 11- month history of amenorrhea after cesarean delivery. The patient was taking birth control pills at the time of presentation. She was observed with a slight elevation of serum β-hCG level, an enlarged heterogeneous uterus and hematometra. A biopsy was performed, and the patient was diagnosed with placental site trophoblastic tumor; the patient then underwent surgery. Placental site trophoblastic tumor is the rarest form of gestational trophoblastic disease, derived from intermediate trophoblast cells. It does not have a pathognomonic appearance; therefore, correlation with medical history, as well as results of laboratory tests and pathological analysis is mandatory. It is a relatively chemoresistant tumor, posing considerable therapeutic challenges; patients with localized disease are managed with surgery and those with metastatic disease require additional chemotherapy. Herein, we review the main features of this entity and top differential diagnosis, as the rarity of this tumor is associated with imaging and pathological pitfalls, reinforcing the need for further experience in this field.
Resumo:
Shiga toxin-producing Escherichia coli (STEC) and enteropathogenic E. coli (EPEC) strains may be responsible for food-borne infections in humans. Twenty-eight STEC and 75 EPEC strains previously isolated from French shellfish-harvesting areas and their watersheds and belonging to 68 distinguishable serotypes were characterized in this study. High-throughput real-time PCR was used to search for the presence of 75 E. coli virulence-associated gene targets, and genes encoding Shiga toxin (stx) and intimin (eae) were subtyped using PCR tests and DNA sequencing, respectively. The results showed a high level of diversity between strains, with 17 unique virulence gene profiles for STEC and 56 for EPEC. Seven STEC and 15 EPEC strains were found to display a large number or a particular combination of genetic markers of virulence and the presence of stx and/or eae variants, suggesting their potential pathogenicity for humans. Among these, an O26:H11 stx1a eae-β1 strain was associated with a large number of virulence-associated genes (n = 47), including genes carried on the locus of enterocyte effacement (LEE) or other pathogenicity islands, such as OI-122, OI-71, OI-43/48, OI-50, OI-57, and the high-pathogenicity island (HPI). One O91:H21 STEC strain containing 4 stx variants (stx1a, stx2a, stx2c, and stx2d) was found to possess genes associated with pathogenicity islands OI-122, OI-43/48, and OI-15. Among EPEC strains harboring a large number of virulence genes (n, 34 to 50), eight belonged to serotype O26:H11, O103:H2, O103:H25, O145:H28, O157:H7, or O153:H2.
Resumo:
A method has been developed for the direct determination of Cu, Cd, Ni and Pb in aquatic humic substances (AHS) by graphite furnace atomic absorption spectrometry. AHS were isolated from water samples rich in organic matter, collected in the Brazilian Ecological Parks. All analytical curves presented good linear correlation coefficient. The limits of detection and quantification were in the ranges 2.5-16.7 mu g g(-1) and 8.5-50.0 mu g g(-1), respectively. The accuracy was determined using recovery tests, and for all analytes recovery percentages ranged from 93 - 98 %, with a relative standard deviation less than 4 %. The results indicated that the proposed method is a suitable alternative for the direct determination of metals in AHS.
Resumo:
Construction and demolition waste can contain considerable amounts of polyvinyl chloride (PVC). This paper describes a study of the recycling of PVC pipes collected from such waste materials. In a sorting facility for the specific disposal of construction and demolition waste, PVC was found to represent one-third of the plastics separated by workers. Pipes were sorted carefully to preclude any possible contamination by poly(ethylene terephthalate) (PET) found in the waste. The material was ground into two distinct particle sizes (final mesh of 12.7 and 8 mm), washed, dried and recycled. The average formulation of the pipes was determined based on ash content tests and used in the fabrication of a similar compound made mainly of virgin PVC. Samples of recycled pipes and of compound based on virgin material were subjected to tensile and impact tests and provided very similar results. These results are a good indication of the application potential of the recycled material and of the fact that longer grinding to obtain finer particles is not necessarily beneficial.
Resumo:
The health of people living with HIV and AIDS (PLWHA) is nutritionally challenged in many nations of the world. The scourge has reduced socio-economic progress globally and more so in sub-Saharan Africa (SSA) where its impact has been compounded by poverty and food insecurity. Good nutrition with proper drug use improves the quality of life for those infected but it is not known how PLWHA exposed to chronic malnutrition and food shortages from developing nations adjust their nutrition with use of Anti-Retro-viral Drugs (ARVs). This study assessed nutritional status, dietary practices, and dietary management of common illnesses that hinder daily food intake by the patients and use of ARVs with food recommendations provided by the health care givers. A descriptive case study design was used to sample 120 HIV-infected patients using systematic sampling procedure. These patients sought health care from an urban slum, Kibera AMREF clinic. Data were collected by anthropometric measurements, bio-chemical analysis, semi-structured questionnaire and secondary data. The Statistical Package for Social Sciences (SPSS) and the Nutri-Survey software packages were used to analyze data. Dietary intakes of micro-nutrients were inadequate for >70% of the patients when compared to the Recommended Daily Requirements. When Body Mass Indices (BMI) were used, only 6.7% of the respondents were underweight (BMI<18.5kg/m2) and 9.2% were overweight (BMI> 25kg/m2), serum albumin test results (mean 3.34±0.06g/dl) showed 60.8% of the respondents were protein deficient and this was confirmed by low dietary protein intakes. The BMI was not related to dietary nutrient intakes, serum albumin and CD4 cell counts (p>0.05). It appeared that there was no significant difference in BMI readings at different categories of CD4 cell count (p>0.05) suggesting that the level of immunity did not affect weight gain with ARV as observed in many studies from developed countries. Malnutrition was, therefore, evident among the 60.8% of the cases as identified by serum albumin tests and food intake was not adequate (68%) for the patients as they ate once a day due to lack of food. National food and nutrition policy should incorporate food security boosting guidelines for the poor people infected with HIV and using ARVs.
Resumo:
Chronic Chagas disease diagnosis relies on laboratory tests due to its clinical characteristics. The aim of this research was to review commercial enzyme-linked immunosorbent assay (ELISA) and polymerase chain reaction (PCR) diagnostic test performance. Performance of commercial ELISA or PCR for the diagnosis of chronic Chagas disease were systematically searched in PubMed, Scopus, Embase, ISI Web, and LILACS through the bibliography from 1980-2014 and by contact with the manufacturers. The risk of bias was assessed with QUADAS-2. Heterogeneity was estimated with the I2 statistic. Accuracies provided by the manufacturers usually overestimate the accuracy provided by academia. The risk of bias is high in most tests and in most QUADAS dimensions. Heterogeneity is high in either sensitivity, specificity, or both. The evidence regarding commercial ELISA and ELISA-rec sensitivity and specificity indicates that there is overestimation. The current recommendation to use two simultaneous serological tests can be supported by the risk of bias analysis and the amount of heterogeneity but not by the observed accuracies. The usefulness of PCR tests are debatable and health care providers should not order them on a routine basis. PCR may be used in selected cases due to its potential to detect seronegative subjects.
Resumo:
Background: Clinical features of Clostridium difficile infection (CDI) cases diagnosed by detection of polymerase chain reaction (PCR), with negative toxin enzyme immunoassay results (EIA) have not been fully elucidated. The purpose of this study was to determine the magnitude of CDI patients who had negative EIA toxin determinations but positive PCR tests, and their differences in clinical presentation. Methods: We performed a retrospective study comparing the clinical features of CDI cases detected by EIA (toxins A + B) with cases detected by PCR (toxin negative, PCR positive) over a 16-month period. Only patients with an initial Clostridium difficile infection episode that fulfilled a standardized definition were included. Results: During the study period, 107 episodes of CDI were detected. Seventy-four patients (69%) had positive glutamate dehydrogenase (GDH) antigen and EIA determinations (EIA positive patients). Thirty-three patients (31%) had GDH positive, negative toxin EIA and positive PCR determination (PCR positive patients). PCR positive patients were younger, 57 (27) years (mean [SD]), than EIA positive patients, 71 (16) years, (p < 0.001). Fewer PCR positive patients were receiving proton pump inhibitors (21 patients, 64%) than EIA positive patients (61 patients, 82%, p = 0.034). The clinical presentation was similar in both groups. In the multivariate analysis, lower age was identified as the only independent variable associated with PCR positive patients. Conclusions: One third of Clostridium difficile infection patients present negative toxin EIA and PCR positive tests. Performing PCR determination after the negative EIA test is more relevant in younger patients.
Resumo:
In this Clinical Practice Guideline we discuss the diagnostic and therapeutic approach of adult patients with constipation and abdominal complaints at the confluence of the irritable bowel syndrome spectrum and functional constipation. Both conditions are included among the functional bowel disorders, and have a significant personal, healthcare, and social impact, affecting the quality of life of the patients who suffer from them. The first one is the irritable bowel syndrome subtype, where constipation represents the predominant complaint, in association with recurrent abdominal pain, bloating, and abdominal distension. Constipation is characterized by difficulties with or low frequency of bowel movements, often accompanied by straining during defecation or a feeling of incomplete evacuation. Most cases have no underlying medical cause, and are therefore considered as a functional bowel disorder. There are many clinical and pathophysiological similarities between both disorders, and both respond similarly to commonly used drugs, their primary difference being the presence or absence of pain, albeit not in an "all or nothing" manner. Severity depends not only upon bowel symptom intensity but also upon other biopsychosocial factors (association of gastrointestinal and extraintestinal symptoms, grade of involvement, and perception and behavior variants). Functional bowel disorders are diagnosed using the Rome criteria. This Clinical Practice Guideline has been made consistent with the Rome IV criteria, which were published late in May 2016, and discuss alarm criteria, diagnostic tests, and referral criteria between Primary Care and gastroenterology settings. Furthermore, all the available treatment options (exercise, fluid ingestion, diet with soluble fiber-rich foods, fiber supplementation, other dietary components, osmotic or stimulating laxatives, probiotics, antibiotics, spasmolytics, peppermint essence, prucalopride, linaclotide, lubiprostone, biofeedback, antidepressants, psychological therapy, acupuncture, enemas, sacral root neurostimulation, surgery) are discussed, and practical recommendations are made regarding each of them.
Resumo:
This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Resumo:
We assessed the genetic structure of populations of the widely distributed sea cucumber Holothuria (Holothuria) mammata Grube, 1840, and investigated the effects of marine barriers to gene flow and historical processes. Several potential genetic breaks were considered, which would separate the Atlantic and Mediterranean basins, the isolated Macaronesian Islands from the other locations analysed, and the Western Mediterranean and Aegean Sea (Eastern Mediterranean). We analysed mitochondrial 16S and COI gene sequences from 177 individuals from four Atlantic locations and four Mediterranean locations. Haplotype diversity was high (H = 0.9307 for 16S and 0.9203 for COI), and the haplotypes were closely related (p = 0.0058 for 16S and 0.0071 for COI). The lowest genetic diversities were found in the Aegean Sea population. Our results showed that the COI gene was more variable and more useful for the detection of population structure than the 16S gene. The distribution of mtDNA haplotypes, the pairwise FST values and the results of exact tests and AMOVA revealed: (i) a significant genetic break between the population in the Aegean Sea and those in the other locations, as supported by both mitochondrial genes, and (ii) weak differentiation of the Canary and Azores Islands from the other populations; however, the populations from the Macaronesian Islands, Algarve and West Mediterranean could be considered to be a panmictic metapopulation. Isolation by distance was not identified in H. (H.) mammata. Historical events behind the observed findings, together with the current oceanographic patterns, were proposed and discussed as the main factors that determine the population structure and genetic signature of H. (H.) mammata
Resumo:
Natural stone has been a popular and reliable building material throughout history appearing in many historic monuments and in more recent buildings. Research into the intrinsic properties of specific stones is important because it gives us a greater understanding of the factors that limit and act on them. This can help prevent serious problems from occurring in our buildings bringing both esthetic benefits and financial savings. To this end, the main objective of this research has been to study the influence of the fabric and the mineral composition of two types of sandstone on their durability. The first is a red continental sandstone from the Buntsandstein Age called “Molinaza Roja”, which is quarried in Montoro (Cordoba). The second is quarried in Ronda (Malaga) and is sold under the trade name of “Arenisca Ronda”. It is a light pink-whitish calcarenite deposited during the Late Tortonian to Late Messinian. We characterized their petrological and petrophysical properties by studying their rock fabrics, porous systems and mechanical properties. In order to obtain a complete vision of the behavior of their rock fabrics, we also carried out two decay tests, the salt crystallization and the freeze–thaw tests. We then measured the effects on the textures of the altered samples during and after the decay tests and we evaluated the changes in the porous system. By comparing the results between intact and altered samples, we found that Arenisca Ronda is less durable because it has a high quantity of expandable clays (smectites) and a high percentage of pores in the 0.1–1 μm range, in which the pressure produced by salt crystallization is strongest. In Molinaza Roja the decay agents caused significant sanding due to loss of cohesion between the clasts, especially during the salt crystallization test. In both stones, the anisotropies (oriented textures) have an important role in their hydric and dynamic behavior and also affect their mechanical properties (especially in the compression resistance). No changes in color were detected.
Resumo:
Purpose: To assess the effects of oral glutamate intake on acute motor effects and chronic intake of ethanol in rodents. Methods: The acute effects of ethanol on motor function were studied in ICR mice by giving 2 or 6 g/kg of ethanol 2 h after distilled water or 2.5 g/kg glutamate per os. Thirty minutes after ethanol treatment, behavioral assays, including rotarod tests and foot print analysis were monitored. In chronic ethanol treatment, male Wistar rats were trained to consume ethanol-sucrose solution during a 2-h period daily, starting with 2 % ethanol/10 % sucrose and gradually increasing to 10 % ethanol/5 % sucrose solution over 56 days. After training session, the drug treatment phase was done for 10 days. The animals were force-fed 50 mg/kg/day topiramate or 2.5 g/kg/day glutamate 2 h before ethanol treatment sessions. Each day, ethanol intake, water intake, food intake and body weight were recorded. Results: Mice that received 2 or 6 g/kg of ethanol orally, showed a significant reduction in time on the rod in the rotarod test and a significant increase in both forelimb and hindlimb stride lengths when compared to control. Oral treatment with 2.5 g/kg of glutamate reversed the acute motor effects of ethanol. In chronic ethanol treatment, the intake of 10 % ethanol/5 % sucrose, accessible for 2 h, was significantly decreased in rats treated with either topiramate or glutamate. Conclusion: These results provide evidence that oral glutamate administration help to reduce the acute motor effects of ethanol in mice and ethanol intake in the chronic ethanol drinking rats.