985 resultados para CRITICAL MICELLE CONCENTRATION
Resumo:
INTRODUCTION: Zoonotic kala-azar, a lethal disease caused by protozoa of the genus Leishmania is considered out of control in parts of the world, particularly in Brazil, where transmission has spread to cities throughout most of the territory and mortality presents an increasing trend. Although a highly debatable measure, the Brazilian government regularly culls seropositive dogs to control the disease. Since control is failing, critical analysis concerning the actions focused on the canine reservoir was conducted. METHODS: In a review of the literature, a historical perspective focusing mainly on comparisons between the successful Chinese and Soviet strategies and the Brazilian approach is presented. In addition, analyses of the principal studies regarding the role of dogs as risk factors to humans and of the main intervention studies regarding the efficacy of the dog killing strategy were undertaken. Brazilian political reaction to a recently published systematic review that concluded that the dog culling program lacked efficiency and its effect on public policy were also reviewed. RESULTS: No firm evidence of the risk conferred by the presence of dogs to humans was verified; on the contrary, a lack of scientific support for the policy of killing dogs was confirmed. A bias for distorting scientific data towards maintaining the policy of culling animals was observed. CONCLUSIONS: Since there is no evidence that dog culling diminishes visceral leishmaniasis transmission, it should be abandoned as a control measure. Ethical considerations have been raised regarding distorting scientific results and the killing of animals despite minimal or absent scientific evidence
Resumo:
The “CMS Safety Closing Sensors System” (SCSS, or CSS for brevity) is a remote monitoring system design to control safety clearance and tight mechanical movements of parts of the CMS detector, especially during CMS assembly phases. We present the different systems that makes SCSS: its sensor technologies, the readout system, the data acquisition and control software. We also report on calibration and installation details, which determine the resolution and limits of the system. We present as well our experience from the operation of the system and the analysis of the data collected since 2008. Special emphasis is given to study positioning reproducibility during detector assembly and understanding how the magnetic fields influence the detector structure.
Resumo:
Double Degree. A Work Project, presented as part of the requirements for the Award of a Master’s Degree in Finance from NOVA – School of Business and Economics and a Masters Degree in Management from Louvain School of Management
Resumo:
RESUMO: O objectivo desta Tese de Doutoramento foi estudar o valor da Proteína CReactiva(PCR) como marcador de infecção e sepsis. Por definição, um marcador da infecção não está presente se o doente não está infectado, deve aparecer concomitantemente ou idealmente preceder a instalação da infecção, deve desaparecer com a instituição de terapêutica antimicrobiana adequada e permanecer elevado se a infecção for refractária ao tratamento. Do ponto de vista biológico, a PCR é o protótipo das proteínas de fase aguda, com uma marcada elevação da sua concentração sérica em resposta a diversos estímulos inflamatórios em particular infecções bacterianas. A sua concentração sérica depende apenas da intensidade do estímulo e da velocidade de síntese hepática, não sendo influenciada por nenhum factor ou tratamento a não ser que este tenha influência directa sobre o estímulo desencadeante, o que a torna um marcador de infecção com grande potencial. Nesta Tese comparou-se a PCR com marcadores clássicos de infecção, temperatura e contagem leucocitária, em diversas situações clínicas analisando doentes com infecções documentadas e doentes controlos, sem infecção. Globalmente os resultados dos trabalhos desta Tese mostram que a PCR é um bom marcador de infecção de acordo com a definição previamente apresentada. Em conjunto com a restante avaliação clínica e laboratorial, a monitorização diária da PCR nos doentes sem infecção mostrou ser útil como sentinela da infecção, isto é, apresenta valores baixos nos doentes sem infecção e sobe precocemente nos doentes que desenvolvem uma infecção. Nos doentes com infecção documentada revelou um ser bom marcador de resposta à terapêutica e evolução clínica, diminuindo naqueles que melhoravam e persistindo elevada nos que tinham mau prognóstico, bem assim como identificar diferentes perfis evolutivos. Em suma, a monitorização diária da PCR mostrou utilidade ao longo de todo o internamento na Unidade de Cuidados Intensivos, quer na presença quer na ausência de infecção. Deste todo, a monitorização diária da PCR pode a possibilitar uma utilização mais racional e judiciosa da terapêutica antimicrobiana, contribuindo dessa forma para uma diminuição da toxicidade e da pressão antibiótica, menor risco de emergência de resistências e finalmente diminuição dos custos. Uma vez que, os doentes internados nas Unidades de Cuidados Intensivos apresentam as mesmas doenças que os restantes doentes admitidos no hospital apenas se distinguindo pela sua maior gravidade, poder-se-á extrapolar que a PCR também é potencialmente um bom marcador de infecção nestes doentes. ----------------ABSTRACT: The aim of this PhD Thesis was to assess the value of C-Reactive Protein (CRP) as a marker of infection and sepsis. A marker of infection should be absent in a non-infected patient, should increase alongside or ideally precede the development of an infection, and finally should assess the therapeutic response, that is to say decrease or even disappear with adequate antimicrobial therapy or on the opposite remain elevated if the infection is refractory to the prescribed treatment. The biology of CRP makes it the prototype of acute phase proteins, with marked and sharp elevations of its serum concentration in response to several inflammatory stimulus in particular bacterial infections. Besides, CRP level depends only of the intensity of the stimulus and the rate of hepatic synthesis. Its concentration is not modified by any therapy or intervention. Only those interventions affecting the inflammatory process responsible for the acute phase reaction can change the CRP level. These properties make CRP a potentially good marker of infection. In this Thesis the value of CRP was studied in comparison to traditional markers of infection, like temperature and white cell count, in different clinical situations analysing patients with documented infections and a control group without infection. The aggregated results of the analysis presented in this Thesis illustrate that CRP could be used as a marker of infection. In conjunction with other clinical and laboratory manifestations of sepsis, daily CRP measurement in patients without infection was useful in prediction of infection as its concentration remains low in patients without infection whereas if an infection appears its levels raise markedly. In addition, in patients with documented infections CRP was useful as a marker of therapeutic response and follow-up, with marked decreases in patients with good outcome and remaining elevated in those with poor prognosis, as well as the recognition of different patterns of evolution. In summary, daily CRP measurement was helpful in critical ill patients along the entire Intensive Care Unit stay, both in the presence and in the absence of infection. As a result, daily CRP measurement can assure a better and more rational use of antibiotics and consequently contribute to a decrease in the antibiotic toxicity and demand, reducing the risks of emergence of resistant strains aas well as costs. Provided that patients admitted to an Intensive Care Unit presented the same clinical diagnosis as those admitted to the wards but with higher severity, one can speculate that CRP is also a potentially good marker of infection in these of patients.
Resumo:
American cutaneous leishmaniasis (ACL) is a complex disease with clinical and epidemiological features that may vary from region to region. In fact, at least seven different Leishmania species, including Leishmania (Viannia) braziliensis, Leishmania (Viannia) guyanensis, Leishmania (Viannia) lainsoni, Leishmania (Viannia) naiffi, Leishmania (Viannia) shawi, Leishmania (Viannia) lindenbergi, and Leishmania (Leishmania) amazonensis, have been implicated in the etiology of ACL in Brazil, and numerous phlebotomine sandfly species of the genus Lutzomyia have been regarded as putative or proven vectors. Because ACL is a focal disease, understanding the disease dynamics at the local level is essential for the implementation of more effective control measures. The present paper is a narrative review about the ACL epidemiology in Pernambuco, northeastern Brazil. Furthermore, the need for more effective diagnosis, treatment, control and prevention strategies for the affected populations is highlighted. This paper will provide researchers with a critical appraisal of ACL in Pernambuco. Hopefully, it will also be helpful for public health authorities to improve current control strategies against ACL at the state and country levels.
Resumo:
The hegemonic definition of Modernism has been subjected to an intense critical revision process that began several decades ago. This process has contributed to the significant broadening of the modernist canon by challenging its primal essentialist assumptions and formalist interpretations in the fields of both the visual arts and architecture. This conference aims to further expand this revision, as it seeks to discuss the notion of “Southern Modernisms” by considering the hypothesis that regional appropriations, both in Southern Europe and the Southern hemisphere, entailed important critical stances that have remained unseen or poorly explored by art and architectural historians. In association with the Southern Modernisms research project (FCT – EXPL/CPC-HAT/0191/2013), we want to consider the entrenchment of southern modernisms in popular culture (folk art and vernacular architecture) as anticipating some of the premises of what would later become known as critical regionalism. It is therefore our purpose to explore a research path that runs parallel to key claims on modernism’s intertwinement with bourgeois society and mass culture, by questioning the idea that an aesthetically significant regionalism – one that resists to the colonization of international styles and is supported by critical awareness – occurred only in the field of architecture, and can only be represented as a postmodernist turn. (...)
Resumo:
ABSTRACTINTRODUCTION:Exposure to subinhibitory concentrations (SICs) of antimicrobials may alter the bacterial transcriptome.METHODS: Here, we evaluated the expression of nine virulence-related genes in vancomycin-resistant enterococci (VRE) urinary tract infection isolates grown at SICs of vancomycin.RESULTS:A Subinhibitory concentrations of vancomycin interferes with gene modulation, but does not affect the phenotype of a VRE strain in vitro .CONCLUSIONS:Subinhibitory concentrations of vancomycin may regulate the expression of virulence factors in vivo or contribute to the selection of vancomycin-resistant strains.
Resumo:
The interest in chromium (Cr) arises from the widespread use of this heavy metal in various industrial processes that cause its release as liquid, solid and gaseous waste into the environment. The impact of Cr on the environment and living organisms primarily depends on its chemical form, since Cr(III) is an essential micronutrient for humans, other animals and plants, and Cr(VI) is highly toxic and a known human carcinogen. This study aimed to evaluate if the electrodialytic process (ED) is an appropriate treatment for Cr removal, through a critical overview of Cr speciation, before and after the ED experiments, to assess possible Cr(III)-Cr(VI) interconversions during the treatment. ED was the treatment technique applied to two types of matrices containing Cr: chromate copper arsenate (CCA) contaminated soil and municipal solid waste incineration (MSWI) fly ash. In order to study Cr remediation, three EDR set-ups were used: a new set-up, the combined cell (2/3C or 3/2C), with three compartments, alternating current between two anodes and different initial experimental conditions, one set-up with three compartments (3C cell) and the other set-up with two compartments (2C cell). The Cr removal rates obtained in this study were between 10-36% for the soil, and 1-13% for the fly ash. The highest Cr removal rates were achieved in the 26 days experiments: 36% for the soil, 13% for the fly ash. Regarding the 13 days experiments, the highest Cr removal rates were attained with the 2/3C set-up: 24% for the soil, 5% for the fly ash. The analysis of Cr(VI) was performed before and after ED experiments to evaluate eventual changes in Cr speciation during the treatment. This analysis was conducted by two methods: USEPA Method 3060A, for the extraction of Cr(VI); and Hach Company Method 8023, for the detection of Cr(VI). Despite the differences in Cr total concentration, both matrices presented a similar speciation, with Cr(III) being the main species found and Cr(VI) less than 3% of Cr total, before and after the treatment. For fly ash, Cr(VI) was initially below the detection limit of the method and remained that way after the treatment. For soil, Cr(VI) decreased after the treatment. Oxidation of Cr(III) to Cr(VI) did not occur during the ED process since there was no increase in Cr(VI) in the matrices after the treatment. Hence, the results of this study indicate that ED is an appropriate technique to remediate matrices containing Cr because it contributes to Cr removal, without causing Cr(III)-Cr(VI) interconversions.
Resumo:
Fluid management and dosage regimens of drugs in preterm infants should be based on the glomerular filtration rate. The current methods to determine glomerular flitration rate are invasive, time-consuming, and expensive. In contrast, creatinine clearance can be easy obtained and quickly determined. The purpose of this study was to compare plasma creatinine on the third and seventh day of life in preterm newborn infants, to evaluate the influence of maternal creatinine, and to demonstrate creatinine clearance can be used as a reliable indicator of glomerular filtration rate. We developed a prospective study (1994) including 40 preterm newborns (gestational age < 37 weeks), average = 34 weeks; birth weight (average) = 1840 g, in the first week of life. Inclusion criteria consisted of: absence of renal and urinary tract anomalies; O2 saturation 3 92%; adequate urine output (>1ml/kg/hr); normal blood pressure; absence of infections and no sympathomimetic amines in use. A blood sample was collected to determine plasma creatinine (enzymatic method) on the third and seventh day of life and creatinine clearance (CrCl) was obtained using the following equation: , k = 0.33 in preterm infant All plasma creatinine determinations showed normal values [third day: 0.78 mg/dl ± 0.24 (mean ± SD)and seventh day: 0.67 mg/dl ± 0.31 - (p>0.05)]. Also all creatinine clearance at third and seventh day of life were normal [third day: 19.5 ml/min ± 5.2 (mean ± SD) and seventh day: 23.8 ml/min ± 7.3 - (p>0,05)]. All preterm infants developed adequate renal function for their respective gestational age. In summary, our results indicate that, for clinical practice, the creatinine clearance, using newborn length, can be used to estimate glomerular filtration rate in preterm newborn infants.
Resumo:
Staphylococcus aureus is an important opportunistic pathogen that can cause a wide variety of diseases from mild to life-threatening conditions. S. aureus can colonize many parts of the human body but the anterior nares are the primary ecological niche. Its clinical importance is due to its ability to resist almost all classes of antibiotics available together with its large number of virulence factores. MRSA (Methicillin-Resistant S. aureus) strains are particularly important in the hospital settings, being the major cause of nosocomial infections worldwide. MRSA resistance to β-lactam antibiotics involves the acquisition of the exogenous mecA gene, part of the SCCmec cassette. Fast and reliable diagnostic techniques are needed to reduce the mortality and morbidity associated with MRSA infections, through the early identification of MRSA strains. The current identification techniques are time-consuming as they usually involves culturing steps, taking up to five days to determine the antibiotic resistance profile. Several amplification-based techniques have been developed to accelerate the diagnosis. The aim of this project was to develop an even faster methodology that bypasses the DNA amplification step. Gold-nanoprobes were developed and used to detect the presence of mecA gene in S. aureus genome, associated with resistance traits, for colorimetric assays based on non-crosslinking method. Our results showed that the mecA and mecA_V2 gold-nanoprobes were sensitive enough to discriminate the presence of mecA gene in PCR products and genomic DNA (gDNA) samples for target concentrations of 10 ng/μL and 20 ng/μL, respectively. As our main objective was to avoid the amplification step, we concluded that the best strategy for the early identification of MRSA infection relies on colorimetric assays based on non-crosslinking method with gDNA samples that can be extracted directly from blood samples.
Resumo:
Evaluation of Cyclosporin A (CyA) blood concentration is imperative in solid organ transplantation in order to achieve maximal immunosuppression with the least side effects. We compared the results of whole blood concentrations of CyA in 50 blood samples simultaneously evaluated by the fluorescent polarization immune assay (TDx) and the enzymatic competitive immune assay (EMIT 2000). There was a strong correlation between both kits for any range of CyA blood concentration (R=0.99, p<0.001). The within-run and between-days coefficient of variation were less than 4% for both assays. The cost for each CyA measurement was 50% lower for the EMIT assay when compared to the TDx assay. We concluded that the EMIT is as accurate as the TDx in measuring CyA blood concentration and has the advantage of a lower cost, as well as the possibility of widespread access to the EMIT methodology in contrast to the TDx equipment, allowing the laboratory to perform several routines within a working day.
Resumo:
PURPOSE: Enteral alimentation is the preferred modality of support in critical patients who have acceptable digestive function and are unable to eat orally, but the advantages of continuous versus intermittent administration are surrounded by controversy. With the purpose of identifying the benefits and complications of each technique, a prospective controlled study with matched subjects was conducted. PATIENTS AND METHODS: Twenty-eight consecutive candidates for enteral feeding were divided into 2 groups (n = 14 each) that were matched for diagnosis and APACHE II score. A commercial immune-stimulating polymeric diet was administered via nasogastric tube by electronic pump in the proportion of 25 kcal/kg/day, either as a 1-hour bolus every 3 hours (Group I), or continuously for 24 hours (Group II), over a 3-day period. Anthropometrics, biochemical measurements, recording of administered drugs and other therapies, thorax X-ray, measurement of abdominal circumference, monitoring of gastric residue, and clinical and nutritional assessments were performed at least once daily. The principal measured outcomes of this protocol were frequency of abdominal distention and pulmonary aspiration, and efficacy in supplying the desired amount of nutrients. RESULTS: Nearly half of the total population (46.4%) exhibited high gastric residues on at least 1 occasion, but only 1 confirmed episode of pulmonary aspiration occurred (3.6%). Both groups displayed a moderate number of complications, without differences. Food input during the first day was greater in Group II (approximately 20% difference), but by the third day, both groups displayed similarly small deficits in total furnished volume of about 10%, when compared with the prescribed diet. CONCLUSIONS: Both administration modalities permitted practical and effective administration of the diet with frequent registered abnormalities but few clinically significant problems. The two groups were similar in this regard, without statistical differences, probably because of meticulous technique, careful monitoring, strict patient matching, and conservative amounts of diet employed in both situations. Further studies with additional populations, diagnostic groups, and dietetic prescriptions should be performed in order to elucidate the differences between these commonly used feeding modalities.
Resumo:
Liver transplantation is now the standard treatment for end-stage liver disease. Given the shortage of liver donors and the progressively higher number of patients waiting for transplantation, improvements in patient selection and optimization of timing for transplantation are needed. Several solutions have been suggested, including increasing the donor pool; a fair policy for allocation, not permitting variables such as age, gender, and race, or third-party payer status to play any role; and knowledge of the natural history of each liver disease for which transplantation is offered. To observe ethical rules and distributive justice (guarantee to every citizen the same opportunity to get an organ), the "sickest first" policy must be used. Studies have demonstrated that death has no relationship with waiting time, but rather with the severity of liver disease at the time of inclusion. Thus, waiting time is no longer part of the United Network for Organ Sharing distribution criteria. Waiting time only differentiates between equally severely diseased patients. The authors have analyzed the waiting list mortality and 1-year survival for patients of the State of São Paulo, from July 1997 through January 2001. Only the chronological criterion was used. According to "Secretaria de Estado da Saúde de São Paulo" data, among all waiting list deaths, 82.2% occurred within the first year, and 37.6% within the first 3 months following inclusion. The allocation of livers based on waiting time is neither fair nor ethical, impairs distributive justice and human rights, and does not occur in any other part of the world.
Resumo:
Worldwide, the impact of meningococcal disease is substantial, and the potential for the introduction and spread of more virulent strains of N. meningitidis or strains with increased resistance to current antibiotics causes concern, making prevention essential. OBJECTIVES: Review the indications for meningococcal disease vaccines, considering the epidemiological status in Brazil. METHODS: A critical literature review on this issue using the Medline and Lilacs databases. RESULTS: In Brazil, MenB and MenC were the most important serogroups identified in the 1990s. Polysaccharide vaccines available against those serogroups can offer only limited protection for infants, the group at highest risk for meningococcal disease. Additionally, polysaccharide vaccines may induce a hypo-responsive state to MenC. New meningococcal C conjugate vaccines could partially solve these problems, but it is unlikely that in the next few years a vaccine against MenB that can promote good protection against multiple strains of MenB responsible for endemic and epidemic diseases will become available. CONCLUSIONS: In order to make the best decision about recommendations on immunization practices, better quality surveillance data are required. In Brazil, MenC was responsible for about 2,000 cases per year during the last 10 years. New conjugate vaccines against MenC are very effective and immunogenic, and they should be recommended, especially for children less than 5 years old. Polysaccharide vaccines should be indicated only in epidemic situations and for high-risk groups. Until new vaccines against MenC and MenB are available for routine immunization programs, the most important measure for controlling meningococcal disease is early diagnosis of these infections in order to treat patients and to offer chemoprophylaxis to contacts.