977 resultados para Laboratory methods.
Resumo:
"A.E.C. Contract AT(05-1)-636."
Resumo:
Item 241-A
Resumo:
Errata leaf inserted.
Resumo:
Includes bibliography.
Resumo:
An investigation of behavioural patterns that form a basis for termite control in the Australasian region was undertaken using laboratory colonies of the subterranean termite Reticulitermes santonensis (Feytaud). The study attempted to build a picture of the behavioural elements of individuals in a colony and based on this, trophallaxis, aggression and cannibalism were investigated in detail. Preliminary study of food transmission showed that 'workers' played a major role in the distribution of food. It was found, that among factors responsible for release of trophallactic behaviour the presence of 'right odour' between participants was important. It also appeared that the role taken by individuals depended on whether they were hungry or fully fed. Antennal palpation was shown by donors and acceptors alike and this seemed to be excitatory in function. Introduction of aliens into nests elicited aggression and these aliens were often killed. Factors eliciting aggression were investigated and colony odour was found to be important. Further investigations revealed that development of colony odour was governed by genetical and environmental mechanisms. Termite response to injury and death was also governed by odour. In the case of injury either the fresh haemolymph from the wound or some component of the haemolymph evoked cannibalism. Necrophagic behaviour was found to be released by fatty acids found in the corpses. Finally, the response of colonies to nestmates carrying arsenic trioxide was investigated. It was found that living and freshly dead arsenic-carrying nestmates were treated like normal nestmates, resulting in high initial mortality. However, poisoned cadavers soon became repellant and were buried thus preventing further spread of the poison to the rest of the colony. This suggested that complete control of subterranean termites by arsenic trioxide is unlikely to be fully effective, especially in those species which are capable of developing secondary reproductives from survivors and thus rebuilding the community.
Resumo:
Lipid peroxidation products like malondialdehyde, 4-hydroxynonenal and F(2)-isoprostanes are widely used as markers of oxidative stress in vitro and in vivo. This study reports the results of a multi-laboratory validation study by COST Action B35 to assess inter-laboratory and intra-laboratory variation in the measurement of lipid peroxidation. Human plasma samples were exposed to UVA irradiation at different doses (0, 15 J, 20 J), encoded and shipped to 15 laboratories, where analyses of malondialdehyde, 4-hydroxynonenal and isoprostanes were conducted. The results demonstrate a low within-day-variation and a good correlation of results observed on two different days. However, high coefficients of variation were observed between the laboratories. Malondialdehyde determined by HPLC was found to be the most sensitive and reproducible lipid peroxidation product in plasma upon UVA treatment. It is concluded that measurement of malondialdehyde by HPLC has good analytical validity for inter-laboratory studies on lipid peroxidation in human EDTA-plasma samples, although it is acknowledged that this may not translate to biological validity.
Resumo:
Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.
Resumo:
Technical evaluation of analytical data is of extreme relevance considering it can be used for comparisons with environmental quality standards and decision-making as related to the management of disposal of dredged sediments and the evaluation of salt and brackish water quality in accordance with CONAMA 357/05 Resolution. It is, therefore, essential that the project manager discusses the environmental agency's technical requirements with the laboratory contracted for the follow-up of the analysis underway and even with a view to possible re-analysis when anomalous data are identified. The main technical requirements are: (1) method quantitation limits (QLs) should fall below environmental standards; (2) analyses should be carried out in laboratories whose analytical scope is accredited by the National Institute of Metrology (INMETRO) or qualified or accepted by a licensing agency; (3) chain of custody should be provided in order to ensure sample traceability; (4) control charts should be provided to prove method performance; (5) certified reference material analysis or, if that is not available, matrix spike analysis, should be undertaken and (6) chromatograms should be included in the analytical report. Within this context and with a view to helping environmental managers in analytical report evaluation, this work has as objectives the discussion of the limitations of the application of SW 846 US EPA methods to marine samples, the consequences of having data based on method detection limits (MDL) and not sample quantitation limits (SQL), and present possible modifications of the principal method applied by laboratories in order to comply with environmental quality standards.
Resumo:
On-line leak detection is a main concern for the safe operation of pipelines. Acoustic and mass balance are the most important and extensively applied technologies in field problems. The objective of this work is to compare these leak detection methods with respect to a given reference situation, i.e., the same pipeline and monitoring signals acquired at the inlet and outlet ends. Experimental tests were conducted in a 749 m long laboratory pipeline transporting water as the working fluid. The instrumentation included pressure transducers and electromagnetic flowmeters. Leaks were simulated by opening solenoid valves placed at known positions and previously calibrated to produce known average leak flow rates. Results have clearly shown the limitations and advantages of each method. It is also quite clear that acoustics and mass balance technologies are, in fact, complementary. In general, an acoustic leak detection system sends out an alarm more rapidly and locates the leak more precisely, provided that the rupture of the pipeline occurs abruptly enough. On the other hand, a mass balance leak detection method is capable of quantifying the leak flow rate very accurately and of detecting progressive leaks.
Resumo:
This paper deals with the use of simplified methods to predict methane generation in tropical landfills. Methane recovery data obtained on site as part of a research program being carried Out at the Metropolitan Landfill, Salvador, Brazil, is analyzed and used to obtain field methane generation over time. Laboratory data from MSW samples of different ages are presented and discussed: and simplified procedures to estimate the methane generation potential, L(o), and the constant related to the biodegradation rate, k are applied. The first order decay method is used to fit field and laboratory results. It is demonstrated that despite the assumptions and the simplicity of the adopted laboratory procedures, the values L(o) and k obtained are very close to those measured in the field, thus making this kind of analysis very attractive for first approach purposes. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A pool of five synthetic peptides was used as an antigenic base in an ELISA (ELISA-Pp) for laboratory diagnosis of Schistosoma mansoni. Serum samples were obtained from individuals with acute (n = 23) and chronic (n = 30) schistosomiasis, with other parasitoses (n = 39) or without parasitic infections (n = 100). ELISA-Pp was compared with other immunoenzymatic methods for detection of IgM (IgM-ELISA) or IgG (IgG-ELISA) as well as an immunofluorescence test for detection of IgM antibodies (IgM-IFT). The sensitivity and specificity of ELISA-Pp was 86.8% and 94.2% when tested on the schistosomiasis group and the non-schistosomiasis group, respectively. Comparison of ELISA-Pp with other serological methods resulted in K concordance indices varying from 0.59 to 0.75. Evaluation of anti-peptide IgG antibodies showed higher Levels in patients with acute compared with chronic schistosomiasis (P = 0.001). ELISA-Pp showed satisfactory sensitivity and high specificity and may constitute a potentially useful method for laboratory diagnosis of schistosomiasis mansoni. (c) 2007 Royal Society of Tropical Medicine and Hygiene. Published by Elsevier Ltd. All rights reserved.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.
Resumo:
Background: Tuberculous meningitis (TBM) is a growing problem in HIV-infected patients in developing countries, where there is scarce data about this co-infection. Our objectives were to analyze the main features and outcomes of HIV-infected patients with TBM. Methods: This was a retrospective study of HIV-infected Brazilian patients admitted consecutively for TBM. All patients had Mycobacterium tuberculosis isolated from the cerebrospinal fluid (CSF). Presenting clinical and laboratory features were studied. Multivariate analysis was used to identify variables associated with death during hospitalization and at 9 months after diagnosis. Survival was estimated using the Kaplan-Meier method. Results: We included 108 cases (median age 36 years, 72% male). Only 15% had fever, headache, and meningeal signs simultaneously. Forty-eight percent had extrameningeal tuberculosis. The median CD4+ cell count was 65 cells/mu l. Among 90 cases, 7% had primary resistance to isoniazid and 9% presented multidrug-resistant strains. The overall mortality during hospitalization was 29% and at 9 months was 41%. Tachycardia and prior highly active antiretroviral therapy (HAART) were associated with 9-month mortality. The 9-month survival rate was 22% (95% confidence interval 12-43%). Conclusions: Clinical and laboratory manifestations were unspecific. Disseminated tuberculosis and severe immunosuppression were common. Mortality was high and the 9-month survival rate was low. Tachycardia and prior HAART were associated with death within 9 months of diagnosis. (C) 2009 International Society for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Resumo:
A laboratory scale sequencing batch reactor (SBR) operating for enhanced biological phosphorus removal (EBPR) and fed with a mixture of volatile fatty acids (VFAs) showed stable and efficient EBPR capacity over a four-year-period. Phosphorus (P), poly-beta-hydroxyalkanoate (PHA) and glycogen cycling consistent with classical anaerobic/aerobic EBPR were demonstrated with the order of anaerobic VFA uptake being propionate, acetate then butyrate. The SBR was operated without pH control and 63.67+/-13.86 mg P l(-1) was released anaerobically. The P% of the sludge fluctuated between 6% and 10% over the operating period (average of 8.04+/-1.31%). Four main morphological types of floc-forming bacteria were observed in the sludge during one year of in-tensive microscopic observation. Two of them were mainly responsible for anaerobic/aerobic P and PHA transformations. Fluorescence in situ hybridization (FISH) and post-FISH chemical staining for intracellular polyphosphate and PHA were used to determine that 'Candidatus Accumulibacter phosphatis' was the most abundant polyphosphate accumulating organism (PAO), forming large clusters of coccobacilli (1.0-1.5 mum) and comprising 53% of the sludge bacteria. Also by these methods, large coccobacillus-shaped gammaproteobacteria (2.5-3.5 mum) from a recently described novel cluster were glycogen-accumulating organisms (GAOs) comprising 13% of the bacteria. Tetrad-forming organisms (TFOs) consistent with the 'G bacterium' morphotype were alphaproteobacteria , but not Amaricoccus spp., and comprised 25% of all bacteria. According to chemical staining, TFOs were occasionally able to store PHA anaerobically and utilize it aerobically.