941 resultados para Method of extraction
Resumo:
Exchangeable Al has been used as a criterion for the calculation of lime requirement in several Brazilian States. However, the laboratory method with extraction by a 1 mol L-1 KCl solution followed by indirect alkaline titration is not accurate for some Brazilian soils, mainly in the case of soils with high organic matter content. The objective of this study was therefore to evaluate the stoichiometry of H+/Al3+ in KCl soil extracts. The results suggested that organically complexed Al is the main contributor to exchangeable acidity in soils enriched with organic matter. Liming recommendations for organic soils based exclusively on exchangeable Al determined by the NaOH titration method should therefore be revised.
Resumo:
In addition to the more reactive forms, metals can occur in the structure of minerals, and the sum of all these forms defines their total contents in different soil fractions. The isomorphic substitution of heavy metals for example alters the dimensions of the unit cell and mineral size. This study proposed a method of chemical fractionation of heavy metals, using more powerful extraction methods, to remove the organic and different mineral phases completely. Soil samples were taken from eight soil profiles (0-10, 10-20 and 20-40 cm) in a Pb mining and metallurgy area in Adrianópolis, Paraná, Brazil. The Pb and Zn concentrations were determined in the following fractions (complete phase removal in each sequential extraction): exchangeable; carbonates; organic matter; amorphous and crystalline Fe oxides; Al oxide, amorphous aluminosilicates and kaolinite; and residual fractions. The complete removal of organic matter and mineral phases in sequential extractions resulted in low participation of residual forms of Pb and Zn in the total concentrations of these metals in the soils: there was lower association of metals with primary and 2:1 minerals and refractory oxides. The powerful methods used here allow an identification of the complete metal-mineral associations, such as the occurrence of Pb and Zn in the structure of the minerals. The higher incidence of Zn than Pb in the structure of Fe oxides, due to isomorphic substitution, was attributed to a smaller difference between the ionic radius of Zn2+ and Fe3+.
Resumo:
The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.
Resumo:
Although the molecular typing of Pseudomonas aeruginosa is important to understand the local epidemiology of this opportunistic pathogen, it remains challenging. Our aim was to develop a simple typing method based on the sequencing of two highly variable loci. Single-strand sequencing of three highly variable loci (ms172, ms217, and oprD) was performed on a collection of 282 isolates recovered between 1994 and 2007 (from patients and the environment). As expected, the resolution of each locus alone [number of types (NT) = 35-64; index of discrimination (ID) = 0.816-0.964] was lower than the combination of two loci (NT = 78-97; ID = 0.966-0.971). As each pairwise combination of loci gave similar results, we selected the most robust combination with ms172 [reverse; R] and ms217 [R] to constitute the double-locus sequence typing (DLST) scheme for P. aeruginosa. This combination gave: (i) a complete genotype for 276/282 isolates (typability of 98%), (ii) 86 different types, and (iii) an ID of 0.968. Analysis of multiple isolates from the same patients or taps showed that DLST genotypes are generally stable over a period of several months. The high typability, discriminatory power, and ease of use of the proposed DLST scheme makes it a method of choice for local epidemiological analyses of P. aeruginosa. Moreover, the possibility to give unambiguous definition of types allowed to develop an Internet database ( http://www.dlst.org ) accessible by all.
Resumo:
Because of the large variability in the pharmacokinetics of anti-HIV drugs, therapeutic drug monitoring in patients may contribute to optimize the overall efficacy and safety of antiretroviral therapy. An LC-MS/MS method for the simultaneous assay in plasma of the novel antiretroviral agents rilpivirine (RPV) and elvitegravir (EVG) has been developed to that endeavor. Plasma samples (100 μL) extraction is performed by protein precipitation with acetonitrile, and the supernatant is subsequently diluted 1:1 with 20-mM ammonium acetate/MeOH 50:50. After reverse-phase chromatography, quantification of RPV and EVG, using matrix-matched calibration samples, is performed by electrospray ionization-triple quadrupole mass spectrometry by selected reaction monitoring detection using the positive mode. The stable isotopic-labeled compounds RPV-(13) C6 and EVG-D6 were used as internal standards. The method was validated according to FDA recommendations, including assessment of extraction yield, matrix effects variability (<6.4%), as well as EVG and RPV short and long-term stability in plasma. Calibration curves were validated over the clinically relevant concentrations ranging from 5 to 2500 ng/ml for RPV and from 50 to 5000 ng/ml for EVG. The method is precise (inter-day CV%: 3-6.3%) and accurate (3.8-7.2%). Plasma samples were found to be stable (<15%) in all considered conditions (RT/48 h, +4°C/48 h, -20°C/3 months and 60°C/1 h). Selected metabolite profiles analysis in patients' samples revealed the presence of EVG glucuronide, that was well separated from parent EVG, allowing to exclude potential interferences through the in-source dissociation of glucuronide to parent drug. This new, rapid and robust LCMS/MS assay for the simultaneous quantification of plasma concentrations of these two major new anti-HIV drugs EVG and RPV offers an efficient analytical tool for clinical pharmacokinetics studies and routine therapeutic drug monitoring service. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Due to the hazardous nature of chemical asphalt extraction agents, nuclear gauges have become an increasingly popular method of determining the asphalt content of a bituminous mix. This report details the results of comparisons made between intended, tank stick, extracted, and nuclear asphalt content determinations. A total of 315 sets of comparisons were made on samples that represented 110 individual mix designs and 99 paving projects. All samples were taken from 1987 construction projects. In addition to the comparisons made, seventeen asphalt cement samples were recovered for determination of penetration and viscosity. Results were compared to similar tests performed on the asphalt assurance samples in an attempt to determine the amount of asphalt hardening that can be expected due to the hot mix process. Conclusions of the report are: 1. Compared to the reflux extraction procedure, nuclear asphalt content gauges determine asphalt content of bituminous mixes with much greater accuracy and comparable precision. 2. As a means for determining asphalt content, the nuclear procedure should be used as an alternate to chemical extractions whenever possible. 3. Based on penetration and viscosity results, softer grade asphalts undergo a greater degree 'of hardening due to hot mix processing than do harder grades, and asphalt viscosity changes caused by the mixing process are subject to much more variability than are changes in penetration. 4. Based on changes in penetration and viscosity, the Thin Film Oven Test provides a reasonable means of estimating how much asphalt hardening can be anticipated due to exposure to the hot mix processing environment.
Resumo:
The objective of this work was to clarify whether the method to extract nematodes from European soils is suitable for forest soils and litter in the eastern of Paraná state, Brazil, and whether nematode abundance differs between sites with different ecosystems and levels of human interference. The study sites were situated in the coastal area of the Serra do Mar, near the town of Antonina, in Eastern Paraná, Brazil. Cobb's sieving and decanting method was more appropriate than ISO method, since extraction efficiency was higher and intra-sample variability was significantly lower. In order to achieve an extraction efficiency higher than 90%, Cobb's method was modified. For the extraction of nematodes from litter, the Baermann funnel, with an extraction time of 48 hours, yielded an extraction efficiency higher than 90%. Nematode abundance in litter was higher than in soil. The mean number of individuals extracted from the litter increased with the age stage of the forest sites sampled, and there was no difference in the number of individuals in the soil of the four forest sites. Mean nematode abundance in soil in banana plantations was about twice as high compared to the banana-palmito mixed stands and to the forest sites.
Resumo:
BACKGROUND: Health professionals and policymakers aspire to make healthcare decisions based on the entire relevant research evidence. This, however, can rarely be achieved because a considerable amount of research findings are not published, especially in case of 'negative' results - a phenomenon widely recognized as publication bias. Different methods of detecting, quantifying and adjusting for publication bias in meta-analyses have been described in the literature, such as graphical approaches and formal statistical tests to detect publication bias, and statistical approaches to modify effect sizes to adjust a pooled estimate when the presence of publication bias is suspected. An up-to-date systematic review of the existing methods is lacking. METHODS/DESIGN: The objectives of this systematic review are as follows:âeuro¢ To systematically review methodological articles which focus on non-publication of studies and to describe methods of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses.âeuro¢ To appraise strengths and weaknesses of methods, the resources they require, and the conditions under which the method could be used, based on findings of included studies.We will systematically search Web of Science, Medline, and the Cochrane Library for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses. A dedicated data extraction form is developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article. As this will be a qualitative systematic review, data reporting will involve a descriptive summary. DISCUSSION: Results are expected to be publicly available in mid 2013. This systematic review together with the results of other systematic reviews of the OPEN project (To Overcome Failure to Publish Negative Findings) will serve as a basis for the development of future policies and guidelines regarding the assessment and handling of publication bias in meta-analyses.
Resumo:
This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).
Resumo:
Building on the instrumental model of group conflict (IMGC), the present experiment investigates the support for discriminatory and meritocratic method of selections at university in a sample of local and immigrant students. Results showed that local students were supporting in a larger proportion selection method that favors them over immigrants in comparison to method that consists in selecting the best applicants without considering his/her origin. Supporting the assumption of the IMGC, this effect was stronger for locals who perceived immigrants as competing for resources. Immigrant students supported more strongly the meritocratic selection method than the one that discriminated them. However, contrasting with the assumption of the IMGC, this effect was only present in students who perceived immigrants as weakly competing for locals' resources. Results demonstrate that selection methods used at university can be perceived differently depending on students' origin. Further, they suggest that the mechanisms underlying the perception of discriminatory and meritocratic selection methods differ between local and immigrant students. Hence, the present experiment makes a theoretical contribution to the IMGC by delimiting its assumptions to the ingroup facing a competitive situation with a relevant outgroup. Practical implication for universities recruitment policies are discussed.
Resumo:
The present work describes the development of a fast and robust analytical method for the determination of 53 antibiotic residues, covering various chemical groups and some of their metabolites, in environmental matrices that are considered important sources of antibiotic pollution, namely hospital and urban wastewaters, as well as in river waters. The method is based on automated off-line solid phase extraction (SPE) followed by ultra-high-performance liquid chromatography coupled to quadrupole linear ion trap tandem mass spectrometry (UHPLC–QqLIT). For unequivocal identification and confirmation, and in order to fulfill EU guidelines, two selected reaction monitoring (SRM) transitions per compound are monitored (the most intense one is used for quantification and the second one for confirmation). Quantification of target antibiotics is performed by the internal standard approach, using one isotopically labeled compound for each chemical group, in order to correct matrix effects. The main advantages of the method are automation and speed-up of sample preparation, by the reduction of extraction volumes for all matrices, the fast separation of a wide spectrum of antibiotics by using ultra-high-performance liquid chromatography, its sensitivity (limits of detection in the low ng/L range) and selectivity (due to the use of tandem mass spectrometry) The inclusion of β-lactam antibiotics (penicillins and cephalosporins), which are compounds difficult to analyze in multi-residue methods due to their instability in water matrices, and some antibiotics metabolites are other important benefits of the method developed. As part of the validation procedure, the method developed was applied to the analysis of antibiotics residues in hospital, urban influent and effluent wastewaters as well as in river water samples
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation
Resumo:
Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective: We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application.Data sources and extraction: A search was made of the MEDLINE and EMBASE databases.Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis: The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level.Conclusions: The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use
Resumo:
In the proposed method, carbon tetrachloride and ethanol were used as extraction and dispersive solvents. Several factors that may be affected on the extraction process, such as extraction solvent, disperser solvent, the volume of extraction and disperser solvent, pH of the aqueous solution and extraction time were optimized. Under the optimal conditions, linearity was maintained between 1.0 ng mL-1 to 1.5 mg mL-1 for zinc and 1.0 ng mL-1 to 0.4 mg mL-1 for cadmium. The proposed method has been applied for determination of trace amount of zinc and cadmium in standard and water samples with satisfactory results.
Resumo:
A method for the determination of trace amounts of palladium was developed using homogeneous liquid-liquid microextraction via flotation assistance (HLLME-FA) followed by graphite furnace atomic absorption spectrometry (GFAAS). Ammonium pyrrolidine dithiocarbamate (APDC) was used as a complexing agent. This was applied to determine palladium in three types of water samples. In this study, a special extraction cell was designed to facilitate collection of the low-density solvent extraction. No centrifugation was required in this procedure. The water sample solution was added to the extraction cell which contained an appropriate mixture of extraction and homogeneous solvents. By using air flotation, the organic solvent was collected at the conical part of the designed cell. Parameters affecting extraction efficiency were investigated and optimized. Under the optimum conditions, the calibration graph was linear in the range of 1.0-200 µg L-1 with a limit of detection of 0.3 µg L-1. The performance of the method was evaluated for the extraction and determination of palladium in water samples and satisfactory results were obtained. In order to verify the accuracy of the approach, the standard addition method was applied for the determination of palladium in spiked synthetic samples and satisfactory results were obtained.