948 resultados para extraction and separation techniques
Resumo:
NASCIMENTO, H. G. ; FERNANDES, L. C. ; SOUSA, M. B. C. . Avaliação da fidedignidade dos ensaios de esteróides fecais realizados no Laboratório de Medidas Hormonais do Departamento de Fisiologia da UFRN. Publica , v. 2, p. 39-48, 2006.
Resumo:
NASCIMENTO, H. G. ; FERNANDES, L. C. ; SOUSA, M. B. C. . Avaliação da fidedignidade dos ensaios de esteróides fecais realizados no Laboratório de Medidas Hormonais do Departamento de Fisiologia da UFRN. Publica , v. 2, p. 39-48, 2006.
Resumo:
A preocupação com a poluição das águas por agrotóxicos tem aumentado, visto que aumentou o número de detecções de agrotóxicos em águas. A falta de avaliação da qualidade da água consumida pela população de áreas rurais onde não existe o abastecimento público de água potável, deve ser considerada, pois essas águas se encontram próximo a áreas de cultivo, onde há intensa aplicação de agrotóxicos. Nessas regiões, o abastecimento de água para as residências e para a irrigação é feito geralmente através das águas de poços. Neste trabalho, um método para determinação dos agrotóxicos carbofurano, clomazona, 2,4-D e tebuconazol em água subterrânea foi desenvolvido e validado. O método utilizou a Extração em Fase Sólida (SPE) e determinação por Cromatografia Líquida de Alta eficiência com Detecção por Arranjo de Diodos (HPLC-DAD) e confirmação por Cromatografia Líquida tandem Espectrometria de Massas (LC-MS/MS). Para a SPE utilizou-se cartuchos C18 de 200 mg, e eluição com 1 mL de metanol. Após a otimização dos parâmetros de extração e separação dos compostos, o método foi validado avaliando-se curva analítica, linearidade, limites de detecção e quantificação, precisão (repetitividade e precisão intermediária) e exatidão (recuperação). Todas as curvas analíticas apresentaram valores de r maiores que 0,99. Os LOQs para o método, considerando a etapa de pré-concentração de 250 vezes, foram de 0,2 µg L -1 para todos os agrotóxicos por HPLC-DAD e, por LC-MS/MS, 4,0 ng L -1 para clomazona, carbofurano e tebuconazol e de 40,0 ng L -1 para 2,4-D. As recuperações foram entre 60,3 e 107,7% para a repetitividade e entre 67,5 e 115,3% para a precisão intermediária, com RSD de 0,8 a 20,7% para todos os compostos por HPLC-DAD. Para o LC-MS/MS a precisão em termos de repetitividade, variou entre 0,97 e 20,7%, e as recuperações entre 67,0 e 108,9%. O método foi aplicado na determinação de agrotóxicos em amostras de águas subterrâneas durante um ano. Nas amostras foram detectados agrotóxicos em níveis de µg L -1 . Dentro do contexto atual da Química Analítica, de desenvolver métodos mais rápidos, que utilizem menor quantidade de solvente, de amostra e com altos fatores de enriquecimento, foi otimizado um método de extração para os agrotóxicos carbofurano, clomazona e tebuconazol utilizando a Microextração Líquido-Líquido Dispersiva (DLLME) e determinação por LC-MS/MS. Foram otimizados alguns parâmetros que influenciam no processo de extração, como: tipo e volume dos solventes dispersores e extratores, tempo de extração, força iônica e velocidade de centrifugação. Nas condições otimizadas, as recuperações para os níveis de concentração entre 0,02 e 2,0 g L -1 variaram entre 62,7 e 120,0%, com valores de RSD entre 1,9 e 9,1%. O LOQ do método foi de 0,02 µg L -1 para todos os compostos. Quando comparado com a SPE se demonstrou rápido, simples, de baixo custo, além de necessitar de menores volumes de amostra para determinação de agrotóxicos em águas. O método mostrou-se adequado à análise dos agrotóxicos em água subterrânea e todos os parâmetros de validação obtidos estão dentro dos limites sugeridos para validação de métodos cromatográficos
Resumo:
My doctoral research is about the modelling of symbolism in the cultural heritage domain, and on connecting artworks based on their symbolism through knowledge extraction and representation techniques. In particular, I participated in the design of two ontologies: one models the relationships between a symbol, its symbolic meaning, and the cultural context in which the symbol symbolizes the symbolic meaning; the second models artistic interpretations of a cultural heritage object from an iconographic and iconological (thus also symbolic) perspective. I also converted several sources of unstructured data, a dictionary of symbols and an encyclopaedia of symbolism, and semi-structured data, DBpedia and WordNet, to create HyperReal, the first knowledge graph dedicated to conventional cultural symbolism. By making use of HyperReal's content, I showed how linked open data about cultural symbolism could be utilized to initiate a series of quantitative studies that analyse (i) similarities between cultural contexts based on their symbologies, (ii) broad symbolic associations, (iii) specific case studies of symbolism such as the relationship between symbols, their colours, and their symbolic meanings. Moreover, I developed a system that can infer symbolic, cultural context-dependent interpretations from artworks according to what they depict, envisioning potential use cases for museum curation. I have then re-engineered the iconographic and iconological statements of Wikidata, a widely used general-domain knowledge base, creating ICONdata: an iconographic and iconological knowledge graph. ICONdata was then enriched with automatic symbolic interpretations. Subsequently, I demonstrated the significance of enhancing artwork information through alignment with linked open data related to symbolism, resulting in the discovery of novel connections between artworks. Finally, I contributed to the creation of a software application. This application leverages established connections, allowing users to investigate the symbolic expression of a concept across different cultural contexts through the generation of a three-dimensional exhibition of artefacts symbolising the chosen concept.
Resumo:
The purpose of this paper is to study metal separation from a sample composed of a mixture of the main types of spent household batteries, using a hydrometallurgical route, comparing selective precipitation and liquid-liquid extraction separation techniques. The preparation of the solution consisted of: grinding the waste of mixed batteries, reduction and volatile metals elimination using electric furnace and acid leaching. From this solution two different routes were studied: selective precipitation with sodium hydroxide and liquid-liquid extraction using Cyanex 272 [bis(2,4,4-trimethylpentyl) phosphoric acid] as extracting agent. The best results were obtained from liquid-liquid extraction in which Zn had a 99% extraction rate at pH 2.5. More than 95% Fe was extracted at pH 7.0, the same pH at which more than 90% Ce was extracted. About 88% Mn, Cr and Co was extracted at this pH. At pH 3.0, more than 85% Ni was extracted, and at pH 3.5 more than 80% of Cd and La was extracted. (C) 2010 Elsevier Ltd. All rights reserved.
Analysis and evaluation of techniques for the extraction of classes in the ontology learning process
Resumo:
This paper analyzes and evaluates, in the context of Ontology learning, some techniques to identify and extract candidate terms to classes of a taxonomy. Besides, this work points out some inconsistencies that may be occurring in the preprocessing of text corpus, and proposes techniques to obtain good terms candidate to classes of a taxonomy.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
Phenolic acids are aromatic secondary plant metabolites, widely spread throughout the plant kingdom. Due to their biological and pharmacological properties, they have been playing an important role in phytotherapy and consequently techniques for their separation and purification are in need. This thesis aims at exploring new sustainable separation processes based on ionic liquids (ILs) in the extraction of biologically active phenolic acids. For that purpose, three phenolic acids with similar chemical structures were selected: cinnamic acid, p-coumaric acid and caffeic acid. In the last years, it has been shown that ionic liquids-based aqueous biphasic systems (ABSs) are valid alternatives for the extraction, recovery and purification of biomolecules when compared to conventional ABS or extractions carried out with organic solvents. In particular, cholinium-based ILs represent a clear step towards a greener chemistry, while providing means for the implementation of efficient techniques for the separation and purification of biomolecules. In this work, ABSs were implemented using cholinium carboxylate ILs using either high charge density inorganic salt (K3PO4) or polyethylene glycol (PEG) to promote the phase separation of aqueous solutions containing three different phenolic acids. These systems allow for the evaluation of effect of chemical structure of the anion on the extraction efficiency. Only one imidazolium-based IL was used in order to establish the effect of the cation chemical structure. The selective extraction of one single acid was also researched. Overall, it was observed that phenolic acids display very complex behaviours in aqueous solutions, from dimerization to polymerization and also hetero-association are quite frequent phenomena, depending on the pH conditions. These phenomena greatly hinder the correct quantification of these acids in solution.
Resumo:
Simulated moving bed (SMB) chromatography is attracting more and more attention since it is a powerful technique for complex separation tasks. Nowadays, more than 60% of preparative SMB units are installed in the pharmaceutical and in the food in- dustry [SDI, Preparative and Process Liquid Chromatography: The Future of Process Separations, International Strategic Directions, Los Angeles, USA, 2002. http://www. strategicdirections.com]. Chromatography is the method of choice in these ¯elds, be- cause often pharmaceuticals and ¯ne-chemicals have physico-chemical properties which di®er little from those of the by-products, and they may be thermally instable. In these cases, standard separation techniques as distillation and extraction are not applicable. The noteworthiness of preparative chromatography, particulary SMB process, as a sep- aration and puri¯cation process in the above mentioned industries has been increasing, due to its °exibility, energy e±ciency and higher product purity performance. Consequently, a new SMB paradigm is requested by the large number of potential small- scale applications of the SMB technology, which exploits the °exibility and versatility of the technology. In this new SMB paradigm, a number of possibilities for improving SMB performance through variation of parameters during a switching interval, are pushing the trend toward the use of units with smaller number of columns because less stationary phase is used and the setup is more economical. This is especially important for the phar- maceutical industry, where SMBs are seen as multipurpose units that can be applied to di®erent separations in all stages of the drug-development cycle. In order to reduce the experimental e®ort and accordingly the coast associated with the development of separation processes, simulation models are intensively used. One impor- tant aspect in this context refers to the determination of the adsorption isotherms in SMB chromatography, where separations are usually carried out under strongly nonlinear conditions in order to achieve higher productivities. The accurate determination of the competitive adsorption equilibrium of the enantiomeric species is thus of fundamental importance to allow computer-assisted optimization or process scale-up. Two major SMB operating problems are apparent at production scale: the assessment of product quality and the maintenance of long-term stable and controlled operation. Constraints regarding product purity, dictated by pharmaceutical and food regulatory organizations, have drastically increased the demand for product quality control. The strict imposed regulations are increasing the need for developing optically pure drugs.(...)
Resumo:
A simple and sensitive liquid chromatography-electrospray ionization mass spectrometry method was developed for the simultaneous quantification in human plasma of all selective serotonin reuptake inhibitors (citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline) and their main active metabolites (desmethyl-citalopram and norfluoxetine). A stable isotope-labeled internal standard was used for each analyte to compensate for the global method variability, including extraction and ionization variations. After sample (250μl) pre-treatment with acetonitrile (500μl) to precipitate proteins, a fast solid-phase extraction procedure was performed using mixed mode Oasis MCX 96-well plate. Chromatographic separation was achieved in less than 9.0min on a XBridge C18 column (2.1×100mm; 3.5μm) using a gradient of ammonium acetate (pH 8.1; 50mM) and acetonitrile as mobile phase at a flow rate of 0.3ml/min. The method was fully validated according to Société Française des Sciences et Techniques Pharmaceutiques protocols and the latest Food and Drug Administration guidelines. Six point calibration curves were used to cover a large concentration range of 1-500ng/ml for citalopram, desmethyl-citalopram, paroxetine and sertraline, 1-1000ng/ml for fluoxetine and fluvoxamine, and 2-1000ng/ml for norfluoxetine. Good quantitative performances were achieved in terms of trueness (84.2-109.6%), repeatability (0.9-14.6%) and intermediate precision (1.8-18.0%) in the entire assay range including the lower limit of quantification. Internal standard-normalized matrix effects were lower than 13%. The accuracy profiles (total error) were mainly included in the acceptance limits of ±30% for biological samples. The method was successfully applied for routine therapeutic drug monitoring of more than 1600 patient plasma samples over 9 months. The β-expectation tolerance intervals determined during the validation phase were coherent with the results of quality control samples analyzed during routine use. This method is therefore precise and suitable both for therapeutic drug monitoring and pharmacokinetic studies in most clinical laboratories.
Resumo:
A rapid biological method for the determination of the bioavailability of naphthalene was developed and its value as an alternative to extraction-based chemical approaches demonstrated. Genetically engineered whole-cell biosensors are used to determine bioavailable naphthalene and their responses compared with results from Tenax extraction and chemical analysis. Results show a 1:1 correlation between biosensor results and chemical analyses for naphthalene-contaminated model materials and sediments, but the biosensor assay is much faster. This work demonstrates that biosensor technology can perform as well as standard chemical methods, though with some advantages including the inherent biological relevance of the response, rapid response time, and potential for field deployment. A survey of results from this work and the literature shows that bioavailability under non-equilibrium conditions nonetheless correlates well with K(oc) or K(d). A rationale is provided wherein chemical resistance is speculated to be operative.
Resumo:
A highly sensitive ultra-high performance liquid chromatography tandem mass spectrometry (UHPLC-MS/MS) method was developed for the quantification of buprenorphine and its major metabolite norbuprenorphine in human plasma. In order to speed up the process and decrease costs, sample preparation was performed by simple protein precipitation with acetonitrile. To the best of our knowledge, this is the first application of this extraction technique for the quantification of buprenorphine in plasma. Matrix effects were strongly reduced and selectivity increased by using an efficient chromatographic separation on a sub-2μm column (Acquity UPLC BEH C18 1.7μm, 2.1×50mm) in 5min with a gradient of ammonium formate 20mM pH 3.05 and acetonitrile as mobile phase at a flow rate of 0.4ml/min. Detection was made using a tandem quadrupole mass spectrometer operating in positive electrospray ionization mode, using multiple reaction monitoring. The procedure was fully validated according to the latest Food and Drug Administration guidelines and the Société Française des Sciences et Techniques Pharmaceutiques. Very good results were obtained by using a stable isotope-labeled internal standard for each analyte, to compensate for the variability due to the extraction and ionization steps. The method was very sensitive with lower limits of quantification of 0.1ng/ml for buprenorphine and 0.25ng/ml for norbuprenorphine. The upper limit of quantification was 250ng/ml for both drugs. Trueness (98.4-113.7%), repeatability (1.9-7.7%), intermediate precision (2.6-7.9%) and internal standard-normalized matrix effects (94-101%) were in accordance with international recommendations. The procedure was successfully used to quantify plasma samples from patients included in a clinical pharmacogenetic study and can be transferred for routine therapeutic drug monitoring in clinical laboratories without further development.
Resumo:
In the proposed method, carbon tetrachloride and ethanol were used as extraction and dispersive solvents. Several factors that may be affected on the extraction process, such as extraction solvent, disperser solvent, the volume of extraction and disperser solvent, pH of the aqueous solution and extraction time were optimized. Under the optimal conditions, linearity was maintained between 1.0 ng mL-1 to 1.5 mg mL-1 for zinc and 1.0 ng mL-1 to 0.4 mg mL-1 for cadmium. The proposed method has been applied for determination of trace amount of zinc and cadmium in standard and water samples with satisfactory results.
Resumo:
A dispersive liquid-liquid microextraction based on solidification of floating organic drop for simultaneous extraction of trace amounts of nickel, cobalt and copper followed by their determination with electrothermal atomic absorption spectrometry was developed. 300 µL of acetone and 1-undecanol was injected into an aqueous sample containing diethyldithiocarbamate complexes of metal ions. For a sample volume of 10 mL, enrichment factors of 277, 270 and 300 and detection limits of 1.2, 1.1 and 1 ng L-1 for nickel, cobalt and copper were obtained, respectively. The method was applied to the extraction and determination of these metals in different water samples.
Resumo:
A new solid phase extraction (SPE) method has been developed for the selective separation and preconcentration of Cu (II) ions in food and water samples prior to its flame atomic absorption spectrometry determination. The method is based on the adsorption of the Cu(II) - 2-{[4-Amino-3-(4-methylphenyl-5-oxo-4,5-dihydro-1H-1,2,4-triazol-1-yl]acetyl}-N-phenyl hydrazinecarbothioamide complex on Amberlite XAD-8 resin. The metal complex retained on the resin was eluted with 7.5 mL of 2.0 mol L-1 HCl in acetone. The optimum conditions for the SPE of Cu(II) ions were investigated, and the method was subsequently applied to sea water, stream water, rice, tea, and tobacco samples for the determination of Cu(II) levels.