953 resultados para extraction method
Resumo:
Hydroethanolic extracts of C. langsdorffii leaves have therapeutic potential. This work reports a validated chromatographic method for the quantification of polar compounds in the hydroethanolic extract of C. langsdorffii leaves. A reliable HPLC method was developed using two monolithic columns linked in series (100 x 4.6 mm - C-18), with nonlinear gradient elution, and UV detection set at 257 nm. A procedure for the extraction of flavonols was also developed, which involved the use of 70% aqueous ethanol and the addition of benzophenone as the internal standard. The developed method led to a good detection response as the values for linearity were between 10.3 and 1000 mu g/mL, and those for recovery between 84.2 and 111.1%. The detection limit ranged from 0.02 to 1.70 mu g/mL and the quantitation limit from 0.07 to 5.1 mu g/mL, with a maximum RSD of 5.24%. Five compounds, rutin, quercetin-3-O-alpha-L-rhamnopyranoside, kaempferol-3-O-alpha-L-rhamnopyranoside, quercetin and kaempferol, were quantified. This method could, therefore, be used for the quality control of hydroethanolic extracts of Copaifera leaves and their cosmetic and pharmaceutical products.
Resumo:
Insect cuticular hydrocarbons including relatively non-volatile chemicals play important roles in cuticle protection and chemical communication. The conventional procedures for extracting cuticular compounds from insects require toxic solvents, or non-destructive techniques that do not allow storage of subsequent samples, such as the use of SPME fibers. In this study, we describe and tested a non-lethal process for extracting cuticular hydrocarbons with styrene-divinylbenzene copolymers, and illustrate the method with two species of bees and one species of beetle. The results demonstrate that these compounds can be efficiently trapped by ChromosorbA (R) (SUPELCO) and that this method can be used as an alternative to existing methods.
Resumo:
Objective. Evaluate feasibility and safety of a novel technique for uterine morcellation in patients scheduled for laparoscopic treatment of gynecologic malignances. Background. The laparoscopic management of uterine malignancies is progressively gaining importance and popularity over laparotomy. Nevertheless, minimal invasive surgery is of limited use when patients have enlarged uterus or narrow vagina. In these cases, conventional uterus morcellation could be a solution but should not be recommended due to risks of tumor dissemination. Methods. Prospective pilot study of women with endometrial cancer in which uterus removal was a realistic concern due to both organ size and proportionality. Brief technique description: after completion of total laparoscopic hysterectomy and bilateral anexectomy, a nylon with polyurethane Lapsac (R) is vaginally inserted into the abdomen; the specimen is placed inside the pouch that will be closed and rotated 180 degrees toward the vaginal vault and, posteriorly, pushed into the vaginal canal; in the transvaginal phase, the surgeon pulls the edges of the bag up to vaginal introitus and all vaginal walls will be covered; inside the pouch, the operator performs a uterus bisection-morcellation. Results. In our series of 8 cases, we achieved successful completion in all patients, without conversion to laparotomy. Average operative time, blood loss and length of hospitalization were favorable. One patient presented with a vesicovaginal fistula. Conclusion. The vaginal morcellation following oncologic principles is a feasible method that permits a rapid uterine extraction and may avoid a number of unnecessary laparotomies. Further studies are needed to confirm the oncological safety of the technique. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
Background Transformed cells of Escherichia coli DH5-α with pGFPuv, induced by IPTG (isopropyl-β-d-thiogalactopyranoside), express the green fluorescent protein (gfpuv) during growth phases. E. coli subjected to the combination of selective permeation by freezing/thawing/sonication cycles followed by the three-phase partitioning extraction (TPP) method were compared to the direct application of TPP to the same culture of E. coli on releasing gfpuv from the over-expressing cells. Material and Methods Cultures (37°C/100 rpm/ 24 h; μ = 0.99 h-1 - 1.10 h-1) of transformed (pGFP) Escherichia coli DH5-α, expressing the green fluorescent protein (gfpuv, absorbance at 394 nm and emission at 509 nm) were sonicated in successive intervals of sonication (25 vibrations/pulse) to determine the maximum amount of gfpuv released from the cells. For selective permeation, the transformed previously frozen (-75°C) cells were subjected to three freeze/thaw (-20°C/ 0.83°C/min) cycles interlaid by sonication (3 pulses/ 6 seconds/ 25 vibrations). The intracellular permeate with gfpuv in extraction buffer (TE) solution (25 mM Tris-HCl, pH 8.0, 1 mM β-mercaptoethanol β-ME, 0.1 mM PMSF) was subjected to the three-phase partitioning (TPP) method with t-butanol and 1.6 M ammonium sulfate. Sonication efficiency was verified on the application to the cells previously treated by the TPP method. The intra-cell releases were mixed and eluted through methyl HIC column with a buffer solution (10 mM Tris-HCl, 10 mM EDTA, pH 8.0). Results The sonication maximum released amount obtained from the cells was 327.67 μg gfpuv/mL (20.73 μg gfpuv/mg total proteins – BSA), after 9 min of treatment. Through the selective permeation by three repeated freezing/thawing/sonication cycles applied to the cells, a close content of 241.19 μg gfpuv/mL (29.74 μg gfpuv/mg BSA) was obtained. The specific mass range of gfpuv released from the same cultures, by the three-phase partitioning (TPP) method, in relation to total proteins, was higher, between 107.28 μg/mg and 135.10 μg/mg. Conclusions The selective permeation of gfpuv by freezing/thawing/sonication followed by TPP separation method was equivalent to the amount of gfpuv extracted from the cells directly by TPP; although selective permeation extracts showed better elution through the HIC column.
Resumo:
Abstract Background The use of lignocellulosic constituents in biotechnological processes requires a selective separation of the main fractions (cellulose, hemicellulose and lignin). During diluted acid hydrolysis for hemicellulose extraction, several toxic compounds are formed by the degradation of sugars and lignin, which have ability to inhibit microbial metabolism. Thus, the use of a detoxification step represents an important aspect to be considered for the improvement of fermentation processes from hydrolysates. In this paper, we evaluated the application of Advanced Oxidative Processes (AOPs) for the detoxification of rice straw hemicellulosic hydrolysate with the goal of improving ethanol bioproduction by Pichia stipitis yeast. Aiming to reduce the toxicity of the hemicellulosic hydrolysate, different treatment conditions were analyzed. The treatments were carried out according to a Taguchi L16 orthogonal array to evaluate the influence of Fe+2, H2O2, UV, O3 and pH on the concentration of aromatic compounds and the fermentative process. Results The results showed that the AOPs were able to remove aromatic compounds (furan and phenolic compounds derived from lignin) without affecting the sugar concentration in the hydrolysate. Ozonation in alkaline medium (pH 8) in the presence of H2O2 (treatment A3) or UV radiation (treatment A5) were the most effective for hydrolysate detoxification and had a positive effect on increasing the yeast fermentability of rice straw hemicellulose hydrolysate. Under these conditions, the higher removal of total phenols (above 40%), low molecular weight phenolic compounds (above 95%) and furans (above 52%) were observed. In addition, the ethanol volumetric productivity by P. stipitis was increased in approximately twice in relation the untreated hydrolysate. Conclusion These results demonstrate that AOPs are a promising methods to reduce toxicity and improve the fermentability of lignocellulosic hydrolysates.
Resumo:
Seaweeds are photosynthetic organisms important to their ecosystem and constitute a source of compounds with several different applications in the pharmaceutical, cosmetic and biotechnology industries, such as triacylglycerols, which can be converted to fatty acid methyl esters that make up biodiesel, an alternative source of fuel applied in economic important areas. This study evaluates the fatty acid profiles and concentrations of three Brazilian seaweed species, Hypnea musciformis (Wulfen) J.V. Lamouroux (Rhodophya), Sargassum cymosum C. Agardh (Heterokontophyta), and Ulva lactuca L. (Chlorophyta), comparing three extraction methods (Bligh & Dyer - B&D; AOAC Official Methods - AOM; and extraction with methanol and ultrasound - EMU) and two transesterification methods (7% BF3 in methanol - BF3; and 5% HCl in methanol - HCl). The fatty acid contents of the three species of seaweeds were significantly different when extracted and transesterified by the different methods. Moreover, the best method for one species was not the same for the other species. The best extraction and transesterification methods for H. musciformis, S. cymosum and U. lactuca were, respectively, AOM-HCl, B&D-BF3 and B&D-BF3/B&D-HCl. These results point to a matrix effect and the method used for the analysis of the fatty acid content of different organisms should be selected carefully.
Resumo:
The hermit crab Clibanarius vittatus is a typical organism from intertidal regions being considered as a good bioindicator of tributyltin presence at these environments. Thus this study presents the analytical performance and validation method for TBT quantification in tissues of C. vittatus by gas chromatography with pulsed flame photometric detector (GC-PFPD) after extraction with an apolar solvent (toluene) and Grignard derivatization. The limits of detection of the method (LOD) were 2.0 and 2.8 ng g-1 for TBT and DBT (dibutyltin), respectively, and its limits of quantification (LOQ) were 6.6 and 8.9 ng g-1 for TBT and DBT, respectively. The method was applied to samples from Santos Estuary, São Paulo State, Brazil. TBT and DBT concentrations ranged from 26.7 to 175.0 ng g-1 and from 46.2 to 156.0 ng g-1, respectively. These concentrations are worrisome since toxic effects (such as endocrine disruption) have been reported for other organisms even under lower levels of registred at this study.
Resumo:
A method for the simultaneous quantification of lycopene, β-carotene, retinol and α-tocopherol by high-performance liquid chromatography (HPLC) with Vis/fluorescence detection with isocratic elution was optimized and validated. The method consists of a rapid and simple liquid-liquid extraction procedure and a posterior quantification of extracted supernatants by HPLC. Aliquots of plasma were stored at -20°C for three months for stability study. The methodology was applied to samples from painters and individuals not exposed to paints (n = 75). The assay was linear for all vitamins (r > 0.99). Intra- and inter-run precisions were obtained with coefficient of variation smaller than 5%. The accuracies ranged from 0.29 to -5.80% and recoveries between 92.73 and 101.97%. Plasma samples and extracted supernatants were stable for 60 days at -20°C. A significant decrease of lycopene, β-carotene and retinol concentrations in plasma from exposed individuals compared to non-exposed individuals (p < 0.05) was observed. The method is simple, reproducible, precise, accurate and sensitive, and can be routinely utilized in clinical laboratories.
Resumo:
Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.
Resumo:
An accurate and sensitive species-specific GC-ICP-IDMS (gas chromatography inductively coupled plasma isotope dilution mass spectrometry) method for the determination of trimethyllead and a multi-species-specific GC-ICP-IDMS method for the simultaneous determination of trimethyllead, methylmercury, and butyltins in biological and environmental samples were developed. They allow the determination of corresponding elemental species down to the low ng g-1 range. The developed synthesis scheme for the formation of isotopically labeled Me3206Pb+ can be used for future production of this spike. The novel extraction technique, stir bar sorptive extraction (SBSE), was applied for the first time in connection with species-specific isotope dilution GC-ICP-MS for the determination of trimethyllead, methylmercury and butyltins. The results were compared with liquid-liquid extraction. The developed methods were validated by the analysis of certified reference materials. The liquid-liquid extraction GC-ICP-IDMS method was applied to seafood samples purchased from a supermarket. The methylated lead fraction in these samples, correlated to total lead, varied in a broad range of 0.01-7.6 %. On the contrary, the fraction of methylmercury is much higher, normally in the range of 80-98 %. The highest methylmercury content of up to 12 µg g-1 has been determined in shark samples, an animal which is at the end of the marine food chain, whereas in other seafood samples a MeHg+ content of less than 0.2 µg g-1 was found. Butyltin species could only be determined in samples, where anthropogenic contaminations must be assumed. This explains the observed broad variation of the butylated tin fraction in the range of <0.3-49 % in different seafood samples. Because all isotope-labelled spike compounds, except trimethyllead, are commercially available, the developed multi-species-specific GC-ICP-IDMS method has a high potential in future for routine analysis.
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
The evaluation of chronic activity of the hypothalamic-pituitary-adrenal (HPA) axis is critical for determining the impact of chronic stressful situations. The potential use of hair glucocorticoids as a non-invasive, retrospective, biomarker of long term HPA activity is of great interest, and it is gaining acceptance in humans and animals. However, there are still no studies in literature examining hair cortisol concentration in pigs and corticosterone concentration in laboratory rodents. Therefore, we developed and validated, for the first time, a method for measuring hair glucocorticoids concentration in commercial sows and in Sprague-Dawley rats. Our preliminary data demonstrated: 1) a validated and specific washing protocol and extraction assay method with a good sensitivity in both species; 2) the effect of the reproductive phase, housing conditions and seasonality on hair cortisol concentration in sows; 3) similar hair corticosterone concentration in male and female rats; 4) elevated hair corticosterone concentration in response to chronic stress manipulations and chronic ACTH administration, demonstrating that hair provides a good direct index of HPA activity over long periods than other indirect parameters, such adrenal or thymus weight. From these results we believe that this new non-invasive tool needs to be applied to better characterize the overall impact in livestock animals and in laboratory rodents of chronic stressful situations that negatively affect animals welfare. Nevertheless, further studies are needed to improve this methodology and maybe to develop animal models for chronic stress of high interest and translational value in human medicine.
Resumo:
The Schroeder's backward integration method is the most used method to extract the decay curve of an acoustic impulse response and to calculate the reverberation time from this curve. In the literature the limits and the possible improvements of this method are widely discussed. In this work a new method is proposed for the evaluation of the energy decay curve. The new method has been implemented in a Matlab toolbox. Its performance has been tested versus the most accredited literature method. The values of EDT and reverberation time extracted from the energy decay curves calculated with both methods have been compared in terms of the values themselves and in terms of their statistical representativeness. The main case study consists of nine Italian historical theatres in which acoustical measurements were performed. The comparison of the two extraction methods has also been applied to a critical case, i.e. the structural impulse responses of some building elements. The comparison underlines that both methods return a comparable value of the T30. Decreasing the range of evaluation, they reveal increasing differences; in particular, the main differences are in the first part of the decay, where the EDT is evaluated. This is a consequence of the fact that the new method returns a “locally" defined energy decay curve, whereas the Schroeder's method accumulates energy from the tail to the beginning of the impulse response. Another characteristic of the new method for the energy decay extraction curve is its independence on the background noise estimation. Finally, a statistical analysis is performed on the T30 and EDT values calculated from the impulse responses measurements in the Italian historical theatres. The aim of this evaluation is to know whether a subset of measurements could be considered representative for a complete characterization of these opera houses.
Resumo:
This thesis aims at investigating methods and software architectures for discovering what are the typical and frequently occurring structures used for organizing knowledge in the Web. We identify these structures as Knowledge Patterns (KPs). KP discovery needs to address two main research problems: the heterogeneity of sources, formats and semantics in the Web (i.e., the knowledge soup problem) and the difficulty to draw relevant boundary around data that allows to capture the meaningful knowledge with respect to a certain context (i.e., the knowledge boundary problem). Hence, we introduce two methods that provide different solutions to these two problems by tackling KP discovery from two different perspectives: (i) the transformation of KP-like artifacts to KPs formalized as OWL2 ontologies; (ii) the bottom-up extraction of KPs by analyzing how data are organized in Linked Data. The two methods address the knowledge soup and boundary problems in different ways. The first method provides a solution to the two aforementioned problems that is based on a purely syntactic transformation step of the original source to RDF followed by a refactoring step whose aim is to add semantics to RDF by select meaningful RDF triples. The second method allows to draw boundaries around RDF in Linked Data by analyzing type paths. A type path is a possible route through an RDF that takes into account the types associated to the nodes of a path. Then we present K~ore, a software architecture conceived to be the basis for developing KP discovery systems and designed according to two software architectural styles, i.e, the Component-based and REST. Finally we provide an example of reuse of KP based on Aemoo, an exploratory search tool which exploits KPs for performing entity summarization.
Resumo:
Satellite image classification involves designing and developing efficient image classifiers. With satellite image data and image analysis methods multiplying rapidly, selecting the right mix of data sources and data analysis approaches has become critical to the generation of quality land-use maps. In this study, a new postprocessing information fusion algorithm for the extraction and representation of land-use information based on high-resolution satellite imagery is presented. This approach can produce land-use maps with sharp interregional boundaries and homogeneous regions. The proposed approach is conducted in five steps. First, a GIS layer - ATKIS data - was used to generate two coarse homogeneous regions, i.e. urban and rural areas. Second, a thematic (class) map was generated by use of a hybrid spectral classifier combining Gaussian Maximum Likelihood algorithm (GML) and ISODATA classifier. Third, a probabilistic relaxation algorithm was performed on the thematic map, resulting in a smoothed thematic map. Fourth, edge detection and edge thinning techniques were used to generate a contour map with pixel-width interclass boundaries. Fifth, the contour map was superimposed on the thematic map by use of a region-growing algorithm with the contour map and the smoothed thematic map as two constraints. For the operation of the proposed method, a software package is developed using programming language C. This software package comprises the GML algorithm, a probabilistic relaxation algorithm, TBL edge detector, an edge thresholding algorithm, a fast parallel thinning algorithm, and a region-growing information fusion algorithm. The county of Landau of the State Rheinland-Pfalz, Germany was selected as a test site. The high-resolution IRS-1C imagery was used as the principal input data.