996 resultados para approximate membership extraction
Resumo:
The main goal of this work was to evaluate thermodynamic parameters of the soybean oil extraction process using ethanol as solvent. The experimental treatments were as follows: aqueous solvents with water contents varying from 0 to 13% (mass basis) and extraction temperature varying from 50 to 100 degrees C. The distribution coefficients of oil at equilibrium have been used to calculate enthalpy, entropy and free energy changes. The results indicate that oil extraction process with ethanol is feasible and spontaneous, mainly under higher temperature. Also, the influence of water level in the solvent and temperature were analysed using the response surface methodology (RSM). It can be noted that the extraction yield was highly affected by both independent variables. A joint analysis of thermodynamic and RSM indicates the optimal level of solvent hydration and temperature to perform the extraction process.
Resumo:
Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Due to idiosyncrasies in their syntax, semantics or frequency, Multiword Expressions (MWEs) have received special attention from the NLP community, as the methods and techniques developed for the treatment of simplex words are not necessarily suitable for them. This is certainly the case for the automatic acquisition of MWEs from corpora. A lot of effort has been directed to the task of automatically identifying them, with considerable success. In this paper, we propose an approach for the identification of MWEs in a multilingual context, as a by-product of a word alignment process, that not only deals with the identification of possible MWE candidates, but also associates some multiword expressions with semantics. The results obtained indicate the feasibility and low costs in terms of tools and resources demanded by this approach, which could, for example, facilitate and speed up lexicographic work.
Resumo:
We introduce a problem called maximum common characters in blocks (MCCB), which arises in applications of approximate string comparison, particularly in the unification of possibly erroneous textual data coming from different sources. We show that this problem is NP-complete, but can nevertheless be solved satisfactorily using integer linear programming for instances of practical interest. Two integer linear formulations are proposed and compared in terms of their linear relaxations. We also compare the results of the approximate matching with other known measures such as the Levenshtein (edit) distance. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Visual representations of isosurfaces are ubiquitous in the scientific and engineering literature. In this paper, we present techniques to assess the behavior of isosurface extraction codes. Where applicable, these techniques allow us to distinguish whether anomalies in isosurface features can be attributed to the underlying physical process or to artifacts from the extraction process. Such scientific scrutiny is at the heart of verifiable visualization - subjecting visualization algorithms to the same verification process that is used in other components of the scientific pipeline. More concretely, we derive formulas for the expected order of accuracy (or convergence rate) of several isosurface features, and compare them to experimentally observed results in the selected codes. This technique is practical: in two cases, it exposed actual problems in implementations. We provide the reader with the range of responses they can expect to encounter with isosurface techniques, both under ""normal operating conditions"" and also under adverse conditions. Armed with this information - the results of the verification process - practitioners can judiciously select the isosurface extraction technique appropriate for their problem of interest, and have confidence in its behavior.
Resumo:
The exchange energy of an arbitrary collinear-spin many-body system in an external magnetic field is a functional of the spin-resolved charge and current densities, E(x)[n(up arrow), n(down arrow), j(up arrow), j(down arrow)]. Within the framework of density-functional theory (DFT), we show that the dependence of this functional on the four densities can be fully reconstructed from either of two extreme limits: a fully polarized system or a completely unpolarized system. Reconstruction from the limit of an unpolarized system yields a generalization of the Oliver-Perdew spin scaling relations from spin-DFT to current-DFT. Reconstruction from the limit of a fully polarized system is used to derive the high-field form of the local-spin-density approximation to current-DFT and to magnetic-field DFT.
Resumo:
This work describes a novel methodology for automatic contour extraction from 2D images of 3D neurons (e.g. camera lucida images and other types of 2D microscopy). Most contour-based shape analysis methods cannot be used to characterize such cells because of overlaps between neuronal processes. The proposed framework is specifically aimed at the problem of contour following even in presence of multiple overlaps. First, the input image is preprocessed in order to obtain an 8-connected skeleton with one-pixel-wide branches, as well as a set of critical regions (i.e., bifurcations and crossings). Next, for each subtree, the tracking stage iteratively labels all valid pixel of branches, tip to a critical region, where it determines the suitable direction to proceed. Finally, the labeled skeleton segments are followed in order to yield the parametric contour of the neuronal shape under analysis. The reported system was successfully tested with respect to several images and the results from a set of three neuron images are presented here, each pertaining to a different class, i.e. alpha, delta and epsilon ganglion cells, containing a total of 34 crossings. The algorithms successfully got across all these overlaps. The method has also been found to exhibit robustness even for images with close parallel segments. The proposed method is robust and may be implemented in an efficient manner. The introduction of this approach should pave the way for more systematic application of contour-based shape analysis methods in neuronal morphology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this work, we introduce a necessary sequential Approximate-Karush-Kuhn-Tucker (AKKT) condition for a point to be a solution of a continuous variational inequality, and we prove its relation with the Approximate Gradient Projection condition (AGP) of Garciga-Otero and Svaiter. We also prove that a slight variation of the AKKT condition is sufficient for a convex problem, either for variational inequalities or optimization. Sequential necessary conditions are more suitable to iterative methods than usual punctual conditions relying on constraint qualifications. The AKKT property holds at a solution independently of the fulfillment of a constraint qualification, but when a weak one holds, we can guarantee the validity of the KKT conditions.
Resumo:
Coal mining and incineration of solid residues of health services (SRHS) generate several contaminants that are delivered into the environment, such as heavy metals and dioxins. These xenobiotics can lead to oxidative stress overgeneration in organisms and cause different kinds of pathologies, including cancer. In the present study the concentrations of heavy metals such as lead, copper, iron, manganese and zinc in the urine, as well as several enzymatic and non-enzymatic biomarkers of oxidative stress in the blood (contents of lipoperoxidation = TBARS, protein carbonyls = PC, protein thiols = PT, alpha-tocopherol = AT, reduced glutathione = GSH, and the activities of glutathione S-transferase = GST, glutathione reductase = GR, glutathione peroxidase = GPx, catalase = CAT and superoxide dismutase = SOD), in the blood of six different groups (n = 20 each) of subjects exposed to airborne contamination related to coal mining as well as incineration of solid residues of health services (SRHS) after vitamin E (800 mg/day) and vitamin C (500 mg/day) supplementation during 6 months, which were compared to the situation before the antioxidant intervention (Avila et al., Ecotoxicology 18:1150-1157, 2009; Possamai et al., Ecotoxicology 18:1158-1164, 2009). Except for the decreased manganese contents, heavy metal concentrations were elevated in all groups exposed to both sources of airborne contamination when compared to controls. TBARS and PC concentrations, which were elevated before the antioxidant intervention decreased after the antioxidant supplementation. Similarly, the contents of PC, AT and GSH, which were decreased before the antioxidant intervention, reached values near those found in controls, GPx activity was reestablished in underground miners, and SOD, CAT and GST activities were reestablished in all groups. The results showed that the oxidative stress condition detected previously to the antioxidant supplementation in both directly and indirectly subjects exposed to the airborne contamination from coal dusts and SRHS incineration, was attenuated after the antioxidant intervention.
Resumo:
In this work cassava bagasse, a by-product of cassava starch industrialization was investigated as a new raw material to extract cellulose whiskers. This by-product is basically constituted of cellulose fibers (17.5 wt%) and residual starch (82 wt%). Therefore, this residue contains both natural fibers and a considerable quantity of starch and this composition suggests the possibility of using cassava bagasse to prepare both starch nanocrystals and cellulose whiskers. In this way, the preparation of cellulose whiskers was investigated employing conditions of sulfuric acid hydrolysis treatment found in the literature. The ensuing materials were characterized by transmission electron microscopy (TEM) and X-ray diffraction experiments. The results showed that high aspect ratio cellulose whiskers were successfully obtained. The reinforcing capability of cellulose whiskers extracted from cassava bagasse was investigated using natural rubber as matrix. High mechanical properties were observed from dynamic mechanical analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Depolymerization of cellulose in homogeneous acidic medium is analyzed on the basis of autocatalytic model of hydrolysis with a positive feedback of acid production from the degraded biopolymer. The normalized number of scissions per cellulose chain, S(t)/nA degrees A = 1 - C(t)/C(0), follows a sigmoid behavior with reaction time t, and the cellulose concentration C(t) decreases exponentially with a linear and cubic time dependence, C(t) = C(0)exp[-at - bt (3)], where a and b are model parameters easier determined from data analysis.
Resumo:
In this article, a novel polydimethylsiloxane/activated carbon (PDMS-ACB) material is proposed as a new polymeric phase for stir bar sorptive extraction (SBSE). The PDMS-ACB stir bar, assembled using a simple Teflon (R)/glass capillary mold, demonstrated remarkable stability and resistance to organic solvents for more than 150 extractions. The SBSE bar has a diameter of 2.36 mm and a length of 2.2 cm and is prepared to contain 92 mu L of polymer coating. This new PDMS-ACB bar was evaluated for its ability to determine the quantity of pesticides in sugarcane juice samples by performing liquid desorption (LD) in 200 mu L of ethyl acetate and analyzing the solvent through gas chromatography coupled with mass spectrometry (GC-MS). A fractional factorial design was used to evaluate the main parameters involved in the extraction procedure. Then, a central composite design with a star configuration was used to optimize the significant extraction parameters. The method used demonstrated a limit of quantification (LOQ) of 0.5-40 mu g/L, depending on the analyte detected; the amount of recovery varied from 0.18 to 49.50%, and the intraday precision ranged from 0.072 to 8.40%. The method was used in the analysis of real sugarcane juice samples commercially available in local markets.
Resumo:
This article presents a method employing stir bar sorptive extraction (SBSE) with in situ derivatization, in combination with either thermal or liquid desorption on-line coupled to gas chromatography-mass spectrometry for the analysis of fluoxetine in plasma samples. Ethyl chloroformate was employed as derivatizing agent producing symmetrical peaks. Parameters such as solvent polarity, time for analyte desorption, and extraction time, were evaluated. During the validation process, the developed method presented specificity, linearity (R-2 > 0.99), precision (R.S.D. < 15%), and limits of quantification (LOQ) of 30 and 1.37 pg mL(-1), when liquid and thermal desorption were employed, respectively. This simple and highly sensitive method showed to be adequate for the measurement-of fluoxetine in typical and trace concentration levels. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
A variety of substrates have been used for fabrication of microchips for DNA extraction, PCR amplification, and DNA fragment separation, including the more conventional glass and silicon as well as alternative polymer-based materials. Polyester represents one such polymer, and the laser-printing of toner onto polyester films has been shown to be effective for generating polyester-toner (PeT) microfluidic devices with channel depths on the order of tens of micrometers. Here, we describe a novel and simple process that allows for the production of multilayer, high aspect-ratio PeT microdevices with substantially larger channel depths. This innovative process utilizes a CO(2) laser to create the microchannel in polyester sheets containing a uniform layer of printed toner, and multilayer devices can easily be constructed by sandwiching the channel layer between uncoated cover sheets of polyester containing precut access holes. The process allows the fabrication of deep channels, with similar to 270 mu m, and we demonstrate the effectiveness of multilayer PeT microchips for dynamic solid phase extraction (dSPE) and PCR amplification. With the former, we found that (i) more than 65% of DNA from 0.6 mu L of blood was recovered, (ii) the resultant DNA was concentrated to greater than 3 ng/mu L., (which was better than other chip-based extraction methods), and (iii) the DNA recovered was compatible with downstream microchip-based PCR amplification. Illustrative of the compatibility of PeT microchips with the PCR process, the successful amplification of a 520 bp fragment of lambda-phage DNA in a conventional thermocycler is shown. The ability to handle the diverse chemistries associated with DNA purification and extraction is a testimony to the potential utility of PeT microchips beyond separations and presents a promising new disposable platform for genetic analysis that is low cost and easy to fabricate.