910 resultados para 3D feature extraction
Resumo:
In music genre classification, most approaches rely on statistical characteristics of low-level features computed on short audio frames. In these methods, it is implicitly considered that frames carry equally relevant information loads and that either individual frames, or distributions thereof, somehow capture the specificities of each genre. In this paper we study the representation space defined by short-term audio features with respect to class boundaries, and compare different processing techniques to partition this space. These partitions are evaluated in terms of accuracy on two genre classification tasks, with several types of classifiers. Experiments show that a randomized and unsupervised partition of the space, used in conjunction with a Markov Model classifier lead to accuracies comparable to the state of the art. We also show that unsupervised partitions of the space tend to create less hubs.
Resumo:
With the electricity market liberalization, distribution and retail companies are looking for better market strategies based on adequate information upon the consumption patterns of its electricity customers. In this environment all consumers are free to choose their electricity supplier. A fair insight on the customer´s behaviour will permit the definition of specific contract aspects based on the different consumption patterns. In this paper Data Mining (DM) techniques are applied to electricity consumption data from a utility client’s database. To form the different customer´s classes, and find a set of representative consumption patterns, we have used the Two-Step algorithm which is a hierarchical clustering algorithm. Each consumer class will be represented by its load profile resulting from the clustering operation. Next, to characterize each consumer class a classification model will be constructed with the C5.0 classification algorithm.
Resumo:
A discussion of the most interesting results obtained in our laboratories, during the supercritical CO(2) extraction of bioactive compounds from microalgae and volatile oils from aromatic plants, was carried out. Concerning the microalgae, the studies on Botryococcus braunii and Chlorella vulgaris were selected. Hydrocarbons from the first microalgae, which are mainly linear alkadienes (C(23)-C(31)) with an odd number of carbon atoms, were selectively extracted at 313 K increasing the pressure up to 30.0 MPa. These hydrocarbons are easily extracted at this pressure, since they are located outside the cellular walls. The extraction of carotenoids, mainly canthaxanthin and astaxanthin, from C. vulgaris is more difficult. The extraction yield of these components at 313 K and 35.0 MPa increased with the degree of crushing of the microalga, since they are not extracellular. On the other hand, for the extraction of volatile oils from aromatic plants, studies on Mentha pulegium and Satureja montana L were chosen. For the first aromatic plant, the composition of the volatile and essential oils was similar, the main components being the pulegone and menthone. However, this volatile oil contained small amounts of waxes, which content decreased with decreasing particle size of the plant matrix. For S. montana L it was also observed that both oils have a similar composition, the main components being carvacrol and thymol. The main difference is the relative amount of thymoquinone, which content can be 15 times higher in volatile oil. This oxygenated monoterpene has important biological activities. Moreover, experimental studies on anticholinesterase activity of supercritical extracts of S. montana were also carried out. The supercritical nonvolatile fraction, which presented the highest content of the protocatechuic, vanilic, chlorogenic and (+)-catechin acids, is the most promising inhibitor of the enzyme butyrylcholinesterase. In contrast, the Soxhlet acetone extract did not affect the activity of this enzyme at the concentrations tested. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This work describes a methodology to extract symbolic rules from trained neural networks. In our approach, patterns on the network are codified using formulas on a Lukasiewicz logic. For this we take advantage of the fact that every connective in this multi-valued logic can be evaluated by a neuron in an artificial network having, by activation function the identity truncated to zero and one. This fact simplifies symbolic rule extraction and allows the easy injection of formulas into a network architecture. We trained this type of neural network using a back-propagation algorithm based on Levenderg-Marquardt algorithm, where in each learning iteration, we restricted the knowledge dissemination in the network structure. This makes the descriptive power of produced neural networks similar to the descriptive power of Lukasiewicz logic language, minimizing the information loss on the translation between connectionist and symbolic structures. To avoid redundance on the generated network, the method simplifies them in a pruning phase, using the "Optimal Brain Surgeon" algorithm. We tested this method on the task of finding the formula used on the generation of a given truth table. For real data tests, we selected the Mushrooms data set, available on the UCI Machine Learning Repository.
Resumo:
Proteins are biochemical entities consisting of one or more blocks typically folded in a 3D pattern. Each block (a polypeptide) is a single linear sequence of amino acids that are biochemically bonded together. The amino acid sequence in a protein is defined by the sequence of a gene or several genes encoded in the DNA-based genetic code. This genetic code typically uses twenty amino acids, but in certain organisms the genetic code can also include two other amino acids. After linking the amino acids during protein synthesis, each amino acid becomes a residue in a protein, which is then chemically modified, ultimately changing and defining the protein function. In this study, the authors analyze the amino acid sequence using alignment-free methods, aiming to identify structural patterns in sets of proteins and in the proteome, without any other previous assumptions. The paper starts by analyzing amino acid sequence data by means of histograms using fixed length amino acid words (tuples). After creating the initial relative frequency histograms, they are transformed and processed in order to generate quantitative results for information extraction and graphical visualization. Selected samples from two reference datasets are used, and results reveal that the proposed method is able to generate relevant outputs in accordance with current scientific knowledge in domains like protein sequence/proteome analysis.
Resumo:
In this work, a microwave-assisted extraction (MAE) methodology was compared with several conventional extraction methods (Soxhlet, Bligh & Dyer, modified Bligh & Dyer, Folch, modified Folch, Hara & Radin, Roese-Gottlieb) for quantification of total lipid content of three fish species: horse mackerel (Trachurus trachurus), chub mackerel (Scomber japonicus), and sardine (Sardina pilchardus). The influence of species, extraction method and frozen storage time (varying from fresh to 9 months of freezing) on total lipid content was analysed in detail. The efficiencies of methods MAE, Bligh & Dyer, Folch, modified Folch and Hara & Radin were the highest and although they were not statistically different, differences existed in terms of variability, with MAE showing the highest repeatability (CV = 0.034). Roese-Gottlieb, Soxhlet, and modified Bligh & Dyer methods were very poor in terms of efficiency as well as repeatability (CV between 0.13 and 0.18).
Resumo:
This paper reports a novel application of microwave-assisted extraction (MAE) of polyphenols from brewer’s spent grains (BSG). A 24 orthogonal composite design was used to obtain the optimal conditions of MAE. The influence of the MAE operational parameters (extraction time, temperature, solvent volume and stirring speed) on the extraction yield of ferulic acid was investigated through response surface methodology. The results showed that the optimal conditions were 15 min extraction time, 100 °C extraction temperature, 20 mL of solvent, and maximum stirring speed. Under these conditions, the yield of ferulic acid was 1.31±0.04% (w/w), which was fivefold higher than that obtained with conventional solid–liquid extraction techniques. The developed new extraction method considerably reduces extraction time, energy and solvent consumption, while generating fewer wastes. HPLC-DADMS analysis indicated that other hydroxycinnamic acids and several ferulic acid dehydrodimers, as well as one dehydrotrimer were also present, confirming that BSG is a valuable source of antioxidant compounds.
Resumo:
This paper presents the study of the remediation of sandy soils containing six of the most common contaminants (benzene, toluene, ethylbenzene, xylene, trichloroethylene and perchloroethylene) using soil vapour extraction (SVE). The influence of soil water content on the process efficiency was evaluated considering the soil type and the contaminant. For artificially contaminated soils with negligible clay contents and natural organic matter it was concluded that: (i) all the remediation processes presented efficiencies above 92%; (ii) an increase of the soil water content led to a more time-consuming remediation; (iii) longer remediation periods were observed for contaminants with lower vapour pressures and lower water solubilities due to mass transfer limitations. Based on these results an easy and relatively fast procedure was developed for the prediction of the remediation times of real soils; 83% of the remediation times were predicted with relative deviations below 14%.
Resumo:
Soil vapor extraction (SVE) is an efficient, well-known and widely applied soil remediation technology. However, under certain conditions it cannot achieve the defined cleanup goals, requiring further treatment, for example, through bioremediation (BR). The sequential application of these technologies is presented as a valid option but is not yet entirely studied. This work presents the study of the remediation of ethylbenzene (EB)-contaminated soils, with different soil water and natural organic matter (NOMC) contents, using sequential SVE and BR. The obtained results allow the conclusion that: (1) SVE was sufficient to reach the cleanup goals in 63% of the experiments (all the soils with NOMC below 4%), (2) higher NOMCs led to longer SVE remediation times, (3) BR showed to be a possible and cost-effective option when EB concentrations were lower than 335 mg kgsoil −1, and (4) concentrations of EB above 438 mg kgsoil −1 showed to be inhibitory for microbial activity.
Resumo:
An accurate and sensitive method for determination of 18 polycyclic aromatic hydrocarbons (PAHs) (16 PAHs considered by USEPA as priority pollutants, dibenzo[a,l]pyrene and benzo[j]fluoranthene) in fish samples was validated. Analysis was performed by microwave-assisted extraction and liquid chromatography with photodiode array and fluorescence detection. Response surface methodology was used to find the optimal extraction parameters. Validation of the overall methodology was performed by spiking assays at four levels and using SRM 2977. Quantification limits ranging from 0.15–27.16 ng/g wet weight were obtained. The established method was applied in edible tissues of three commonly consumed and commercially valuable fish species (sardine, chub mackerel and horse mackerel) originated from Atlantic Ocean. Variable levels of naphthalene (1.03–2.95 ng/g wet weight), fluorene (0.34–1.09 ng/g wet weight) and phenanthrene (0.34–3.54 ng/g wet weight) were detected in the analysed samples. None of the samples contained detectable amounts of benzo[a]pyrene, the marker used for evaluating the occurrence and carcinogenic effects of PAHs in food.
Resumo:
A QuEChERS method for the extraction of ochratoxin A (OTA) from bread samples was evaluated. A factorial design (23) was used to find the optimal QuEChERS parameters (extraction time, extraction solvent volume and sample mass). Extracts were analysed by LC with fluorescence detection. The optimal extraction conditions were: 5 g of sample, 15 mL of acetonitrile and 3 min of agitation. The extraction procedure was validated by systematic recovery experiments at three levels. The recoveries obtained ranged from 94.8% (at 1.0 μg kg -1) to 96.6% (at 3.0 μg kg -1). The limit of quantification of the method was 0.05 μg kg -1. The optimised procedure was applied to 20 samples of different bread types (‘‘Carcaça’’, ‘‘Broa de Milho’’, and ‘‘Broa de Avintes’’) highly consumed in Portugal. None of the samples exceeded the established European legal limit of 3 μg kg -1.
Resumo:
In this work, we present a neural network (NN) based method designed for 3D rigid-body registration of FMRI time series, which relies on a limited number of Fourier coefficients of the images to be aligned. These coefficients, which are comprised in a small cubic neighborhood located at the first octant of a 3D Fourier space (including the DC component), are then fed into six NN during the learning stage. Each NN yields the estimates of a registration parameter. The proposed method was assessed for 3D rigid-body transformations, using DC neighborhoods of different sizes. The mean absolute registration errors are of approximately 0.030 mm in translations and 0.030 deg in rotations, for the typical motion amplitudes encountered in FMRI studies. The construction of the training set and the learning stage are fast requiring, respectively, 90 s and 1 to 12 s, depending on the number of input and hidden units of the NN. We believe that NN-based approaches to the problem of FMRI registration can be of great interest in the future. For instance, NN relying on limited K-space data (possibly in navigation echoes) can be a valid solution to the problem of prospective (in frame) FMRI registration.
Resumo:
Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores