917 resultados para Wavelet Packet Decomposition
Resumo:
Esta Tesis presenta un nuevo método para filtrar errores en bases de datos multidimensionales. Este método no precisa ninguna información a priori sobre la naturaleza de los errores. En concreto, los errrores no deben ser necesariamente pequeños, ni de distribución aleatoria ni tener media cero. El único requerimiento es que no estén correlados con la información limpia propia de la base de datos. Este nuevo método se basa en una extensión mejorada del método básico de reconstrucción de huecos (capaz de reconstruir la información que falta de una base de datos multidimensional en posiciones conocidas) inventado por Everson y Sirovich (1995). El método de reconstrucción de huecos mejorado ha evolucionado como un método de filtrado de errores de dos pasos: en primer lugar, (a) identifica las posiciones en la base de datos afectadas por los errores y después, (b) reconstruye la información en dichas posiciones tratando la información de éstas como información desconocida. El método resultante filtra errores O(1) de forma eficiente, tanto si son errores aleatorios como sistemáticos e incluso si su distribución en la base de datos está concentrada o esparcida por ella. Primero, se ilustra el funcionamiento delmétodo con una base de datosmodelo bidimensional, que resulta de la dicretización de una función transcendental. Posteriormente, se presentan algunos casos prácticos de aplicación del método a dos bases de datos tridimensionales aerodinámicas que contienen la distribución de presiones sobre un ala a varios ángulos de ataque. Estas bases de datos resultan de modelos numéricos calculados en CFD. ABSTRACT A method is presented to filter errors out in multidimensional databases. The method does not require any a priori information about the nature the errors. In particular, the errors need not to be small, neither random, nor exhibit zero mean. Instead, they are only required to be relatively uncorrelated to the clean information contained in the database. The method is based on an improved extension of a seminal iterative gappy reconstruction method (able to reconstruct lost information at known positions in the database) due to Everson and Sirovich (1995). The improved gappy reconstruction method is evolved as an error filtering method in two steps, since it is adapted to first (a) identify the error locations in the database and then (b) reconstruct the information in these locations by treating the associated data as gappy data. The resultingmethod filters out O(1) errors in an efficient fashion, both when these are random and when they are systematic, and also both when they concentrated and when they are spread along the database. The performance of the method is first illustrated using a two-dimensional toymodel database resulting fromdiscretizing a transcendental function and then tested on two CFD-calculated, three-dimensional aerodynamic databases containing the pressure coefficient on the surface of a wing for varying values of the angle of attack. A more general performance analysis of the method is presented with the intention of quantifying the randomness factor the method admits maintaining a correct performance and secondly, quantifying the size of error the method can detect. Lastly, some improvements of the method are proposed with their respective verification.
Resumo:
The wavelet transform and Lipschitz exponent perform well in detecting signal singularity.With the bridge crack damage modeled as rotational springs based on fracture mechanics, the deflection time history of the beam under the moving load is determined with a numerical method. The continuous wavelet transformation (CWT) is applied to the deflection of the beam to identify the location of the damage, and the Lipschitz exponent is used to evaluate the damage degree. The influence of different damage degrees,multiple damage, different sensor locations, load velocity and load magnitude are studied.Besides, the feasibility of this method is verified by a model experiment.
Resumo:
In a recent article [Khan, A. U., Kovacic, D., Kolbanovsky, A., Desai, M., Frenkel, K. & Geacintov, N. E. (2000) Proc. Natl. Acad. Sci. USA 97, 2984–2989], the authors claimed that ONOO−, after protonation to ONOOH, decomposes into 1HNO and 1O2 according to a spin-conserved unimolecular mechanism. This claim was based partially on their observation that nitrosylhemoglobin is formed via the reaction of peroxynitrite with methemoglobin at neutral pH. However, thermochemical considerations show that the yields of 1O2 and 1HNO are about 23 orders of magnitude lower than those of ⋅NO2 and ⋅OH, which are formed via the homolysis of ONOOH. We also show that methemoglobin does not form with peroxynitrite any spectrally detectable product, but with contaminations of nitrite and H2O2 present in the peroxynitrite sample. Thus, there is no need to modify the present view of the mechanism of ONOOH decomposition, according to which initial homolysis into a radical pair, [ONO⋅ ⋅OH]cage, is followed by the diffusion of about 30% of the radicals out of the cage, while the rest recombines to nitric acid in the solvent cage.
Resumo:
We describe the use of singular value decomposition in transforming genome-wide expression data from genes × arrays space to reduced diagonalized “eigengenes” × “eigenarrays” space, where the eigengenes (or eigenarrays) are unique orthonormal superpositions of the genes (or arrays). Normalizing the data by filtering out the eigengenes (and eigenarrays) that are inferred to represent noise or experimental artifacts enables meaningful comparison of the expression of different genes across different arrays in different experiments. Sorting the data according to the eigengenes and eigenarrays gives a global picture of the dynamics of gene expression, in which individual genes and arrays appear to be classified into groups of similar regulation and function, or similar cellular state and biological phenotype, respectively. After normalization and sorting, the significant eigengenes and eigenarrays can be associated with observed genome-wide effects of regulators, or with measured samples, in which these regulators are overactive or underactive, respectively.
Resumo:
Patterns in sequences of amino acid hydrophobic free energies predict secondary structures in proteins. In protein folding, matches in hydrophobic free energy statistical wavelengths appear to contribute to selective aggregation of secondary structures in “hydrophobic zippers.” In a similar setting, the use of Fourier analysis to characterize the dominant statistical wavelengths of peptide ligands’ and receptor proteins’ hydrophobic modes to predict such matches has been limited by the aliasing and end effects of short peptide lengths, as well as the broad-band, mode multiplicity of many of their frequency (power) spectra. In addition, the sequence locations of the matching modes are lost in this transformation. We make new use of three techniques to address these difficulties: (i) eigenfunction construction from the linear decomposition of the lagged covariance matrices of the ligands and receptors as hydrophobic free energy sequences; (ii) maximum entropy, complex poles power spectra, which select the dominant modes of the hydrophobic free energy sequences or their eigenfunctions; and (iii) discrete, best bases, trigonometric wavelet transformations, which confirm the dominant spectral frequencies of the eigenfunctions and locate them as (absolute valued) moduli in the peptide or receptor sequence. The leading eigenfunction of the covariance matrix of a transmembrane receptor sequence locates the same transmembrane segments seen in n-block-averaged hydropathy plots while leaving the remaining hydrophobic modes unsmoothed and available for further analyses as secondary eigenfunctions. In these receptor eigenfunctions, we find a set of statistical wavelength matches between peptide ligands and their G-protein and tyrosine kinase coupled receptors, ranging across examples from 13.10 amino acids in acid fibroblast growth factor to 2.18 residues in corticotropin releasing factor. We find that the wavelet-located receptor modes in the extracellular loops are compatible with studies of receptor chimeric exchanges and point mutations. A nonbinding corticotropin-releasing factor receptor mutant is shown to have lost the signatory mode common to the normal receptor and its ligand. Hydrophobic free energy eigenfunctions and their transformations offer new quantitative physical homologies in database searches for peptide-receptor matches.
Resumo:
The existence of the RNA world, in which RNA acted as a catalyst as well as an informational macromolecule, assumes a large prebiotic source of ribose or the existence of pre-RNA molecules with backbones different from ribose-phosphate. The generally accepted prebiotic synthesis of ribose, the formose reaction, yields numerous sugars without any selectivity. Even if there were a selective synthesis of ribose, there is still the problem of stability. Sugars are known to be unstable in strong acid or base, but there are few data for neutral solutions. Therefore, we have measured the rate of decomposition of ribose between pH 4 and pH 8 from 40 degrees C to 120 degrees C. The ribose half-lives are very short (73 min at pH 7.0 and 100 degrees C and 44 years at pH 7.0 and 0 degrees C). The other aldopentoses and aldohexoses have half-lives within an order of magnitude of these values, as do 2-deoxyribose, ribose 5-phosphate, and ribose 2,4-bisphosphate. These results suggest that the backbone of the first genetic material could not have contained ribose or other sugars because of their instability.
Resumo:
For many years, humans and machines have shared the same physical space. To facilitate their interaction with humans, their social integration and for more rational behavior has been sought that the robots demonstrate human-like behavior. For this it is necessary to understand how human behavior is generated, discuss what tasks are performed and how relate to themselves, for subsequent implementation in robots. In this paper, we propose a model of competencies based on human neuroregulator system for analysis and decomposition of behavior into functional modules. Using this model allow separate and locate the tasks to be implemented in a robot that displays human-like behavior. As an example, we show the application of model to the autonomous movement behavior on unfamiliar environments and its implementation in various simulated and real robots with different physical configurations and physical devices of different nature. The main result of this work has been to build a model of competencies that is being used to build robotic systems capable of displaying behaviors similar to humans and consider the specific characteristics of robots.
Resumo:
Support for this work was provided by the Generalitat Valenciana (Spain) with projects PROMETEO/2009/043/FEDER, and by the Spanish MCT CTQ2008-05520.
Resumo:
The pyrolysis and combustion of corn stover were studied by dynamic thermogravimetry and derivate thermogravimetry (TG-DTG) at heating rates of 5, 10, 20 and 50 K min−1 at atmospheric pressure. For the simulation of pyrolysis and combustion processes a kinetic model based on the distribution of activation energies was used, with three pools of reactants (three pseudocomponents) because of the complexity of the biomass samples of agricultural origin. The experimental thermogravimetric data of pyrolysis and combustion processes were simultaneously fitted to determine a single set of kinetic parameters able to describe both processes at the different heating rates. The model proposed achieves a good correlation between the experimental and calculated curves, with an error of less than 4% for fitting four heating rates simultaneously. The experimental results and kinetic parameters may provide useful data for the design of thermo decomposition processing system using corn stover as feedstock. On the other hand, analysis of the main compounds in the evolved gas is given by means of a microcromatograph.
Resumo:
The pyrolysis of a sludge produced in the waste water treatment plant of an oil refinery was studied in a pilot plant reactor provided with a system for condensation of semivolatile matter. The study comprises experiments at 350, 400, 470 and 530 °C in nitrogen atmosphere. Analysis of all the products obtained (gases, liquids and chars) are presented, with a thermogravimetric study of the char produced and analysis of main components of the liquid. In the temperature range studied, the composition of the gas fraction does not appreciably vary. In the liquids, the light hidrocarbon yield increases with increasing temperature, whereas the aromatic compounds diminish. The decomposition of the solid fraction has been analysed, finding a material that reacts rapidly with oxygen regardless of the conditions it is formed.
Resumo:
A nonempty set F is called Motzkin decomposable when it can be expressed as the Minkowski sum of a compact convex set C with a closed convex cone D. In that case, the sets C and D are called compact and conic components of F. This paper provides new characterizations of the Motzkin decomposable sets involving truncations of F (i.e., intersections of FF with closed halfspaces), when F contains no lines, and truncations of the intersection F̂ of F with the orthogonal complement of the lineality of F, otherwise. In particular, it is shown that a nonempty closed convex set F is Motzkin decomposable if and only if there exists a hyperplane H parallel to the lineality of F such that one of the truncations of F̂ induced by H is compact whereas the other one is a union of closed halflines emanating from H. Thus, any Motzkin decomposable set F can be expressed as F=C+D, where the compact component C is a truncation of F̂. These Motzkin decompositions are said to be of type T when F contains no lines, i.e., when C is a truncation of F. The minimality of this type of decompositions is also discussed.
Resumo:
Humans and machines have shared the same physical space for many years. To share the same space, we want the robots to behave like human beings. This will facilitate their social integration, their interaction with humans and create an intelligent behavior. To achieve this goal, we need to understand how human behavior is generated, analyze tasks running our nerves and how they relate to them. Then and only then can we implement these mechanisms in robotic beings. In this study, we propose a model of competencies based on human neuroregulator system for analysis and decomposition of behavior into functional modules. Using this model allow separate and locate the tasks to be implemented in a robot that displays human-like behavior. As an example, we show the application of model to the autonomous movement behavior on unfamiliar environments and its implementation in various simulated and real robots with different physical configurations and physical devices of different nature. The main result of this study has been to build a model of competencies that is being used to build robotic systems capable of displaying behaviors similar to humans and consider the specific characteristics of robots.
Resumo:
Combustion runs at 700 °C in a horizontal laboratory furnace were carried out on two different electric wires (PVC and halogen-free wire). Tests were performed in the presence and in the absence of the metal conductor of the wires. The analyses of the polycyclic aromatic hydrocarbons (PAHs), chlorobenzenes (CBzs), chlorophenols (CPhs), mono- to octa-chlorodibenzo-p-dioxin and dibenzofurans (PCDD/Fs), and dioxin-like PCBs are shown. Regarding semivolatile compounds, PAHs production decreases in the presence of metal, while a higher amount of chlorinated compounds are emitted. Respect to the PCDD/Fs, the PVC wire in the presence of metal presents the highest emission, with a much more emission of furans than dioxins. The maximum emission is with 2 or 3 chlorine atom PCDD/Fs. PCBs emission correlates with PCDD/F production and represents 3–4% of total toxicity, determined by using WHO2005 factors.
Resumo:
Thermal decomposition of printed circuits boards (PCB) is studied, using thermogravimetric analysis to compare the thermal behavior of PCB of mobile phones before and after the removal of the metallic fraction by acid washing. Several dynamic and dynamic + isothermal runs have been carried out at different heating rates (5, 10 and 20 K min−1), from room temperature to more than 1100 K. Also runs in the presence and in the absence of oxygen were performed (combustion and pyrolysis runs). Moreover, TG–MS experiments were performed (both in inert and oxidizing atmosphere) in order to better understand the thermal decomposition of these wastes and identify some compounds emitted during the controlled heating of these materials. Different reaction models are proposed, one for pyrolysis and one for combustion of the two kinds of wastes studied, which proved to simulate appropriately the experimental results at all the heating rates simultaneously.