974 resultados para MULTIRESOLUTION APPROXIMATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A radial basis function network (RBFN) circuit for function approximation is presented. Simulation and experimental results show that the network has good approximation capabilities. The RBFN was a squared hyperbolic secant with three adjustable parameters amplitude, width and center. To test the network a sinusoidal and sine function,vas approximated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Function approximation is a very important task in environments where the computation has to be based on extracting information from data samples in real world processes. So, the development of new mathematical model is a very important activity to guarantee the evolution of the function approximation area. In this sense, we will present the Polynomials Powers of Sigmoid (PPS) as a linear neural network. In this paper, we will introduce one series of practical results for the Polynomials Powers of Sigmoid, where we will show some advantages of the use of the powers of sigmiod functions in relationship the traditional MLP-Backpropagation and Polynomials in functions approximation problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multicommodity flow (MF) problems have a wide variety of applications in areas such as VLSI circuit design, network design, etc., and are therefore very well studied. The fractional MF problems are polynomial time solvable while integer versions are NP-complete. However, exact algorithms to solve the fractional MF problems have high computational complexity. Therefore approximation algorithms to solve the fractional MF problems have been explored in the literature to reduce their computational complexity. Using these approximation algorithms and the randomized rounding technique, polynomial time approximation algorithms have been explored in the literature. In the design of high-speed networks, such as optical wavelength division multiplexing (WDM) networks, providing survivability carries great significance. Survivability is the ability of the network to recover from failures. It further increases the complexity of network design and presents network designers with more formidable challenges. In this work we formulate the survivable versions of the MF problems. We build approximation algorithms for the survivable multicommodity flow (SMF) problems based on the framework of the approximation algorithms for the MF problems presented in [1] and [2]. We discuss applications of the SMF problems to solve survivable routing in capacitated networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown that the vertical structure of the Brazil Current (BC)-Intermediate Western Boundary Current (IWBC) System is dominated by the first baroclinic mode at 22 degrees S-23 degrees S. In this work, we employed the Miami Isopycnic Coordinate Ocean Model to investigate whether the rich mesoscale activity of this current system, between 20 degrees S and 28 degrees S, is reproduced by a two-layer approximation of its vertical structure. The model results showed cyclonic and anticyclonic meanders propagating southwestward along the current axis, resembling the dynamical pattern of Rossby waves superposed on a mean flow. Analysis of the upper layer zonal velocity component, using a space-time diagram, revealed a dominant wavelength of about 450 km and phase velocity of about 0.20 ms(-1) southwestward. The results also showed that the eddy-like structures slowly grew in amplitude as they moved downstream. Despite the simplified design of the numerical experiments conducted here, these results compared favorably with observations and seem to indicate that weakly unstable long baroclinic waves are responsible for most of the variability observed in the BC-IWBC system. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analytically study the input-output properties of a neuron whose active dendritic tree, modeled as a Cayley tree of excitable elements, is subjected to Poisson stimulus. Both single-site and two-site mean-field approximations incorrectly predict a nonequilibrium phase transition which is not allowed in the model. We propose an excitable-wave mean-field approximation which shows good agreement with previously published simulation results [Gollo et al., PLoS Comput. Biol. 5, e1000402 (2009)] and accounts for finite-size effects. We also discuss the relevance of our results to experiments in neuroscience, emphasizing the role of active dendrites in the enhancement of dynamic range and in gain control modulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Letter reports an investigation on the optical properties of copper nanocubes as a function of size as modeled by the discrete dipole approximation. In the far-field, our results showed that the extinction resonances shifted from 595 to 670 nm as the size increased from 20 to 100 nm. Also, the highest optical efficiencies for absorption and scattering were obtained for nanocubes that were 60 and 100 nm in size, respectively. In the near-field, the electric-field amplitudes were investigated considering 514, 633 and 785 nm as the excitation wavelengths. The E-fields increased with size, being the highest at 633 nm. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We apply Stochastic Dynamics method for a differential equations model, proposed by Marc Lipsitch and collaborators (Proc. R. Soc. Lond. B 260, 321, 1995), for which the transmission dynamics of parasites occurs from a parent to its offspring (vertical transmission), and by contact with infected host (horizontal transmission). Herpes, Hepatitis and AIDS are examples of diseases for which both horizontal and vertical transmission occur simultaneously during the virus spreading. Understanding the role of each type of transmission in the infection prevalence on a susceptible host population may provide some information about the factors that contribute for the eradication and/or control of those diseases. We present a pair mean-field approximation obtained from the master equation of the model. The pair approximation is formed by the differential equations of the susceptible and infected population densities and the differential equations of pairs that contribute to the former ones. In terms of the model parameters, we obtain the conditions that lead to the disease eradication, and set up the phase diagram based on the local stability analysis of fixed points. We also perform Monte Carlo simulations of the model on complete graphs and Erdös-Rényi graphs in order to investigate the influence of population size and neighborhood on the previous mean-field results; by this way, we also expect to evaluate the contribution of vertical and horizontal transmission on the elimination of parasite. Pair Approximation for a Model of Vertical and Horizontal Transmission of Parasites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dealing with latent constructs (loaded by reflective and congeneric measures) cross-culturally compared means studying how these unobserved variables vary, and/or covary each other, after controlling for possibly disturbing cultural forces. This yields to the so-called ‘measurement invariance’ matter that refers to the extent to which data collected by the same multi-item measurement instrument (i.e., self-reported questionnaire of items underlying common latent constructs) are comparable across different cultural environments. As a matter of fact, it would be unthinkable exploring latent variables heterogeneity (e.g., latent means; latent levels of deviations from the means (i.e., latent variances), latent levels of shared variation from the respective means (i.e., latent covariances), levels of magnitude of structural path coefficients with regard to causal relations among latent variables) across different populations without controlling for cultural bias in the underlying measures. Furthermore, it would be unrealistic to assess this latter correction without using a framework that is able to take into account all these potential cultural biases across populations simultaneously. Since the real world ‘acts’ in a simultaneous way as well. As a consequence, I, as researcher, may want to control for cultural forces hypothesizing they are all acting at the same time throughout groups of comparison and therefore examining if they are inflating or suppressing my new estimations with hierarchical nested constraints on the original estimated parameters. Multi Sample Structural Equation Modeling-based Confirmatory Factor Analysis (MS-SEM-based CFA) still represents a dominant and flexible statistical framework to work out this potential cultural bias in a simultaneous way. With this dissertation I wanted to make an attempt to introduce new viewpoints on measurement invariance handled under covariance-based SEM framework by means of a consumer behavior modeling application on functional food choices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In technical design processes in the automotive industry, digital prototypes rapidly gain importance, because they allow for a detection of design errors in early development stages. The technical design process includes the computation of swept volumes for maintainability analysis and clearance checks. The swept volume is very useful, for example, to identify problem areas where a safety distance might not be kept. With the explicit construction of the swept volume an engineer gets evidence on how the shape of components that come too close have to be modified.rnIn this thesis a concept for the approximation of the outer boundary of a swept volume is developed. For safety reasons, it is essential that the approximation is conservative, i.e., that the swept volume is completely enclosed by the approximation. On the other hand, one wishes to approximate the swept volume as precisely as possible. In this work, we will show, that the one-sided Hausdorff distance is the adequate measure for the error of the approximation, when the intended usage is clearance checks, continuous collision detection and maintainability analysis in CAD. We present two implementations that apply the concept and generate a manifold triangle mesh that approximates the outer boundary of a swept volume. Both algorithms are two-phased: a sweeping phase which generates a conservative voxelization of the swept volume, and the actual mesh generation which is based on restricted Delaunay refinement. This approach ensures a high precision of the approximation while respecting conservativeness.rnThe benchmarks for our test are amongst others real world scenarios that come from the automotive industry.rnFurther, we introduce a method to relate parts of an already computed swept volume boundary to those triangles of the generator, that come closest during the sweep. We use this to verify as well as to colorize meshes resulting from our implementations.