932 resultados para Non-linear dynamic analysis
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We prove existence theorems for the Dirichlet problem for hypersurfaces of constant special Lagrangian curvature in Hadamard manifolds. The first results are obtained using the continuity method and approximation and then refined using two iterations of the Perron method. The a-priori estimates used in the continuity method are valid in any ambient manifold.
Gaussian estimates for the density of the non-linear stochastic heat equation in any space dimension
Resumo:
In this paper, we establish lower and upper Gaussian bounds for the probability density of the mild solution to the stochastic heat equation with multiplicative noise and in any space dimension. The driving perturbation is a Gaussian noise which is white in time with some spatially homogeneous covariance. These estimates are obtained using tools of the Malliavin calculus. The most challenging part is the lower bound, which is obtained by adapting a general method developed by Kohatsu-Higa to the underlying spatially homogeneous Gaussian setting. Both lower and upper estimates have the same form: a Gaussian density with a variance which is equal to that of the mild solution of the corresponding linear equation with additive noise.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
In this work we develop a viscoelastic bar element that can handle multiple rheo- logical laws with non-linear elastic and non-linear viscous material models. The bar element is built by joining in series an elastic and viscous bar, constraining the middle node position to the bar axis with a reduction method, and stati- cally condensing the internal degrees of freedom. We apply the methodology to the modelling of reversible softening with sti ness recovery both in 2D and 3D, a phenomenology also experimentally observed during stretching cycles on epithelial lung cell monolayers.
Resumo:
This paper extends existing insurance results on the type of insurance contracts needed for insurance market efficiency toa dynamic setting. It introduces continuosly open markets that allow for more efficient asset allocation. It alsoeliminates the role of preferences and endowments in the classification of risks, which is done primarily in terms of the actuarial properties of the underlying riskprocess. The paper further extends insurability to include correlated and catstrophic events. Under these very general conditions the paper defines a condition that determines whether a small number of standard insurance contracts (together with aggregate assets) suffice to complete markets or one needs to introduce such assets as mutual insurance.
Resumo:
Climate science indicates that climate stabilization requires low GHG emissions. Is thisconsistent with nondecreasing human welfare?Our welfare or utility index emphasizes education, knowledge, and the environment. Weconstruct and calibrate a multigenerational model with intertemporal links provided by education,physical capital, knowledge and the environment.We reject discounted utilitarianism and adopt, first, the Pure Sustainability Optimization (orIntergenerational Maximin) criterion, and, second, the Sustainable Growth Optimization criterion,that maximizes the utility of the first generation subject to a given future rate of growth. We applythese criteria to our calibrated model via a novel algorithm inspired by the turnpike property.The computed paths yield levels of utility higher than the level at reference year 2000 for allgenerations. They require the doubling of the fraction of labor resources devoted to the creation ofknowledge relative to the reference level, whereas the fractions of labor allocated to consumptionand leisure are similar to the reference ones. On the other hand, higher growth rates requiresubstantial increases in the fraction of labor devoted to education, together with moderate increasesin the fractions of labor devoted to knowledge and the investment in physical capital.
Resumo:
Exact solutions to FokkerPlanck equations with nonlinear drift are considered. Applications of these exact solutions for concrete models are studied. We arrive at the conclusion that for certain drifts we obtain divergent moments (and infinite relaxation time) if the diffusion process can be extended without any obstacle to the whole space. But if we introduce a potential barrier that limits the diffusion process, moments converge with a finite relaxation time.
Resumo:
The propagation of a pulse in a nonlinear array of oscillators is influenced by the nature of the array and by its coupling to a thermal environment. For example, in some arrays a pulse can be speeded up while in others a pulse can be slowed down by raising the temperature. We begin by showing that an energy pulse (one dimension) or energy front (two dimensions) travels more rapidly and remains more localized over greater distances in an isolated array (microcanonical) of hard springs than in a harmonic array or in a soft-springed array. Increasing the pulse amplitude causes it to speed up in a hard chain, leaves the pulse speed unchanged in a harmonic system, and slows down the pulse in a soft chain. Connection of each site to a thermal environment (canonical) affects these results very differently in each type of array. In a hard chain the dissipative forces slow down the pulse while raising the temperature speeds it up. In a soft chain the opposite occurs: the dissipative forces actually speed up the pulse, while raising the temperature slows it down. In a harmonic chain neither dissipation nor temperature changes affect the pulse speed. These and other results are explained on the basis of the frequency vs energy relations in the various arrays
Resumo:
This paper deals with non-linear transformations for improving the performance of an entropy-based voice activity detector (VAD). The idea to use a non-linear transformation has already been applied in the field of speech linear prediction, or linear predictive coding (LPC), based on source separation techniques, where a score function is added to classical equations in order to take into account the true distribution of the signal. We explore the possibility of estimating the entropy of frames after calculating its score function, instead of using original frames. We observe that if the signal is clean, the estimated entropy is essentially the same; if the signal is noisy, however, the frames transformed using the score function may give entropy that is different in voiced frames as compared to nonvoiced ones. Experimental evidence is given to show that this fact enables voice activity detection under high noise, where the simple entropy method fails.
Resumo:
This special issue aims to cover some problems related to non-linear and nonconventional speech processing. The origin of this volume is in the ISCA Tutorial and Research Workshop on Non-Linear Speech Processing, NOLISP’09, held at the Universitat de Vic (Catalonia, Spain) on June 25–27, 2009. The series of NOLISP workshops started in 2003 has become a biannual event whose aim is to discuss alternative techniques for speech processing that, in a sense, do not fit into mainstream approaches. A selected choice of papers based on the presentations delivered at NOLISP’09 has given rise to this issue of Cognitive Computation.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed an upscaling procedure based on a Bayesian sequential simulation approach. This method is then applied to the stochastic integration of low-resolution, regional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this upscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
The human motion study, which relies on mathematical and computational models ingeneral, and multibody dynamic biomechanical models in particular, has become asubject of many recent researches. The human body model can be applied to different physical exercises and many important results such as muscle forces, which are difficult to be measured through practical experiments, can be obtained easily. In the work, human skeletal lower limb model consisting of three bodies in build using the flexible multibody dynamics simulation approach. The floating frame of reference formulation is used to account for the flexibility in the bones of the human lower limb model. The main reason of considering the flexibility inthe human bones is to measure the strains in the bone result from different physical exercises. It has been perceived the bone under strain will become stronger in order to cope with the exercise. On the other hand, the bone strength is considered and important factors in reducing the bone fractures. The simulation approach and model developed in this work are used to measure the bone strain results from applying raising the sole of the foot exercise. The simulation results are compared to the results available in literature. The comparison shows goof agreement. This study sheds the light on the importance of using the flexible multibody dynamic simulation approach to build human biomechanical models, which can be used in developing some exercises to achieve the optimalbone strength.