974 resultados para Exponential Polynomials


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A poor representation of cloud structure in a general circulation model (GCM) is widely recognised as a potential source of error in the radiation budget. Here, we develop a new way of representing both horizontal and vertical cloud structure in a radiation scheme. This combines the ‘Tripleclouds’ parametrization, which introduces inhomogeneity by using two cloudy regions in each layer as opposed to one, each with different water content values, with ‘exponential-random’ overlap, in which clouds in adjacent layers are not overlapped maximally, but according to a vertical decorrelation scale. This paper, Part I of two, aims to parametrize the two effects such that they can be used in a GCM. To achieve this, we first review a number of studies for a globally applicable value of fractional standard deviation of water content for use in Tripleclouds. We obtain a value of 0.75 ± 0.18 from a variety of different types of observations, with no apparent dependence on cloud type or gridbox size. Then, through a second short review, we create a parametrization of decorrelation scale for use in exponential-random overlap, which varies the scale linearly with latitude from 2.9 km at the Equator to 0.4 km at the poles. When applied to radar data, both components are found to have radiative impacts capable of offsetting biases caused by cloud misrepresentation. Part II of this paper implements Tripleclouds and exponential-random overlap into a radiation code and examines both their individual and combined impacts on the global radiation budget using re-analysis data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliably representing both horizontal cloud inhomogeneity and vertical cloud overlap is fundamentally important for the radiation budget of a general circulation model. Here, we build on the work of Part One of this two-part paper by applying a pair of parameterisations that account for horizontal inhomogeneity and vertical overlap to global re-analysis data. These are applied both together and separately in an attempt to quantify the effects of poor representation of the two components on radiation budget. Horizontal inhomogeneity is accounted for using the “Tripleclouds” scheme, which uses two regions of cloud in each layer of a gridbox as opposed to one; vertical overlap is accounted for using “exponential-random” overlap, which aligns vertically continuous cloud according to a decorrelation height. These are applied to a sample of scenes from a year of ERA-40 data. The largest radiative effect of horizontal inhomogeneity is found to be in areas of marine stratocumulus; the effect of vertical overlap is found to be fairly uniform, but with larger individual short-wave and long-wave effects in areas of deep, tropical convection. The combined effect of the two parameterisations is found to reduce the magnitude of the net top-of-atmosphere cloud radiative forcing (CRF) by 2.25 W m−2, with shifts of up to 10 W m−2 in areas of marine stratocumulus. The effects of the uncertainty in our parameterisations on radiation budget is also investigated. It is found that the uncertainty in the impact of horizontal inhomogeneity is of order ±60%, while the uncertainty in the impact of vertical overlap is much smaller. This suggests an insensitivity of the radiation budget to the exact nature of the global decorrelation height distribution derived in Part One.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the classical coupled, combined-field integral equation formulations for time-harmonic acoustic scattering by a sound soft bounded obstacle. In recent work, we have proved lower and upper bounds on the $L^2$ condition numbers for these formulations, and also on the norms of the classical acoustic single- and double-layer potential operators. These bounds to some extent make explicit the dependence of condition numbers on the wave number $k$, the geometry of the scatterer, and the coupling parameter. For example, with the usual choice of coupling parameter they show that, while the condition number grows like $k^{1/3}$ as $k\to\infty$, when the scatterer is a circle or sphere, it can grow as fast as $k^{7/5}$ for a class of `trapping' obstacles. In this paper we prove further bounds, sharpening and extending our previous results. In particular we show that there exist trapping obstacles for which the condition numbers grow as fast as $\exp(\gamma k)$, for some $\gamma>0$, as $k\to\infty$ through some sequence. This result depends on exponential localisation bounds on Laplace eigenfunctions in an ellipse that we prove in the appendix. We also clarify the correct choice of coupling parameter in 2D for low $k$. In the second part of the paper we focus on the boundary element discretisation of these operators. We discuss the extent to which the bounds on the continuous operators are also satisfied by their discrete counterparts and, via numerical experiments, we provide supporting evidence for some of the theoretical results, both quantitative and asymptotic, indicating further which of the upper and lower bounds may be sharper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the scattering of a time-harmonic acoustic incident plane wave by a sound soft convex curvilinear polygon with Lipschitz boundary. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the number of degrees of freedom required to achieve a prescribed level of accuracy grows at least linearly with respect to the frequency of the incident wave. Here we propose a novel Galerkin boundary element method with a hybrid approximation space, consisting of the products of plane wave basis functions with piecewise polynomials supported on several overlapping meshes; a uniform mesh on illuminated sides, and graded meshes refined towards the corners of the polygon on illuminated and shadow sides. Numerical experiments suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy need only grow logarithmically as the frequency of the incident wave increases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple parameter adaptive controller design methodology is introduced in which steady-state servo tracking properties provide the major control objective. This is achieved without cancellation of process zeros and hence the underlying design can be applied to non-minimum phase systems. As with other self-tuning algorithms, the design (user specified) polynomials of the proposed algorithm define the performance capabilities of the resulting controller. However, with the appropriate definition of these polynomials, the synthesis technique can be shown to admit different adaptive control strategies, e.g. self-tuning PID and self-tuning pole-placement controllers. The algorithm can therefore be thought of as an embodiment of other self-tuning design techniques. The performances of some of the resulting controllers are illustrated using simulation examples and the on-line application to an experimental apparatus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of identification of a nonlinear dynamic system is considered. A two-layer neural network is used for the solution of the problem. Systems disturbed with unmeasurable noise are considered, although it is known that the disturbance is a random piecewise polynomial process. Absorption polynomials and nonquadratic loss functions are used to reduce the effect of this disturbance on the estimates of the optimal memory of the neural-network model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The extinction of dinosaurs at the Cretaceous/Paleogene (K/Pg) boundary was the seminal event that opened the door for the subsequent diversification of terrestrial mammals. Our compilation of maximum body size at the ordinal level by sub-epoch shows a near-exponential increase after the K/Pg. On each continent, the maximum size of mammals leveled off after 40 million years ago and thereafter remained approximately constant. There was remarkable congruence in the rate, trajectory, and upper limit across continents, orders, and trophic guilds, despite differences in geological and climatic history, turnover of lineages, and ecological variation. Our analysis suggests that although the primary driver for the evolution of giant mammals was diversification to fill ecological niches, environmental temperature and land area may have ultimately constrained the maximum size achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Greater attention has been focused on the use of CDMA for future cellular mobile communications. CA near-far resistant detector for asynchronous code-division multiple-access (CDMA) systems operating in additive white Gaussian noise (AWGN) channels is presented. The multiuser interference caused by K users transmitting simultaneously, each with a specific signature sequence, is completely removed at the receiver. The complexity of this detector grows only linearly with the number of users, as compared to the optimum multiuser detector which requires exponential complexity in the number of users. A modified algorithm based on time diversity is described. It performs detection on a bit-by-bit basis and overcomes the complexity of using a sequence detector. The performance of this detector is shown to be superior to that of the conventional receiver.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the theoretical development of a nonlinear adaptive filter based on a concept of filtering by approximated densities (FAD). The most common procedures for nonlinear estimation apply the extended Kalman filter. As opposed to conventional techniques, the proposed recursive algorithm does not require any linearisation. The prediction uses a maximum entropy principle subject to constraints. Thus, the densities created are of an exponential type and depend on a finite number of parameters. The filtering yields recursive equations involving these parameters. The update applies the Bayes theorem. Through simulation on a generic exponential model, the proposed nonlinear filter is implemented and the results prove to be superior to that of the extended Kalman filter and a class of nonlinear filters based on partitioning algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smooth trajectories are essential for safe interaction in between human and a haptic interface. Different methods and strategies have been introduced to create such smooth trajectories. This paper studies the creation of human-like movements in haptic interfaces, based on the study of human arm motion. These motions are intended to retrain the upper limb movements of patients that lose manipulation functions following stroke. We present a model that uses higher degree polynomials to define a trajectory and control the robot arm to achieve minimum jerk movements. It also studies different methods that can be driven from polynomials to create more realistic human-like movements for therapeutic purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A One-Dimensional Time to Explosion (ODTX) apparatus has been used to study the times to explosion of a number of compositions based on RDX and HMX over a range of contact temperatures. The times to explosion at any given temperature tend to increase from RDX to HMX and with the proportion of HMX in the composition. Thermal ignition theory has been applied to time to explosion data to calculate kinetic parameters. The apparent activation energy for all of the compositions lay between 127 kJ mol−1 and 146 kJ mol−1. There were big differences in the pre-exponential factor and this controlled the time to explosion rather than the activation energy for the process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Routh-stability method is employed to reduce the order of discrete-time system transfer functions. It is shown that the Routh approximant is well suited to reduce both the denominator and the numerator polynomials, although alternative methods, such as PadÃ�Â(c)-Markov approximation, are also used to fit the model numerator coefficients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study was designed to determine the response of in vitro fermentation parameters to incremental levels of polyethylene glycol (PEG) when tanniniferous tree fruits (Dichrostachys cinerea, Acacia erioloba, A. erubiscens, A. nilotica and Piliostigma thonningii) were fermented using the Reading Pressure Technique. The trivalent ytterbium precipitable phenolics content of fruit substrates ranged from 175 g/kg DM in A. erubiscens to 607 g/kg DM in A. nilotica, while the soluble condensed tannin content ranged from 0.09 AU550nm/40mg in A. erioloba to 0.52 AU550nm/40 mg in D. cinerea. The ADF was highest in P. thonningii fruits (402 g/kg DM) and lowest in A. nilotica fruits (165 g/kg DM). Increasing the level of PEG caused an exponential rise to a maximum (asymptotic) for cumulative gas production, rate of gas production and nitrogen degradability in all substrates except P. thonningii fruits. Dry matter degradability for fruits containing higher levels of soluble condensed tannins (D. cinerea and P. thonningii), showed little response to incremental levels of PEG after incubation for 24 h. The minimum levels of PEG required to maximize in vitro fermentation of tree fruits was found to be 200 mg PEG/g DM of sample for all tree species except A. erubiscens fruits, which required 100 mg PEG/g DM sample. The study provides evidence that PEG levels lower than 1 g/g DM sample can be used for in vitro tannin bioassays to reduce the cost of evaluating non-conventional tanniniferous feedstuffs used in developing countries in the tropics and subtopics. The use of in vitro nitrogen degradability in place of the favoured dry matter degradability improved the accuracy of PEG as a diagnostic tool for tannins in in vitro fermentation systems.