15 resultados para One-point Quadrature

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We show that any invariant test for spatial autocorrelation in a spatial error or spatial lag model with equal weights matrix has power equal to size. This result holds under the assumption of an elliptical distribution. Under Gaussianity, we also show that any test whose power is larger than its size for at least one point in the parameter space must be biased.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unless the benefits to society of measures to protect and improve the welfare of animals are made transparent by means of their valuation they are likely to go unrecognised and cannot easily be weighed against the costs of such measures as required, for example, by policy-makers. A simple single measure scoring system, based on the Welfare Quality® index, is used, together with a choice experiment economic valuation method, to estimate the value that people place on improvements to the welfare of different farm animal species measured on a continuous (0-100) scale. Results from using the method on a survey sample of some 300 people show that it is able to elicit apparently credible values. The survey found that 96% of respondents thought that we have a moral obligation to safeguard the welfare of animals and that over 72% were concerned about the way farm animals are treated. Estimated mean annual willingness to pay for meat from animals with improved welfare of just one point on the scale was £5.24 for beef cattle, £4.57 for pigs and £5.10 for meat chickens. Further development of the method is required to capture the total economic value of animal welfare benefits. Despite this, the method is considered a practical means for obtaining economic values that can be used in the cost-benefit appraisal of policy measures intended to improve the welfare of animals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use the point-source method (PSM) to reconstruct a scattered field from its associated far field pattern. The reconstruction scheme is described and numerical results are presented for three-dimensional acoustic and electromagnetic scattering problems. We give new proofs of the algorithms, based on the Green and Stratton-Chu formulae, which are more general than with the former use of the reciprocity relation. This allows us to handle the case of limited aperture data and arbitrary incident fields. Both for 3D acoustics and electromagnetics, numerical reconstructions of the field for different settings and with noisy data are shown. For shape reconstruction in acoustics, we develop an appropriate strategy to identify areas with good reconstruction quality and combine different such regions into one joint function. Then, we show how shapes of unknown sound-soft scatterers are found as level curves of the total reconstructed field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of authors have proposed clinical trial designs involving the comparison of several experimental treatments with a control treatment in two or more stages. At the end of the first stage, the most promising experimental treatment is selected, and all other experimental treatments are dropped from the trial. Provided it is good enough, the selected experimental treatment is then compared with the control treatment in one or more subsequent stages. The analysis of data from such a trial is problematic because of the treatment selection and the possibility of stopping at interim analyses. These aspects lead to bias in the maximum-likelihood estimate of the advantage of the selected experimental treatment over the control and to inaccurate coverage for the associated confidence interval. In this paper, we evaluate the bias of the maximum-likelihood estimate and propose a bias-adjusted estimate. We also propose an approach to the construction of a confidence region for the vector of advantages of the experimental treatments over the control based on an ordering of the sample space. These regions are shown to have accurate coverage, although they are also shown to be necessarily unbounded. Confidence intervals for the advantage of the selected treatment are obtained from the confidence regions and are shown to have more accurate coverage than the standard confidence interval based upon the maximum-likelihood estimate and its asymptotic standard error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most factorial experiments in industrial research form one stage in a sequence of experiments and so considerable prior knowledge is often available from earlier stages. A Bayesian A-optimality criterion is proposed for choosing designs, when each stage in experimentation consists of a small number of runs and the objective is to optimise a response. Simple formulae for the weights are developed, some examples of the use of the design criterion are given and general recommendations are made. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas and performing a monochromatic radiation calculation for each point. In this presentation it is shown that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K/day due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such that they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide, and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K/day can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K/day for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models, and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas, and performing a pseudo-monochromatic radiation calculation for each point. In this paper it is first argued that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer pseudo-monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K d−1 due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K d−1 can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K d−1 for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new incremental four-dimensional variational (4D-Var) data assimilation algorithm is introduced. The algorithm does not require the computationally expensive integrations with the nonlinear model in the outer loops. Nonlinearity is accounted for by modifying the linearization trajectory of the observation operator based on integrations with the tangent linear (TL) model. This allows us to update the linearization trajectory of the observation operator in the inner loops at negligible computational cost. As a result the distinction between inner and outer loops is no longer necessary. The key idea on which the proposed 4D-Var method is based is that by using Gaussian quadrature it is possible to get an exact correspondence between the nonlinear time evolution of perturbations and the time evolution in the TL model. It is shown that J-point Gaussian quadrature can be used to derive the exact adjoint-based observation impact equations and furthermore that it is straightforward to account for the effect of multiple outer loops in these equations if the proposed 4D-Var method is used. The method is illustrated using a three-level quasi-geostrophic model and the Lorenz (1996) model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data augmentation is a powerful technique for estimating models with latent or missing data, but applications in agricultural economics have thus far been few. This paper showcases the technique in an application to data on milk market participation in the Ethiopian highlands. There, a key impediment to economic development is an apparently low rate of market participation. Consequently, economic interest centers on the “locations” of nonparticipants in relation to the market and their “reservation values” across covariates. These quantities are of policy interest because they provide measures of the additional inputs necessary in order for nonparticipants to enter the market. One quantity of primary interest is the minimum amount of surplus milk (the “minimum efficient scale of operations”) that the household must acquire before market participation becomes feasible. We estimate this quantity through routine application of data augmentation and Gibbs sampling applied to a random-censored Tobit regression. Incorporating random censoring affects markedly the marketable-surplus requirements of the household, but only slightly the covariates requirements estimates and, generally, leads to more plausible policy estimates than the estimates obtained from the zero-censored formulation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Khartoum like many cities in least developing countries (LDCs) still witnesses huge influx of people. Accommodation of the new comers leads to encroachment on the cultivation land leads to sprawl expansion of Greater Khartoum. The city expanded in diameter from 16.8 km in 1955 to 802.5 km in 1998. Most of this horizontal expansion was residential. In 2008 Khartoum accommodated 29% of the urban population of Sudan. Today Khartoum is considered as one of 43 major cities in Africa that accommodates more than 1 million inhabitants. Most of new comers live in the outskirts of the city e.g. Dar El-Salam and Mayo neighbourhoods. The majority of those new comers built their houses especially the walls from mud, wood, straw and sacks. Selection of building materials usually depends on its price regardless of the environmental impact, quality, thermal performance and life of the material. Most of the time, this results in increasing the cost with variables of impacts over the environment during the life of the building. Therefore, consideration of the environmental impacts, social impacts and economic impacts is crucial in the selection of any building material. Decreasing such impacts could lead to more sustainable housing. Comparing the sustainability of the available wall building materials for low cost housing in Khartoum is carried out through the life cycle assessment (LCA) technique. The purpose of this paper is to compare the most available local building materials for walls for the urban poor of Khartoum from a sustainability point of view by going through the manufacturing of the materials, the use of these materials and then the disposal of the materials after their life comes to an end. Findings reveal that traditional red bricks couldn’t be considered as a sustainable wall building material that will draw the future of the low cost housing in Greater Khartoum. On the other hand, results of the comparison lead to draw attention to the wide range of the soil techniques and to its potentials to be a promising sustainable wall material for urban low cost housing in Khartoum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Older computer users often exhibit poorer performance in point and click tasks on a computer than younger adults. This paper reports on the first phase of research that examines whether a visual illusion that makes an object appear to be larger (Delboeuf‟s Illusion), can help to improve point and click performance for older computer users. In this first phase, we look at the effect sizes for different configurations of the Delboeuf illusion. The study finds that the target size is overestimated by 8% for both older and younger adults in one configuration, and 12% for older adults in another configuration. The results will inform the design of a second phase, in which the configurations which demonstrate the largest effects will be investigated using a Fitts‟-style study of pointing performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The endocannabinoid system (ECS) was only 'discovered' in the 1990s. Since then, many new ligands have been identified, as well as many new intracellular targets--ranging from the PPARs, to mitochondria, to lipid rafts. It was thought that blocking the CB-1 receptor might reverse obesity and the metabolic syndrome. This was based on the idea that the ECS was dysfunctional in these conditions. This has met with limited success. The reason may be that the ECS is a homeostatic system, which integrates energy seeking and storage behaviour with resistance to oxidative stress. It could be viewed as having thrifty actions. Thriftiness is an innate property of life, which is programmed to a set point by both environment and genetics, resulting in an epigenotype perfectly adapted to its environment. This thrifty set point can be modulated by hormetic stimuli, such as exercise, cold and plant micronutrients. We have proposed that the physiological and protective insulin resistance that underlies thriftiness encapsulates something called 'redox thriftiness', whereby insulin resistance is determined by the ability to resist oxidative stress. Modern man has removed most hormetic stimuli and replaced them with a calorific sedentary lifestyle, leading to increased risk of metabolic inflexibility. We suggest that there is a tipping point where lipotoxicity in adipose and hepatic cells induces mild inflammation, which switches thrifty insulin resistance to inflammation-driven insulin resistance. To understand this, we propose that the metabolic syndrome could be seen from the viewpoint of the ECS, the mitochondrion and the FOXO group of transcription factors. FOXO has many thrifty actions, including increasing insulin resistance and appetite, suppressing oxidative stress and shifting the organism towards using fatty acids. In concert with factors such as PGC-1, they also modify mitochondrial function and biogenesis. Hence, the ECS and FOXO may interact at many points; one of which may be via intracellular redox signalling. As cannabinoids have been shown to modulate reactive oxygen species production, it is possible that they can upregulate anti-oxidant defences. This suggests they may have an 'endohormetic' signalling function. The tipping point into the metabolic syndrome may be the result of a chronic lack of hormetic stimuli (in particular, physical activity), and thus, stimulus for PGC-1, with a resultant reduction in mitochondrial function and a reduced lipid capacitance. This, in the context of a positive calorie environment, will result in increased visceral adipose tissue volume, abnormal ectopic fat content and systemic inflammation. This would worsen the inflammatory-driven pathological insulin resistance and inability to deal with lipids. The resultant oxidative stress may therefore drive a compensatory anti-oxidative response epitomised by the ECS and FOXO. Thus, although blocking the ECS (e.g. via rimonabant) may induce temporary weight loss, it may compromise long-term stress resistance. Clues about how to modulate the system more safely are emerging from observations that some polyphenols, such as resveratrol and possibly, some phytocannabinoids, can modulate mitochondrial function and might improve resistance to a modern lifestyle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IEEE 754 floating-point arithmetic is widely used in modern, general-purpose computers. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. Modifying the IEEE arithmetic so that it uses transreal arithmetic has a number of advantages. It removes one redundant binade from IEEE floating-point objects, doubling the numerical precision of the arithmetic. It removes eight redundant, relational,floating-point operations and removes the redundant total order operation. It replaces the non-reflexive, floating-point, equality operator with a reflexive equality operator and it indicates that some of the exceptions may be removed as redundant { subject to issues of backward compatibility and transient future compatibility as programmers migrate to the transreal paradigm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Field observations of new particle formation and the subsequent particle growth are typically only possible at a fixed measurement location, and hence do not follow the temporal evolution of an air parcel in a Lagrangian sense. Standard analysis for determining formation and growth rates requires that the time-dependent formation rate and growth rate of the particles are spatially invariant; air parcel advection means that the observed temporal evolution of the particle size distribution at a fixed measurement location may not represent the true evolution if there are spatial variations in the formation and growth rates. Here we present a zero-dimensional aerosol box model coupled with one-dimensional atmospheric flow to describe the impact of advection on the evolution of simulated new particle formation events. Wind speed, particle formation rates and growth rates are input parameters that can vary as a function of time and location, using wind speed to connect location to time. The output simulates measurements at a fixed location; formation and growth rates of the particle mode can then be calculated from the simulated observations at a stationary point for different scenarios and be compared with the ‘true’ input parameters. Hence, we can investigate how spatial variations in the formation and growth rates of new particles would appear in observations of particle number size distributions at a fixed measurement site. We show that the particle size distribution and growth rate at a fixed location is dependent on the formation and growth parameters upwind, even if local conditions do not vary. We also show that different input parameters used may result in very similar simulated measurements. Erroneous interpretation of observations in terms of particle formation and growth rates, and the time span and areal extent of new particle formation, is possible if the spatial effects are not accounted for.