64 resultados para best estimate method
Resumo:
In this paper we consider the problem of time-harmonic acoustic scattering in two dimensions by convex polygons. Standard boundary or finite element methods for acoustic scattering problems have a computational cost that grows at least linearly as a function of the frequency of the incident wave. Here we present a novel Galerkin boundary element method, which uses an approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh, with smaller elements closer to the corners of the polygon. We prove that the best approximation from the approximation space requires a number of degrees of freedom to achieve a prescribed level of accuracy that grows only logarithmically as a function of the frequency. Numerical results demonstrate the same logarithmic dependence on the frequency for the Galerkin method solution. Our boundary element method is a discretization of a well-known second kind combined-layer-potential integral equation. We provide a proof that this equation and its adjoint are well-posed and equivalent to the boundary value problem in a Sobolev space setting for general Lipschitz domains.
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
Sensible and latent heat fluxes are often calculated from bulk transfer equations combined with the energy balance. For spatial estimates of these fluxes, a combination of remotely sensed and standard meteorological data from weather stations is used. The success of this approach depends on the accuracy of the input data and on the accuracy of two variables in particular: aerodynamic and surface conductance. This paper presents a Bayesian approach to improve estimates of sensible and latent heat fluxes by using a priori estimates of aerodynamic and surface conductance alongside remote measurements of surface temperature. The method is validated for time series of half-hourly measurements in a fully grown maize field, a vineyard and a forest. It is shown that the Bayesian approach yields more accurate estimates of sensible and latent heat flux than traditional methods.
Resumo:
The basic premise of transaction-cost theory is that the decision to outsource, rather than to undertake work in-house, is determined by the relative costs incurred in each of these forms of economic organization. In construction the "make or buy" decision invariably leads to a contract. Reducing the costs of entering into a contractual relationship (transaction costs) raises the value of production and is therefore desirable. Commonly applied methods of contractor selection may not minimise the costs of contracting. Research evidence suggests that although competitive tendering typically results in the lowest bidder winning the contract this may not represent the lowest project cost after completion. Multi-parameter and quantitative models for contractor selection have been developed to identify the best (or least risky) among bidders. A major area in which research is still needed is in investigating the impact of different methods of contractor selection on the costs of entering into a contract and the decision to outsource.
Resumo:
In this paper we consider the impedance boundary value problem for the Helmholtz equation in a half-plane with piecewise constant boundary data, a problem which models, for example, outdoor sound propagation over inhomogeneous. at terrain. To achieve good approximation at high frequencies with a relatively low number of degrees of freedom, we propose a novel Galerkin boundary element method, using a graded mesh with smaller elements adjacent to discontinuities in impedance and a special set of basis functions so that, on each element, the approximation space contains polynomials ( of degree.) multiplied by traces of plane waves on the boundary. We prove stability and convergence and show that the error in computing the total acoustic field is O( N-(v+1) log(1/2) N), where the number of degrees of freedom is proportional to N logN. This error estimate is independent of the wavenumber, and thus the number of degrees of freedom required to achieve a prescribed level of accuracy does not increase as the wavenumber tends to infinity.
Resumo:
Wavenumber-frequency spectral analysis and linear wave theory are combined in a novel method to quantitatively estimate equatorial wave activity in the tropical lower stratosphere. The method requires temperature and velocity observations that are regularly spaced in latitude, longitude and time; it is therefore applied to the ECMWF 15-year re-analysis dataset (ERA-15). Signals consistent with idealized Kelvin and Rossby-gravity waves are found at wavenumbers and frequencies in agreement with previous studies. When averaged over 1981-93, the Kelvin wave explains approximately 1 K-2 of temperature variance on the equator at 100 hPa, while the Rossby-gravity wave explains approximately 1 m(2)s(-2) of meridional wind variance. Some inertio-gravity wave and equatorial Rossby wave signals are also found; however the resolution of ERA-15 is not sufficient for the method to provide an accurate climatology of waves with high meridional structure.
Resumo:
This paper presents a first attempt to estimate mixing parameters from sea level observations using a particle method based on importance sampling. The method is applied to an ensemble of 128 members of model simulations with a global ocean general circulation model of high complexity. Idealized twin experiments demonstrate that the method is able to accurately reconstruct mixing parameters from an observed mean sea level field when mixing is assumed to be spatially homogeneous. An experiment with inhomogeneous eddy coefficients fails because of the limited ensemble size. This is overcome by the introduction of local weighting, which is able to capture spatial variations in mixing qualitatively. As the sensitivity of sea level for variations in mixing is higher for low values of mixing coefficients, the method works relatively well in regions of low eddy activity.
Resumo:
This paper presents the method and findings of a contingent valuation (CV) study that aimed to elicit United Kingdom citizens' willingness to pay to support legislation to phase out the use of battery cages for egg production in the European Union (EU). The method takes account of various biases associated with the CV technique, including 'warm glow', 'part-whole' and sample response biases. Estimated mean willingness to pay to support the legislation is used to estimate the annual benefit of the legislation to UK citizens. This is compared with the estimated annual costs of the legislation over a 12-year period, which allows for readjustment by the UK egg industry. The analysis shows that the estimated benefits of the legislation outweigh the costs. The study demonstrates that CV is a potentially useful technique for assessing the likely benefits associated with proposed legislation. However, estimates of CV studies must be treated with caution. It is important that they are derived from carefully designed surveys and that the willingness to pay estimation method allows for various biases. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Subsidised energy prices in pre-transition Hungary had led to excessive energy intensity in the agricultural sector. Transition has resulted in steep input price increases. In this study, Allen and Morishima elasticities of substitution are estimated to study the effects of these price changes on energy use, chemical input use, capital formation and employment. Panel data methods, Generalised Method of Moments (GMM) and instrument exogeneity tests are used to specify and estimate technology and substitution elasticities. Results indicate that indirect price policy may be effective in controlling energy consumption. The sustained increases in energy and chemical input prices have worked together to restrict energy and chemical input use, and the substitutability between energy, capital and labour has prevented the capital shrinkage and agricultural unemployment situations from being worse. The Hungarian push towards lower energy intensity may be best pursued through sustained energy price increases rather than capital subsidies. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
1. Population growth rate (PGR) is central to the theory of population ecology and is crucial for projecting population trends in conservation biology, pest management and wildlife harvesting. Furthermore, PGR is increasingly used to assess the effects of stressors. Image analysis that can automatically count and measure photographed individuals offers a potential methodology for estimating PGR. 2. This study evaluated two ways in which the PGR of Daphnia magna, exposed to different stressors, can be estimated using an image analysis system. The first method estimated PGR as the ratio of counts of individuals obtained at two different times, while the second method estimated PGR as the ratio of population sizes at two different times, where size is measured by the sum of the individuals' surface areas, i.e. total population surface area. This method is attractive if surface area is correlated with reproductive value (RV), as it is for D. magna, because of the theoretical result that PGR is the rate at which the population RV increases. 3. The image analysis system proved reliable and reproducible in counting populations of up to 440 individuals in 5 L of water. Image counts correlated well with manual counts but with a systematic underestimate of about 30%. This does not affect accuracy when estimating PGR as the ratio of two counts. Area estimates of PGR correlated well with count estimates, but were systematically higher, possibly reflecting their greater accuracy in the study situation. 4. Analysis of relevant scenarios suggested the correlation between RV and body size will generally be good for organisms in which fecundity correlates with body size. In these circumstances, area estimation of PGR is theoretically better than count estimation. 5. Synthesis and applications. There are both theoretical and practical advantages to area estimation of population growth rate when individuals' reproductive values are consistently well correlated with their surface areas. Because stressors may affect both the number and quality of individuals, area estimation of population growth rate should improve the accuracy of predicting stress impacts at the population level.
Resumo:
Quality control on fruits requires reliable methods, able to assess with reasonable accuracy and possibly in a non-destructive way their physical and chemical characteristics. More specifically, a decreased firmness indicates the presence of damage or defects in the fruit or else that the fruit has exceeded its “best before date”, becoming unsuitable for consumption. In high-value exotic fruits, such as mangoes, where firmness cannot be easily measured from a simple observation of texture, colour changes and unevenness of fruits surface, the use of non-destructive techniques is highly recommendable. In particular, the application of Laser vibrometry, based on the Doppler effect, a non-contact technique sensitive to differences in displacements inferior to the nanometre, appears ideal for a possible on-line control on food. Previous results indicated that a phase shift can be in a repeatable way associated with the presence of damage on the fruit, whilst a decreased firmness results in significant differences in the displacement of the fruits under the same excitation signal. In this work, frequency ranges for quality control via the application of a sound chirp are suggested, based on the measurement of the signal coherence. The variations of the average vibration spectrum of a grid of points, or of point-by-point signal velocity allows the go-no go recognition of “firm” and “over-ripe” fruits, with notable success in the particular case of mangoes. The future exploitation of this work will include the application of this method to allow on-line control during conveyor belt distribution of fruits.
Resumo:
Background Screening instruments for autistic-spectrum disorders have not been compared in the same sample. Aims To compare the Social Communication Questionnaire (SCQ), the Social Responsiveness Scale (SRS) and the Children's Communication Checklist (CCC). Method Screen and diagnostic assessments on 119 children between 9 and 13 years of age with special educational needs with and without autistic-spectrum disorders were weighted to estimate screen characteristics for a realistic target population. Results The SCQ performed best (area under receiver operating characteristic curve (AUC)=0.90; sensitivity. 6; specificity 0.78). The SRS had a lower AUC (0.77) with high sensitivity (0.78) and moderate specificity (0.67). The CCC had a high sensitivity but lower specificity (AUC=0.79; sensitivity 0.93; specificity 0.46). The AUC of the SRS and CCC was lower for children with IQ < 70. Behaviour problems reduced specificity for all three instruments. Conclusions The SCQ, SRS and CCC showed strong to moderate ability to identify autistic-spectrum disorder in this at-risk sample of school-age children with special educational needs.
Resumo:
A method of estimating dissipation rates from a vertically pointing Doppler lidar with high temporal and spatial resolution has been evaluated by comparison with independent measurements derived from a balloon-borne sonic anemometer. This method utilizes the variance of the mean Doppler velocity from a number of sequential samples and requires an estimate of the horizontal wind speed. The noise contribution to the variance can be estimated from the observed signal-to-noise ratio and removed where appropriate. The relative size of the noise variance to the observed variance provides a measure of the confidence in the retrieval. Comparison with in situ dissipation rates derived from the balloon-borne sonic anemometer reveal that this particular Doppler lidar is capable of retrieving dissipation rates over a range of at least three orders of magnitude. This method is most suitable for retrieval of dissipation rates within the convective well-mixed boundary layer where the scales of motion that the Doppler lidar probes remain well within the inertial subrange. Caution must be applied when estimating dissipation rates in more quiescent conditions. For the particular Doppler lidar described here, the selection of suitably short integration times will permit this method to be applicable in such situations but at the expense of accuracy in the Doppler velocity estimates. The two case studies presented here suggest that, with profiles every 4 s, reliable estimates of ϵ can be derived to within at least an order of magnitude throughout almost all of the lowest 2 km and, in the convective boundary layer, to within 50%. Increasing the integration time for individual profiles to 30 s can improve the accuracy substantially but potentially confines retrievals to within the convective boundary layer. Therefore, optimization of certain instrument parameters may be required for specific implementations.
Resumo:
A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and nonconvective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud-resolving model simulations, and from the Bayesian formulation itself. Synthetic rain-rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in TMI instantaneous rain-rate estimates at 0.5°-resolution range from approximately 50% at 1 mm h−1 to 20% at 14 mm h−1. Errors in collocated spaceborne radar rain-rate estimates are roughly 50%–80% of the TMI errors at this resolution. The estimated algorithm random error in TMI rain rates at monthly, 2.5° resolution is relatively small (less than 6% at 5 mm day−1) in comparison with the random error resulting from infrequent satellite temporal sampling (8%–35% at the same rain rate). Percentage errors resulting from sampling decrease with increasing rain rate, and sampling errors in latent heating rates follow the same trend. Averaging over 3 months reduces sampling errors in rain rates to 6%–15% at 5 mm day−1, with proportionate reductions in latent heating sampling errors.
Resumo:
Evolutionary synthesis methods, as originally described by Dobrowolski, have been shown in previous literature to be an effective method of obtaining anti-reflection coating designs. To make this method even more effective, the combination of a good starting design, the best suited thin-film materials, a realistic optimization target function and a non-gradient optimization method are used in an algorithm written for a PC. Several broadband anti-reflection designs obtained by this new design method are given as examples of its usefulness.