21 resultados para travel cost method

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of scattering of a time-harmonic acoustic incident plane wave by a sound soft convex polygon. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the computational cost required to achieve a prescribed level of accuracy grows linearly with respect to the frequency of the incident wave. Recently Chandler–Wilde and Langdon proposed a novel Galerkin boundary element method for this problem for which, by incorporating the products of plane wave basis functions with piecewise polynomials supported on a graded mesh into the approximation space, they were able to demonstrate that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency. Here we propose a related collocation method, using the same approximation space, for which we demonstrate via numerical experiments a convergence rate identical to that achieved with the Galerkin scheme, but with a substantially reduced computational cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the problem of time-harmonic acoustic scattering in two dimensions by convex polygons. Standard boundary or finite element methods for acoustic scattering problems have a computational cost that grows at least linearly as a function of the frequency of the incident wave. Here we present a novel Galerkin boundary element method, which uses an approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh, with smaller elements closer to the corners of the polygon. We prove that the best approximation from the approximation space requires a number of degrees of freedom to achieve a prescribed level of accuracy that grows only logarithmically as a function of the frequency. Numerical results demonstrate the same logarithmic dependence on the frequency for the Galerkin method solution. Our boundary element method is a discretization of a well-known second kind combined-layer-potential integral equation. We provide a proof that this equation and its adjoint are well-posed and equivalent to the boundary value problem in a Sobolev space setting for general Lipschitz domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The basic premise of transaction-cost theory is that the decision to outsource, rather than to undertake work in-house, is determined by the relative costs incurred in each of these forms of economic organization. In construction the "make or buy" decision invariably leads to a contract. Reducing the costs of entering into a contractual relationship (transaction costs) raises the value of production and is therefore desirable. Commonly applied methods of contractor selection may not minimise the costs of contracting. Research evidence suggests that although competitive tendering typically results in the lowest bidder winning the contract this may not represent the lowest project cost after completion. Multi-parameter and quantitative models for contractor selection have been developed to identify the best (or least risky) among bidders. A major area in which research is still needed is in investigating the impact of different methods of contractor selection on the costs of entering into a contract and the decision to outsource.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The feeding rates of many predators and parasitoids exhibit type II functional responses, with a decelerating rate of increase to reach an asymptotic value as the density of their prey or hosts increases. Holling's disc equation describes such relationships and predicts that the asymptotic feeding rate at high prey densities is set by handling time, while the rate at which feeding rate increases with increased prey density is determined by searching efficiency. Searching efficiency and handling time are also parameters in other models which describe the functional response. Models which incorporate functional responses in order to make predictions of the effects of food shortage thus rely upon a clear understanding and accurate quantification of searching efficiency and handling time. 2. Blackbird Turdus merula exhibit a type II functional response and use pause-travel foraging, a foraging technique in which animals search for prey while stationary and then move to capture prey. Pause-travel foraging allows accurate direct measurement of feeding rate and both searching efficiency and handling time. We use Blackbirds as a model species to: (i) compare observed measures of both searching efficiency and handling time with those estimated by statistically fitting the disc equation to the observed functional response; and (ii) investigate alternative measures of searching efficiency derived by the established method where search area is assumed to be circular and a new method that we propose where it is not. 3. We find that the disc equation can adequately explain the functional response of blackbirds feeding on artificial prey. However, this depends critically upon how searching efficiency is measured. Two variations on the previous method of measuring search area (a component of searching efficiency) overestimated searching efficiency, and hence predicted feeding rates higher than those observed. Two variations of our alternative approach produced lower estimates of searching efficiency, closer to that estimated by fitting the disc equation, and hence more accurately predicted feeding rate. Our study shows the limitations of the previous method of measuring searching efficiency, and describes a new method for measuring searching efficiency more accurately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A perennial issue for land use policy is the evaluation of landscape biodiversity and the associated cost effectiveness of any biodiversity conservation policy actions. Based on the CUA methodology as applied to species conservation, this paper develops a methodology for evaluating the impact on habitats of alternative landscape management scenarios. The method incorporates three dimensions of habitats, quantity change, quality change and relative scarcity, and is illustrated in relation to the alternative landscape management scenarios for the Scottish Highlands (Cairngorms) study area of the BioScene project. The results demonstrate the value of the method for evaluating biodiversity conservation policies through their impact on habitats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Details about the parameters of kinetic systems are crucial for progress in both medical and industrial research, including drug development, clinical diagnosis and biotechnology applications. Such details must be collected by a series of kinetic experiments and investigations. The correct design of the experiment is essential to collecting data suitable for analysis, modelling and deriving the correct information. We have developed a systematic and iterative Bayesian method and sets of rules for the design of enzyme kinetic experiments. Our method selects the optimum design to collect data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. The rules select features of the design such as the substrate range and the number of measurements. We show here that this method can be directly applied to the study of other important kinetic systems, including drug transport, receptor binding, microbial culture and cell transport kinetics. It is possible to reduce the errors in the estimated parameters and, most importantly, increase the efficiency and cost-effectiveness by reducing the necessary amount of experiments and data points measured. (C) 2003 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collaborative working methods offer the hope of reduced waste, lower tendering costs and improved outputs. The costs of tendering may be influenced by the introduction of different working methods. Transaction cost economics appears to offer an analytical framework for studying the costs of tendering, but it is more to do with providing explanations at the institutional/industry level, not at the level of individual projects. Surveys and interviews were carried out with small samples in UK. The data show that that while tendering costs are not necessarily higher in collaborative working arrangements, there is no correlation between costs of tendering and the way the work is organized. Practitioners perceive that the benefits of working in collaborative procurement routes far outweigh the costs. Tendering practices can be improved to avoid waste, and the suggested improvements include restricting selective tendering lists to 23 bidders, letting bidders know who they are competing with, reimbursing tendering costs for aborted projects and ensuring that timely and comprehensive information is provided to bidders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: This study reports the cost-effectiveness of a preventive intervention, consisting of counseling and specific support for the mother-infant relationship, targeted at women at high risk of developing postnatal depression. Methods: A prospective economic evaluation was conducted alongside a pragmatic randomized controlled trial in which women considered at high risk of developing postnatal depression were allocated randomly to the preventive intervention (n = 74) or to routine primary care (n = 77). The primary outcome measure was the duration of postnatal depression experienced during the first 18 months postpartum. Data on health and social care use by women and their infants up to 18 months postpartum were collected, using a combination of prospective diaries and face-to-face interviews, and then were combined with unit costs ( pound, year 2000 prices) to obtain a net cost per mother-infant dyad. The nonparametric bootstrap method was used to present cost-effectiveness acceptability curves and net benefit statistics at alternative willingness to pay thresholds held by decision makers for preventing 1 month of postnatal depression. Results: Women in the preventive intervention group were depressed for an average of 2.21 months (9.57 weeks) during the study period, whereas women in the routine primary care group were depressed for an average of 2.70 months (11.71 weeks). The mean health and social care costs were estimated at 2,396.9 pound per mother-infant dyad in the preventive intervention group and 2,277.5 pound per mother-infant dyad in the routine primary care group, providing a mean cost difference of 119.5 pound (bootstrap 95 percent confidence interval [Cl], -535.4, 784.9). At a willingness to pay threshold of 1,000 pound per month of postnatal depression avoided, the probability that the preventive intervention is cost-effective is .71 and the mean net benefit is 383.4 pound (bootstrap 95 percent Cl, -863.3- pound 1,581.5) pound. Conclusions: The preventive intervention is likely to be cost-effective even at relatively low willingness to pay thresholds for preventing 1 month of postnatal depression during the first 18 months postpartum. Given the negative impact of postnatal depression on later child development, further research is required that investigates the longer-term cost-effectiveness of the preventive intervention in high risk women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the key issues encountered in testing during the development of high-speed networking hardware systems by documenting a practical method for "real-life like" testing. The proposed method is empowered by modern and commonly available Field Programmable Gate Array (FPGA) technology. Innovative application of standard FPGA blocks in combination with reconfigurability are used as a back-bone of the method. A detailed elaboration of the method is given so as to serve as a general reference. The method is fully characterised and compared to alternatives through a case study proving it to be the most efficient and effective one at a reasonable cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider bilinear forms of matrix polynomials and show that these polynomials can be used to construct solutions for the problems of solving systems of linear algebraic equations, matrix inversion and finding extremal eigenvalues. An almost Optimal Monte Carlo (MAO) algorithm for computing bilinear forms of matrix polynomials is presented. Results for the computational costs of a balanced algorithm for computing the bilinear form of a matrix power is presented, i.e., an algorithm for which probability and systematic errors are of the same order, and this is compared with the computational cost for a corresponding deterministic method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the numerical efficiency of solving the self-consistent field theory (SCFT) for periodic block-copolymer morphologies by combining the spectral method with Anderson mixing. Using AB diblock-copolymer melts as an example, we demonstrate that this approach can be orders of magnitude faster than competing methods, permitting precise calculations with relatively little computational cost. Moreover, our results raise significant doubts that the gyroid (G) phase extends to infinite $\chi N$. With the increased precision, we are also able to resolve subtle free-energy differences, allowing us to investigate the layer stacking in the perforated-lamellar (PL) phase and the lattice arrangement of the close-packed spherical (S$_{cp}$) phase. Furthermore, our study sheds light on the existence of the newly discovered Fddd (O$^{70}$) morphology, showing that conformational asymmetry has a significant effect on its stability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas and performing a monochromatic radiation calculation for each point. In this presentation it is shown that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K/day due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such that they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide, and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K/day can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K/day for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models, and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas, and performing a pseudo-monochromatic radiation calculation for each point. In this paper it is first argued that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer pseudo-monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K d−1 due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K d−1 can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K d−1 for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unless the benefits to society of measures to protect and improve the welfare of animals are made transparent by means of their valuation they are likely to go unrecognised and cannot easily be weighed against the costs of such measures as required, for example, by policy-makers. A simple single measure scoring system, based on the Welfare Quality® index, is used, together with a choice experiment economic valuation method, to estimate the value that people place on improvements to the welfare of different farm animal species measured on a continuous (0-100) scale. Results from using the method on a survey sample of some 300 people show that it is able to elicit apparently credible values. The survey found that 96% of respondents thought that we have a moral obligation to safeguard the welfare of animals and that over 72% were concerned about the way farm animals are treated. Estimated mean annual willingness to pay for meat from animals with improved welfare of just one point on the scale was £5.24 for beef cattle, £4.57 for pigs and £5.10 for meat chickens. Further development of the method is required to capture the total economic value of animal welfare benefits. Despite this, the method is considered a practical means for obtaining economic values that can be used in the cost-benefit appraisal of policy measures intended to improve the welfare of animals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.