890 resultados para problem of mediation
Resumo:
We consider the problem of determining the pressure and velocity fields for a weakly compressible fluid flowing in a two-dimensional reservoir in an inhomogeneous, anisotropic porous medium, with vertical side walls and variable upper and lower boundaries, in the presence of vertical wells injecting or extracting fluid. Numerical solution of this problem may be expensive, particularly in the case that the depth scale of the layer h is small compared to the horizontal length scale l. This is a situation which occurs frequently in the application to oil reservoir recovery. Under the assumption that epsilon=h/l<<1, we show that the pressure field varies only in the horizontal direction away from the wells (the outer region). We construct two-term asymptotic expansions in epsilon in both the inner (near the wells) and outer regions and use the asymptotic matching principle to derive analytical expressions for all significant process quantities. This approach, via the method of matched asymptotic expansions, takes advantage of the small aspect ratio of the reservoir, epsilon, at precisely the stage where full numerical computations become stiff, and also reveals the detailed structure of the dynamics of the flow, both in the neighborhood of wells and away from wells.
Resumo:
Ice clouds are an important yet largely unvalidated component of weather forecasting and climate models, but radar offers the potential to provide the necessary data to evaluate them. First in this paper, coordinated aircraft in situ measurements and scans by a 3-GHz radar are presented, demonstrating that, for stratiform midlatitude ice clouds, radar reflectivity in the Rayleigh-scattering regime may be reliably calculated from aircraft size spectra if the "Brown and Francis" mass-size relationship is used. The comparisons spanned radar reflectivity values from -15 to +20 dBZ, ice water contents (IWCs) from 0.01 to 0.4 g m(-3), and median volumetric diameters between 0.2 and 3 mm. In mixed-phase conditions the agreement is much poorer because of the higher-density ice particles present. A large midlatitude aircraft dataset is then used to derive expressions that relate radar reflectivity and temperature to ice water content and visible extinction coefficient. The analysis is an advance over previous work in several ways: the retrievals vary smoothly with both input parameters, different relationships are derived for the common radar frequencies of 3, 35, and 94 GHz, and the problem of retrieving the long-term mean and the horizontal variance of ice cloud parameters is considered separately. It is shown that the dependence on temperature arises because of the temperature dependence of the number concentration "intercept parameter" rather than mean particle size. A comparison is presented of ice water content derived from scanning 3-GHz radar with the values held in the Met Office mesoscale forecast model, for eight precipitating cases spanning 39 h over Southern England. It is found that the model predicted mean I WC to within 10% of the observations at temperatures between -30 degrees and - 10 degrees C but tended to underestimate it by around a factor of 2 at colder temperatures.
Resumo:
An analytical model is developed for the initial stage of surface wave generation at an air-water interface by a turbulent shear flow in either the air or in the water. The model treats the problem of wave growth departing from a flat interface and is relevant for small waves whose forcing is dominated by turbulent pressure fluctuations. The wave growth is predicted using the linearised and inviscid equations of motion, essentially following Phillips [Phillips, O.M., 1957. On the generation of waves by turbulent wind. J. Fluid Mech. 2, 417-445], but the pressure fluctuations that generate the waves are treated as unsteady and related to the turbulent velocity field using the rapid-distortion treatment of Durbin [Durbin, P.A., 1978. Rapid distortion theory of turbulent flows. PhD thesis, University of Cambridge]. This model, which assumes a constant mean shear rate F, can be viewed as the simplest representation of an oceanic or atmospheric boundary layer. For turbulent flows in the air and in the water producing pressure fluctuations of similar magnitude, the waves generated by turbulence in the water are found to be considerably steeper than those generated by turbulence in the air. For resonant waves, this is shown to be due to the shorter decorrelation time of turbulent pressure in the air (estimated as proportional to 1/Gamma), because of the higher shear rate existing in the air flow, and due to the smaller length scale of the turbulence in the water. Non-resonant waves generated by turbulence in the water, although being somewhat gentler, are still steeper than resonant waves generated by turbulence in the air. Hence, it is suggested that turbulence in the water may have a more important role than previously thought in the initiation of the surface waves that are subsequently amplified by feedback instability mechanisms.
Resumo:
We previously demonstrated that a dry, room temperature stable formulation of a live bacterial vaccine was highly susceptible to bile, and suggested that this will lead to significant loss of viability of any live bacterial formulation released into the intestine using an enteric coating or capsule. We found that bile and acid tolerance is very rapidly recovered after rehydration with buffer or water, raising the possibility that rehydration in the absence of bile prior to release into the intestine might solve the problem of bile toxicity to dried cells. We describe here a novel formulation that combines extensively studied bile acid adsorbent resins with the dried bacteria, to temporarily adsorb bile acids and allow rehydration and recovery of bile resistance of bacteria in the intestine before release. Tablets containing the bile acid adsorbent cholestyramine release 250-fold more live bacteria when dissolved in a bile solution, compared to control tablets without cholestyramine or with a control resin that does not bind bile acids. We propose that a simple enteric coated oral dosage form containing bile acid adsorbent resins will allow improved live bacterial delivery to the intestine via the oral route, a major step towards room temperature stable, easily administered and distributed vaccine pills and other bacterial therapeutics
Resumo:
This note reports on the results of a choice experiment survey of 400 people in England and Wales, conducted to estimate the value that society places on changes to the size of the badger population. The study was undertaken in the context of the possible need to reduce the badger population by culling to help control bovine tuberculosis in cattle. The study found that people were concerned about the problem of bovine tuberculosis in cattle, which was reflected in their willingness to pay to control the disease, and gave a relatively low value to changes in the size of the badger population (within limits). However, people did not like the idea of a policy that intentionally killed large numbers of badgers and had a relatively very high willingness to pay not to have such a policy.
Resumo:
In Central Brazil, the long-term sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, ‘asset value of cattle (representing cattle ownership)' and ‘present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics, and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple ‘no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil.
Resumo:
In Central Brazil, the long-term, sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from. degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, 'asset value of cattle (representing cattle ownership and 'present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring caring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics,and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple 'no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The terminator gene can render seeds sterile, so forcing farmers to purchase fresh seed every year. It is a technological solution to the problem of market failure that could increase the appropriability of R&D investment more effectively than intellectual property rights legislation or patents. This paper shows that appropriability should be more than tripled and that this leads to greater private R&D investment, which may be expected to double or triple. This would bring open-pollinating varieties into line with F1 hybrids, for which seed cannot be saved. In turn, the increased investment should raise yield increases to levels similar to those for hybrid crops. Thus, there are benefits to set against the possible ecological and environmental costs and the clear distributional and social consequences. The paper discusses the way the seed market is developing, the possible impacts, especially from a developing country viewpoint, and considers the policy changes that are needed.
Resumo:
The endemic pink pigeon has recovered from less than 20 birds in the mid-1970s to 355 free-living individuals in 2003. A major concern for the species' recovery has been the potential genetic problem of inbreeding. Captive pink pigeons bred for reintroduction were managed to maximise founder representation and minimise inbreeding. In this paper, we quantify the effect of inbreeding on survival and reproductive parameters in captive and wild populations and quantify DNA sequence variation in the mitochondrial d-loop region for pink pigeon founders. Inbreeding affected egg fertility, squab, juvenile and adult survival, but effects were strongest in highly inbred birds (F≥0.25). Inbreeding depression was more apparent in free-living birds where even moderate levels of inbreeding affected survival, although highly inbred birds were equally compromised in both captive and wild populations. Mitochondrial DNA haplotypic diversity in pink pigeon founders is low, suggesting that background inbreeding is contributing to low fertility and depressed productivity in this species, as well as comparable survival of some groups of non-inbred and nominally inbred birds. Management of wild populations has boosted population growth and may be required long-term to offset the negative effects of inbreeding depression and enhance the species' survival.
Resumo:
The problem of estimating the individual probabilities of a discrete distribution is considered. The true distribution of the independent observations is a mixture of a family of power series distributions. First, we ensure identifiability of the mixing distribution assuming mild conditions. Next, the mixing distribution is estimated by non-parametric maximum likelihood and an estimator for individual probabilities is obtained from the corresponding marginal mixture density. We establish asymptotic normality for the estimator of individual probabilities by showing that, under certain conditions, the difference between this estimator and the empirical proportions is asymptotically negligible. Our framework includes Poisson, negative binomial and logarithmic series as well as binomial mixture models. Simulations highlight the benefit in achieving normality when using the proposed marginal mixture density approach instead of the empirical one, especially for small sample sizes and/or when interest is in the tail areas. A real data example is given to illustrate the use of the methodology.
Resumo:
This paper considers the problem of estimation when one of a number of populations, assumed normal with known common variance, is selected on the basis of it having the largest observed mean. Conditional on selection of the population, the observed mean is a biased estimate of the true mean. This problem arises in the analysis of clinical trials in which selection is made between a number of experimental treatments that are compared with each other either with or without an additional control treatment. Attempts to obtain approximately unbiased estimates in this setting have been proposed by Shen [2001. An improved method of evaluating drug effect in a multiple dose clinical trial. Statist. Medicine 20, 1913–1929] and Stallard and Todd [2005. Point estimates and confidence regions for sequential trials involving selection. J. Statist. Plann. Inference 135, 402–419]. This paper explores the problem in the simple setting in which two experimental treatments are compared in a single analysis. It is shown that in this case the estimate of Stallard and Todd is the maximum-likelihood estimate (m.l.e.), and this is compared with the estimate proposed by Shen. In particular, it is shown that the m.l.e. has infinite expectation whatever the true value of the mean being estimated. We show that there is no conditionally unbiased estimator, and propose a new family of approximately conditionally unbiased estimators, comparing these with the estimators suggested by Shen.
Resumo:
This article introduces a new general method for genealogical inference that samples independent genealogical histories using importance sampling (IS) and then samples other parameters with Markov chain Monte Carlo (MCMC). It is then possible to more easily utilize the advantages of importance sampling in a fully Bayesian framework. The method is applied to the problem of estimating recent changes in effective population size from temporally spaced gene frequency data. The method gives the posterior distribution of effective population size at the time of the oldest sample and at the time of the most recent sample, assuming a model of exponential growth or decline during the interval. The effect of changes in number of alleles, number of loci, and sample size on the accuracy of the method is described using test simulations, and it is concluded that these have an approximately equivalent effect. The method is used on three example data sets and problems in interpreting the posterior densities are highlighted and discussed.
Resumo:
Liquid chromatography-mass spectrometry (LC-MS) datasets can be compared or combined following chromatographic alignment. Here we describe a simple solution to the specific problem of aligning one LC-MS dataset and one LC-MS/MS dataset, acquired on separate instruments from an enzymatic digest of a protein mixture, using feature extraction and a genetic algorithm. First, the LC-MS dataset is searched within a few ppm of the calculated theoretical masses of peptides confidently identified by LC-MS/MS. A piecewise linear function is then fitted to these matched peptides using a genetic algorithm with a fitness function that is insensitive to incorrect matches but sufficiently flexible to adapt to the discrete shifts common when comparing LC datasets. We demonstrate the utility of this method by aligning ion trap LC-MS/MS data with accurate LC-MS data from an FTICR mass spectrometer and show how hybrid datasets can improve peptide and protein identification by combining the speed of the ion trap with the mass accuracy of the FTICR, similar to using a hybrid ion trap-FTICR instrument. We also show that the high resolving power of FTICR can improve precision and linear dynamic range in quantitative proteomics. The alignment software, msalign, is freely available as open source.
Resumo:
We present the symbolic resonance analysis (SRA) as a viable method for addressing the problem of enhancing a weakly dominant mode in a mixture of impulse responses obtained from a nonlinear dynamical system. We demonstrate this using results from a numerical simulation with Duffing oscillators in different domains of their parameter space, and by analyzing event-related brain potentials (ERPs) from a language processing experiment in German as a representative application. In this paradigm, the averaged ERPs exhibit an N400 followed by a sentence final negativity. Contemporary sentence processing models predict a late positivity (P600) as well. We show that the SRA is able to unveil the P600 evoked by the critical stimuli as a weakly dominant mode from the covering sentence final negativity. (c) 2007 American Institute of Physics. (c) 2007 American Institute of Physics.
Resumo:
This paper addresses the crucial problem of wayfinding assistance in the Virtual Environments (VEs). A number of navigation aids such as maps, agents, trails and acoustic landmarks are available to support the user for navigation in VEs, however it is evident that most of the aids are visually dominated. This work-in-progress describes a sound based approach that intends to assist the task of 'route decision' during navigation in a VE using music. Furthermore, with use of musical sounds it aims to reduce the cognitive load associated with other visually as well as physically dominated tasks. To achieve these goals, the approach exploits the benefits provided by music to ease and enhance the task of wayfinding, whilst making the user experience in the VE smooth and enjoyable.