90 resultados para Armington Assumption

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The behaviour of stationary, non-passive plumes can be simulated in a reasonably simple and accurate way by integral models. One of the key requirements of these models, but also one of their less well-founded aspects, is the entrainment assumption, which parameterizes turbulent mixing between the plume and the environment. The entrainment assumption developed by Schatzmann and adjusted to a set of experimental results requires four constants and an ad hoc hypothesis to eliminate undesirable terms. With this assumption, Schatzmann’s model exhibits numerical instability for certain cases of plumes with small velocity excesses, due to very fast radius growth. The purpose of this paper is to present an alternative entrainment assumption based on a first-order turbulence closure, which only requires two adjustable constants and seems to solve this problem. The asymptotic behaviour of the new formulation is studied and compared to previous ones. The validation tests presented by Schatzmann are repeated and it is found that the new formulation not only eliminates numerical instability but also predicts more plausible growth rates for jets in co-flowing streams.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whilst common sense knowledge has been well researched in terms of intelligence and (in particular) artificial intelligence, specific, factual knowledge also plays a critical part in practice. When it comes to testing for intelligence, testing for factual knowledge is, in every-day life, frequently used as a front line tool. This paper presents new results which were the outcome of a series of practical Turing tests held on 23rd June 2012 at Bletchley Park, England. The focus of this paper is on the employment of specific knowledge testing by interrogators. Of interest are prejudiced assumptions made by interrogators as to what they believe should be widely known and subsequently the conclusions drawn if an entity does or does not appear to know a particular fact known to the interrogator. The paper is not at all about the performance of machines or hidden humans but rather the strategies based on assumptions of Turing test interrogators. Full, unedited transcripts from the tests are shown for the reader as working examples. As a result, it might be possible to draw critical conclusions with regard to the nature of human concepts of intelligence, in terms of the role played by specific, factual knowledge in our understanding of intelligence, whether this is exhibited by a human or a machine. This is specifically intended as a position paper, firstly by claiming that practicalising Turing's test is a useful exercise throwing light on how we humans think, and secondly, by taking a potentially controversial stance, because some interrogators adopt a solipsist questioning style of hidden entities with a view that it is a thinking intelligent human if it thinks like them and knows what they know. The paper is aimed at opening discussion with regard to the different aspects considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

European researchers across heterogeneous disciplines voice concerns and argue for new paths towards a brighter future regarding scientific and knowledge creation and communication. Recently, in biological and natural sciences concerns have been expressed that major threats are intentionally ignored. These threats are challenging Europe’s future sustainability towards creating knowledge that effectively deals with emerging social, environmental, health, and economic problems of a planetary scope. Within social science circles however, the root cause regarding the above challenges, have been linked with macro level forces of neo-liberal ways of valuing and relevant rules in academia and beyond which we take for granted. These concerns raised by heterogeneous scholars in natural and the applied social sciences concern the ethics of today’s research and academic integrity. Applying Bourdieu’s sociology may not allow an optimistic lens if change is possible. Rather than attributing the replication of neo-liberal habitus in intentional agent and institutional choices, Bourdieu’s work raises the importance of thoughtlessly internalised habits in human and social action. Accordingly, most action within a given paradigm (in this case, neo-liberalism) is understood as habituated, i.e. unconsciously reproducing external social fields, even ill-defined ways of valuing. This essay analyses these and how they may help critically analyse the current habitus surrounding research and knowledge production, evaluation, and communication and related aspects of academic freedom. Although it is acknowledged that transformation is not easy, the essay presents arguments and recent theory paths to suggest that change nevertheless may be a realistic hope once certain action logics are encouraged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE: An altered gastric emptying (GE) rate has been implicated in the aetiology of obesity. The (13)C-octanoic acid breath test (OBT) is frequently used to measure GE, and the cumulative percentage of (13)C recovered (cPDR) is a common outcome measure. However, true cPDR in breath is dependent on accurate measurement of carbon dioxide production rate (VCO(2)). The current study aimed to quantify differences in the (13)C OBT results obtained using directly measured VCO(2) (VCO(2DM)) compared with (i) predicted from resting VCO(2) (VCO(2PR)) and (ii) predicted from body surface area VCO(2) (VCO(2BSA)). METHODS: The GE rate of a high-fat test meal was assessed in 27 lean subjects using the OBT. Breath samples were gathered during the fasted state and at regular intervals throughout the 6-h postprandial period for determination of (13)C-isotopic enrichment by continuous-flow isotope-ratio mass spectrometry. The VCO(2) was measured directly from exhaled air samples and the PDR calculated by three methods. The bias and the limits of agreement were calculated using Bland-Altman plots. RESULTS: Compared with the VCO(2DM), the cPDR was underestimated by VCO(2PR) (4.8%; p = 0.0001) and VCO(2BSA) (2.7%; p = 0.02). The GE T(half) was underestimated by VCO(2PR) (13 min; p = 0.0001) and VCO(2BSA) (10 min; p = 0.01), compared with VCO(2DM). CONCLUSIONS: The findings highlight the importance of directly measuring VCO(2)production rates throughout the (13)C OBT and could partly explain the conflicting evidence regarding the effect of obesity on GE rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of evolution by sexual selection for sexual size dimorphism (SSD) postulates that SSD primarily reflects the adaptation of males and females to their different reproductive roles. For example, competition among males for access to females increases male body size because larger males are better able to maintain dominant status than smaller males. Larger dominant males sire most offspring while smaller subordinate males are unsuccessful, leading to skew in reproductive success. Therefore, species with male-biased SSD are predicted to have greater variance in male reproductive success than those in which both sexes are similar in size. We tested this prediction among the Pinnipedia, a mammalian group with a great variation in SSD. From a literature review, we identified genetic estimates of male reproductive success for 10 pinniped taxa (eight unique species and two subspecies of a ninth species) that range from seals with similarly sized males and females to species in which males are more than four times as large as females. We found no support for a positive relationship between variance in reproductive success and SSD among pinnipeds after excluding the elephant seals Mirounga leonina and Mirounga angustirostris, which we discuss as distinctive cases. Several explanations for these results are presented, including the revival of one of Darwin's original ideas. Darwin proposed that natural selection may explain SSD based on differences in energetic requirements between sexes and the potential for sexual niche segregation. Males may develop larger bodies to exploit resources that remain unavailable to females due to the energetic constraints imposed on female mammals by gestation and lactation. The importance of this alternative explanation remains to be tested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A parametrization for ice supersaturation is introduced into the ECMWF Integrated Forecast System (IFS), compatible with the cloud scheme that allows partial cloud coverage. It is based on the simple, but often justifiable, diagnostic assumption that the ice nucleation and subsequent depositional growth time-scales are short compared to the model time step, thus supersaturation is only permitted in the clear-sky portion of the grid cell. Results from model integrations using the new scheme are presented, which is demonstrated to increase upper-tropospheric humidity, decrease high-level cloud cover and, to a much lesser extent, cloud ice amounts, all as expected from simple arguments. Evaluation of the relative distribution of supersaturated humidity amounts shows good agreement with the observed climatology derived from in situ aircraft observations. With the new scheme, the global distribution of frequency of occurrence of supersaturated regions compares well with remotely sensed microwave limb sounder (MLS) data, with the most marked errors of underprediction occurring in regions where the model is known to underpredict deep convection. Finally, it is also demonstrated that the new scheme leads to improved predictions of permanent contrail cloud over southern England, which indirectly implies upper-tropospheric humidity fields are better represented for this region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data assimilation provides techniques for combining observations and prior model forecasts to create initial conditions for numerical weather prediction (NWP). The relative weighting assigned to each observation in the analysis is determined by its associated error. Remote sensing data usually has correlated errors, but the correlations are typically ignored in NWP. Here, we describe three approaches to the treatment of observation error correlations. For an idealized data set, the information content under each simplified assumption is compared with that under correct correlation specification. Treating the errors as uncorrelated results in a significant loss of information. However, retention of an approximated correlation gives clear benefits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the question of how many facets are needed to represent the energy balance of an urban area by developing simplified 3-, 2- and 1-facet versions of a 4-facet energy balance model of two-dimensional streets and buildings. The 3-facet model simplifies the 4-facet model by averaging over the canyon orientation, which results in similar net shortwave and longwave balances for both wall facets, but maintains the asymmetry in the heat fluxes within the street canyon. For the 2-facet model, on the assumption that the wall and road temperatures are equal, the road and wall facets can be combined mathematically into a single street-canyon facet with effective values of the heat transfer coefficient, albedo, emissivity and thermodynamic properties, without further approximation. The 1-facet model requires the additional assumption that the roof temperature is also equal to the road and wall temperatures. Idealised simulations show that the geometry and material properties of the walls and road lead to a large heat capacity of the combined street canyon, whereas the roof behaves like a flat surface with low heat capacity. This means that the magnitude of the diurnal temperature variation of the street-canyon facets are broadly similar and much smaller than the diurnal temperature variation of the roof facets. Consequently, the approximation that the street-canyon facets have similar temperatures is sound, and the road and walls can be combined into a single facet. The roof behaves very differently and a separate roof facet is required. Consequently, the 2-facet model performs similarly to the 4-facet model, while the 1-facet model does not. The models are compared with previously published observations collected in Mexico City. Although the 3- and 2-facet models perform better than the 1-facet model, the present models are unable to represent the phase of the sensible heat flux. This result is consistent with previous model comparisons, and we argue that this feature of the data cannot be produced by a single column model. We conclude that a 2-facet model is necessary, and for numerical weather prediction sufficient, to model an urban surface, and that this conclusion is robust and therefore applicable to more general geometries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The soil microflora is very heterogeneous in its spatial distribution. The origins of this heterogeneity and its significance for soil function are not well understood. A problem for understanding spatial variation better is the assumption of statistical stationarity that is made in most of the statistical methods used to assess it. These assumptions are made explicit in geostatistical methods that have been increasingly used by soil biologists in recent years. Geostatistical methods are powerful, particularly for local prediction, but they require the assumption that the variability of a property of interest is spatially uniform, which is not always plausible given what is known about the complexity of the soil microflora and the soil environment. We have used the wavelet transform, a relatively new innovation in mathematical analysis, to investigate the spatial variation of abundance of Azotobacter in the soil of a typical agricultural landscape. The wavelet transform entails no assumptions of stationarity and is well suited to the analysis of variables that show intermittent or transient features at different spatial scales. In this study, we computed cross-variograms of Azotobacter abundance with the pH, water content and loss on ignition of the soil. These revealed scale-dependent covariation in all cases. The wavelet transform also showed that the correlation of Azotobacter abundance with all three soil properties depended on spatial scale, the correlation generally increased with spatial scale and was only significantly different from zero at some scales. However, the wavelet analysis also allowed us to show how the correlation changed across the landscape. For example, at one scale Azotobacter abundance was strongly correlated with pH in part of the transect, and not with soil water content, but this was reversed elsewhere on the transect. The results show how scale-dependent variation of potentially limiting environmental factors can induce a complex spatial pattern of abundance in a soil organism. The geostatistical methods that we used here make assumptions that are not consistent with the spatial changes in the covariation of these properties that our wavelet analysis has shown. This suggests that the wavelet transform is a powerful tool for future investigation of the spatial structure and function of soil biota. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pulses of potassium (K+) applied to columns of repacked calcium (Ca2+) saturated soil were leached with distilled water or calcium chloride (CaCl2) solutions of various concentrations at a rate of 12 mm h(-1). With increased Ca2+ concentration, the rate of movement of K+ increased, as did the concentration of K+ in the displaced pulse, which was less dispersed. The movement of K+ in calcite-amended soil leached with water was at a similar rate to that of the untreated soil leached with 1 mM CaCl2, and in soil containing gypsum, movement was similar to that leached with 15 mM CaCl2. The Ca2+ concentrations in the leachates were about 0.4 and 15 mM respectively the expected values for the dissolution of the two amendments. Soil containing native K+ was leached with distilled water or CaCl2 solutions. The amount of K+ leached increased as Ca2+ concentration increased, with up to 34% of the exchangeable K+ being removed in five pore volumes of 15 mM CaCl2. Soil amended with calcite and leached with water lost K+ at a rate between that for leaching the unamended soil with 1 mM CaCl2 and that with water. Soil containing gypsum and leached with water lost K+ at a similar rate to unamended soil leached with 15 mM CaCl2. The presence of Ca2+ in irrigation water and of soil minerals able to release Ca2+ are of importance in determining the amounts of K+ leached from soils. The LEACHM model predicted approximately the displacement of K+, and was more accurate with higher concentrations of displacing solution. The shortcomings of this model are its inability to account for rate-controlled processes and the assumption that K+:Ca2+ exchange during leaching can be described using a constant adsorption coefficient. As a result, the pulse is predicted to appear a little earlier and the following edge has less of a tail than chat measured. In practical agriculture, the model will be more useful in soils containing gypsum or leached with saline water than in either calcareous or non-calcareous soils leached with rainwater.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past 15 years have witnessed the rise of post-development theory as a means of understanding the development discourse since the 1940s. Post-development argues that intentional development (as distinct from immanent development - what people are doing anyway) is a construct of Western hegemony. Sustainable development, they argue, is no different and indeed is perhaps worse, given that most of the global environmental degradation has been driven by consumerism and industrialization in the West. Critics of post-development counter by stating that it only provides destruction by tearing apart what is currently practiced in 'development' without providing an alternative. When post-developmentalists do offer an alternative it typically amouints to little more than a call for more grassroots involvement in development and disengagement from a Western agenda. Post-sustainable development analysis and counter-analysis has received remarkably little attention within the sustainable development literature, yet this paper argues that it can make a positive contributrion by calling for an analysis of discourse rather than a hiding of power differentials and an assumption that consensus must exist within a community. A case is made for a post-sustainable development that acknowledges that diversity will exist and consensus may not be achievable, but at the same time participation can help with learning. The role of the expert within sustainable development is also discussed. Copyright (C) 2008 John Wiley & Sons, Ltd and ERP Environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Development geography has long sought to understand why inequalities exist and the best ways to address them. Dependency theory sets out an historical rationale for under development based on colonialism and a legacy of developed core and under-developed periphery. Race is relevant in this theory only insofar that Europeans are white and the places they colonised were occupied by people with darker skin colour. There are no innate biological reasons why it happened in that order. However, a new theory for national inequalities proposed by Lynn and Vanhanen in a series of publications makes the case that poorer countries have that status because of a poorer genetic stock rather than an accident of history. They argue that IQ has a genetic basis and IQ is linked to ability. Thus races with a poorer IQ have less ability, and thus national IQ can be positively correlated with performance as measured by an indicator like GDP/capita. Their thesis is one of despair, as little can be done to improve genetic stock significantly other than a programme of eugenics. This paper summarises and critiques the Lynn and Vanhanen hypothesis and the assumptions upon which it is based, and uses this analysis to show how a human desire to simplify in order to manage can be dangerous in development geography. While the attention may naturally be focused on the 'national IQ' variables as a proxy measure of 'innate ability', the assumption of GDP per capita as an indicator of 'success' and 'achievement' is far more readily accepted without criticism. The paper makes the case that the current vogue for indicators, indices and cause-effect can be tyrannical.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Yarn minisett technique (YMT) has been promoted throughout West Africa since the 1980s as a sustainable means of producing clean yarn planting material, but adoption of the technique is Often reported as being patchy at best. While there has been much research Oil the factors that influence adoption of the technique, there have been no attempts to assess its economic viability under 'farmer-managed' as distinct from 'on station' conditions. The present paper describes the results of farmer-managed trials employing the YMT (white yarn: Dioscorea rotundata) at two villages in Igalaland, Kogi State, Nigeria. One of the villages (Edeke) is on the banks of the River Niger and represents it specialist yarn environment, whereas the other village (Ekwuloko) is inland, where farmers employ a more general cropping system. Four farmers were selected in each of the two villages and asked to plant a trial comprising two varieties of yam, their popular local variety its well its another variety grown in other parts of Igalaland, and to treat yarn setts (80-100 g) with either woodash or insecticide/nematicide + fungicide mix (chemical treatment). Results suggest that while chemical sett treatment increased yield and hence gross margin compared with woodash, if household labour is costed then YMT is not economically viable. However, the specialist yarn growers of Edeke were far more positive about the use of YMT as they tended to keep the yarn seed tubers for planting rather than sell them. Thus, great care needs to be taken with planning adoption surveys on the assumption that all farmers should adopt a technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of determining the pressure and velocity fields for a weakly compressible fluid flowing in a two-dimensional reservoir in an inhomogeneous, anisotropic porous medium, with vertical side walls and variable upper and lower boundaries, in the presence of vertical wells injecting or extracting fluid. Numerical solution of this problem may be expensive, particularly in the case that the depth scale of the layer h is small compared to the horizontal length scale l. This is a situation which occurs frequently in the application to oil reservoir recovery. Under the assumption that epsilon=h/l<<1, we show that the pressure field varies only in the horizontal direction away from the wells (the outer region). We construct two-term asymptotic expansions in epsilon in both the inner (near the wells) and outer regions and use the asymptotic matching principle to derive analytical expressions for all significant process quantities. This approach, via the method of matched asymptotic expansions, takes advantage of the small aspect ratio of the reservoir, epsilon, at precisely the stage where full numerical computations become stiff, and also reveals the detailed structure of the dynamics of the flow, both in the neighborhood of wells and away from wells.