992 resultados para consistent value
Resumo:
Status signals function in a number of species to communicate competitive ability to conspecific rivals during competition for resources. In the paper wasp Polistes dominulus, variable black clypeal patterns are thought to be important in mediating competition among females. Results of previous behavioral experiments in the lab indicate that P dominulus clypeal patterns provide information about an individual's competitive ability to rivals during agonistic interactions. To date, however, there has been no detailed examination of the adaptive value of clypeal patterns in the wild. To address this, we looked for correlations between clypeal patterning and various fitness measures, including reproductive success, hierarchical rank, and survival, in a large, free-living population of P. dominulus in southern Spain. Reproductive success over the nesting season was not correlated with clypeal patterning. Furthermore, there was no relationship between a female's clypeal patterning and the rank she achieved within the hierarchy or her survival during nest founding. Overall, we found no evidence that P dominulus clypeal patterns are related to competitive ability or other aspects of quality in our population. This result is consistent with geographical variation in the adaptive value of clypeal patterns between P. dominulus populations; however, data on the relationship between patterning and fitness from other populations are required to test this hypothesis.
Resumo:
Current research agendas are increasingly encouraging the construction industry to operate on the basis of 'added value'. Such debates echo the established concept of 'high value manufacturing' and associated trends towards servitization. Within construction, the so-called 'value agenda' draws heavily from the notion of integrated solutions. This is held to be especially appropriate in the context of PFI projects. Also relevant is the concept of service-led projects whereby the project rationale is driven by the client's objectives for delivering an enhanced service to its own customers. Such ideas are contextualized by a consideration of broader trends of privatization and outsourcing within and across the construction industry's client base. The current emphasis on integrated solutions reflects long-term trends within privatized client organizations towards the outsourcing of asset management capabilities. However, such trends are by no means uniform or consistent. An in-depth case study of three operating divisions within a major construction company illustrates that firms are unlikely to reorientate their business in response to the 'value agenda'. In the case of PFI, the tendency has been to establish specialist units for the purposes of winning work. Meanwhile, institutionally embedded operating routines within the rest of the business remain broadly unaffected.
Resumo:
The effect of variety, agronomic and environmental factors on the chemical composition and energy value for ruminants and non-ruminants of husked and naked oats grain was studied. Winter oats were grown as experimental plots in each of 2 years on three sites in England. At each site two conventional husked oat cultivars (Gerald and Image) and two naked cultivars (Kynon and Pendragon) were grown. At each site, crops were sown on two dates and all crops were grown with the application of either zero or optimum fertiliser nitrogen. Variety and factors contained within the site + year effect had the greatest influence on the chemical composition and nutritive value of oats, followed by nitrogen ferfiliser treatment. For example, compared with zero nitrogen, the optimum nitrogen fertiliser treatment resulted in a consistent and significant (P < 0.001) increase in crude protein for all varieties at all sites from an average of 95 to 118 g kg(-1) DM, increased the potassium concentration in all varieties from an average of 4.9 to 5.1 g kg(-1) DM (P < 0.01) and reduced total lipid by a small but significant (P < 0.001) amount. Optimum nitrogen increased (P < 0.001) the NDF concentration in the two husked varieties and in the naked variety Pendragon. Naked cultivars were lower in fibre, had considerably higher energy, total lipid, linoleic acid, protein, starch and essential amino acids than the husked cultivars. Thus nutritionists need to be selective in their choice of naked or husked oat depending on the intended dietary use. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The Boltzmann equation in presence of boundary and initial conditions, which describes the general case of carrier transport in microelectronic devices is analysed in terms of Monte Carlo theory. The classical Ensemble Monte Carlo algorithm which has been devised by merely phenomenological considerations of the initial and boundary carrier contributions is now derived in a formal way. The approach allows to suggest a set of event-biasing algorithms for statistical enhancement as an alternative of the population control technique, which is virtually the only algorithm currently used in particle simulators. The scheme of the self-consistent coupling of Boltzmann and Poisson equation is considered for the case of weighted particles. It is shown that particles survive the successive iteration steps.
Resumo:
Numerous studies have documented the failure of the static and conditional capital asset pricing models to explain the difference in returns between value and growth stocks. This paper examines the post-1963 value premium by employing a model that captures the time-varying total risk of the value-minus-growth portfolios. Our results show that the time-series of value premia is strongly and positively correlated with its volatility. This conclusion is robust to the criterion used to sort stocks into value and growth portfolios and to the country under review (the US and the UK). Our paper is consistent with evidence on the possible role of idiosyncratic risk in explaining equity returns, and also with a separate strand of literature concerning the relative lack of reversibility of value firms' investment decisions.
Resumo:
This paper discusses concepts of value from the point of view of the user of the space and the counter view of the provider of the same. Land and property are factors of production. The value of the land flows from the use to which it is put, and that in turn, is dependent upon the demand (and supply) for the product or service that is produced/provided from that space. If there is a high demand for the product (at a fixed level of supply), the price will increase and the economic rent for the land/property will increase accordingly. This is the underlying paradigm of Ricardian rent theory where the supply of land is fixed and a single good is produced. In such a case the rent of land is wholly an economic rent. Economic theory generally distinguishes between two kinds of price, price of production or “value in use” (as determined by the labour theory of value), and market price or “value in exchange” (as determined by supply and demand). It is based on a coherent and consistent theory of value and price. Effectively the distinction is between what space is ‘worth’ to an individual and that space’s price of exchange in the market place. In a perfect market where any individual has access to the same information as all others in the market, price and worth should coincide. However in a market where access to information is not uniform, and where different uses compete for the same space, it is more likely that the two figures will diverge. This paper argues that the traditional reliance of valuers to use methods of comparison to determine “price” has led to an artificial divergence of “value in use” and “value in exchange”, but now such comparison are becoming more difficult due to the diversity of lettings in the market place, there will be a requirement to return to fundamentals and pay heed to the thought process of the user in assessing the worth of the space to be let.
Resumo:
Using monthly time-series data 1999-2013, the paper shows that markets for agricultural commodities provide a yardstick for real purchasing power, and thus a reference point for the real value of fiat currencies. The daily need for each adult to consume about 2800 food calories is universal; data from FAO food balance sheets confirm that the world basket of food consumed daily is non-volatile in comparison to the volatility of currency exchange rates, and so the replacement cost of food consumed provides a consistent indicator of economic value. Food commodities are storable for short periods, but ultimately perishable, and this exerts continual pressure for markets to clear in the short term; moreover, food calories can be obtained from a very large range of foodstuffs, and so most households are able to use arbitrage to select a near optimal weighting of quantities purchased. The paper proposes an original method to enable a standard of value to be established, definable in physical units on the basis of actual worldwide consumption of food goods, with an illustration of the method.
Resumo:
Simple predator–prey models with a prey-dependent functional response predict that enrichment (increased carrying capacity) destabilizes community dynamics: this is the ‘paradox of enrichment’. However, the energy value of prey is very important in this context. The intraspecific chemical composition of prey species determines its energy value as a food for the potential predator. Theoretical and experimental studies establish that variable chemical composition of prey affects the predator–prey dynamics. Recently, experimental and theoretical approaches have been made to incorporate explicitly the stoichiometric heterogeneity of simple predator–prey systems. Following the results of the previous experimental and theoretical advances, in this article we propose a simple phenomenological formulation of the variation of energy value at increased level of carrying capacity. Results of our study demonstrate that coupling the parameters representing the phenomenological energy value and carrying capacity in a realistic way, may avoid destabilization of community dynamics following enrichment. Additionally, under such coupling the producer–grazer system persists for only an intermediate zone of production—a result consistent with recent studies. We suggest that, while addressing the issue of enrichment in a general predator–prey model, the phenomenological relationship that we propose here might be applicable to avoid Rosenzweig’s paradox.
Resumo:
Liquid Chromatography Mass Spectrometry (LC-MS) was used to obtain glucosinolate and flavonol content for 35 rocket accessions and commercial varieties. 13 glucosinolates and 11 flavonol compounds were identified. Semi-quantitative methods were used to estimate concentrations of both groups of compounds. Minor glucosinolate composition was found to be different between accessions; concentrations varied significantly. Flavonols showed differentiation between genera, with Diplotaxis accumulating quercetin glucosides and Eruca accumulating kaempferol glucosides. Several compounds were detected in each genus that have only previously been reported in the other. We highlight how knowledge of phytochemical content and concentration can be used to breed new, nutritionally superior varieties. We also demonstrate the effects of controlled environment conditions on the accumulations of glucosinolates and flavonols and explore the reasons for differences with previous studies. We stress the importance of consistent experimental design between research groups to effectively compare and contrast results.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
A self-consistent equilibrium calculation, valid for arbitrary aspect ratio tokamaks, is obtained through a direct variational technique that reduces the equilibrium solution, in general obtained from the 2D Grad-Shafranov equation, to a 1D problem in the radial flux coordinate rho. The plasma current profile is supposed to have contributions of the diamagnetic, Pfirsch-Schluter and the neoclassical ohmic and bootstrap currents. An iterative procedure is introduced into our code until the flux surface averaged toroidal current density (J(T)), converges to within a specified tolerance for a given pressure profile and prescribed boundary conditions. The convergence criterion is applied between the (J(T)) profile used to calculate the equilibrium through the variational procedure and the one that results from the equilibrium and given by the sum of all current components. The ohmic contribution is calculated from the neoclassical conductivity and from the self-consistently determined loop voltage in order to give the prescribed value of the total plasma current. The bootstrap current is estimated through the full matrix Hirshman-Sigmar model with the viscosity coefficients as proposed by Shaing, which are valid in all plasma collisionality regimes and arbitrary aspect ratios. The results of the self-consistent calculation are presented for the low aspect ratio tokamak Experimento Tokamak Esferico. A comparison among different models for the bootstrap current estimate is also performed and their possible Limitations to the self-consistent calculation is analysed.
Resumo:
We derive the equation of state of nuclear matter for the quark-meson coupling model taking into account quantum fluctuations of the σ meson as well as vacuum polarization effects for the nucleons. This model incorporates explicitly quark degrees of freedom with quarks coupled to the scalar and vector mesons. Quantum fluctuations lead to a softer equation of state for nuclear matter giving a lower value of incompressibility than would be reached without quantum effects. The in-medium nucleon and σ-meson masses are also calculated in a self-consistent manner. The spectral function of the σ meson is calculated and the σ mass has the value increased with respect to the purely classical approximation at high densities.
Resumo:
Big game can damage crops and compete with livestock for valuable forage. Ranchers have reported their tolerance for big game would increase if the animals could be prevented from using key areas critical for spring livestock use. Likewise, some farmers have high value areas that must be protected. Fences provide the most consistent long term control compared to other deterrent methods, but are costly to erect. Many designs of woven wire and electric fences are currently used. Costs of erecting deer proof fencing could be greatly reduced if an existing fence could be modified instead of being replaced entirely. This study investigates the possibility of modifying existing fences to prohibit deer and elk crossings. Preliminary results indicate effective modifications can be made to existing fences for $1300- $3500 per mile for materials. Traditional complete construction of game fences cost more than $10,000 per mile. These fences may be used in lieu of compensation programs for ranchers. Also, if farmers and ranchers can keep big game out of important foraging areas, their tolerance for these animals on the rest of their property may greatly increase.
Resumo:
Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.
Resumo:
Die vorliegende Arbeit ist motiviert durch biologische Fragestellungen bezüglich des Verhaltens von Membranpotentialen in Neuronen. Ein vielfach betrachtetes Modell für spikende Neuronen ist das Folgende. Zwischen den Spikes verhält sich das Membranpotential wie ein Diffusionsprozess X der durch die SDGL dX_t= beta(X_t) dt+ sigma(X_t) dB_t gegeben ist, wobei (B_t) eine Standard-Brown'sche Bewegung bezeichnet. Spikes erklärt man wie folgt. Sobald das Potential X eine gewisse Exzitationsschwelle S überschreitet entsteht ein Spike. Danach wird das Potential wieder auf einen bestimmten Wert x_0 zurückgesetzt. In Anwendungen ist es manchmal möglich, einen Diffusionsprozess X zwischen den Spikes zu beobachten und die Koeffizienten der SDGL beta() und sigma() zu schätzen. Dennoch ist es nötig, die Schwellen x_0 und S zu bestimmen um das Modell festzulegen. Eine Möglichkeit, dieses Problem anzugehen, ist x_0 und S als Parameter eines statistischen Modells aufzufassen und diese zu schätzen. In der vorliegenden Arbeit werden vier verschiedene Fälle diskutiert, in denen wir jeweils annehmen, dass das Membranpotential X zwischen den Spikes eine Brown'sche Bewegung mit Drift, eine geometrische Brown'sche Bewegung, ein Ornstein-Uhlenbeck Prozess oder ein Cox-Ingersoll-Ross Prozess ist. Darüber hinaus beobachten wir die Zeiten zwischen aufeinander folgenden Spikes, die wir als iid Treffzeiten der Schwelle S von X gestartet in x_0 auffassen. Die ersten beiden Fälle ähneln sich sehr und man kann jeweils den Maximum-Likelihood-Schätzer explizit angeben. Darüber hinaus wird, unter Verwendung der LAN-Theorie, die Optimalität dieser Schätzer gezeigt. In den Fällen OU- und CIR-Prozess wählen wir eine Minimum-Distanz-Methode, die auf dem Vergleich von empirischer und wahrer Laplace-Transformation bezüglich einer Hilbertraumnorm beruht. Wir werden beweisen, dass alle Schätzer stark konsistent und asymptotisch normalverteilt sind. Im letzten Kapitel werden wir die Effizienz der Minimum-Distanz-Schätzer anhand simulierter Daten überprüfen. Ferner, werden Anwendungen auf reale Datensätze und deren Resultate ausführlich diskutiert.