34 resultados para distribution (probability theory)

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this study, a simple analytical framework to find the probability distributions of number of children and maternal age at various order births by making use of data on age-specific fertility rates by birth order was proposed. The proposed framework is applicable to both the period and cohort fertility schedules. The most appealing point of the proposed framework is that it does not require stringent assumptions. The proposed framework has been applied to the cohort birth order-specific fertility schedules of India and its different regions and period birth order-specific fertility schedules, including the United States of America, Russia, and the Netherlands, to demonstrate its usefulness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The issue of information sharing and exchanging is one of the most important issues in the areas of artificial intelligence and knowledge-based systems (KBSs), or even in the broader areas of computer and information technology. This paper deals with a special case of this issue by carrying out a case study of information sharing between two well-known heterogeneous uncertain reasoning models: the certainty factor model and the subjective Bayesian method. More precisely, this paper discovers a family of exactly isomorphic transformations between these two uncertain reasoning models. More interestingly, among isomorphic transformation functions in this family, different ones can handle different degrees to which a domain expert is positive or negative when performing such a transformation task. The direct motivation of the investigation lies in a realistic consideration. In the past, expert systems exploited mainly these two models to deal with uncertainties. In other words, a lot of stand-alone expert systems which use the two uncertain reasoning models are available. If there is a reasonable transformation mechanism between these two uncertain reasoning models, we can use the Internet to couple these pre-existing expert systems together so that the integrated systems are able to exchange and share useful information with each other, thereby improving their performance through cooperation. Also, the issue of transformation between heterogeneous uncertain reasoning models is significant in the research area of multi-agent systems because different agents in a multi-agent system could employ different expert systems with heterogeneous uncertain reasonings for their action selections and the information sharing and exchanging is unavoidable between different agents. In addition, we make clear the relationship between the certainty factor model and probability theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper makes use of the idea of prediction intervals (PIs) to capture the uncertainty associated with wind power generation in power systems. Since the forecasting errors cannot be appropriately modeled using distribution probability functions, here we employ a powerful nonparametric approach called lower upper bound estimation (LUBE) method to construct the PIs. The proposed LUBE method uses a new framework based on a combination of PIs to overcome the performance instability of neural networks (NNs) used in the LUBE method. Also, a new fuzzy-based cost function is proposed with the purpose of having more freedom and flexibility in adjusting NN parameters used for construction of PIs. In comparison with the other cost functions in the literature, this new formulation allows the decision-makers to apply their preferences for satisfying the PI coverage probability and PI normalized average width individually. As the optimization tool, bat algorithm with a new modification is introduced to solve the problem. The feasibility and satisfying performance of the proposed method are examined using datasets taken from different wind farms in Australia.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review of the literature shows such null models are virtually always rejected, with often large effect sizes. We formulate an alternative null model, which assumes that 1) the number of EPC has a random (Poisson) distribution across females (broods) and that 2) the probability for an offspring to be of extrapair origin is zero without any EPC and increases with the number of EPC. Our brood-level model can accommodate the bimodality of both zero and medium rates of EPY typically found in empirical data, and fitting our model to EPY production of 7 passerine bird species shows evidence of a nonrandom distribution of EPY in only 2 species. We therefore argue that 1) dichotomy in extrapair mate choice cannot be inferred only from a significant deviation in the observed distribution of EPY from a random process on the level of offspring and that 2) additional empirical work on testing the contrasting critical predictions from the classic and our alternative null models is required.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract
This paper aims to investigate the effect of cash flow and free cash flow on corporate failure in the emerging market in particular Jordan using two samples; matched sample and a cross sectional time-series (panel data) sample representative of 167 Jordanian companies in 1989-2003. LOGIT models are used to outline the relationship between firms’ financial health and the probability of default. Our results show that there is firm’s free cash flow increases corporate failure. The result also shows that the firm’s cash flow decreases corporate failure. Firms’ capital structures are fund a mental in predicting default. Capital structure is seen as the main factor affecting the probability of default as it affects a firm’s ability to access external sources of funds. Jordanian firms depend on short-term debt for both short and long term financing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the distribution, habitat and population dynamics of the swamp antechinus (Antechinus minimus maritimus) in the eastern Otway Ranges. The species has a restricted, disjunct distribution and has been recorded at 25 sites between 1969 and 1999. All sites were located within 7 km of the coast, occurred at altitudes up to 80 m above sea level and within 10 m of a gully. Analysis of landscape site variables identified sun index as being significant in determination of the probability of occurrence of A. minimus. The presence of A. minimus is negatively associated with sun index, occuring at sites that have a southerly aspect and gentle slope. A. minimus was located in a range of structural vegetation including Open Forest, Low Woodland, Shrubland and Hummock Grassland and a number of floristic groups, some characterised by high frequencies of sclerophyll shrubs, others by high frequencies of Pteridium esculentum, hummock grasses and herbaceous species. A. minimus occurs in fragmented, small populations with maximum population densities of 1.1–18 ha–1. Populations at inland sites became extinct after the 1983 wildfire which burnt 41 000 ha. These sites have not been recolonised since, while on the coast the species did not re-establish until 1993–97. One population that is restricted to a narrow coastal strip of habitat is characterised by high levels of transient animals. The species is subject to extinction in the region due to habitat fragmentation, coastal developments and fire. Management actions to secure the present populations and ensure long-term survival of the species in the area are required and include implementation of appropriate fire regimes, prevention of habitat fragmentation, revegetation of habitat, and establishment of corridor habitat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the development of geographic information system (GIS) technology and digital data manipulation techniques has enabled practitioners in the geographical and geophysical sciences to make more efficient use of resource information, many of the methods used in forming spatial prediction models are still inherently based on traditional techniques of map stacking in which layers of data are combined under the guidance of a theoretical domain model. This paper describes a data-driven approach by which Artificial Neural Networks (ANNs) can be trained to represent a function characterising the probability that an instance of a discrete event, such as the presence of a mineral deposit or the sighting of an endangered animal species, will occur over some grid element of the spatial area under consideration. A case study describes the application of the technique to the task of mineral prospectivity mapping in the Castlemaine region of Victoria using a range of geological, geophysical and geochemical input variables. Comparison of the maps produced using neural networks with maps produced using a density estimation-based technique demonstrates that the maps can reliably be interpreted as representing probabilities. However, while the neural network model and the density estimation-based model yield similar results under an appropriate choice of values for the respective parameters, the neural network approach has several advantages, especially in high dimensional input spaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate assessment of the fate of salts, nutrients, and pollutants in natural, heterogeneous soils requires a proper quantification of both spatial and temporal solute spreading during solute movement. The number of experiments with multisampler devices that measure solute leaching as a function of space and time is increasing. The breakthrough curve (BTC) can characterize the temporal aspect of solute leaching, and recently the spatial solute distribution curve (SSDC) was introduced to describe the spatial solute distribution. We combined and extended both concepts to develop a tool for the comprehensive analysis of the full spatio-temporal behavior of solute leaching. The sampling locations are ranked in order of descending amount of total leaching (defined as the cumulative leaching from an individual compartment at the end of the experiment), thus collapsing both spatial axes of the sampling plane into one. The leaching process can then be described by a curved surface that is a function of the single spatial coordinate and time. This leaching surface is scaled to integrate to unity, and termed S can efficiently represent data from multisampler solute transport experiments or simulation results from multidimensional solute transport models. The mathematical relationships between the scaled leaching surface S, the BTC, and the SSDC are established. Any desired characteristic of the leaching process can be derived from S. The analysis was applied to a chloride leaching experiment on a lysimeter with 300 drainage compartments of 25 cm2 each. The sandy soil monolith in the lysimeter exhibited fingered flow in the water-repellent top layer. The observed S demonstrated the absence of a sharp separation between fingers and dry areas, owing to diverging flow in the wettable soil below the fingers. Times-to-peak, maximum solute fluxes, and total leaching varied more in high-leaching than in low-leaching compartments. This suggests a stochastic–convective transport process in the high-flow streamtubes, while convection–dispersion is predominant in the low-flow areas. S can be viewed as a bivariate probability density function. Its marginal distributions are the BTC of all sampling locations combined, and the SSDC of cumulative solute leaching at the end of the experiment. The observed S cannot be represented by assuming complete independence between its marginal distributions, indicating that S contains information about the leaching process that cannot be derived from the combination of the BTC and the SSDC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource management decisions influence not only the output of the economy but also the distribution of utility between groups within the community. The theory of Benefit Cost Analysis provides a means of incorporating this distributional change through the application of distributional or welfare weights. This paper reports the results of research designed to estimate distributional weights suitable for inclusion in a Benefit Cost Analysis framework. The findings of a choice modelling experiment estimating community preferences with respect to intergenerational utility distribution are presented to illustrate this innovative application of a stated preference technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wildlife managers are often faced with the difficult task of determining the distribution of species, and their preferred habitats, at large spatial scales. This task is even more challenging when the species of concern is in low abundance and/or the terrain is largely inaccessible. Spatially explicit distribution models, derived from multivariate statistical analyses and implemented in a geographic information system (GIS), can be used to predict the distributions of species and their habitats, thus making them a useful conservation tool. We present two such models: one for a dasyurid, the Swamp Antechinus (Antechinus minimus), and the other for a ground-dwelling bird, the Rufous Bristlebird (Dasyornis broadbenti), both of which are rare species occurring in the coastal heathlands of south-western Victoria. Models were generated using generalized linear modelling (GLM) techniques with species presence or absence as the independent variable and a series of landscape variables derived from GIS layers and high-resolution imagery as the predictors. The most parsimonious model, based on the Akaike Information Criterion, for each species then was extrapolated spatially in a GIS. Probability of species presence was used as an index of habitat suitability. Because habitat fragmentation is thought to be one of the major threats to these species, an assessment of the spatial distribution of suitable habitat across the landscape is vital in prescribing management actions to prevent further habitat fragmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future global distribution of the political regimes of countries, just like that of their economic incomes, displays a surprising tendency for polarization into only two clubs of convergence at the extrema. This, in itself, is a persuasive reason to analyze afresh the logical validity of an endogenous theory for political and economic development inherent in modernization theory. I suggest how adopting a simple evolutionary game theoretic view on the subject allows an explanation for these parallel clubs of convergence in political regimes and economic income within the framework of existing research in democratization theory. I also suggest how instrumental action can be methodically introduced into such a setup using learning strategies adopted by political actors. These strategies, based on the first principles of political competition, are motivated by introducing the theoretical concept of a Credible Polity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource management decisions influence not only the output of the economy but also the distribution of utility between groups within the community. The theory of cost benefit analysis provides a means of incorporating distributional changes into the decision making calculus through the application of distributional or welfare weights. However, this practice has not been widely adopted in part due to difficulties in the estimation of distributional weights. This paper addresses this problem by using the stated preference method of choice modelling to estimate distributional weights suitable for inclusion in a cost benefit analysis framework. The findings of a choice modelling experiment designed to estimate community preferences with respect to intergenerational utility distribution illustrate the potential of this method in addressing distributional issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DDoS is a spy-on-spy game between attackers and detectors. Attackers are mimicking network traffic patterns to disable the detection algorithms which are based on these features. It is an open problem of discriminating the mimicking DDoS attacks from massive legitimate network accessing. We observed that the zombies use controlled function(s) to pump attack packages to the victim, therefore, the attack flows to the victim are always share some properties, e.g. packages distribution behaviors, which are not possessed by legitimate flows in a short time period. Based on this observation, once there appear suspicious flows to a server, we start to calculate the distance of the package distribution behavior among the suspicious flows. If the distance is less than a given threshold, then it is a DDoS attack, otherwise, it is a legitimate accessing. Our analysis and the preliminary experiments indicate that the proposed method- can discriminate mimicking flooding attacks from legitimate accessing efficiently and effectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Staple fibre yarns vary quite markedly in linear density (tex) along their length and the degree to which twist redistributes from thick to thin places will affect the strength, torque and extension behaviour of the yarn. Theory suggests that twist along worsted yarns should vary as 1/(tex)2 if fibres were locked in the structure, whereas themean torque of worsted yarns reported in the literature implies that twist should be proportional to 1/tex. This article examines twist distribution in ring-spun marl yarns, down to 5 mm resolution, as a function of linear density measured using a high-resolution capacitive sensor. It is found for moderate twist-level worsted yarns that twist is approximately proportional to 1/(tex)1.6. The results and theory provide a guide as to the effect the observed large variations in linear density will have on yarn properties such as tenacity and torque.