639 resultados para Money market -- Australia -- Problems, exercises, etc.
Resumo:
We illustrate the potential of using higher order critical points in the deeper understanding of several interesting problems of condensed matter science, e.g. critical adsorption, finite size effects, morphology of critical fluctuations, reversible aggregation of colloids, dynamics of the ordering process, etc.
Resumo:
Synthetic aperture radar (SAR) is a powerful tool for mapping and remote sensing. The theory and operation of SAR have seen a period of intense activity in recent years. This paper attempts to review some of the more advanced topics studied in connection with modern SAR systems based on digital processing. Following a brief review of the principles involved in the operation of SAR, attention is focussed on special topics such as advanced SAR modelling and focussing techniques, in particular clutterlock and autofocus, Doppler centroid (DC) estimation methods involving seismic migration technique, moving target imaging, bistatic radar imaging, effects of system nonlinearities, etc.
Resumo:
We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.
Resumo:
This paper was presented at the 11th Annual Conference of the European Society for the History of Economic Thought (ESHET).
Resumo:
[EN] The increasing interest in eco-innovation or environmental innovation as a strategy not only to address the serious global environmental problems but also as a source of competitive advantages for companies and for the emergence of new business areas, leads us to try to identify the different factors that act as determinants of its development and adoption at the micro level.
Resumo:
This paper deals with the economics of gasification facilities in general and IGCC power plants in particular. Regarding the prospects of these systems, passing the technological test is one thing, passing the economic test can be quite another. In this respect, traditional valuations assume constant input and/or output prices. Since this is hardly realistic, we allow for uncertainty in prices. We naturally look at the markets where many of the products involved are regularly traded. Futures markets on commodities are particularly useful for valuing uncertain future cash flows. Thus, revenues and variable costs can be assessed by means of sound financial concepts and actual market data. On the other hand, these complex systems provide a number of flexibility options (e.g., to choose among several inputs, outputs, modes of operation, etc.). Typically, flexibility contributes significantly to the overall value of real assets. Indeed, maximization of the asset value requires the optimal exercise of any flexibility option available. Yet the economic value of flexibility is elusive, the more so under (price) uncertainty. And the right choice of input fuels and/or output products is a main concern for the facility managers. As a particular application, we deal with the valuation of input flexibility. We follow the Real Options approach. In addition to economic variables, we also address technical and environmental issues such as energy efficiency, utility performance characteristics and emissions (note that carbon constraints are looming). Lastly, a brief introduction to some stochastic processes suitable for valuation purposes is provided.
Resumo:
Esta dissertação aplica a regularização por entropia máxima no problema inverso de apreçamento de opções, sugerido pelo trabalho de Neri e Schneider em 2012. Eles observaram que a densidade de probabilidade que resolve este problema, no caso de dados provenientes de opções de compra e opções digitais, pode ser descrito como exponenciais nos diferentes intervalos da semireta positiva. Estes intervalos são limitados pelos preços de exercício. O critério de entropia máxima é uma ferramenta poderosa para regularizar este problema mal posto. A família de exponencial do conjunto solução, é calculado usando o algoritmo de Newton-Raphson, com limites específicos para as opções digitais. Estes limites são resultados do princípio de ausência de arbitragem. A metodologia foi usada em dados do índice de ação da Bolsa de Valores de São Paulo com seus preços de opções de compra em diferentes preços de exercício. A análise paramétrica da entropia em função do preços de opções digitais sínteticas (construídas a partir de limites respeitando a ausência de arbitragem) mostraram valores onde as digitais maximizaram a entropia. O exemplo de extração de dados do IBOVESPA de 24 de janeiro de 2013, mostrou um desvio do princípio de ausência de arbitragem para as opções de compra in the money. Este princípio é uma condição necessária para aplicar a regularização por entropia máxima a fim de obter a densidade e os preços. Nossos resultados mostraram que, uma vez preenchida a condição de convexidade na ausência de arbitragem, é possível ter uma forma de smile na curva de volatilidade, com preços calculados a partir da densidade exponencial do modelo. Isto coloca o modelo consistente com os dados do mercado. Do ponto de vista computacional, esta dissertação permitiu de implementar, um modelo de apreçamento que utiliza o princípio de entropia máxima. Três algoritmos clássicos foram usados: primeiramente a bisseção padrão, e depois uma combinação de metodo de bisseção com Newton-Raphson para achar a volatilidade implícita proveniente dos dados de mercado. Depois, o metodo de Newton-Raphson unidimensional para o cálculo dos coeficientes das densidades exponenciais: este é objetivo do estudo. Enfim, o algoritmo de Simpson foi usado para o calculo integral das distribuições cumulativas bem como os preços do modelo obtido através da esperança matemática.
Resumo:
Fishery-independent estimates of spawning biomass (BSP) of the Pacific sardine (Sardinops sagax) on the south and lower west coasts of Western Australia (WA) were obtained periodically between 1991 and 1999 by using the daily egg production method (DEPM). Ichthyoplankton data collected during these surveys, specifically the presence or absence of S. sagax eggs, were used to investigate trends in the spawning area of S. sagax within each of four regions. The expectation was that trends in BSP and spawning area were positively related. With the DEPM model, estimates of BSP will change proportionally with spawning area if all other variables remain constant. The proportion of positive stations (PPS), i.e., stations with nonzero egg counts — an objective estimator of spawning area — was high for all south coast regions during the early 1990s (a period when the estimated BSP was also high) and then decreased after the mid-1990s. There was a decrease in PPS from the mid-1990s to 1999. The particularly low estimates in 1999 followed a severe epidemic mass mortality of S. sagax throughout their range across southern Australia. Deviations from the expected relationship between BSP and PPS were used to identify uncertainty around estimates of BSP. Because estimation of spawning area is subject to less sampling bias than estimation of BSP, the deviation in the relation between the two provides an objective basis for adjusting some estimates of the latter. Such an approach is particularly useful for fisheries management purposes when sampling problems are suspected to be present. The analysis of PPS undertaken from the same set of samples from which the DEPM estimate is derived will help provide information for stock assessments and for the management of purse-seine fisheries.
Resumo:
Various problems associated with the quality of the fishery products like spoilage, discolouration, microbiological problems, etc., are outlined. The reasons and remedial measures are discussed. The importance of proper handling, processing and hygiene is stressed.
An empirical examination of risk equalisation in a regulated community rated health insurance market
Resumo:
Despite universal access entitlements to the public healthcare system in Ireland, over half the population is covered by voluntary private health insurance. The market operates on the basis of community rating, open enrolment and lifetime cover. A set of minimum benefits also exists, and two risk equalisation schemes have been put in place but neither was implemented. These schemes have proved highly controversial. To date, the debate has primarily consisted of qualitative arguments. This study adds a quantitative element by analysing a number of pertinent issues. A model of a community rated insurance market is developed, which shows that community rating can only be maintained in a competitive market if all insurers in the market have the same risk profile as the market overall. This has relevance to the Irish market in the aftermath of a Supreme Court decision to set aside risk equalisation. Two reasons why insurers’ risk profiles might differ are adverse selection and risk selection. Evidence is found of the existence of both forms of selection in the Irish market. A move from single rate community rating to lifetime community rating in Australia had significant consequences for take-up rates and the age profile of the insured population. A similar move has been proposed in Ireland. It is found that, although this might improve the stability of community rating in the short term, it would not negate the need for risk equalisation. If community rating were to collapse then risk rating might result. A comparison of the Irish, Australian and UK health insurance markets suggests that community rating encourages higher take-up among older consumers than risk rating. Analysis of Irish hospital discharge figures suggests that this yields significant savings for the Irish public healthcare system. This thesis has implications for government policy towards private health insurance in Ireland.
Resumo:
To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.
Resumo:
Market failures associated with environmental pollution interact with market failures associated with the innovation and diffusion of new technologies. These combined market failures provide a strong rationale for a portfolio of public policies that foster emissions reduction as well as the development and adoption of environmentally beneficial technology. Both theory and empirical evidence suggest that the rate and direction of technological advance is influenced by market and regulatory incentives, and can be cost-effectively harnessed through the use of economic-incentive based policy. In the presence of weak or nonexistent environmental policies, investments in the development and diffusion of new environmentally beneficial technologies are very likely to be less than would be socially desirable. Positive knowledge and adoption spillovers and information problems can further weaken innovation incentives. While environmental technology policy is fraught with difficulties, a long-term view suggests a strategy of experimenting with policy approaches and systematically evaluating their success. © 2005 Elsevier B.V. All rights reserved.
Resumo:
Whilst child welfare systems in the United Kingdom, Australia and the United States may share a number of common goals, they are not designed to identify families with multiple problems. Where system output measures have been utilised as proxy measures to detect such families they indicate the presence of families in the population served by child and family social work. In interviews with practitioners and managers working within contrasting welfare systems, we explore how families with multiple problems are identiifed, what repsonses they currently recieve and how their needs might be better met.