93 resultados para Precautionary Principle


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show how to build full-diversity product codes under both iterative encoding and decoding over non-ergodic channels, in presence of block erasure and block fading. The concept of a rootcheck or a root subcode is introduced by generalizing the same principle recently invented for low-density parity-check codes. We also describe some channel related graphical properties of the new family of product codes, a familyreferred to as root product codes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The academic debate about the secession of a territory which is part of a liberal democracy state displays an initial contrast. On the one hand, practical secessionist movements usually legitimize their position using nationalist arguments linked to the principle of national self- determination. On the other hand, we find in academia few defenders of a normative principle of national self-determination. Philosophers, political scientists and jurists usually defend the status quo. And even when they do not defend it, most of them tend to leave the question of that question and secession unresolved or confused. Regarding this issue, liberal-democratic theories show a tendency to be “conservative” in relation to the political borders, regardless the historical and empirical processes of creation of current States. Probably, this feature is not far away to the fact that, since its beginning, political liberalism has not been a theory of the nation, but a theory of the state.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We formulate performance assessment as a problem of causal analysis and outline an approach based on the missing data principle for its solution. It is particularly relevant in the context of so-called league tables for educational, health-care and other public-service institutions. The proposed solution avoids comparisons of institutions that have substantially different clientele (intake).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In principle, a country can not endure negative genuine savings for longperiods of time without experiencing declining consumption. Nevertheless,theoreticians envisage two alternatives to explain how an exporter ofnon-renewable natural resources could experience permanent negativegenuine savings and still ensure sustainability. The first one allegesthat the capital gains arising from the expected improvement in theterms of trade would suffice to compensate for the negative savings ofthe resource exporter. The second alternative points at technologicalchange as a way to avoid economic collapse. This paper uses the dataof Venezuela and Mexico to empirically test the first of these twohypotheses. The results presented here prove that the terms oftrade do not suffice to compensate the depletion of oil reservesin these two open economies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we offer the first large sample evidence on the availability and usage ofcredit lines in U.S. public corporations and use it to re-examine the existing findings oncorporate liquidity. We show that the availability of credit lines is widespread and thataverage undrawn credit is of the same order of magnitude as cash holdings. We test thetrade-off theory of liquidity according to which firms target an optimum level of liquidity,computed as the sum of cash and undrawn credit lines. We provide support for the existenceof a liquidity target, but also show that the reasons why firms hold cash and credit linesare very different. While the precautionary motive explains well cash holdings, the optimumlevel of credit lines appears to be driven by the restrictions imposed by the credit line itself,in terms of stated purpose and covenants. In support to these findings, credit line drawdownsare associated with capital expenditures, acquisitions, and working capital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new algorithm called the parameterized expectations approach(PEA) for solving dynamic stochastic models under rational expectationsis developed and its advantages and disadvantages are discussed. Thisalgorithm can, in principle, approximate the true equilibrium arbitrarilywell. Also, this algorithm works from the Euler equations, so that theequilibrium does not have to be cast in the form of a planner's problem.Monte--Carlo integration and the absence of grids on the state variables,cause the computation costs not to go up exponentially when the numberof state variables or the exogenous shocks in the economy increase. \\As an application we analyze an asset pricing model with endogenousproduction. We analyze its implications for time dependence of volatilityof stock returns and the term structure of interest rates. We argue thatthis model can generate hump--shaped term structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We formulate performance assessment as a problem of causal analysis and outline an approach based on the missing data principle for its solution. It is particularly relevant in the context of so-called league tables for educational, health-care and other public-service institutions. The proposed solution avoids comparisons of institutions that have substantially different clientele (intake).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we explore the accumulation of capital in the presence oflimited insurance against idiosyncratic shocks, borrowing constraintsand endogenous labor supply. As in the exogenous labor supply case(e.g. Aiyagari 1994, Huggett 1997), we find that steady states arecharacterized with an interest rate smaller than the rate of timepreference. However,wealsofind that when labor supply is endogenous thepresence of uncertainty and a borrowing limit are not enough to giverise to aggregate precautionary savings .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The organisation of inpatient care provision has undergone significant reform in many southern European countries. Overall across Europe, public management is moving towards the introduction of more flexibility and autonomy . In this setting, the promotion of the further decentralisation of health care provision stands out as a key salient policy option in all countries that have hitherto had a traditionally centralised structure. Yet, the success of the underlying incentives that decentralised structures create relies on the institutional design at the organisational level, especially in respect of achieving efficiency and promoting policy innovation without harming the essential principle of equal access for equal need that grounds National Health Systems (NHS). This paper explores some of the specific organisational developments of decentralisation structures drawing from the Spanish experience, and particularly those in the Catalonia. This experience provides some evidence of the extent to which organisation decentralisation structures that expand levels of autonomy and flexibility lead to organisational innovation while promoting activity and efficiency. In addition to this pure managerial decentralisation process, Spain is of particular interest as a result of the specific regional NHS decentralisation that started in the early 1980 s and was completed in 2002 when all seventeen autonomous communities that make up the country had responsibility for health care services.Already there is some evidence to suggest that this process of decentralisation has been accompanied by a degree of policy innovation and informal regional cooperation. Indeed, the Spanish experience is relevant because both institutional changes took place, namely managerial decentralisation leading to higher flexibility and autonomy- alongside an increasing political decentralisation at the regional level. The coincidence of both processes could potentially explain why some organisation and policy innovation resulting from policy experimentation at the regional level might be an additional featureto take into account when examining the benefits of decentralisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the standard economic model of unilateral accidents, in its simplest form, assumingthat the injurers have limited assets.We identify a second-best optimal rule that selects as duecare the minimum of first-best care, and a level of care that takes into account the wealth ofthe injurer. We show that such a rule in fact maximizes the precautionary effort by a potentialinjurer. The idea is counterintuitive: Being softer on an injurer, in terms of the required level ofcare, actually improves the incentives to take care when he is potentially insolvent. We extendthe basic result to an entire population of potentially insolvent injurers, and find that the optimalgeneral standards of care do depend on wealth, and distribution of income. We also show theconditions for the result that higher income levels in a given society call for higher levels of carefor accidents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the principle aims of the Working Families' Tax Credit in the UK was to increase the participation of single mothers. The literature to date concludes there was approximately a five-percentage-point increase in employment of single mothers. The differences-in-differences methodology that is typically used compares single mother with single women without children. However, the characteristics of these groups are very different, and change over time in relative covariates are likely to violate the identifying assumption. We find that when we control for differential trends between women with and without children, the employment effect of the policy falls significantly. Moreover, the effect is borne solely by those working full-time (30 hours or more), while having no effect on inducing people into the labor market from inactivity. Looking closely at important covariates over time, we can see sizeable changes in the relative returns to employment between the treatment and control groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We construct and calibrate a general equilibrium business cycle model with unemployment and precautionary saving. We compute the cost of business cycles and locate the optimum in a set of simple cyclical fiscal policies. Our economy exhibits productivity shocks, giving firms an incentive to hire more when productivity is high. However, business cycles make workers' income riskier, both by increasing the unconditional probability of unusuallylong unemployment spells, and by making wages more variable, and therefore they decrease social welfare by around one-fourth or one-third of 1% of consumption. Optimal fiscal policy offsets the cycle, holding unemployment benefits constant but varying the tax rate procyclically to smooth hiring. By running a deficit of 4% to 5% of output in recessions, the government eliminates half the variation in the unemployment rate, most of the variation in workers'aggregate consumption, and most of the welfare cost of business cycles.