841 resultados para Coordination scheme
Resumo:
Este trabalho desenvolve um novo "canal de Confiança" da política fiscal e caracteriza a política ótima quando esse canal é levado em consideração. Para esse objetivo, utilizamos um modelo estático com (i) concorrência monopolística, (ii) custos de ajustamento fixos para investir, (iii) complementaridade estratégica devido a informação imperfeita com respeito a produtividade agregada, e (iv) bens privados como substitutos imperfeitos de bens privados. Este arcabouço acomoda a possibilidade de falhas de coordenação nos investimentos, mas apresenta um equilíbrio único. Mostramos que a política fiscal tem efeitos importantes na coordenação. Um aumento dos gastos do governo leva a uma maior demanda por bens privados. Mais importante, este também afeta as expectativas de ordem superior com relação a demanda das demais firmas, que amplifica os efeitos do aumento inicial da demanda devido a complementaridade estratégica nas decisões de investimento. Como as demais firmas estão se deparam com uma demanda maior, espera-se que estas invistam mais, que por sua vez, aumenta a demanda individual de cada firma, que aumenta os incentivos a investir. Denominamos isto como o "canal de confiança" da política fiscal. Sob a ameaça de falhas de coordenação, a política fiscal ótima prescreve produzir além do ponto em que o benefício marginal resultante do consumo de bens públicos é igual ao custo marginal desses bens. Este benefício adicional vem do fato de que a política fiscal pode ampliar a coordenação dos investimentos.
Resumo:
This paper explaina why workers lack motivation near bankruptcy, why they tend to leave companies in financiai distreas, and why thoae who remam require higher compensation. Theae indirect costa of financiai diatresa adie becauae the optimal combination of debt and incentive achem.ea, deaigned to minimize agency costa, ends up underpaying managers when there ia a bankruptcy threat. The paper a1so providea new empirica1 implications on the intera.ction between financiai reatructuring and changea in managerial compensation. Theae predictions are supported by the findings of Gilson and Vetsuypens (1992).
Resumo:
There are plenty of economic studies pointing out some requirements, like the inexistence of fiscal dominance, for inflation targeting framework be implemented in successful (credible) way. Essays on how public targets could be used in the absence of such requirements are unusual. In this papel' we appraise how central banks could use inflation targeting before soundness economic fundamentaIs have been achieved. First, based on concise framework, where confidence crises and imperfect information are neglected, we conclude that less ambitious (greater) target for inflation increases the credibility in the precommitment. Optimal target is higher than the one obtained using the Cukierman-Liviatan [7] model, where increasing credibility effect is not considered. Second, extending the model to make confidence crises possible, multiple equilibria solutions becomes possible too. In this case, to set greater targets for inflation may stimulate confidence crises and reduce the policymaker credibility. On the other hand, multiple (bad) equilibria may be avoided. The optimal target depends on the likelihood of each equilibrium be selected. Finally, when perturbing common knowledge uniqueness is restored even considering confidence crises, as in Morris-Shin[ 14]. The first result, i.e. less ambitious target for inflation increases credibility in precommitment, is also recovered. Adding a precise public signal, cOOl'dinated self-fulfilling actions and equilibrium multiplicity may still exist for some lack of common knowledge (as in Angeleto and Weming[l]). In this case, to set greater targets for inflation may stimulate confidence crisis again, reducing the policymaker credibility. From another aspect, multiple (bad) equilibria may be avoided. Optimal policy prescriptions depend on the likelihood of each equilibrium be selected. Results also indicate that more precise public information may open the door for bad equilibrium, contrary to the conventional wisdom that more central oank transparency is always good when considering inflation targeting framework.
Resumo:
We study a dynamic model of coordination with timing frictions and payoff heterogeneity. There is a unique equilibrium, characterized by thresholds that determine the choices of each type of agent. We characterize equilibrium for the limiting cases of vanishing timing frictions and vanishing shocks to fundamentals. A lot of conformity emerges: despite payoff heterogeneity, agents’ equilibrium thresholds partially coincide as long as there exists a set of beliefs that would make this coincidence possible – though they never fully coincide. In case of vanishing frictions, the economy behaves almost as if all agents were equal to an average type. Conformity is not inefficient. The efficient solution would have agents following others even more often and giving less importance to the fundamental
Resumo:
The essentiality of money is commonly justi ed on e¢ ciency grounds. In this paper, we propose an alternative view on the essentiality of money. We consider an economy with llimited monitoring where agents have to coordinate on the use of two alternative technologies of exchange, money and credit. We show that although credit strictly dominates money from an e¢ ciency perspective, money is essential for coordination reasons. If agents are patient, the region of parameters where they coordinate in the use of money strictly contains the region of parameters where they coordinate in the use of credit
Resumo:
Supply chain coordination (SCC) can be a challenge for many organizations as different firms in the same chain has different expectations and interdependencies (Arshinder & Deshmukh, 2008). Lack of SCC can result in the bullwhip effect and poor performance for a firm and its partners. By investigating the phenomenon in the Brazilian pharmaceutical supply chain using a qualitative research, this paper aims to understand the main issues that avoid a better integrated chain. Results of 21 interviews suggested that the lack of coordination in this environment was influenced by the network design and the history of the sector in Brazil, as well as scarce resources
Resumo:
We estimate the impact of the main unconditional federal grant (Fundo de Participaçãodos Municípios - FPM) to Brazilian municipalities as well as its spillover from the neighboring cities on local health outcomes. We consider data from 2002 to 2007 (Brollo et al, 2013) and explore the FPM distribution rule according to population brackets to apply a fuzzy Regression Discontinuity Design (RDD) using cities near the thresholds. In elasticity terms, we nd a reduction on infant mortality rate (-0.18) and on morbidity rate (- 0.41), except in the largest cities of our sample. We also nd an increase on the access to the main program of visiting the vulnerable families, the Family Health Program (Programa Sa ude da Família - PSF). The e ects are stronger for the smallest cities of our sample and we nd increase: (i) On the percentage of residents enrolled in the program (0.36), (ii) On the per capita number of PSF visits (1.59), and (iii) On the per capita number of PSF visits with a doctor (1.8) and nurse (2). After we control for the FPM spillover using neighboring cities near diferent thresholds, our results show that the reduction in morbidity and mortality is largely due to the spillover e ect, but there are negative spillover on preventive actions, as PSF doctors visits and vaccination. Finally, the negative spillover e ect on health resources may be due free riding or political coordination problems, as in the case of the number of hospital beds, but also due to to competition for health professionals, as in the case of number of doctors (-0.35 and -0.87, respectively), specially general practitioners and surgeons (-1.84 and -2.45).
Resumo:
This project describes an authentication technique that is shoulder-surfing resistant. Shoulder surfing is an attack in which an attacker can get access to private information by observing the user’s interaction with a terminal, or by using recording tools to record the user interaction and study the obtained data, with the objective of obtaining unauthorized access to a target user’s personal information. The technique described here relies on gestural analysis coupled with a secondary channel of authentication that uses button pressing. The thesis presents and evaluates multiple alternative algorithms for gesture analysis, and furthermore assesses the effectiveness of the technique.
Resumo:
Spontaneous volunteers always emerge under emergency scenarios and are vital to a successful community response, yet some uncertainty subsists around their role and its inherent acceptance by official entities under emergency scenarios. In our research we have identified that most of the spontaneous volunteers do have none or little support from official entities, hence they end up facing critical problems as situational awareness, safety instructions and guidance, motivation and group organization. We argue that official entities still play a crucial role and should change some of their behaviors regarding spontaneous volunteerism. We aim with this thesis to design a software architecture and a framework in order to implement a solution to support spontaneous volunteerism under emergency scenarios along with a set of guidelines for the design of open information management systems. Together with the collaboration from both citizens and emergency professionals we have been able to attain several important contributions, as the clear identification of the roles taken by both spontaneous volunteers and professionals, the importance of volunteerism in overall community response and the role which open collaborative information management systems have in the community volunteering efforts. These conclusions have directly supported the design guidelines of our software solution proposal. In what concerns to methodology, we first review literature on technologies support to emergencies and how spontaneous volunteers actually challenge these systems. Following, we have performed a field research where we have observed that the emerging of spontaneous volunteer’s efforts imposes new requirements for the design of such systems, which leaded to the creation of a cluster of design guidelines that supported our software solution proposal to address the volunteers’ requirements. Finally we have architected and developed an online open information management tool which has been evaluated via usability engineering methods, usability user tests and heuristic evaluations.
Resumo:
The aim of this work was to develop a quality index method (QIM) scheme for whole ice-boxed refrigerated blackspot seabream and to perform shelf-life evaluations, using sensory analysis, GR Torrymeter measurements and bacterial counts of specific spoilage organisms (SSO) during chilled storage. A QIM scheme based on a total of 30 demerit points was developed. Sensory, physical and microbiological data were integrated and used to determine the rejection point. Results indicated that the shelf-life of blackspot seabream is around 12-13 days. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The scheme is based on Ami Harten's ideas (Harten, 1994), the main tools coming from wavelet theory, in the framework of multiresolution analysis for cell averages. But instead of evolving cell averages on the finest uniform level, we propose to evolve just the cell averages on the grid determined by the significant wavelet coefficients. Typically, there are few cells in each time step, big cells on smooth regions, and smaller ones close to irregularities of the solution. For the numerical flux, we use a simple uniform central finite difference scheme, adapted to the size of each cell. If any of the required neighboring cell averages is not present, it is interpolated from coarser scales. But we switch to ENO scheme in the finest part of the grids. To show the feasibility and efficiency of the method, it is applied to a system arising in polymer-flooding of an oil reservoir. In terms of CPU time and memory requirements, it outperforms Harten's multiresolution algorithm.The proposed method applies to systems of conservation laws in 1Dpartial derivative(t)u(x, t) + partial derivative(x)f(u(x, t)) = 0, u(x, t) is an element of R-m. (1)In the spirit of finite volume methods, we shall consider the explicit schemeupsilon(mu)(n+1) = upsilon(mu)(n) - Deltat/hmu ((f) over bar (mu) - (f) over bar (mu)-) = [Dupsilon(n)](mu), (2)where mu is a point of an irregular grid Gamma, mu(-) is the left neighbor of A in Gamma, upsilon(mu)(n) approximate to 1/mu-mu(-) integral(mu-)(mu) u(x, t(n))dx are approximated cell averages of the solution, (f) over bar (mu) = (f) over bar (mu)(upsilon(n)) are the numerical fluxes, and D is the numerical evolution operator of the scheme.According to the definition of (f) over bar (mu), several schemes of this type have been proposed and successfully applied (LeVeque, 1990). Godunov, Lax-Wendroff, and ENO are some of the popular names. Godunov scheme resolves well the shocks, but accuracy (of first order) is poor in smooth regions. Lax-Wendroff is of second order, but produces dangerous oscillations close to shocks. ENO schemes are good alternatives, with high order and without serious oscillations. But the price is high computational cost.Ami Harten proposed in (Harten, 1994) a simple strategy to save expensive ENO flux calculations. The basic tools come from multiresolution analysis for cell averages on uniform grids, and the principle is that wavelet coefficients can be used for the characterization of local smoothness.. Typically, only few wavelet coefficients are significant. At the finest level, they indicate discontinuity points, where ENO numerical fluxes are computed exactly. Elsewhere, cheaper fluxes can be safely used, or just interpolated from coarser scales. Different applications of this principle have been explored by several authors, see for example (G-Muller and Muller, 1998).Our scheme also uses Ami Harten's ideas. But instead of evolving the cell averages on the finest uniform level, we propose to evolve the cell averages on sparse grids associated with the significant wavelet coefficients. This means that the total number of cells is small, with big cells in smooth regions and smaller ones close to irregularities. This task requires improved new tools, which are described next.
Resumo:
fit the context of normalized variable formulation (NVF) of Leonard and total variation diminishing (TVD) constraints of Harten. this paper presents an extension of it previous work by the authors for solving unsteady incompressible flow problems. The main contributions of the paper are threefold. First, it presents the results of the development and implementation of a bounded high order upwind adaptative QUICKEST scheme in the 3D robust code (Freeflow), for the numerical solution of the full incompressible Navier-Stokes equations. Second, it reports numerical simulation results for 1D hock tube problem, 2D impinging jet and 2D/3D broken clam flows. Furthermore, these results are compared with existing analytical and experimental data. and third, it presents the application of the numerical method for solving 3D free surface flow problems. (C) 2007 IMACS. Published by Elsevier B.V. All rights reserved,
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)