966 resultados para CONSISTENCY


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent attempts to incorporate optimal fiscal policy into New Keynesian models subject to nominal inertia, have tended to assume that policy makers are benevolent and have access to a commitment technology. A separate literature, on the New Political Economy, has focused on real economies where there is strategic use of policy instruments in a world of political conflict. In this paper we combine these literatures and assume that policy is set in a New Keynesian economy by one of two policy makers facing electoral uncertainty (in terms of infrequent elections and an endogenous voting mechanism). The policy makers generally share the social welfare function, but differ in their preferences over fiscal expenditure (in its size and/or composition). Given the environment, policy shall be realistically constrained to be time-consistent. In a sticky-price economy, such heterogeneity gives rise to the possibility of one policy maker utilising (nominal) debt strategically to tie the hands of the other party, and influence the outcome of any future elections. This can give rise to a deficit bias, implying a sub-optimally high level of steady-state debt, and can also imply a sub-optimal response to shocks. The steady-state distortions and inflation bias this generates, combined with the volatility induced by the electoral cycle in a sticky-price environment, can significantly

Relevância:

10.00% 10.00%

Publicador:

Resumo:

If choices depend on the decision maker's mood, is the attempt to derive any consistency in choice doomed? In this paper we argue that, even with full unpredictability of mood, the way choices from a menu relate to choices from another menu exhibits some structure. We present two alternative models of 'moody choice' and show that, in either of them, not all choice patterns are possible. Indeed, we characterise both models in terms of consistency requirements of the observed choice data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estudi realitzat a partir d’una estada al Physics Department de la New York University, United States, Estats Units, entre 2006 i 2008. Una de les observacions de més impacte en la cosmologia moderna ha estat la determinació empírica que l’Univers es troba actualment en una fase d’Expansió Accelerada (EA). Aquest fenòmen implica que o bé l’Univers està dominat per un nou sector de matèria/energia, o bé la Relativitat General deixa de tenir validesa a escales cosmològiques. La primera possibilitat comprèn els models d’Energia Fosca (EF), i el seu principal problema és que l’EF ha de tenir propietats tan especials que es fan difícils de justificar teòricament. La segona possibilitat requereix la construcció de teories consistents de Gravetat Modificada a Grans Distàncies (GMGD), que són una generalització dels models de gravetat massiva. L’interès fenomenològic per aquestes teories també va resorgir amb l’aparició dels primers exemples de models de GMGD, com ara el model de Dvali, Gabadadze i Porrati (DGP), que consisteix en un tipus de brana en una dimensió extra. Malauradament, però, aquest model no permet explicar de forma consistent l’EA de l’Univers. Un dels objectius d’aquest projecte ha estat establir la viabilitat interna i fenomenològica dels models de GMGD. Des del punt de vista fenomenològic, ens hem centrat en la questió més important a la pràctica: trobar signatures observacionals que permetin distingir els models de GMGD dels d’EF. A nivell més teòric, també hem investigat el significat de les inestabilitats del model DGP.L’altre gran objectiu que ens vam proposar va ser la construcció de noves teories de GMGD. En la segona part d’aquest projecte, hem elaborat i mostrat la consistència del model “DGP en Cascada”, que generalitza el model DGP a més dimensions extra, i representa el segon model consistent i invariant-Lorentz a l’espai pla conegut. L’existència d’altres models de GMGD més enllà de DGP és de gran interès atès que podria permetre obtenir l’EA de l’Univers de forma purament geomètrica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper revisits the argument that the stabilisation bias that arises under discretionary monetary policy can be reduced if policy is delegated to a policymaker with redesigned objectives. We study four delegation schemes: price level targeting, interest rate smoothing, speed limits and straight conservatism. These can all increase social welfare in models with a unique discretionary equilibrium. We investigate how these schemes perform in a model with capital accumulation where uniqueness does not necessarily apply. We discuss how multiplicity arises and demonstrate that no delegation scheme is able to eliminate all potential bad equilibria. Price level targeting has two interesting features. It can create a new equilibrium that is welfare dominated, but it can also alter equilibrium stability properties and make coordination on the best equilibrium more likely.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the behavior of a central bank that seeks to conduct policy optimally while having imperfect credibility and harboring doubts about its model. Taking the Smets-Wouters model as the central bank.s approximating model, the paper's main findings are as follows. First, a central bank.s credibility can have large consequences for how policy responds to shocks. Second, central banks that have low credibility can bene.t from a desire for robustness because this desire motivates the central bank to follow through on policy announcements that would otherwise not be time-consistent. Third, even relatively small departures from perfect credibility can produce important declines in policy performance. Finally, as a technical contribution, the paper develops a numerical procedure to solve the decision-problem facing an imperfectly credible policymaker that seeks robustness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an axiomatic characterization of difference-form group contests, that is, contests fought among groups and where their probability of victory depends on the difference of their effective efforts. This axiomatization rests on the property of Equalizing Consistency, stating that the difference between winning probabilities in the grand contest and in the smaller contest should be identical across all participants in the smaller contest. This property overcomes some of the drawbacks of the widely-used ratio-form contest success functions. Our characterization shows that the criticisms commonly-held against difference-form contests success functions, such as lack of scale invariance and zero elasticity of augmentation, are unfounded.By clarifying the properties of this family of contest success functions, this axiomatization can help researchers to find the functional form better suited to their application of interest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of the two-stage threshold model of decision making, with the agent’s choices determined by the interaction Of three “structural variables,” we study the restrictions on behavior that arise when one or more variables are xogenously known. Our results supply necessary and sufficient conditions for consistency with the model for all possible states of partial Knowledge, and for both single- and multivalued choice functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: Advances in biopsychosocial science have underlined the importance of taking social history and life course perspective into consideration in primary care. For both clinical and research purposes, this study aims to develop and validate a standardised instrument measuring both material and social deprivation at an individual level. METHODS: We identified relevant potential questions regarding deprivation using a systematic review, structured interviews, focus group interviews and a think-aloud approach. Item response theory analysis was then used to reduce the length of the 38-item questionnaire and derive the deprivation in primary care questionnaire (DiPCare-Q) index using data obtained from a random sample of 200 patients during their planned visits to an ambulatory general internal medicine clinic. Patients completed the questionnaire a second time over the phone 3 days later to enable us to assess reliability. Content validity of the DiPCare-Q was then assessed by 17 general practitioners. Psychometric properties and validity of the final instrument were investigated in a second set of patients. The DiPCare-Q was administered to a random sample of 1898 patients attending one of 47 different private primary care practices in western Switzerland along with questions on subjective social status, education, source of income, welfare status and subjective poverty. RESULTS: Deprivation was defined in three distinct dimensions: material (eight items), social (five items) and health deprivation (three items). Item consistency was high in both the derivation (Kuder-Richardson Formula 20 (KR20) =0.827) and the validation set (KR20 =0.778). The DiPCare-Q index was reliable (interclass correlation coefficients=0.847) and was correlated to subjective social status (r(s)=-0.539). CONCLUSION: The DiPCare-Q is a rapid, reliable and validated instrument that may prove useful for measuring both material and social deprivation in primary care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a new test of true versus spurious long memory, based on log-periodogram estimation of the long memory parameter using skip-sampled data. A correction factor is derived to overcome the bias in this estimator due to aliasing. The procedure is designed to be used in the context of a conventional test of significance of the long memory parameter, and composite test procedure described that has the properties of known asymptotic size and consistency. The test is implemented using the bootstrap, with the distribution under the null hypothesis being approximated using a dependent-sample bootstrap technique to approximate short-run dependence following fractional differencing. The properties of the test are investigated in a set of Monte Carlo experiments. The procedure is illustrated by applications to exchange rate volatility and dividend growth series.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AIMS - To pilot the implementation of brief motivational intervention (BMI) among conscripts, and to test the effectiveness of BMI in young men voluntarily showing up for a single face-to-face alcohol BMI session. Participants were conscripts attending the army recruitment process in Lausanne. This process is mandatory for all Swiss males at age 19 and Lausanne serves all francophone Swiss men. METHODS - Of 3'227 young men that were seen during the army recruitment procedures, 445 voluntarily showed up for a BMI and 367 were included in the study (exclusions were random and unsystematic and related to organizational aspects in the recruitment center). After an initial assessment, subjects were randomized into two groups: an immediate BMI and a 6-month delayed BMI (waiting list design). A 6-month follow-up assessment was conducted in both groups. BMI was a face-to-face 20 minutes counseling session with a psychologist trained in motivational interviewing at baseline and a telephone session for the control group at follow-up. Strategies of BMI included the exploration and evocation of a possible behavior change, importance of future change, readiness to change, and commitment to change. A filmed example of such an intervention is available in French at www.alcoologie.ch. RESULTS - All procedures are now fully implemented and working and the provision of preventive efforts found general approval by the army. 3'227 were eligible for BMI and 445 of them (13.8%) showed up for receiving a BMI. 367 were included in the study, 181 in the BMI group and 186 in the control group. More than 86% of those included were reached at follow-up. With one exception all findings on alcohol use went in the expected direction, i.e. a stronger decrease in alcohol use (or a smaller increase as for usual weekly drinking amount) in the BMI group. The risk for risky single occasion drinking (RSOD) decreased from 57% at-risk users at baseline to 50.6%, i.e. a 6.4% point decrease in the BMI group, while there was only a 0.6% point decrease (from 57.5% to 56.9%) in the control group. Moreover, the study showed that there was a likelihood of crossover effects for other substances like tobacco smoking and cannabis use. Despite these encouraging and consistent positive findings, none reached significance at conventional levels (p < 0.05). DISCUSSION - Data suggest a beneficial impact of BMI on alcohol use outcomes and potential effect on other substance use in 19-year old men attending the army recruitment and showing up voluntarily for BMI. As the main aim was to implement and test feasibility of conducting BMI in this setting none of our findings reached statistical significance. The consistency of findings across measures and substances, however, raises hope that non-significance in the present study does not mean no effect, but mainly insufficient power of this pilot study. [Authors]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Satellite remote sensing imagery is used for forestry, conservation and environmental applications, but insufficient spatial resolution, and, in particular, unavailability of images at the precise timing required for a given application, often prevent achieving a fully operational stage. Airborne remote sensing has the advantage of custom-tuned sensors, resolution and timing, but its price prevents using it as a routine technique for the mentioned fields. Some Unmanned Aerial Vehicles might provide a “third way” solution as low-cost techniques for acquiring remotely sensed information, under close control of the end-user, albeit at the expense of lower quality instrumentation and instability. This report evaluates a light remote sensing system based on a remotely-controlled mini-UAV (ATMOS-3) equipped with a color infra-red camera (VEGCAM-1) designed and operated by CATUAV. We conducted a testing mission over a Mediterranean landscape dominated by an evergreen woodland of Aleppo pine (Pinus halepensis) and (Holm) oak (Quercus ilex) in the Montseny National Park (Catalonia, NE Spain). We took advantage of state-of-the-art ortho-rectified digital aerial imagery (acquired by the Institut Cartogràfic de Catalunya over the area during the previous year) and used it as quality reference. In particular, we paid attention to: 1) Operationality of flight and image acquisition according to a previously defined plan; 2) Radiometric and geometric quality of the images; and 3) Operational use of the images in the context of applications. We conclude that the system has achieved an operational stage regarding flight activities, although with meteorological limits set by wind speed and turbulence. Appropriate landing areas can be sometimes limiting also, but the system is able to land on small and relatively rough terrains such as patches of grassland or short matorral, and we have operated the UAV as far as 7 km from the control unit. Radiometric quality is sufficient for interactive analysis, but probably insufficient for automated processing. A forthcoming camera is supposed to greatly improve radiometric quality and consistency. Conventional GPS positioning through time synchronization provides coarse orientation of the images, with no roll information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En aquest projecte s’ha estudiat la relació entre els canvis en les temperatures superficials de l’Oceà Atlàntic i els canvis en la circulació atmosfèrica en el segle XX. Concretament s’han analitzat dos períodes de estudi: el primer des del 1940 al 1960 i el segon des del 1980 fins al 2000. S’ha posat especial interès en les anomalies en les temperatures superficials del mar en la regió tropical de l’Oceà Atlàntic i la possible interconnexió amb els canvis climàtics observats i predits. Per a la realització de l’estudi s’han dut a terme una sèrie d’experiments utilitzant el model climàtic elaborat a la universitat d’UCLA (UCLA‐AGCM model). Els resultats obtinguts han estat analitzats en forma de mapes i figures per a cada variable d’estudi. També s’ha fet una comparació entre els resultats obtinguts i altres trobats en altres treballs publicats sobre el mateix tema de recerca. Els resultats obtinguts són molt amplis i poden tenir diverses interpretacions. Tot i així algunes de les conclusions a les quals s’ha arribat són: les diferències més significatives per a les variables estudiades i trobades a partir dels resultats obtinguts del model per als dos períodes d’estudi són en els mesos d’hivern i a la zona dels tròpics; concretament a parts del nord de sud Amèrica i a parts del nord d’Àfrica. S’han trobat també canvis significatius en els patrons de precipitació sobre aquestes mateixes zones. També s’ha observant un moviment cap al nord de la zona d’interconvergència tropical i pot ser degut a l’anòmal gradient trobat a la zona equatorial en les temperatures superficial de l’Oceà. Tot i així per a una definitiva discussió i conclusions sobre els resultats dels experiments, seria necessari un estudi més ampli i profund.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As part of a project to use the long-lived (T(1/2)=1200a) (166m)Ho as reference source in its reference ionisation chamber, IRA standardised a commercially acquired solution of this nuclide using the 4pibeta-gamma coincidence and 4pigamma (NaI) methods. The (166m)Ho solution supplied by Isotope Product Laboratories was measured to have about 5% Europium impurities (3% (154)Eu, 0.94% (152)Eu and 0.9% (155)Eu). Holmium had therefore to be separated from europium, and this was carried out by means of ion-exchange chromatography. The holmium fractions were collected without europium contamination: 162h long HPGe gamma measurements indicated no europium impurity (detection limits of 0.01% for (152)Eu and (154)Eu, and 0.03% for (155)Eu). The primary measurement of the purified (166m)Ho solution with the 4pi (PC) beta-gamma coincidence technique was carried out at three gamma energy settings: a window around the 184.4keV peak and gamma thresholds at 121.8 and 637.3keV. The results show very good self-consistency, and the activity concentration of the solution was evaluated to be 45.640+/-0.098kBq/g (0.21% with k=1). The activity concentration of this solution was also measured by integral counting with a well-type 5''x5'' NaI(Tl) detector and efficiencies computed by Monte Carlo simulations using the GEANT code. These measurements were mutually consistent, while the resulting weighted average of the 4pi NaI(Tl) method was found to agree within 0.15% with the result of the 4pibeta-gamma coincidence technique. An ampoule of this solution and the measured value of the concentration were submitted to the BIPM as a contribution to the Système International de Référence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cultured primary fetal cells from one organ donation could possibly meet the exigent and stringent technical aspects for development of therapeutic products. These cell types have fewer technological limitations for cellular proliferation capacity (simple culture conditions) and maintenance of differentiated phenotype, and they also have low probability for transmission of communicable diseases. Master and Working Cell Banks (MCB, WCB) can be obtained from one fetal organ donation, permitting multiple tissues (skin, bone, cartilage, muscle and intervertebral disc) to be processed in short periods of time with identical methods to assure a stringent tracing of the processes for the production of standardized therapeutic agents. Clinical use of biologics from embryo and fetal tissues is relatively new and current legislation and ethics have some differences between countries to date. In addition, specific cell delivery systems for each tissue type can be adapted to the clinical application. Since it is the intention that banked primary fetal cells enhance the prospective treatment of hundreds of thousands of patients with only one organ donation, it is imperative to show consistency, traceability and safety of the processes including donor tissue selection, cell banking, cell testing and growth of cells in out-scaling for the preparation of bio-engineered products for clinical application.