946 resultados para rotated to zero
Resumo:
To study the macroeconomic effects of unconventional monetary policy across the different countries of the eurozone, I develop an identification scheme to disentangle conventional from non-conventional policy shocks, using futures contracts on overnight interest rates and the size of the European Central Bank balance sheet. Setting these shocks as endogenous variables in a structural vector autoregressive (SVAR) model, along with the CPI and the employment rate, estimated impulse response functions of policy to macroeconomic variables are studied. I find that unconventional policy shocks generated mixed effects in inflation but had a positive impact on employment, with the exception of Portugal, Spain, Greece and Italy where the employment response is close to zero or negative. The heterogeneity that characterizes the responses shows that the monetary policy measures taken in recent years were not sufficient to stabilize the economies of the eurozone countries under more severe economic conditions.
Resumo:
Dissertao de mestrado em Biofsica e Bionanossistemas
Resumo:
The main object of the present paper consists in giving formulas and methods which enable us to determine the minimum number of repetitions or of individuals necessary to garantee some extent the success of an experiment. The theoretical basis of all processes consists essentially in the following. Knowing the frequency of the desired p and of the non desired ovents q we may calculate the frequency of all possi- ble combinations, to be expected in n repetitions, by expanding the binomium (p-+q)n. Determining which of these combinations we want to avoid we calculate their total frequency, selecting the value of the exponent n of the binomium in such a way that this total frequency is equal or smaller than the accepted limit of precision n/p{ 1/n1 (q/p)n + 1/(n-1)| (q/p)n-1 + 1/ 2!(n-2)| (q/p)n-2 + 1/3(n-3) (q/p)n-3... < Plim - -(1b) There does not exist an absolute limit of precision since its value depends not only upon psychological factors in our judgement, but is at the same sime a function of the number of repetitions For this reasen y have proposed (1,56) two relative values, one equal to 1-5n as the lowest value of probability and the other equal to 1-10n as the highest value of improbability, leaving between them what may be called the "region of doubt However these formulas cannot be applied in our case since this number n is just the unknown quantity. Thus we have to use, instead of the more exact values of these two formulas, the conventional limits of P.lim equal to 0,05 (Precision 5%), equal to 0,01 (Precision 1%, and to 0,001 (Precision P, 1%). The binominal formula as explained above (cf. formula 1, pg. 85), however is of rather limited applicability owing to the excessive calculus necessary, and we have thus to procure approximations as substitutes. We may use, without loss of precision, the following approximations: a) The normal or Gaussean distribution when the expected frequency p has any value between 0,1 and 0,9, and when n is at least superior to ten. b) The Poisson distribution when the expected frequecy p is smaller than 0,1. Tables V to VII show for some special cases that these approximations are very satisfactory. The praticai solution of the following problems, stated in the introduction can now be given: A) What is the minimum number of repititions necessary in order to avoid that any one of a treatments, varieties etc. may be accidentally always the best, on the best and second best, or the first, second, and third best or finally one of the n beat treatments, varieties etc. Using the first term of the binomium, we have the following equation for n: n = log Riim / log (m:) = log Riim / log.m - log a --------------(5) B) What is the minimun number of individuals necessary in 01der that a ceratin type, expected with the frequency p, may appaer at least in one, two, three or a=m+1 individuals. 1) For p between 0,1 and 0,9 and using the Gaussean approximation we have: on - . p (1-p) n - a -1.m b= δ. 1-p /p e c = m/p } -------------------(7) n = b + b + 4 c/ 2 n = 1/p n cor = n + n' ---------- (8) We have to use the correction n' when p has a value between 0,25 and 0,75. The greek letters delta represents in the present esse the unilateral limits of the Gaussean distribution for the three conventional limits of precision : 1,64; 2,33; and 3,09 respectively. h we are only interested in having at least one individual, and m becomes equal to zero, the formula reduces to : c= m/p o para a = 1 a = { b + b} = b = δ2 1- p /p }-----------------(9) n = 1/p n (cor) = n + n 2) If p is smaller than 0,1 we may use table 1 in order to find the mean m of a Poisson distribution and determine. n = m: p C) Which is the minimun number of individuals necessary for distinguishing two frequencies p1 and p2? 1) When pl and p2 are values between 0,1 and 0,9 we have: n = { δ p1 ( 1-pi) + p2) / p2 (1 - p2) n= 1/p1-p2 }------------ (13) n (cor) We have again to use the unilateral limits of the Gaussean distribution. The correction n' should be used if at least one of the valors pl or p2 has a value between 0,25 and 0,75. A more complicated formula may be used in cases where whe want to increase the precision : n (p1 - p2) δ { p1 (1- p2 ) / n= m δ = δ p1 ( 1 - p1) + p2 ( 1 - p2) c= m / p1 - p2 n = { b2 + 4 4 c }2 }--------- (14) n = 1/ p1 - p2 2) When both pl and p2 are smaller than 0,1 we determine the quocient (pl-r-p2) and procure the corresponding number m2 of a Poisson distribution in table 2. The value n is found by the equation : n = mg /p2 ------------- (15) D) What is the minimun number necessary for distinguishing three or more frequencies, p2 p1 p3. If the frequecies pl p2 p3 are values between 0,1 e 0,9 we have to solve the individual equations and sue the higest value of n thus determined : n 1.2 = {δ p1 (1 - p1) / p1 - p2 } = Fiim n 1.2 = { δ p1 ( 1 - p1) + p1 ( 1 - p1) } } -- (16) Delta represents now the bilateral limits of the : Gaussean distrioution : 1,96-2,58-3,29. 2) No table was prepared for the relatively rare cases of a comparison of threes or more frequencies below 0,1 and in such cases extremely high numbers would be required. E) A process is given which serves to solve two problemr of informatory nature : a) if a special type appears in n individuals with a frequency p(obs), what may be the corresponding ideal value of p(esp), or; b) if we study samples of n in diviuals and expect a certain type with a frequency p(esp) what may be the extreme limits of p(obs) in individual farmlies ? I.) If we are dealing with values between 0,1 and 0,9 we may use table 3. To solve the first question we select the respective horizontal line for p(obs) and determine which column corresponds to our value of n and find the respective value of p(esp) by interpolating between columns. In order to solve the second problem we start with the respective column for p(esp) and find the horizontal line for the given value of n either diretly or by approximation and by interpolation. 2) For frequencies smaller than 0,1 we have to use table 4 and transform the fractions p(esp) and p(obs) in numbers of Poisson series by multiplication with n. Tn order to solve the first broblem, we verify in which line the lower Poisson limit is equal to m(obs) and transform the corresponding value of m into frequecy p(esp) by dividing through n. The observed frequency may thus be a chance deviate of any value between 0,0... and the values given by dividing the value of m in the table by n. In the second case we transform first the expectation p(esp) into a value of m and procure in the horizontal line, corresponding to m(esp) the extreme values om m which than must be transformed, by dividing through n into values of p(obs). F) Partial and progressive tests may be recomended in all cases where there is lack of material or where the loss of time is less importent than the cost of large scale experiments since in many cases the minimun number necessary to garantee the results within the limits of precision is rather large. One should not forget that the minimun number really represents at the same time a maximun number, necessary only if one takes into consideration essentially the disfavorable variations, but smaller numbers may frequently already satisfactory results. For instance, by definition, we know that a frequecy of p means that we expect one individual in every total o(f1-p). If there were no chance variations, this number (1- p) will be suficient. and if there were favorable variations a smaller number still may yield one individual of the desired type. r.nus trusting to luck, one may start the experiment with numbers, smaller than the minimun calculated according to the formulas given above, and increase the total untill the desired result is obtained and this may well b ebefore the "minimum number" is reached. Some concrete examples of this partial or progressive procedure are given from our genetical experiments with maize.
Resumo:
We review several results concerning the long time asymptotics of nonlinear diffusion models based on entropy and mass transport methods. Semidiscretization of these nonlinear diffusion models are proposed and their numerical properties analysed. We demonstrate the long time asymptotic results by numerical simulation and we discuss several open problems based on these numerical results. We show that for general nonlinear diffusion equations the long-time asymptotics can be characterized in terms of fixed points of certain maps which are contractions for the euclidean Wasserstein distance. In fact, we propose a new scaling for which we can prove that this family of fixed points converges to the Barenblatt solution for perturbations of homogeneous nonlinearities for values close to zero.
Resumo:
This paper examines competition in a spatial model of two-candidate elections, where one candidate enjoys a quality advantage over the other candidate. The candidates care about winning and also have policy preferences. There is two-dimensional private information. Candidate ideal points as well as their tradeoffs between policy preferences and winning are private information. The distribution of this two-dimensional type is common knowledge. The location of the median voter's ideal point is uncertain, with a distribution that is commonly known by both candidates. Pure strategy equilibria always exist in this model. We characterize the effects of increased uncertainty about the median voter, the effect of candidate policy preferences, and the effects of changes in the distribution of private information. We prove that the distribution of candidate policies approaches the mixed equilibrium of Aragones and Palfrey (2002a), when both candidates' weights on policy preferences go to zero.
Resumo:
The determination of blood lipase has been proposed by SEABRA as a method for detecting predisposition to initial or subsequent stages of tuberculosis; normal subjects having high titers (8-12 units), tuberculous patients low ones (5-7), falling to zero in advanced stages of the disease. An assay of the method has been made by the AA. in sera of 238 non tuberculous subjects (419 tests) and 207 tuberculous ones (456 tests) following the technical procedures described by SEABRA. All of them had their Roentgenographies taken at the same day of blood collection. Factors interfering with blood lipase values in tuberculosis are discussed. A relationship between the course of the disease and the serum lipase could not be confirmed. High and low values were found in initial as well as in advanced cases. Our results are in agreement with those recorded in the literature (Figs. I and II). It seems that the general condition, rather than pulmonary lesions are responsible for the blood lipase values. There was no direct relationship between blood lipase titer and severity of pulmonary tuberculosis; however the data presented in this paper do not agree with such correlation, stated by SEABRA, FERNANDES and VICENTE.
Resumo:
The presence of subcentres cannot be captured by an exponential function. Cubic spline functions seem more appropriate to depict the polycentricity pattern of modern urban systems. Using data from Barcelona Metropolitan Region, two possible population subcentre delimitation procedures are discussed. One, taking an estimated derivative equal to zero, the other, a density gradient equal to zero. It is argued that, in using a cubic spline function, a delimitation strategy based on derivatives is more appropriate than one based on gradients because the estimated density can be negative in sections with very low densities and few observations, leading to sudden changes in estimated gradients. It is also argued that using as a criteria for subcentre delimitation a second derivative with value zero allow us to capture a more restricted subcentre area than using as a criteria a first derivative zero. This methodology can also be used for intermediate ring delimitation.
Resumo:
The paper sets out a one sector growth model with a neoclassical production function in land and a capital-labour aggregate. Capital accumulates through capitalist saving, the labour supply is infinitely elastic at a subsistence wage and all factors may experience factor augmenting technical progress. The main result is that, if the elasticity of substitution between land and the capital-labour aggregate is less than one and if the rate of caital augmenting technical progress is strictly positive, then the rate of profit will fall to zero. The surprise is that this result holds regardless of the rate of land augmenting technical progress; that is, no amount of technical advance in agriculture can stop the fall in the rate of profit. The paper also discusses the relation of this result to the classical and Marxist literature and sets out the path of the relative price of land.
Resumo:
Proponents of proportional electoral rules often argue that majority rule depresses turnout and may lower welfare due to the 'tyranny of the majority' problem. The present paper studies the impact of electoral rules on turnout and social welfare. We analyze a model of instrumental voting where citizens have private information over their individual cost of voting and over the alternative they prefer. The electoral rule used to select the winning alternative is a combination of majority rule and proportional rule. Results show that the above arguments against majority rule do not hold in this set up. Social welfare and turnout increase with the weight that the electoral rule gives to majority rule when the electorate is expected to be split, and they are independent of the electoral rule employed when the expected size of the minority group tends to zero. However, more proportional rules can increase turnout within the minority group. This effect is stronger the smaller the minority group. We then conclude that majority rule fosters overall turnout and increases social welfare, whereas proportional rule fosters the participation of minorities.
Resumo:
In a market in which sellers compete by posting mechanisms, we study how the properties of the meeting technology affect the mechanism that sellers select. In general, sellers have incentive to use mechanisms that are socially efficient. In our environment, sellers achieve this by posting an auction with a reserve price equal to their own valuation, along with a transfer that is paid by (or to) all buyers with whom the seller meets. However, we define a novel condition on meeting technologies, which we call invariance, and show that the transfer is equal to zero if and only if the meeting technology satisfies this condition.
Resumo:
In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
Resumo:
This paper evaluates the effects of policy interventions on sectoral labour markets and the aggregate economy in a business cycle model with search and matching frictions. We extend the canonical model by including capital-skill complementarity in production, labour markets with skilled and unskilled workers and on-the-job-learning (OJL) within and across skill types. We first find that, the model does a good job at matching the cyclical properties of sectoral employment and the wage-skill premium. We next find that vacancy subsidies for skilled and unskilled jobs lead to output multipliers which are greater than unity with OJL and less than unity without OJL. In contrast, the positive output effects from cutting skilled and unskilled income taxes are close to zero. Finally, we find that the sectoral and aggregate effects of vacancy subsidies do not depend on whether they are financed via public debt or distorting taxes.
Resumo:
A large bibliographic survey provided data on Trypanosoma cruzi serology covering the period l948-l984. Epidemiological-demographic methods provided an estimate of 11% for the prevalenceof positive serology in Brazil, by 1984. Significant temporal trends were observed for most of the Brazilian geographical regions as well as for Brazil, as a whole. The parabolic curve that fit best for the entire country, indicates that by 1991, the incidence of new positive serology would be close to zero. This conclusion needs further fine-adjustment, since the forecast point is somewhat distant from the measured period.
Resumo:
We investigate the transition to synchronization in the Kuramoto model with bimodal distributions of the natural frequencies. Previous studies have concluded that the model exhibits a hysteretic phase transition if the bimodal distribution is close to a unimodal one, due to the shallowness the central dip. Here we show that proximity to the unimodal-bimodal border does not necessarily imply hysteresis when the width, but not the depth, of the central dip tends to zero. We draw this conclusion from a detailed study of the Kuramoto model with a suitable family of bimodal distributions.
Resumo:
Heavy domestic and peridomestic infestations of Triatoma infestans were controlled in two villages in southern Bolivia by the application of deltamethrin SC25 (2.5% suspension concentrate) at a target dose of 25 mg a.i./m. Actual applied dose was monitored by HPLC analysis of filter papers placed at various heights on the house walls, and was shown to range from 0 to 59.6 about a mean of 28.5 mg a.i./m. Wall bioassays showed high mortality of T. infestans during the first month after the application of deltamethrin. Mortality declined to zero as summer temperatures increased, but reappeared with the onset of the following winter. In contrast, knockdown was apparent throughout the trial, showing no discernible temperature dependence. House infestation rates, measured by manual sampling and use of paper sheets to collect bug faeces, declined from 79% at the beginning of the trial to zero at the 6 month evaluation. All but one of the houses were still free of T. infestans at the final evaluation 12 months after spraying, although a small number of bugs were found at this time in 5 of 355 peridomestic dependencies. Comparative cost studies endorse the recommendation of large-scale application of deltamethrin, or pyrethroid of similar cost-effectiveness, as a means to eliminate domestic T. infestans populations in order to interrupt transmission of Chagas disease