894 resultados para probability of error


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Investigações anteriores relacionadas ao schadenfreude concentraram-se nos fatores que provocam o prazer no infortúnio do outro. A presente pesquisa tem como objetivo investigar o impacto do schadenfreude na tomada de decisão. Dois estudos (um em laboratório e uma em campo) abordam o impacto do schadenfreude em decisões realizadas no passado e no futuro em eventos desportivos. O primeiro estudo confronta sentimentos de orgulho em uma vitória do time favorito contra os sentimentos de perda schadenfreude de uma equipe rival. Os resultados mostraram que as pessoas preferiam enviar notícias sobre a vitória da equipe favorita (orgulho) ao invés da perda do time rival (schadenfreude) quando as diferenças de pontuação no jogo eram pequenas (por exemplo: time favorito 1 x 0 outro, contra, o time rival 0 x 1 favorito). No entanto, as pessoas eram mais propensas a fazer a escolha schadenfreude (por exemplo, escolher o envio de uma notícia sobre a derrota de um time rival) quando o resultado era alto (por exemplo, time favorito 5 x 0 rival, contra, time rival 0 x 5 favorito). O segundo estudo no campo examina como schadenfreude influencia a vontade de apostar contra um time rival. Para responder a esse problema, a preferência da equipe do participante é avaliada (Participantes que apoiam time alvo contra os que apoiam o rival). Uma manipulação de louvor é adicionada, tal que os consumidores vejam ou não um elogio à equipe alvo enquanto eles estão fazendo uma aposta sobre o resultado da partida. Os resultados mostram que os torcedores do time alvo não foram influenciados pela manipulação de louvor. No entanto, torcedores do time rival aumentaram sua probabilidade de aposta contra o time alvo (ou seja, mostraram um comportamento que envolve o schadenfreude) quando esta foi elogiada antes do jogo.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A central question in political economy is how to incentivize elected socials to allocate resources to those that need them the most. Research has shown that, while electoral incentives lead central governments to transfer fewer funds to non-aligned constituencies, media presence is instrumental in promoting a better allocation of resources. This study evaluates how these two phenomena interact by analyzing the role of media in compensating political biases. In particular, we analyze how media presence, connectivity and ownership affect the distribution of federal drought relief transfers to Brazilian municipalities. We find that municipalities that are not aligned with the federal government have a lower probability of receiving funds conditional on experiencing low precipitation. However, we show that the presence of radio stations compensates for this bias. This effect is driven by municipalities that have radio stations connected to a regional network rather than by the presence of local radio stations. In addition, the effect of network-connected radio stations increases with their network coverage. These findings suggests that the connection of a radio station to a network is important because it increases the salience of disasters, making it harder for the federal government to ignore non-allies. We show that our findings are not explained by the ownership and manipulation of media by politicians.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A model of externaI CrISIS is deveIoped focusing on the interaction between Iiquidity creation by financiaI intermediaries and foreign exchange collapses. The intermediaries' role of transforming maturities is shown to result in larger movements of capital and a higher probability of crisis. This resembles the observed cycle in capital fiows: large infiows, crisis and abrupt outfiows. The mo deI highlights how adverse productivity and international interest rate shocks can be magnified by the behavior of individual foreign investors linked together through their deposits in the intermediaries. An eventual collapse of the exchange rate can link investors' behavior even further. The basic model is then extended, quite naturally, to study the effects of capital fiow contagion between countries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A model of overlapping generations in continuous time is composed. IndividuaIs pass through two distinct time periods during their life times. During the first period, they work, save and have a death probability equal to zero. During the second, from the periods T after birth, their probability of death changes to p and then they retire. Capital stock and the stationary state in come are calculated for two situations: in the first, people live from their accumulated capital after retirementj in the second, they live from a state transfer payment through income taxo To simplify matters, in this preliminary version, it is supposed that there is no population growth and that the instantaneous elasticity substitution of consumption is unitary.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper studies the political viability of free trade agreements (FTAs). The key element of the analysis is the “rent dissipation” that these arrangements induce: by eliminating intra-bloc trade barriers, an FTA reduces the incentives of the local firms to lobby for higher external tariffs, thereby causing a reduction of the rents created in the lobbying process. The prospect of rent dissipation moderates the governments’ willingness to participate in FTAs; they will support only arrangements that are “substantially” welfare improving, and no FTA that reduces welfare. Rent dissipation also implies that the prospects of political turnover may create strategic reasons for the formation of FTAs. Specifically, a government facing a high enough probability of losing power may want to form a trade bloc simply to “tie the hands” of its successor. An FTA can affect the likelihood of political turnover as well. If the incumbent party has a known bias toward special interests, it may want to commit to less distortionary policies in order to reduce its electoral disadvantage; the rent dissipation effect ensures that an FTA can serve as the vehicle for such a commitment. In nascent/unstable democracies, the incumbent government can use a free trade agreement also to reduce the likelihood of a dictatorial takeover and to “consolidate” democracy – a finding that is consistent with the timing of numerous accessions to and formations of preferential arrangements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nós usamos a metodologia de Regressões em Descontinuidade (RDD) para estimar o efeito causal do Fundo de Participação dos Municípios (FPM) recebido por um município sobre características dos municípios vizinhos, considerando uma variedade de temas: finanças públicas, educação, saúde e resultados eleitorais. Nós exploramos a regra que gera uma variação exógena da transferência em munícipios próximos às descontinuidades no repasse do fundo de acordo com faixas de população. Nossa principal contribuição é estimar separadamente e em conjunto o efeito spillover e o efeito direto do FPM, considerando ambos municípios vizinhos ou apenas um deles próximos às mudanças de faixa. Dessa forma, conseguimos entender melhor a interação entre municípios vizinhos quando há uma correlação na probabilidade de receber uma transferência federal. Nós mostramos que a estimativa do efeito direto do FPM sobre os gastos locais diminui em cerca de 20% quando controlamos pelo spillover do vizinho, que em geral é positivo, com exceção dos gastos em saúde e saneamento. Nós estimamos um efeito positivo da transferência sobre notas na prova Brasil e taxas de aprovação escolares em municípios vizinhos e na rede estadual do ensino fundamental. Por outro lado, o recebimento de FPM por municípios vizinhos de pequena população reduz o provimento de bens e serviços de saúde em cidades próximas e maiores, o que pode ocorrer devido à redução da demanda por serviços de saúde. A piora de alguns indicadores globais de saúde é um indício, no entanto, de que podem existir problemas de coordenação para os prefeitos reterem seus gastos em saúde. De fato, quando controlamos pela margem de vitória nas eleições municipais e consideramos apenas cidades vizinhas com prefeitos de partido diferentes, o efeito spillover é maior em magnitude, o que indica que incentivos políticos são importantes para explicar a subprovisão de serviços em saúde, por um lado, e o aumento da provisão de bens em educação, por outro. Nós também constatamos um efeito positivo do FPM sobre votos para o partido do governo federal nas eleições municipais e nacionais, e grande parte desse efeito é explicado pelo spillover do FPM de cidades vizinhas, mostrando que cidades com dependência econômica do governo federal se tornam a base de sustentação e apoio político desse governo. Por fim, nós encontramos um efeito ambíguo do aumento de receita devido ao FPM sobre a competição eleitoral nas eleições municipais, com uma queda da margem de vitória do primeiro colocado e uma redução do número de candidatos, o que pode ser explicado pelo aumento do custo fixo das campanhas locais.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper employs mechanism design to study the effects of imperfect legal enforcement on optimal scale of projects, borrowing interest rates and the probability of default. The analysis departs from an environment that combines asymmetric information about cash flows and limited commitment by borrowers. Incentive for repayment comes from the possibility of liquidation of projects by a court, but courts are costly and may fail to liquidate. The value of liquidated assets can be used as collateral: it is transferred to the lender when courts liquidate. Examples reveal that costly use of courts may be optimal, which contrasts with results from most limited commitment models, where punishments are just threats, never applied in optimal arrangements. I show that when voluntary liquidation is allowed, both asymmetric information and uncertainty about courts are necessary conditions for legal punishments ever to be applied. Numerical solutions for several parametric specifications are presented, allowing for heterogeneity on initial wealth and variability of project returns. In all such solutions, wealthier individuals borrow with lower interest rates and run higher scale enterprises, which is consistent with stylized facts. The reliability of courts has a consistently positive effect on the scale of projects. However its effect on interest rates is subtler and depends essentially on the degree of curvature of the production function. Numerical results also show that the possibility of collateral seizing allows comovements of the interest rates and the probability of repayment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The synthetic control (SC) method has been recently proposed as an alternative method to estimate treatment e ects in comparative case studies. Abadie et al. [2010] and Abadie et al. [2015] argue that one of the advantages of the SC method is that it imposes a data-driven process to select the comparison units, providing more transparency and less discretionary power to the researcher. However, an important limitation of the SC method is that it does not provide clear guidance on the choice of predictor variables used to estimate the SC weights. We show that such lack of speci c guidances provides signi cant opportunities for the researcher to search for speci cations with statistically signi cant results, undermining one of the main advantages of the method. Considering six alternative speci cations commonly used in SC applications, we calculate in Monte Carlo simulations the probability of nding a statistically signi cant result at 5% in at least one speci cation. We nd that this probability can be as high as 13% (23% for a 10% signi cance test) when there are 12 pre-intervention periods and decay slowly with the number of pre-intervention periods. With 230 pre-intervention periods, this probability is still around 10% (18% for a 10% signi cance test). We show that the speci cation that uses the average pre-treatment outcome values to estimate the weights performed particularly bad in our simulations. However, the speci cation-searching problem remains relevant even when we do not consider this speci cation. We also show that this speci cation-searching problem is relevant in simulations with real datasets looking at placebo interventions in the Current Population Survey (CPS). In order to mitigate this problem, we propose a criterion to select among SC di erent speci cations based on the prediction error of each speci cations in placebo estimations

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis explores the possibility of directly detecting blackbody emission from Primordial Black Holes (PBHs). A PBH might form when a cosmological density uctuation with wavenumber k, that was once stretched to scales much larger than the Hubble radius during ination, reenters inside the Hubble radius at some later epoch. By modeling these uctuations with a running{tilt power{law spectrum (n(k) = n0 + a1(k)n1 + a2(k)n2 + a3(k)n3; n0 = 0:951; n1 = ????0:055; n2 and n3 unknown) each pair (n2,n3) gives a di erent n(k) curve with a maximum value (n+) located at some instant (t+). The (n+,t+) parameter space [(1:20,10????23 s) to (2:00,109 s)] has t+ = 10????23 s{109 s and n+ = 1:20{2:00 in order to encompass the formation of PBHs in the mass range 1015 g{1010M (from the ones exploding at present to the most massive known). It was evenly sampled: n+ every 0.02; t+ every order of magnitude. We thus have 41 33 = 1353 di erent cases. However, 820 of these ( 61%) are excluded (because they would provide a PBH population large enough to close the Universe) and we are left with 533 cases for further study. Although only sub{stellar PBHs ( 1M ) are hot enough to be detected at large distances we studied PBHs with 1015 g{1010M and determined how many might have formed and still exist in the Universe. Thus, for each of the 533 (n+,t+) pairs we determined the fraction of the Universe going into PBHs at each epoch ( ), the PBH density parameter (PBH), the PBH number density (nPBH), the total number of PBHs in the Universe (N), and the distance to the nearest one (d). As a rst result, 14% of these (72 cases) give, at least, one PBH within the observable Universe, one{third being sub{stellar and the remaining evenly spliting into stellar, intermediate mass and supermassive. Secondly, we found that the nearest stellar mass PBH might be at 32 pc, while the nearest intermediate mass and supermassive PBHs might be 100 and 1000 times farther, respectively. Finally, for 6% of the cases (four in 72) we might have substellar mass PBHs within 1 pc. One of these cases implies a population of 105 PBHs, with a mass of 1018 g(similar to Halley's comet), within the Oort cloud, which means that the nearest PBH might be as close as 103 AU. Such a PBH could be directly detected with a probability of 10????21 (cf. 10????32 for low{energy neutrinos). We speculate in this possibility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work, a performance analysis of transmission schemes employing turbo trellis coded modulation. In general, the performance analysis of such schemes is guided by evaluating the error probability of these schemes. The exact evaluation of this probability is very complex and inefficient from the computational point of view, a widely used alternative is the use of union bound of error probability, because of its easy implementation and computational produce bounds that converge quickly. Since it is the union bound, it should use to expurge some elements of distance spectrum to obtain a tight bound. The main contribution of this work is that the listing proposal is carried out from the puncturing at the level of symbol rather than bit-level as in most works of literature. The main reason for using the symbol level puncturing lies in the fact that the enummerating function of the turbo scheme is obtained directly from complex sequences of signals through the trellis and not indirectly from the binary sequences that require further binary to complex mapping, as proposed by previous works. Thus, algorithms can be applied through matrix from the adjacency matrix, which is obtained by calculating the distances of the complex sequences of the trellis. This work also presents two matrix algorithms for state reduction and the evaluation of the transfer function of this. The results presented in comparisons of the bounds obtained using the proposed technique with some turbo codes of the literature corroborate the proposition of this paper that the expurgated bounds obtained are quite tight and matrix algorithms are easily implemented in any programming software language

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the absence of the selective availability, which was turned off on May 1, 2000, the ionosphere can be the largest source of error in GPS positioning and navigation. Its effects on GPS observable cause a code delays and phase advances. The magnitude of this error is affected by the local time of the day, season, solar cycle, geographical location of the receiver and Earth's magnetic field. As it is well known, the ionosphere is the main drawback for high accuracy positioning, when using single frequency receivers, either for point positioning or relative positioning of medium and long baselines. The ionosphere effects were investigated in the determination of point positioning and relative positioning using single frequency data. A model represented by a Fourier series type was implemented and the parameters were estimated from data collected at the active stations of RBMC (Brazilian Network for Continuous Monitoring of GPS satellites). The data input were the pseudorange observables filtered by the carrier phase. Quality control was implemented in order to analyse the adjustment and to validate the significance of the estimated parameters. Experiments were carried out in the equatorial region, using data collected from dual frequency receivers. In order to validate the model, the estimated values were compared with ground truth. For point and relative positioning of baselines of approximately 100 km, the values of the discrepancies indicated an error reduction better than 80% and 50% respectively, compared to the processing without the ionospheric model. These results give an indication that more research has to be done in order to provide support to the L1 GPS users in the Equatorial region.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)