988 resultados para Gaussian assumption
Resumo:
Due to population aging, by 2030 Switzerland may face a demand of 24 million family practitioner visits, a growth of 13 percent from the 2005 level. This result is based on the assumption that the per capita demand for doctor visits remains what was observed in 2005 by age groups and sex. During the same period, the total number of practitioners may decrease by 14 percent whereas the female proportion of such practitioners may double. These changes may cause a 33 percent decrease in the supply of physician visits to reach only 14 millions. The comparison of the demand and supply of family doctor visits reveals that by 2030, 10 million visits may be unmet which represents 40 percent of the demand. On the supply side, a full scale implementation of task delegation may partially reduce that gap (minus 2 millions). On the demand side, improved health status may bring in a larger decrease in the needs for visits (minus 4 million).
Resumo:
Empirical studies on the determinants of industrial location typically use variables measured at the available administrative level (municipalities, counties, etc.). However, this amounts to assuming that the effects these determinants may have on the location process do not extent beyond the geographical limits of the selected site. We address the validity of this assumption by comparing results from standard count data models with those obtained by calculating the geographical scope of the spatially varying explanatory variables using a wide range of distances and alternative spatial autocorrelation measures. Our results reject the usual practice of using administrative records as covariates without making some kind of spatial correction. Keywords: industrial location, count data models, spatial statistics JEL classification: C25, C52, R11, R30
Resumo:
We prove rigidity and vanishing theorems for several holomorphic Euler characteristics on complex contact manifolds admitting holomorphic circle actions preserving the contact structure. Such vanishings are reminiscent of those of LeBrun and Salamon on Fano contact manifolds but under a symmetry assumption instead of a curvature condition.
Resumo:
When actuaries face with the problem of pricing an insurance contract that contains different types of coverage, such as a motor insurance or homeowner's insurance policy, they usually assume that types of claim are independent. However, this assumption may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce different regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion. These models have been largely ignored to multivariate Poisson date, mainly because of their computational di±culties. Bayesian inference based on MCMC helps to solve this problem (and also lets us derive, for several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile insurance claims database with three different types of claims. We analyse the consequences for pure and loaded premiums when the independence assumption is relaxed by using different multivariate Poisson regression models and their zero-inflated versions.
Resumo:
The assumption that climatic niche requirements of invasive species are conserved between their native and invaded ranges is key to predicting the risk of invasion. However, this assumption has been challenged recently by evidence of niche shifts in some species. Here, we report the first large-scale test of niche conservatism for 50 terrestrial plant invaders between Eurasia, North America, and Australia. We show that when analog climates are compared between regions, fewer than 15% of species have more than 10% of their invaded distribution outside their native climatic niche. These findings reveal that substantial niche shifts are rare in terrestrial plant invaders, providing support for an appropriate use of ecological niche models for the prediction of both biological invasions and responses to climate change.
Resumo:
The thesis examines the impact of collective war victimization on individuals' readiness to accept or assign collective guilt for past war atrocities. As a complement to previous studies, its aim is to articulate an integrated approach to collective victimization, which distinguishes between individual-, communal-, and societal-level consequences of warfare. Building on a social representation approach, it is guided by the assumption that individuals form beliefs about a conflict through their personal experiences of victimization, communal experiences of warfare that occur in their proximal surrounding, and the mass- mediatised narratives that circulate in a society's public sphere. Four empirical studies test the hypothesis that individuals' beliefs about the conflict depend on the level and type of war experiences to which they have been exposed, that is, on informative and normative micro and macro contexts in which they are embedded. The studies have been conducted in the context of the Yugoslav wars that attended the breakup of Yugoslavia, a series of wars fought between 1991 and 2001 during which numerous war atrocities were perpetrated causing a massive victimisation of population. To examine the content and impact of war experiences at each level of analysis, the empirical studies employed various methodological strategies, from quantitative analyses of a representative public opinion survey, to qualitative analyses of media content and political speeches. Study 1 examines the impact of individual- and communal- level war experiences on individuals' acceptance and assignment of collective guilt. It further examines the impact of the type of communal level victimization: exposure to symmetric (i.e., violence that similarly affects members of different ethnic groups, including adversaries) and asymmetric violence. The main goal of Study 2 is to examine the structural and political circumstances that enhance collective guilt assignment. While the previous studies emphasize the role of past victimisation, Study 2 tests the assumption that the political demobilisation strategy employed by elites facing public discontent in the collective system-threatening circumstances can fuel out-group blame. Studies 3 and 4 have been conducted predominantly in the context of Croatia and examine rhetoric construction of the dominant politicized narrative of war in a public sphere (Study 3) and its maintenance through public delegitimization of alternative (critical) representations (Study 4). Study 4 further examines the likelihood that highly identified group members adhere to publicly delegitimized critical stances on war. - Cette thèse étudie l'impact de la victimisation collective de guerre sur la capacité des individus à accepter ou à attribuer une culpabilité collective liée à des atrocités commises en temps de guerre. En compléments aux recherches existantes, le but de ce travail est de définir une approche intégrative de la victimisation collective, qui distingue les conséquences de la guerre aux niveaux individuel, régional et sociétal. En partant de l'approche des représentations sociales, cette thèse repose sur le postulat que les individus forment des croyances sur un conflit au travers de leurs expériences personnelles de victimisation, de leurs expériences de guerre lorsque celle-ci se déroule près d'eux, ainsi qu'au travers des récits relayés par les mass media. Quatre études testent l'hypothèse que les croyances des individus dépendent des niveaux et des types d'expériences de guerre auxquels ils ont été exposés, c'est-à-dire, des contextes informatifs et normatifs, micro et macro dans lesquels ils sont insérés. Ces études ont été réalisées dans le contexte des guerres qui, entre 1991 et 2001, ont suivi la dissolution de la Yougoslavie et durant lesquelles de nombreuses atrocités de guerre ont été commises, causant une victimisation massive de la population. Afin d'étudier le contenu et l'impact des expériences de guerre sur chaque niveau d'analyse, différentes stratégies méthodologiques ont été utilisées, des analyses quantitatives sur une enquête représentative d'opinion publique aux analyses qualitatives de contenu de médias et de discours politiques. L'étude 1 étudie l'impact des expériences de guerre individuelles et régionales sur l'acceptation et l'attribution de la culpabilité collective par les individus. Elle examine aussi l'impact du type de victimisation régionale : exposition à la violence symétrique (i.e., violence qui touche les membres de différents groupes ethniques, y compris les adversaires) et asymétrique. L'étude 2 se penche sur les circonstances structurelles et politiques qui augmentent l'attribution de culpabilité collective. Alors que les recherches précédentes ont mis l'accent sur le rôle de la victimisation passée, l'étude 2 teste l'hypothèse que la stratégie de démobilisation politique utilisée par les élites pour faire face à l'insatisfaction publique peut encourager l'attribution de la culpabilité à l'exogroupe. Les études 3 et 4 étudient, principalement dans le contexte croate, la construction rhétorique du récit de guerre politisé dominant (étude 3) et son entretien à travers la délégitimation publique des représentations alternatives (critiques] (étude 4). L'étude 4 examine aussi la probabilité qu'ont les membres de groupe fortement identifiés d'adhérer à des points de vue sur la guerre critiques et publiquement délégitimés.
Resumo:
This paper explores the impact of citizens' motivation to vote on the pattern of fiscal federalism. If the only concern of instrumental citizens was outcome they would have little incentive to vote because the probability that a single vote might change an electoral outcome is usually minuscule. If voters turn out in large numbers to derive intrinsic value from action, how will these voters choose when considering the role local jurisdictions should play? The first section of the paper assesses the weight that expressive voters attach to an instrumental evaluation of alternative outcomes. Predictions are tested with reference to case study analysis of the way Swiss voters assessed the role their local jurisdiction should play. The relevance of this analysis is also assessed with reference to the choice that voters express when considering other local issues. Textbook analysis of fiscal federalism is premised on the assumption that voters register choice just as 'consumers' reveal demand for services in a market, but how robust is this analogy.
Resumo:
This note develops a flexible methodology for splicing economic time series that avoids the extreme assumptions implicit in the procedures most commonly used in the literature. It allows the user to split the required correction to the older of the series being linked between its levels and growth rates on the basis what he knows or conjectures about the persistence of the factors that account for the discrepancy between the two series that emerges at their linking point. The time profile of the correction is derived from the assumption that the error in the older series reflects the inadequate coverage of emerging sectors or activities that grow faster than the aggregate.
Resumo:
The objective of this paper is to re-examine the risk-and effort attitude in the context of strategic dynamic interactions stated as a discrete-time finite-horizon Nash game. The analysis is based on the assumption that players are endogenously risk-and effort-averse. Each player is characterized by distinct risk-and effort-aversion types that are unknown to his opponent. The goal of the game is the optimal risk-and effort-sharing between the players. It generally depends on the individual strategies adopted and, implicitly, on the the players' types or characteristics.
Resumo:
This paper develops a simple model that can be used to estimate the effectiveness of Cohesion expenditure relative to similar but unsubsidized projects, thereby making it possible to explicitly test an important assumption that is often implicit in estimates of the impact of Cohesion policies. Some preliminary results are reported for the case of infrastructure investment in the Spanish regions.
Resumo:
Multiplier analysis based upon the information contained in Leontief's inverse is undoubtedly part of the core of the input-output methodology and numerous applications an extensions have been developed that exploit its informational content. Nonetheless there are some implicit theoretical assumptions whose implications have perhaps not been fully assessed. This is the case of the 'excess capacity' assumption. Because of this assumption resources are available as needed to adjust production to new equilibrium states. In real world applications, however, new resources are scarce and costly. Supply constraints kick in and hence resource allocation needs to take them into account to really assess the effect of government policies. Using a closed general equilibrium model that incorporates supply constraints, we perform some simple numerical exercises and proceed to derive a 'constrained' multiplier matrix that can be compared with the standard 'unrestricted' multiplier matrix. Results show that the effectiveness of expenditure policies hinges critically on whether or not supply constraints are considered.
Resumo:
To describe the collective behavior of large ensembles of neurons in neuronal network, a kinetic theory description was developed in [13, 12], where a macroscopic representation of the network dynamics was directly derived from the microscopic dynamics of individual neurons, which are modeled by conductance-based, linear, integrate-and-fire point neurons. A diffusion approximation then led to a nonlinear Fokker-Planck equation for the probability density function of neuronal membrane potentials and synaptic conductances. In this work, we propose a deterministic numerical scheme for a Fokker-Planck model of an excitatory-only network. Our numerical solver allows us to obtain the time evolution of probability distribution functions, and thus, the evolution of all possible macroscopic quantities that are given by suitable moments of the probability density function. We show that this deterministic scheme is capable of capturing the bistability of stationary states observed in Monte Carlo simulations. Moreover, the transient behavior of the firing rates computed from the Fokker-Planck equation is analyzed in this bistable situation, where a bifurcation scenario, of asynchronous convergence towards stationary states, periodic synchronous solutions or damped oscillatory convergence towards stationary states, can be uncovered by increasing the strength of the excitatory coupling. Finally, the computation of moments of the probability distribution allows us to validate the applicability of a moment closure assumption used in [13] to further simplify the kinetic theory.
Resumo:
This paper challenges the prevailing view of the neutrality of the labour income share to labour demand, and investigates its impact on the evolution of employment. Whilst maintaining the assumption of a unitary long-run elasticity of wages with respect to productivity, we demonstrate that productivity growth affects the labour share in the long run due to frictional growth (that is, the interplay of wage dynamics and productivity growth). In the light of this result, we consider a stylised labour demand equation and show that the labour share is a driving force of employment. We substantiate our analytical exposition by providing empirical models of wage setting and employment equations for France, Germany, Italy, Japan, Spain, the UK, and the US over the 1960-2008 period. Our findings show that the timevarying labour share of these countries has significantly influenced their employment trajectories across decades. This indicates that the evolution of the labour income share (or, equivalently, the wage-productivity gap) deserves the attention of policy makers.
Resumo:
The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.