902 resultados para Deterministic imputation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze the effects of uncertainty and private information on horizontal mergers. Firms face uncertain demands or costs and receive private signals. They may decide to merge sharing their private information. If the uncertainty parameters are independent and the signals are perfect, uncertainty generates an informational advantage only to the merging firms, increasing merger incentives and decreasing free-riding effects. Thus, mergers become more profitable and stable. These results generalize to the case of correlated parameters if the correlation is not very severe, and for perfect correlation if the firms receive noisy signals. From the normative point of view, mergers are socially less harmful compared to deterministic markets and may even be welfare enhancing. If the signals are, instead, publicly observed, uncertainty does not necessarily give more incentives to merge, and mergers are not always less socially harmful.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we prove that the solution of a backward stochastic differential equation, which involves a subdifferential operator and associated to a family of reflecting diffusion processes, converges to the solution of a deterministic backward equation and satisfes a large deviation principle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze (non-deterministic) contests with anonymous contest success functions. There is no restriction on the number of contestants or on their valuations for the prize. We provide intuitive and easily verifiable conditions for the existence of an equilibrium with properties similar to the one of the (deterministic) all-pay auction. Since these conditions are fulfilled for a wide array of situations, the predictions of this equilibrium are very robust to the specific details of the contest. An application of this result contributes to fill a gap in the analysis of the popular Tullock rent- seeking game because it characterizes properties of an equilibrium for increasing returns to scale larger than two, for any number of contestants and in contests with or without a common value. Keywords: (non-) deterministic contest, all-pay auction, contest success functions. JEL Classification Numbers: C72 (Noncooperative Games), D72 (Economic Models of Political Processes: Rent-Seeking, Elections), D44 (Auctions).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest projecte presenta una avaluació de les diferents alternatives d'encaminament per a una NoC amb una topologia mesh 2D. Per tal d'exposar aquestes alternatives s'ha estudiat la composició d'un router implementat amb l'algorisme determinista XY i s'ha adaptat per tal que aquest suportés els algorismes parcialment adaptatius West First, North Last i Negative First. Un cop tenim els routers implementats es disposa un estudi dels diferents algorismes i com cadascun d'aquests actuen en front uns mateixos estímuls per tal de crear una comparativa entre ells que ens faciliti una elecció a priori.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the inter-industry wage structure of the organised manufacturing sector in India for the period 1973-74 to 2003-04 by estimating the growth of average real wages for production workers by industry. In order to estimate the growth rates, the study adopts a methodological framework that differs from other studies in that the time series properties of the concerned variables are closely considered in order to obtain meaningful estimates of growth that are unbiased and (asymptotically) efficient. Using wage data on 51 manufacturing industries at three digit level of the National Industrial Classification 1998 (India), our estimation procedure obtains estimates of growth of real wages per worker that are deterministic in nature by accounting for any potential structural break(s). Our findings show that the inter-industry wage structure in India has changed a lot in the period 1973-74 to 2003-04 and that it provides some evidence that the inter-industry wage differences have become more pronounced in the post-reforms period. Thus this paper provides new evidence from India on the need to consider the hypothesis that industry affiliation is potentially an important determinant of wages when studying any relationship between reforms and wages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study addresses the issue of the presence of a unit root on the growth rate estimation by the least-squares approach. We argue that when the log of a variable contains a unit root, i.e., it is not stationary then the growth rate estimate from the log-linear trend model is not a valid representation of the actual growth of the series. In fact, under such a situation, we show that the growth of the series is the cumulative impact of a stochastic process. As such the growth estimate from such a model is just a spurious representation of the actual growth of the series, which we refer to as a “pseudo growth rate”. Hence such an estimate should be interpreted with caution. On the other hand, we highlight that the statistical representation of a series as containing a unit root is not easy to separate from an alternative description which represents the series as fundamentally deterministic (no unit root) but containing a structural break. In search of a way around this, our study presents a survey of both the theoretical and empirical literature on unit root tests that takes into account possible structural breaks. We show that when a series is trendstationary with breaks, it is possible to use the log-linear trend model to obtain well defined estimates of growth rates for sub-periods which are valid representations of the actual growth of the series. Finally, to highlight the above issues, we carry out an empirical application whereby we estimate meaningful growth rates of real wages per worker for 51 industries from the organised manufacturing sector in India for the period 1973-2003, which are not only unbiased but also asymptotically efficient. We use these growth rate estimates to highlight the evolving inter-industry wage structure in India.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the incentive to invest to improve marriage prospects, in a frictionless marriage market with non-transferable utility. Stochastic returns to investment eliminate the multiplicity of equilibria in models with deterministic returns, and a unique equilibrium exists under reasonable conditions. Equilibrium investment is efficient when the sexes are symmetric. However, when there is any asymmetry, including an unbalanced sex ratio, investments are generically excessive. For example, if there is an excess of boys, then there is parental over-investment in boys and under-investment in girls, and total investment will be excessive.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is a contribution to the growing literature on constrained inefficiencies in economies with financial frictions. The purpose is to present two simple examples, inspired by the stochastic models in Gersbach-Rochet (2012) and Lorenzoni (2008), of deterministic environments in which such inefficiencies arise through credit constraints. Common to both examples is a pecuniary externality, which operates through an asset price. In the second example, a simple transfer between two groups of agents can bring about a Pareto improvement. In a first best economy, there are no pecuniary externalities because marginal productivities are equalised. But when agents face credit constraints, there is a wedge between their marginal productivities and those of the non-credit-constrained agents. The wedge is the source of the pecuniary externality: economies with these kinds of imperfections in credit markets are not second-best efficient. This is akin to the constrained inefficiency of an economy with incomplete markets, as in Geanakoplos and Polemarchakis (1986).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Empirical researchers interested in how governance shapes various aspects of economic development frequently use the Worldwide Governance indicators (WGI). These variables come in the form of an estimate along with a standard error reflecting the uncertainty of this estimate. Existing empirical work simply uses the estimates as an explanatory variable and discards the information provided by the standard errors. In this paper, we argue that the appropriate practice should be to take into account the uncertainty around the WGI estimates through the use of multiple imputation. We investigate the importance of our proposed approach by revisiting in three applications the results of recently published studies. These applications cover the impact of governance on (i) capital flows; (ii) international trade; (iii) income levels around the world. We generally find that the estimated effects of governance are highly sensitive to the use of multiple imputation. We also show that model misspecification is a concern for the results of our reference studies. We conclude that the effects of governance are hard to establish once we take into account uncertainty around both the WGI estimates and the correct model specification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To describe the collective behavior of large ensembles of neurons in neuronal network, a kinetic theory description was developed in [13, 12], where a macroscopic representation of the network dynamics was directly derived from the microscopic dynamics of individual neurons, which are modeled by conductance-based, linear, integrate-and-fire point neurons. A diffusion approximation then led to a nonlinear Fokker-Planck equation for the probability density function of neuronal membrane potentials and synaptic conductances. In this work, we propose a deterministic numerical scheme for a Fokker-Planck model of an excitatory-only network. Our numerical solver allows us to obtain the time evolution of probability distribution functions, and thus, the evolution of all possible macroscopic quantities that are given by suitable moments of the probability density function. We show that this deterministic scheme is capable of capturing the bistability of stationary states observed in Monte Carlo simulations. Moreover, the transient behavior of the firing rates computed from the Fokker-Planck equation is analyzed in this bistable situation, where a bifurcation scenario, of asynchronous convergence towards stationary states, periodic synchronous solutions or damped oscillatory convergence towards stationary states, can be uncovered by increasing the strength of the excitatory coupling. Finally, the computation of moments of the probability distribution allows us to validate the applicability of a moment closure assumption used in [13] to further simplify the kinetic theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 22q11.2 deletion syndrome (22q11DS) is a widely recognized genetic model allowing the study of neuroanatomical biomarkers that underlie the risk for developing schizophrenia. Recent advances in magnetic resonance image analyses enable the examination of structural connectivity integrity, scarcely used in the 22q11DS field. This framework potentially provides evidence for the disconnectivity hypothesis of schizophrenia in this high-risk population. In the present study, we quantify the whole brain white matter connections in 22q11DS using deterministic tractography. Diffusion Tensor Imaging was acquired in 30 affected patients and 30 age- and gender-matched healthy participants. The Human Connectome technique was applied to register white matter streamlines with cortical anatomy. The number of fibers (streamlines) was used as a measure of connectivity for comparison between groups at the global, lobar and regional level. All statistics were corrected for age and gender. Results showed a 10% reduction of the total number of fibers in patients compared to controls. After correcting for this global reduction, preserved connectivity was found within the right frontal and right parietal lobes. The relative increase in the number of fibers was located mainly in the right hemisphere. Conversely, an excessive reduction of connectivity was observed within and between limbic structures. Finally, a disproportionate reduction was shown at the level of fibers connecting the left fronto-temporal regions. We could therefore speculate that the observed disruption to fronto-temporal connectivity in individuals at risk of schizophrenia implies that fronto-temporal disconnectivity, frequently implicated in the pathogenesis of schizophrenia, could precede the onset of symptoms and, as such, constitutes a biomarker of the vulnerability to develop psychosis. On the contrary, connectivity alterations in the limbic lobe play a role in a wide range of psychiatric disorders and therefore seem to be less specific in defining schizophrenia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A collection of spherical obstacles in the unit ball in Euclidean space is said to be avoidable for Brownian motion if there is a positive probability that Brownian motion diffusing from some point in the ball will avoid all the obstacles and reach the boundary of the ball. The centres of the spherical obstacles are generated according to a Poisson point process while the radius of an obstacle is a deterministic function. If avoidable configurations are generated with positive probability Lundh calls this percolation diffusion. An integral condition for percolation diffusion is derived in terms of the intensity of the point process and the function that determines the radii of the obstacles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.