973 resultados para Roundoff errors.
Resumo:
Spatial econometrics has been criticized by some economists because some model specifications have been driven by data-analytic considerations rather than having a firm foundation in economic theory. In particular this applies to the so-called W matrix, which is integral to the structure of endogenous and exogenous spatial lags, and to spatial error processes, and which are almost the sine qua non of spatial econometrics. Moreover it has been suggested that the significance of a spatially lagged dependent variable involving W may be misleading, since it may be simply picking up the effects of omitted spatially dependent variables, incorrectly suggesting the existence of a spillover mechanism. In this paper we review the theoretical and empirical rationale for network dependence and spatial externalities as embodied in spatially lagged variables, arguing that failing to acknowledge their presence at least leads to biased inference, can be a cause of inconsistent estimation, and leads to an incorrect understanding of true causal processes.
Resumo:
‘Modern’ Phillips curve theories predict inflation is an integrated, or near integrated, process. However, inflation appears bounded above and below in developed economies and so cannot be ‘truly’ integrated and more likely stationary around a shifting mean. If agents believe inflation is integrated as in the ‘modern’ theories then they are making systematic errors concerning the statistical process of inflation. An alternative theory of the Phillips curve is developed that is consistent with the ‘true’ statistical process of inflation. It is demonstrated that United States inflation data is consistent with the alternative theory but not with the existing ‘modern’ theories.
Resumo:
In this paper we investigate the ability of a number of different ordered probit models to predict ratings based on firm-specific data on business and financial risks. We investigate models based on momentum, drift and ageing and compare them against alternatives that take into account the initial rating of the firm and its previous actual rating. Using data on US bond issuing firms rated by Fitch over the years 2000 to 2007 we compare the performance of these models in predicting the rating in-sample and out-of-sample using root mean squared errors, Diebold-Mariano tests of forecast performance and contingency tables. We conclude that initial and previous states have a substantial influence on rating prediction.
Resumo:
Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.
Resumo:
Accurate chromosome segregation during mitosis is temporally and spatially coordinated by fidelity-monitoring checkpoint systems. Deficiencies in these checkpoint systems can lead to chromosome segregation errors and aneuploidy, and promote tumorigenesis. Here, we report that the TRAF-interacting protein (TRAIP), a ubiquitously expressed nucleolar E3 ubiquitin ligase important for cellular proliferation, is localized close to mitotic chromosomes. Its knockdown in HeLa cells by RNA interference (RNAi) decreased the time of early mitosis progression from nuclear envelope breakdown (NEB) to anaphase onset and increased the percentages of chromosome alignment defects in metaphase and lagging chromosomes in anaphase compared with those of control cells. The decrease in progression time was corrected by the expression of wild-type but not a ubiquitin-ligase-deficient form of TRAIP. TRAIP-depleted cells bypassed taxol-induced mitotic arrest and displayed significantly reduced kinetochore levels of MAD2 (also known as MAD2L1) but not of other spindle checkpoint proteins in the presence of nocodazole. These results imply that TRAIP regulates the spindle assembly checkpoint, MAD2 abundance at kinetochores and the accurate cellular distribution of chromosomes. The TRAIP ubiquitin ligase activity is functionally required for the spindle assembly checkpoint control.
Resumo:
We study the impact of anticipated fiscal policy changes in a Ramsey economy where agents form long-horizon expectations using adaptive learning. We extend the existing framework by introducing distortionary taxes as well as elastic labour supply, which makes agents. decisions non-predetermined but more realistic. We detect that the dynamic responses to anticipated tax changes under learning have oscillatory behaviour that can be interpreted as self-fulfilling waves of optimism and pessimism emerging from systematic forecast errors. Moreover, we demonstrate that these waves can have important implications for the welfare consequences of .scal reforms. (JEL: E32, E62, D84)
Resumo:
Los déficits y sesgos tanto cognitivos como afectivos han sido fuente creciente de interés en el ámbito de la Neurociéncia de los Trastornos Mentales. En este proyecto, que se inicia en 2004 y finaliza a finales de 2008, se han estudiado los siguientes Trastornos Mentales: Juego Patológico (JP), Trastornos de la Conducta Alimentaria (TCA) y Trastornos Depresivos. En esta memoria nos centraremos en resumir parte de los resultados obtenidos en un estudio sobre JP y toma de decisiones (articulo en revisión y pendiente de aceptación) y otro de funcionamiento ejecutivo en JP y Bulimia Nerviosa (BN) (artículo en prensa). Resumiento el primer estudio los JP (N=32) muestran un proceso de toma de decisiones sesgado por la búsqueda de recompensa en forma de elevada toma de riesgos en comparación con Controles Sanos (CS). También se observan déficits en flexibilidad cognitiva pero no en control inhibitorio entre JP y CS. Los resultados descartan miopía conductual para lo toma de decisiones en JP, pero apuntan a un sesgo cognitivo-afectivo, en el que el control de los impulsos jugaría un papel relevante, en forma de ilusión de control, para los procesos de toma de decisiones con recompensa inmediata pero con castigo diferido, medidos por una prueba de toma de decisiones (IGT ABCD). En el segundo estudio, basándose en las vulnerabilidadades compartidas descritas entre JP y BN se comparó el funcionamiento ejecutivo de mujeres con JP y BN. Tras la administración del WCST y Stroop y ajustando el análisis por edad y educación, las JP mostraron mayor afectación, en concreto mayor porcentaje de errores perservaritvos, menor nivel de respuestas conceptuales y mayor número de ensayos administrados, mientras que el grupo de BN mostró mayor porcentaje de errores no persevarativos. Ambas, mujeres JP y BN mostraron disfunción ejecutiva en relación a los CS pero con diferentes correlatos específcos.
Resumo:
Empirical researchers interested in how governance shapes various aspects of economic development frequently use the Worldwide Governance indicators (WGI). These variables come in the form of an estimate along with a standard error reflecting the uncertainty of this estimate. Existing empirical work simply uses the estimates as an explanatory variable and discards the information provided by the standard errors. In this paper, we argue that the appropriate practice should be to take into account the uncertainty around the WGI estimates through the use of multiple imputation. We investigate the importance of our proposed approach by revisiting in three applications the results of recently published studies. These applications cover the impact of governance on (i) capital flows; (ii) international trade; (iii) income levels around the world. We generally find that the estimated effects of governance are highly sensitive to the use of multiple imputation. We also show that model misspecification is a concern for the results of our reference studies. We conclude that the effects of governance are hard to establish once we take into account uncertainty around both the WGI estimates and the correct model specification.
Resumo:
Diffusion MRI is a well established imaging modality providing a powerful way to probe the structure of the white matter non-invasively. Despite its potential, the intrinsic long scan times of these sequences have hampered their use in clinical practice. For this reason, a large variety of methods have been recently proposed to shorten the acquisition times. Among them, spherical deconvolution approaches have gained a lot of interest for their ability to reliably recover the intra-voxel fiber configuration with a relatively small number of data samples. To overcome the intrinsic instabilities of deconvolution, these methods use regularization schemes generally based on the assumption that the fiber orientation distribution (FOD) to be recovered in each voxel is sparse. The well known Constrained Spherical Deconvolution (CSD) approach resorts to Tikhonov regularization, based on an ℓ(2)-norm prior, which promotes a weak version of sparsity. Also, in the last few years compressed sensing has been advocated to further accelerate the acquisitions and ℓ(1)-norm minimization is generally employed as a means to promote sparsity in the recovered FODs. In this paper, we provide evidence that the use of an ℓ(1)-norm prior to regularize this class of problems is somewhat inconsistent with the fact that the fiber compartments all sum up to unity. To overcome this ℓ(1) inconsistency while simultaneously exploiting sparsity more optimally than through an ℓ(2) prior, we reformulate the reconstruction problem as a constrained formulation between a data term and a sparsity prior consisting in an explicit bound on the ℓ(0)norm of the FOD, i.e. on the number of fibers. The method has been tested both on synthetic and real data. Experimental results show that the proposed ℓ(0) formulation significantly reduces modeling errors compared to the state-of-the-art ℓ(2) and ℓ(1) regularization approaches.
Resumo:
In this paper we make three contributions to the literature on optimal Competition Law enforcement procedures. The first (which is of general interest beyond competition policy) is to clarify the concept of “legal uncertainty”, relating it to ideas in the literature on Law and Economics, but formalising the concept through various information structures which specify the probability that each firm attaches – at the time it takes an action – to the possibility of its being deemed anti-competitive were it to be investigated by a Competition Authority. We show that the existence of Type I and Type II decision errors by competition authorities is neither necessary nor sufficient for the existence of legal uncertainty, and that information structures with legal uncertainty can generate higher welfare than information structures with legal certainty – a result echoing a similar finding obtained in a completely different context and under different assumptions in earlier Law and Economics literature (Kaplow and Shavell, 1992). Our second contribution is to revisit and significantly generalise the analysis in our previous paper, Katsoulacos and Ulph (2009), involving a welfare comparison of Per Se and Effects- Based legal standards. In that analysis we considered just a single information structure under an Effects-Based standard and also penalties were exogenously fixed. Here we allow for (a) different information structures under an Effects-Based standard and (b) endogenous penalties. We obtain two main results: (i) considering all information structures a Per Se standard is never better than an Effects-Based standard; (ii) optimal penalties may be higher when there is legal uncertainty than when there is no legal uncertainty.
Resumo:
This paper develops a methodology to estimate the entire population distributions from bin-aggregated sample data. We do this through the estimation of the parameters of mixtures of distributions that allow for maximal parametric flexibility. The statistical approach we develop enables comparisons of the full distributions of height data from potential army conscripts across France's 88 departments for most of the nineteenth century. These comparisons are made by testing for differences-of-means stochastic dominance. Corrections for possible measurement errors are also devised by taking advantage of the richness of the data sets. Our methodology is of interest to researchers working on historical as well as contemporary bin-aggregated or histogram-type data, something that is still widely done since much of the information that is publicly available is in that form, often due to restrictions due to political sensitivity and/or confidentiality concerns.
Resumo:
A study on lead pollution was carried out on a sample of ca. 300 city children. This paper presents the errors producing bias in the sample. It is emphasized that, in Switzerland, the difference between the Swiss and the migrant population (the latter being mainly Italian and Spanish) must be taken into account.
Resumo:
BACKGROUND: Little information is available on the validity of simple and indirect body-composition methods in non-Western populations. Equations for predicting body composition are population-specific, and body composition differs between blacks and whites. OBJECTIVE: We tested the hypothesis that the validity of equations for predicting total body water (TBW) from bioelectrical impedance analysis measurements is likely to depend on the racial background of the group from which the equations were derived. DESIGN: The hypothesis was tested by comparing, in 36 African women, TBW values measured by deuterium dilution with those predicted by 23 equations developed in white, African American, or African subjects. These cross-validations in our African sample were also compared, whenever possible, with results from other studies in black subjects. RESULTS: Errors in predicting TBW showed acceptable values (1.3-1.9 kg) in all cases, whereas a large range of bias (0.2-6.1 kg) was observed independently of the ethnic origin of the sample from which the equations were derived. Three equations (2 from whites and 1 from blacks) showed nonsignificant bias and could be used in Africans. In all other cases, we observed either an overestimation or underestimation of TBW with variable bias values, regardless of racial background, yielding no clear trend for validity as a function of ethnic origin. CONCLUSIONS: The findings of this cross-validation study emphasize the need for further fundamental research to explore the causes of the poor validity of TBW prediction equations across populations rather than the need to develop new prediction equations for use in Africa.
Resumo:
Current American Academy of Neurology (AAN) guidelines for outcome prediction in comatose survivors of cardiac arrest (CA) have been validated before the therapeutic hypothermia era (TH). We undertook this study to verify the prognostic value of clinical and electrophysiological variables in the TH setting. A total of 111 consecutive comatose survivors of CA treated with TH were prospectively studied over a 3-year period. Neurological examination, electroencephalography (EEG), and somatosensory evoked potentials (SSEP) were performed immediately after TH, at normothermia and off sedation. Neurological recovery was assessed at 3 to 6 months, using Cerebral Performance Categories (CPC). Three clinical variables, assessed within 72 hours after CA, showed higher false-positive mortality predictions as compared with the AAN guidelines: incomplete brainstem reflexes recovery (4% vs 0%), myoclonus (7% vs 0%), and absent motor response to pain (24% vs 0%). Furthermore, unreactive EEG background was incompatible with good long-term neurological recovery (CPC 1-2) and strongly associated with in-hospital mortality (adjusted odds ratio for death, 15.4; 95% confidence interval, 3.3-71.9). The presence of at least 2 independent predictors out of 4 (incomplete brainstem reflexes, myoclonus, unreactive EEG, and absent cortical SSEP) accurately predicted poor long-term neurological recovery (positive predictive value = 1.00); EEG reactivity significantly improved the prognostication. Our data show that TH may modify outcome prediction after CA, implying that some clinical features should be interpreted with more caution in this setting as compared with the AAN guidelines. EEG background reactivity is useful in determining the prognosis after CA treated with TH.
Resumo:
As computer chips implementation technologies evolve to obtain more performance, those computer chips are using smaller components, with bigger density of transistors and working with lower power voltages. All these factors turn the computer chips less robust and increase the probability of a transient fault. Transient faults may occur once and never more happen the same way in a computer system lifetime. There are distinct consequences when a transient fault occurs: the operating system might abort the execution if the change produced by the fault is detected by bad behavior of the application, but the biggest risk is that the fault produces an undetected data corruption that modifies the application final result without warnings (for example a bit flip in some crucial data). With the objective of researching transient faults in computer system’s processor registers and memory we have developed an extension of HP’s and AMD joint full system simulation environment, named COTSon. This extension allows the injection of faults that change a single bit in processor registers and memory of the simulated computer. The developed fault injection system makes it possible to: evaluate the effects of single bit flip transient faults in an application, analyze an application robustness against single bit flip transient faults and validate fault detection mechanism and strategies.