988 resultados para Gaussian assumption
Resumo:
Waveform tomographic imaging of crosshole georadar data is a powerful method to investigate the shallow subsurface because of its ability to provide images of pertinent petrophysical parameters with extremely high spatial resolution. All current crosshole georadar waveform inversion strategies are based on the assumption of frequency-independent electromagnetic constitutive parameters. However, in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behavior. In this paper, we evaluate synthetically the reconstruction limits of a recently published crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. Our results indicate that, when combined with a source wavelet estimation procedure that provides a means of partially accounting for the frequency-dependent effects through an "effective" wavelet, the inversion algorithm performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.
Resumo:
In previous work we have applied the environmental multi-region input-output (MRIO) method proposed by Turner et al (2007) to examine the ‘CO2 trade balance’ between Scotland and the Rest of the UK. In McGregor et al (2008) we construct an interregional economy-environment input-output (IO) and social accounting matrix (SAM) framework that allows us to investigate methods of attributing responsibility for pollution generation in the UK at the regional level. This facilitates analysis of the nature and significance of environmental spillovers and the existence of an environmental ‘trade balance’ between regions. While the existence of significant data problems mean that the quantitative results of this study should be regarded as provisional, we argue that the use of such a framework allows us to begin to consider questions such as the extent to which a devolved authority like the Scottish Parliament can and should be responsible for contributing to national targets for reductions in emissions levels (e.g. the UK commitment to the Kyoto Protocol) when it is limited in the way it can control emissions, particularly with respect to changes in demand elsewhere in the UK. However, while such analysis is useful in terms of accounting for pollution flows in the single time period that the accounts relate to, it is limited when the focus is on modelling the impacts of any marginal change in activity. This is because a conventional demand-driven IO model assumes an entirely passive supply-side in the economy (i.e. all supply is infinitely elastic) and is further restricted by the assumption of universal Leontief (fixed proportions) technology implied by the use of the A and multiplier matrices. In this paper we argue that where analysis of marginal changes in activity is required, a more flexible interregional computable general equilibrium approach that models behavioural relationships in a more realistic and theory-consistent manner, is more appropriate and informative. To illustrate our analysis, we compare the results of introducing a positive demand stimulus in the UK economy using both IO and CGE interregional models of Scotland and the rest of the UK. In the case of the latter, we demonstrate how more theory consistent modelling of both demand and supply side behaviour at the regional and national levels affect model results, including the impact on the interregional CO2 ‘trade balance’.
Resumo:
The application of multi-region environmental input-output (IO) analysis to the problem of accounting for emissions generation (and/or resource use) under different accounting principles has become increasingly common in the ecological and environmental economics literature in particular, with applications at the international and interregional subnational level. However, while environmental IO analysis is invaluable in accounting for pollution flows in the single time period that the accounts relate to, it is limited when the focus is on modelling the impacts of any marginal change in activity. This is because a conventional demand-driven IO model assumes an entirely passive supply-side in the economy (i.e. all supply is infinitely elastic) and is further restricted by the assumption of universal Leontief (fixed proportions) technology implied by the use of the A and multiplier matrices. Where analysis of marginal changes in activity is required, extension from an IO accounting framework to a more flexible interregional computable general equilibrium (CGE) approach, where behavioural relationships can be modelled in a more realistic and theory-consistent manner, is appropriate. Our argument is illustrated by comparing the results of introducing a positive demand stimulus in the UK economy using IO and CGE interregional models of Scotland and the rest of the UK. In the case of the latter, we demonstrate how more theory consistent modelling of both demand and supply side behaviour at the regional and national levels effect model results, including the impact on the interregional CO2 ‘trade balance’.
Resumo:
This paper evaluates, from an Allyn Youngian perspective, the neoclassical Solow model of growth and the associated empirical estimates of the sources of growth based on it. It attempts to clarify Young’s particular concept of generalised or macroeconomic “increasing returns” to show the limitations of a model of growth based on an assumption that the aggregate production function is characterised by constant returns to scale but “augmented” by exogenous technical progress. Young’s concept of endogenous, self-sustaining growth is also shown to differ in important respects (including in its policy implications) from modern endogenous growth theory.
Resumo:
We develop tests of the proportional hazards assumption, with respect to a continuous covariate, in the presence of unobserved heterogeneity with unknown distribution at the individual observation level. The proposed tests are specially powerful against ordered alternatives useful for modeling non-proportional hazards situations. By contrast to the case when the heterogeneity distribution is known up to …nite dimensional parameters, the null hypothesis for the current problem is similar to a test for absence of covariate dependence. However, the two testing problems di¤er in the nature of relevant alternative hypotheses. We develop tests for both the problems against ordered alternatives. Small sample performance and an application to real data highlight the usefulness of the framework and methodology.
Resumo:
Until recently, much effort has been devoted to the estimation of panel data regression models without adequate attention being paid to the drivers of diffusion and interaction across cross section and spatial units. We discuss some new methodologies in this emerging area and demonstrate their use in measurement and inferences on cross section and spatial interactions. Specifically, we highlight the important distinction between spatial dependence driven by unobserved common factors and those based on a spatial weights matrix. We argue that, purely factor driven models of spatial dependence may be somewhat inadequate because of their connection with the exchangeability assumption. Limitations and potential enhancements of the existing methods are discussed, and several directions for new research are highlighted.
Resumo:
This work is focused on the development of a methodology for the use of chemical characteristic of tire traces to help answer the following question: "Is the offending tire at the origin of the trace found on the crime scene?". This methodology goes from the trace sampling on the road to statistical analysis of its chemical characteristics. Knowledge about the composition and manufacture of tread tires as well as a review of instrumental techniques used for the analysis of polymeric materials were studied to select, as an ansi vi cal technique for this research, pyrolysis coupled to a gas Chromatograph with a mass spectrometry detector (Py-GC/MS). An analytical method was developed and optimized to obtain the lowest variability between replicates of the same sample. Within-variability of the tread was evaluated regarding width and circumference with several samples taken from twelve tires of different brands and/or models. The variability within each of the treads (within-variability) and between the treads (between-variability) could be quantified. Different statistical methods have shown that within-variability is lower than between-variability, which helped differentiate these tires. Ten tire traces were produced with tires of different brands and/or models by braking tests. These traces have been adequately sampled using sheets of gelatine. Particles of each trace were analysed using the same methodology as for the tires at their origin. The general chemical profile of a trace or of a tire has been characterized by eighty-six compounds. Based on a statistical comparison of the chemical profiles obtained, it has been shown that a tire trace is not differentiable from the tire at its origin but is generally differentiable from tires that are not at its origin. Thereafter, a sample containing sixty tires was analysed to assess the discrimination potential of the developed methodology. The statistical results showed that most of the tires of different brands and models are differentiable. However, tires of the same brand and model with identical characteristics, such as country of manufacture, size and DOT number, are not differentiable. A model, based on a likelihood ratio approach, was chosen to evaluate the results of the comparisons between the chemical profiles of the traces and tires. The methodology developed was finally blindly tested using three simulated scenarios. Each scenario involved a trace of an unknown tire as well as two tires possibly at its origin. The correct results for the three scenarios were used to validate the developed methodology. The different steps of this work were useful to collect the required information to test and validate the underlying assumption that it is possible to help determine if an offending tire » or is not at the origin of a trace, by means of a statistical comparison of their chemical profile. This aid was formalized by a measure of the probative value of the evidence, which is represented by the chemical profile of the trace of the tire. - Ce travail s'est proposé de développer une méthodologie pour l'exploitation des caractéristiques chimiques des traces de pneumatiques dans le but d'aider à répondre à la question suivante : «Est-ce que le pneumatique incriminé est ou n'est pas à l'origine de la trace relevée sur les lieux ? ». Cette méthodologie s'est intéressée du prélèvement de la trace de pneumatique sur la chaussée à l'exploitation statistique de ses caractéristiques chimiques. L'acquisition de connaissances sur la composition et la fabrication de la bande de roulement des pneumatiques ainsi que la revue de techniques instrumentales utilisées pour l'analyse de matériaux polymériques ont permis de choisir, comme technique analytique pour la présente recherche, la pyrolyse couplée à un chromatographe en phase gazeuse avec un détecteur de spectrométrie de masse (Py-GC/MS). Une méthode analytique a été développée et optimisée afin d'obtenir la plus faible variabilité entre les réplicas d'un même échantillon. L'évaluation de l'intravariabilité de la bande de roulement a été entreprise dans sa largeur et sa circonférence à l'aide de plusieurs prélèvements effectués sur douze pneumatiques de marques et/ou modèles différents. La variabilité au sein de chacune des bandes de roulement (intravariabilité) ainsi qu'entre les bandes de roulement considérées (intervariabilité) a pu être quantifiée. Les différentes méthodes statistiques appliquées ont montré que l'intravariabilité est plus faible que l'intervariabilité, ce qui a permis de différencier ces pneumatiques. Dix traces de pneumatiques ont été produites à l'aide de pneumatiques de marques et/ou modèles différents en effectuant des tests de freinage. Ces traces ont pu être adéquatement prélevées à l'aide de feuilles de gélatine. Des particules de chaque trace ont été analysées selon la même méthodologie que pour les pneumatiques à leur origine. Le profil chimique général d'une trace de pneumatique ou d'un pneumatique a été caractérisé à l'aide de huitante-six composés. Sur la base de la comparaison statistique des profils chimiques obtenus, il a pu être montré qu'une trace de pneumatique n'est pas différenciable du pneumatique à son origine mais est, généralement, différenciable des pneumatiques qui ne sont pas à son origine. Par la suite, un échantillonnage comprenant soixante pneumatiques a été analysé afin d'évaluer le potentiel de discrimination de la méthodologie développée. Les méthodes statistiques appliquées ont mis en évidence que des pneumatiques de marques et modèles différents sont, majoritairement, différenciables entre eux. La méthodologie développée présente ainsi un bon potentiel de discrimination. Toutefois, des pneumatiques de la même marque et du même modèle qui présentent des caractéristiques PTD (i.e. pays de fabrication, taille et numéro DOT) identiques ne sont pas différenciables. Un modèle d'évaluation, basé sur une approche dite du likelihood ratio, a été adopté pour apporter une signification au résultat des comparaisons entre les profils chimiques des traces et des pneumatiques. La méthodologie mise en place a finalement été testée à l'aveugle à l'aide de la simulation de trois scénarios. Chaque scénario impliquait une trace de pneumatique inconnue et deux pneumatiques suspectés d'être à l'origine de cette trace. Les résultats corrects obtenus pour les trois scénarios ont permis de valider la méthodologie développée. Les différentes étapes de ce travail ont permis d'acquérir les informations nécessaires au test et à la validation de l'hypothèse fondamentale selon laquelle il est possible d'aider à déterminer si un pneumatique incriminé est ou n'est pas à l'origine d'une trace, par le biais d'une comparaison statistique de leur profil chimique. Cette aide a été formalisée par une mesure de la force probante de l'indice, qui est représenté par le profil chimique de la trace de pneumatique.
Resumo:
The paper uses a regional input-output (IO) framework and data derived on waste generation by industry to examine regional accountability for waste generation. In addition to estimating a series of industry output-waste coefficients, the paper considers two methods for waste attribution but focuses first on one (trade endogenised linear attribution system (TELAS)) that permits a greater focus on private and public final consumption as the main exogenous driver of waste generation. Second, the paper uses a domestic technology assumption (DTA) to consider a regional ‘waste footprint’ where local consumption requirements are assumed to be met through domestic production.
Resumo:
We consider a general equilibrium model a la Bhaskar (Review of Economic Studies 2002): there are complementarities across sectors, each of which comprise (many) heterogenous monopolistically competitive firms. Bhaskar's model is extended in two directions: production requires capital, and labour markets are segmented. Labour market segmentation models the difficulties of labour migrating across international barriers (in a trade context) or from a poor region to a richer one (in a regional context), whilst the assumption of a single capital market means that capital flows freely between countries or regions. The model is solved analytically and a closed form solution is provided. Adding labour market segmentation to Bhaskar's two-tier industrial structure allows us to study, inter alia, the impact of competition regulations on wages and - financial flows both in the regional and international context, and the output, welfare and financial implications of relaxing immigration laws. The analytical approach adopted allows us, not only to sign the effect of policies, but also to quantify their effects. Introducing capital as a factor of production improves the realism of the model and refi nes its empirically testable implications.
Resumo:
This paper studies the implications for monetary policy of heterogeneous expectations in a New Keynesian model. The assumption of rational expectations is replaced with parsimonious forecasting models where agents select between predictors that are underparameterized. In a Misspecification Equilibrium agents only select the best-performing statistical models. We demonstrate that, even when monetary policy rules satisfy the Taylor principle by adjusting nominal interest rates more than one for one with inflation, there may exist equilibria with Intrinsic Heterogeneity. Under certain conditions, there may exist multiple misspecification equilibria. We show that these findings have important implications for business cycle dynamics and for the design of monetary policy.
Resumo:
This paper reviews the evidence on the effects of recessions on potential output. In contrast to the assumption in mainstream macroeconomic models that economic fluctuations do not change potential output paths, the evidence is that they do in the case of recessions. A model is proposed to explain this phenomenon, based on an analogy with water flows in porous media. Because of the discrete adjustments made by heterogeneous economic agents in such a world, potential output displays hysteresis with regard to aggregate demand shocks, and thus retains a memory of the shocks associated with recessions.
Resumo:
When speech is degraded, word report is higher for semantically coherent sentences (e.g., her new skirt was made of denim) than for anomalous sentences (e.g., her good slope was done in carrot). Such increased intelligibility is often described as resulting from "top-down" processes, reflecting an assumption that higher-level (semantic) neural processes support lower-level (perceptual) mechanisms. We used time-resolved sparse fMRI to test for top-down neural mechanisms, measuring activity while participants heard coherent and anomalous sentences presented in speech envelope/spectrum noise at varying signal-to-noise ratios (SNR). The timing of BOLD responses to more intelligible speech provides evidence of hierarchical organization, with earlier responses in peri-auditory regions of the posterior superior temporal gyrus than in more distant temporal and frontal regions. Despite Sentence content × SNR interactions in the superior temporal gyrus, prefrontal regions respond after auditory/perceptual regions. Although we cannot rule out top-down effects, this pattern is more compatible with a purely feedforward or bottom-up account, in which the results of lower-level perceptual processing are passed to inferior frontal regions. Behavioral and neural evidence that sentence content influences perception of degraded speech does not necessarily imply "top-down" neural processes.
Resumo:
We examine the proof of a classical localization theorem of Bousfield and Friedlander and we remove the assumption that the underlying model category be right proper. The key to the argument is a lemma about factoring in morphisms in the arrow category of a model category.
Resumo:
The project aims to achieve two objectives. First, we are analysing the labour market implications of the assumption that firms cannot pay similarly qualified employees differently according to when they joined the firm. For example, if the general situation for workers improves, a firm that seeks to hire new workers may feel it has to pay more to new hires. However, if the firm must pay the same wage to new hires and incumbents due to equal treatment, it would either have to raise the wage of the incumbents, or offer new workers a lower wage than the firm would do otherwise. This is very different from the standard assumption in economic analysis that firms are free to treat newly hired workers independently of existing hires. Second, we will use detailed data on individual wages to try to gauge whether (and to what extent) equity is a feature of actual labour markets. To investigate this, we are using two matched employer-employee panel datasets, one from Portugal and the other from Brazil. These unique datasets provide objective records on millions of workers and their firms over a long period of time, so that we can identify which firms employ which workers at each time. The datasets also include a large number of firm and worker variables.
Resumo:
Official calculations of automatic stabilizers are seriously flawed since they rest on the assumption that the only element of social spending that reacts automatically to the cycle is unemployment compensation. This puts into question many estimates of discretionary fiscal policy. In response, we propose a simultaneous estimate of automatic and discretionary fiscal policy. This leads us, quite naturally, to a tripartite decomposition of the budget balance between revenues, social spending and other spending as a bare minimum. Our headline results for a panel of 20 OECD countries in 1981-2003 are .59 automatic stabilization in percentage-points of primary surplus balances. All of this stabilization remains following discretionary responses during contractions, but arguably only about 3/5 of it remains so in expansions while discretionary behavior cancels the rest. We pay a lot of attention to the impact of the Maastricht Treaty and the SGP on the EU members of our sample and to real time data.