922 resultados para Empirical Flow Models
Resumo:
Correlations between the elliptic or triangular flow coefficients vm (m=2 or 3) and other flow harmonics vn (n=2 to 5) are measured using sNN−−−−√=2.76 TeV Pb+Pb collision data collected in 2010 by the ATLAS experiment at the LHC, corresponding to an integrated lumonisity of 7 μb−1. The vm-vn correlations are measured in midrapidity as a function of centrality, and, for events within the same centrality interval, as a function of event ellipticity or triangularity defined in a forward rapidity region. For events within the same centrality interval, v3 is found to be anticorrelated with v2 and this anticorrelation is consistent with similar anticorrelations between the corresponding eccentricities ϵ2 and ϵ3. On the other hand, it is observed that v4 increases strongly with v2, and v5 increases strongly with both v2 and v3. The trend and strength of the vm-vn correlations for n=4 and 5 are found to disagree with ϵm-ϵn correlations predicted by initial-geometry models. Instead, these correlations are found to be consistent with the combined effects of a linear contribution to vn and a nonlinear term that is a function of v22 or of v2v3, as predicted by hydrodynamic models. A simple two-component fit is used to separate these two contributions. The extracted linear and nonlinear contributions to v4 and v5 are found to be consistent with previously measured event-plane correlations.
Resumo:
Dissertação de mestrado integrado em Engenharia Mecânica
Resumo:
"A workshop within the 19th International Conference on Applications and Theory of Petri Nets - ICATPN’1998"
Resumo:
This paper assesses empirically the importance of size discrimination and disaggregate data for deciding where to locate a start-up concern. We compare three econometric specifications using Catalan data: a multinomial logit with 4 and 41 alternatives (provinces and comarques, respectively) in which firm size is the main covariate; a conditional logit with 4 and 41 alternatives including attributes of the sites as well as size-site interactions; and a Poisson model on the comarques and the full spatial choice set (942 municipalities) with site-specific variables. Our results suggest that if these two issues are ignored, conclusions may be misleading. We provide evidence that large and small firms behave differently and conclude that Catalan firms tend to choose between comarques rather than between municipalities. Moreover, labour-intensive firms seem more likely to be located in the city of Barcelona. Keywords: Catalonia, industrial location, multinomial response model. JEL: C250, E30, R00, R12
Resumo:
This paper provides empirical evidence that continuous time models with one factor of volatility, in some conditions, are able to fit the main characteristics of financial data. It also reports the importance of the feedback factor in capturing the strong volatility clustering of data, caused by a possible change in the pattern of volatility in the last part of the sample. We use the Efficient Method of Moments (EMM) by Gallant and Tauchen (1996) to estimate logarithmic models with one and two stochastic volatility factors (with and without feedback) and to select among them.
Resumo:
The main purpose of this paper is building a research model to integrate the socioeconomic concept of social capital within intentional models of new firm creation. Nevertheless, some researchers have found cultural differences between countries and regions to have an effect on economic development. Therefore, a second objective of this study is exploring whether those cultural differences affect entrepreneurial cognitions. Research design and methodology: Two samples of last year university students from Spain and Taiwan are studied through an Entrepreneurial Intention Questionnaire (EIQ). Structural equation models (Partial Least Squares) are used to test the hypotheses. The possible existence of differences between both sub-samples is also empirically explored through a multigroup analysis. Main outcomes and results: The proposed model explains 54.5% of the variance in entrepreneurial intention. Besides, there are some significant differences between both subsamples that could be attributed to cultural diversity. Conclusions: This paper has shown the relevance of cognitive social capital in shaping individuals’ entrepreneurial intentions across different countries. Furthermore, it suggests that national culture could be shaping entrepreneurial perceptions, but not cognitive social capital. Therefore, both cognitive social capital and culture (made up essentially of values and beliefs), may act together to reinforce the entrepreneurial intention.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
Report for the scientific sojourn carried out at the University of New South Wales from February to June the 2007. Two different biogeochemical models are coupled to a three dimensional configuration of the Princeton Ocean Model (POM) for the Northwestern Mediterranean Sea (Ahumada and Cruzado, 2007). The first biogeochemical model (BLANES) is the three-dimensional version of the model described by Bahamon and Cruzado (2003) and computes the nitrogen fluxes through six compartments using semi-empirical descriptions of biological processes. The second biogeochemical model (BIOMEC) is the biomechanical NPZD model described in Baird et al. (2004), which uses a combination of physiological and physical descriptions to quantify the rates of planktonic interactions. Physical descriptions include, for example, the diffusion of nutrients to phytoplankton cells and the encounter rate of predators and prey. The link between physical and biogeochemical processes in both models is expressed by the advection-diffusion of the non-conservative tracers. The similarities in the mathematical formulation of the biogeochemical processes in the two models are exploited to determine the parameter set for the biomechanical model that best fits the parameter set used in the first model. Three years of integration have been carried out for each model to reach the so called perpetual year run for biogeochemical conditions. Outputs from both models are averaged monthly and then compared to remote sensing images obtained from sensor MERIS for chlorophyll.
Resumo:
This paper investigates the role of institutions in determining per capita income levels and growth. It contributes to the empirical literature by using different variables as proxies for institutions and by developing a deeper analysis of the issues arising from the use of weak and too many instruments in per capita income and growth regressions. The cross-section estimation suggests that institutions seem to matter, regardless if they are the only explanatory variable or are combined with geographical and integration variables, although most models suffer from the issue of weak instruments. The results from the growth models provides some interesting results: there is mixed evidence on the role of institutions and such evidence is more likely to be associated with law and order and investment profile; government spending is an important policy variable; collapsing the number of instruments results in fewer significant coefficients for institutions.
Resumo:
This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.
Resumo:
This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting model as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
When Bank of England (and the Federal Reserve Board) introduced their quantitative easing (QE) operations they emphasised the effects on money and credit, but much of their empirical research on the effects of QE focuses on long-term interest rates. We use a flow of funds matrix with an independent central bank to show the implications of QE and other monetary developments, and argue that the financial crisis, the fiscal expansion and QE are likely to have constituted major exogenous shocks to money and credit in the UK which could not be digested immediately by the usual adjustment mechanisms. We present regressions of a reduced form model which considers the growth of nominal spending as determined by the growth of nominal money and other variables. These results suggest that money was not important during the Great Moderation but has had a much larger role in the period of the crisis and QE. We then use these estimates to illustrate the effects of the financial crisis and QE. We conclude that it would be useful to incorporate money and/or credit in wider macroeconometric models of the UK economy.
Resumo:
Time varying parameter (TVP) models have enjoyed an increasing popularity in empirical macroeconomics. However, TVP models are parameter-rich and risk over-fitting unless the dimension of the model is small. Motivated by this worry, this paper proposes several Time Varying dimension (TVD) models where the dimension of the model can change over time, allowing for the model to automatically choose a more parsimonious TVP representation, or to switch between different parsimonious representations. Our TVD models all fall in the category of dynamic mixture models. We discuss the properties of these models and present methods for Bayesian inference. An application involving US inflation forecasting illustrates and compares the different TVD models. We find our TVD approaches exhibit better forecasting performance than several standard benchmarks and shrink towards parsimonious specifications.
Resumo:
This paper uses data on the world's copper mining industry to measure the impact on efficiency of the adoption of the ISO 14001 environmental standard. Anecdotal and case study literature suggests that firms are motivated to adopt this standard so as to achieve greater efficiency through changes in operating procedures and processes. Using plant level panel data from 1992-2007 on most of the world's industrial copper mines, the study uses stochastic frontier methods to investigate the effects of ISO adoption. The variety of models used in this study find that adoption either tends to improve efficiency or has no impact on efficiency, but no evidence is found that ISO adoption decreases efficiency.