980 resultados para empirical correlation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Residual viscosity is a unique function of density for pure Freon-12 and Freon-22 vapors. Also, a plot of residual viscosity against density for Freon-12 and Freon-22 vapors exhibits a regular trend. These phenomena form the basis for predicting the viscosity of mixtures of Freon-12 and Freon-22 vapors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A correlation for entropy and enthalpy based on the two-parameter law of corresponding states and the reference substance concept has been obtained. For 199 points tested, the over-all average absolute deviation and the maximum deviations in the calculated values of entropy with the available data are 0.74 and 7.20%. The corresponding deviations of enthalpy are 1.86 and 15.0%, respectively. A compressibility chart for chloromethanes has been made and shown to be superior to existing charts. For 102 points tested, the average absolute and maximum deviations in the compressibilities were 1.80 and 19.5%, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrothermal treatment of a slurry of badly crystalline (beta(bc)) nickel hydroxide at different temperatures (65-170 degrees C) results in the progressive ordering of the structure by the step-wise elimination of disorders. Interstratification is eliminated at 140 degrees C, while cation vacancies are eliminated at 170 degrees C. A small percentage of stacking faults continue to persist even in `crystalline' samples. Electrochemical investigations show that the crystalline nickel hydroxide has a very low (0.4 e/Ni) reversible charge storage capacity. An incidence of at least 15% stacking faults combined with cation vacancies is essential for nickel hydroxide to perform close to its theoretical (1 e/ Ni) discharge capacity. (c) 2005 The Electrochemical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new equation for predicting the thermal conductivities of organic liquids using dimension-less analysis is given. The equation (Equation Presented) correlates 51 different liquids tested within 11% average error and 17% standard deviation. A comparison of the proposed equation with the available correlations and its application to some industrially important liquids show that this equation can be safely used to calculate the thermal conductivities at 20°C. and 1 atm. pressure for organic liquids of known molecular weight. Cp and ΔHv - the only two parameters for which experimental values must be known for making use of this equation - can be calculated using other well known correlations. The proposed equation is not applicable to inorganic liquids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A careful comparison of the distribution in the (R, θ)-plane of all NH ... O hydrogen bonds with that for bonds between neutral NH and neutral C=O groups indicated that the latter has a larger mean R and a wider range of θ and that the distribution was also broader than for the average case. Therefore, the potential function developed earlier for an average NH ... O hydrogen bond was modified to suit the peptide case. A three-parameter expression of the form {Mathematical expression}, with △ = R - Rmin, was found to be satisfactory. By comparing the theoretically expected distribution in R and θ with observed data (although limited), the best values were found to be p1 = 25, p3 = - 2 and q1 = 1 × 10-3, with Rmin = 2·95 Å and Vmin = - 4·5 kcal/mole. The procedure for obtaining a smooth transition from Vhb to the non-bonded potential Vnb for large R and θ is described, along with a flow chart useful for programming the formulae. Calculated values of ΔH, the enthalpy of formation of the hydrogen bond, using this function are in reasonable agreement with observation. When the atoms involved in the hydrogen bond occur in a five-membered ring as in the sequence[Figure not available: see fulltext.] a different formula for the potential function is needed, which is of the form Vhb = Vmin +p1△2 +q1x2 where x = θ - 50° for θ ≥ 50°, with p1 = 15, q1 = 0·002, Rmin = 2· Å and Vmin = - 2·5 kcal/mole. © 1971 Indian Academy of Sciences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The significant correlation coefficient between the terrestial heat flow and thermal conductivity computed from the continental heat flow data by Horai and Nur [1]2) may be explained as a natural consequence of terrestrial heat flow through a random medium. The theory predicts a value of 0.40 for the correlation coefficient. A simple statistical test shows that the majority of the computed coefficients belong to the statistical population whose mean is equal to the theoretical correlation coefficient. There are, however, a few observations of unsually high correlation coefficient which cannot be explained by the above hypothesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report derives from the EU funded research project “Key Factors Influencing Economic Relationships and Communication in European Food Chains” (FOODCOMM). The research consortium consisted of the following organisations: University of Bonn (UNI BONN), Department of Agricultural and Food Marketing Research (overall project co-ordination); Institute of Agricultural Development in Central and Eastern Europe (IAMO), Department for Agricultural Markets, Marketing and World Agricultural Trade, Halle (Saale), Germany; University of Helsinki, Ruralia Institute Seinäjoki Unit, Finland; Scottish Agricultural College (SAC), Food Marketing Research Team - Land Economy Research Group, Edinburgh and Aberdeen; Ashtown Food Research Centre (AFRC), Teagasc, Food Marketing Unit, Dublin; Institute of Agricultural & Food Economics (IAFE), Department of Market Analysis and Food Processing, Warsaw and Government of Aragon, Center for Agro-Food Research and Technology (CITA), Zaragoza, Spain. The aim of the FOODCOMM project was to examine the role (prevalence, necessity and significance) of economic relationships in selected European food chains and to identify the economic, social and cultural factors which influence co-ordination within these chains. The research project considered meat and cereal commodities in six different European countries (Finland, Germany, Ireland, Poland, Spain, UK/Scotland) and was commissioned against a background of changing European food markets. The research project as a whole consisted of seven different work packages. This report presents the results of qualitative research conducted for work package 5 (WP5) in the pig meat and rye bread chains in Finland. Ruralia Institute would like to give special thanks for all the individuals and companies that kindly gave up their time to take part in the study. Their input has been invaluable to the project. The contribution of research assistant Sanna-Helena Rantala was significant in the data gathering. FOODCOMM project was coordinated by the University of Bonn, Department of Agricultural and Food Market Research. Special thanks especially to Professor Monika Hartmann for acting as the project leader of FOODCOMM.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Governance has been one of the most popular buzzwords in recent political science. As with any term shared by numerous fields of research, as well as everyday language, governance is encumbered by a jungle of definitions and applications. This work elaborates on the concept of network governance. Network governance refers to complex policy-making situations, where a variety of public and private actors collaborate in order to produce and define policy. Governance is processes of autonomous, self-organizing networks of organizations exchanging information and deliberating. Network governance is a theoretical concept that corresponds to an empirical phenomenon. Often, this phenomenon is used to descirbe a historical development: governance is often used to describe changes in political processes of Western societies since the 1980s. In this work, empirical governance networks are used as an organizing framework, and the concepts of autonomy, self-organization and network structure are developed as tools for empirical analysis of any complex decision-making process. This work develops this framework and explores the governance networks in the case of environmental policy-making in the City of Helsinki, Finland. The crafting of a local ecological sustainability programme required support and knowledge from all sectors of administration, a number of entrepreneurs and companies and the inhabitants of Helsinki. The policy process relied explicitly on networking, with public and private actors collaborating to design policy instruments. Communication between individual organizations led to the development of network structures and patterns. This research analyses these patterns and their effects on policy choice, by applying the methods of social network analysis. A variety of social network analysis methods are used to uncover different features of the networked process. Links between individual network positions, network subgroup structures and macro-level network patterns are compared to the types of organizations involved and final policy instruments chosen. By using governance concepts to depict a policy process, the work aims to assess whether they contribute to models of policy-making. The conclusion is that the governance literature sheds light on events that would otherwise go unnoticed, or whose conceptualization would remain atheoretical. The framework of network governance should be in the toolkit of the policy analyst.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The open access (OA) model for journals is compared to the open source principle for computer software. Since the early 1990s nearly 1,000 OA scientific journals have emerged – mostly as voluntary community efforts, although recently some professionally operating publishers have used author charges or institutional membership. This study of OA journals without author charges shows that their impact is still relatively small, but awareness of it is increasing. The average number of research articles per year is lower than for major scientific journals but the publication times are shorter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current mainstream scientific-publication process has so far been only marginally affected by the possibilities offered by the Internet, despite some pioneering attempts with free electronic-only journals and electronic preprint archives. Additional electronic versions of traditional paper journals for which one needs a subscription are not a solution. A clear trend, for young researchers in particular, is to go around subscription barriers (both for paper and electronic material) and rely almost exclusively on what they can find free on the Internet, which often includes working versions posted on the home pages of the authors. A survey of how scientists retrieve publications was conducted in February 2000, aimed at measuring to what extent the opportunities offered by the Internet are already changing the scientific information exchange and how researchers feel about this. This paper presents the results based on 236 replies to an extensive Web-based questionnaire, which was announced to around 3,000 researchers in the domains of construction information technology and construction management. The questions dealt with how researchers find, access, and read different sources; how many and what publications they read; how often and to which conferences they travel; how much they publish, and criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing, with one final section dedicated to opinions about electronic publishing. According to the survey, researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author's or publisher's Web site. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available freely in their entirety on the Web, where the costs could be covered by, for instance, professional societies or the publishing university.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Thomé's procedure, the asymptotic solutions of the Frieman and Book equation for the two-particle correlation in a plasma have been obtained in a complete form. The solution is interpreted in terms of the Lorentz distance. The exact expressions for the internal energy and pressure are evaluated and they are found to be a generalization of the result obtained earlier by others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mikael Juselius’ doctoral dissertation covers a range of significant issues in modern macroeconomics by empirically testing a number of important theoretical hypotheses. The first essay presents indirect evidence within the framework of the cointegrated VAR model on the elasticity of substitution between capital and labor by using Finnish manufacturing data. Instead of estimating the elasticity of substitution by using the first order conditions, he develops a new approach that utilizes a CES production function in a model with a 3-stage decision process: investment in the long run, wage bargaining in the medium run and price and employment decisions in the short run. He estimates the elasticity of substitution to be below one. The second essay tests the restrictions implied by the core equations of the New Keynesian Model (NKM) in a vector autoregressive model (VAR) by using both Euro area and U.S. data. Both the new Keynesian Phillips curve and the aggregate demand curve are estimated and tested. The restrictions implied by the core equations of the NKM are rejected on both U.S. and Euro area data. These results are important for further research. The third essay is methodologically similar to essay 2, but it concentrates on Finnish macro data by adopting a theoretical framework of an open economy. Juselius’ results suggests that the open economy NKM framework is too stylized to provide an adequate explanation for Finnish inflation. The final essay provides a macroeconometric model of Finnish inflation and associated explanatory variables and it estimates the relative importance of different inflation theories. His main finding is that Finnish inflation is primarily determined by excess demand in the product market and by changes in the long-term interest rate. This study is part of the research agenda carried out by the Research Unit of Economic Structure and Growth (RUESG). The aim of RUESG it to conduct theoretical and empirical research with respect to important issues in industrial economics, real option theory, game theory, organization theory, theory of financial systems as well as to study problems in labor markets, macroeconomics, natural resources, taxation and time series econometrics. RUESG was established at the beginning of 1995 and is one of the National Centers of Excellence in research selected by the Academy of Finland. It is financed jointly by the Academy of Finland, the University of Helsinki, the Yrjö Jahnsson Foundation, Bank of Finland and the Nokia Group. This support is gratefully acknowledged.