81 resultados para Correlation matching techniques
Resumo:
The contributions of the correlated and uncorrelated components of the electron-pair density to atomic and molecular intracule I(r) and extracule E(R) densities and its Laplacian functions ∇2I(r) and ∇2E(R) are analyzed at the Hartree-Fock (HF) and configuration interaction (CI) levels of theory. The topologies of the uncorrelated components of these functions can be rationalized in terms of the corresponding one-electron densities. In contrast, by analyzing the correlated components of I(r) and E(R), namely, IC(r) and EC(R), the effect of electron Fermi and Coulomb correlation can be assessed at the HF and CI levels of theory. Moreover, the contribution of Coulomb correlation can be isolated by means of difference maps between IC(r) and EC(R) distributions calculated at the two levels of theory. As application examples, the He, Ne, and Ar atomic series, the C2-2, N2, O2+2 molecular series, and the C2H4 molecule have been investigated. For these atoms and molecules, it is found that Fermi correlation accounts for the main characteristics of IC(r) and EC(R), with Coulomb correlation increasing slightly the locality of these functions at the CI level of theory. Furthermore, IC(r), EC(R), and the associated Laplacian functions, reveal the short-ranged nature and high isotropy of Fermi and Coulomb correlation in atoms and molecules
Resumo:
The level of ab initio theory which is necessary to compute reliable values for the static and dynamic (hyper)polarizabilities of three medium size π-conjugated organic nonlinear optical (NLO) molecules is investigated. With the employment of field-induced coordinates in combination with a finite field procedure, the calculations were made possible. It is stated that to obtain reasonable values for the various individual contributions to the (hyper)polarizability, it is necessary to include electron correlation. Based on the results, the convergence of the usual perturbation treatment for vibrational anharmonicity was examined
Analysis and evaluation of techniques for the extraction of classes in the ontology learning process
Resumo:
This paper analyzes and evaluates, in the context of Ontology learning, some techniques to identify and extract candidate terms to classes of a taxonomy. Besides, this work points out some inconsistencies that may be occurring in the preprocessing of text corpus, and proposes techniques to obtain good terms candidate to classes of a taxonomy.
Resumo:
This paper applies random matrix theory to obtain analytical characterizations of the capacity of correlated multiantenna channels. The analysis is not restricted to the popular separable correlation model, but rather it embraces a more general representation that subsumesmost of the channel models that have been treated in the literature. For arbitrary signal-to-noise ratios (SNR), the characterization is conducted in the regime of large numbers of antennas. For the low- and high-SNR regions, in turn, we uncover compact capacity expansions that are valid for arbitrary numbers of antennas and that shed insight on how antenna correlation impacts the tradeoffs between power, bandwidth and rate.
Resumo:
This paper points out an empirical puzzle that arises when an RBC economy with a job matching function is used to model unemployment. The standard model can generate sufficiently large cyclical fluctuations in unemployment, or a sufficiently small response of unemployment to labor market policies, but it cannot do both. Variable search and separation, finite UI benefit duration, efficiency wages, and capital all fail to resolve this puzzle. However, both sticky wages and match-specific productivity shocks help the model reproduce the stylized facts: both make the firm's flow of surplus more procyclical, thus making hiring more procyclical too.
Resumo:
A skill-biased change in technology can account at once for the changes observed in a number of important variables of the US labour market between 1970 and 1990. These include the increasing inequality in wages, both between and within education groups, and the increase in unemployment at all levels of education. In contrast, in previous literature this type of technology shock cannot account for all of these changes. The paper uses a matching model with a segmented labour market, an imperfect correlation between individual ability and education, and a fixed cost of setting up a job. The endogenous increase in overeducation is key to understand the response of unemployment to the technology shock.
Resumo:
It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.
Resumo:
According to Ljungqvist and Sargent (1998), high European unemployment since the 1980s can be explained by a rise in economic turbulence, leading to greater numbers of unemployed workers with obsolete skills. These workers refuse new jobs due to high unemployment benefits. In this paper we reassess the turbulence-unemployment relationship using a matching model with endogenous job destruction. In our model, higher turbulence reduces the incentives of employed workers to leave their jobs. If turbulence has only a tiny effect on the skills of workers experiencing endogenous separation, then the results of Lungqvist and Sargent (1998, 2004) are reversed, and higher turbulence leads to a reduction in unemployment. Thus, changes in turbulence cannot provide an explanation for European unemployment that reconciles the incentives of both unemployed and employed workers.
Resumo:
Many workers believe that personal contacts are crucial for obtainingjobs in high-wage sectors. On the other hand, firms in high-wage sectorsreport using employee referrals because they help provide screening andmonitoring of new employees. This paper develops a matching model thatcan explain the link between inter-industry wage differentials and useof employee referrals. Referrals lower monitoring costs because high-effortreferees can exert peer pressure on co-workers, allowing firms to pay lowerefficiency wages. On the other hand, informal search provides fewer job andapplicant contacts than formal methods (e.g., newspaper ads). In equilibrium,the matching process generates segmentation in the labor market becauseof heterogeneity in the size of referral networks. Referrals match good high-paying jobs to well-connected workers, while formal methods matchless attractive jobs to less-connected workers. Industry-level data show apositive correlation between industry wage premia and use of employeereferrals. Moreover, evidence using the NLSY shows similar positive andsignificant OLS and fixed-effects estimates of the returns to employeereferrals, but insignificant effects once sector of employment is controlledfor. This evidence suggests referred workers earn higher wages not becauseof higher unobserved ability or better matches but rather because theyare hired in high-wage sectors.
Illusory correlation in the remuneration of chief executive officers: It pays to play golf, and well
Resumo:
Illusory correlation refers to the use of information in decisions that is uncorrelated with the relevantcriterion. We document illusory correlation in CEO compensation decisions by demonstrating thatinformation, that is uncorrelated with corporate performance, is related to CEO compensation. We usepublicly available data from the USA for the years 1998, 2000, 2002, and 2004 to examine the relationsbetween golf handicaps of CEOs and corporate performance, on the one hand, and CEO compensationand golf handicaps, on the other hand. Although we find no relation between handicap and corporateperformance, we do find a relation between handicap and CEO compensation. In short, golfers earnmore than non-golfers and pay increases with golfing ability. We relate these findings to the difficultiesof judging compensation for CEOs. To overcome this and possibly other illusory correlations inthese kinds of decisions, we recommend the use of explicit, mechanical decision rules.
Resumo:
This paper generalizes the original random matching model of money byKiyotaki and Wright (1989) (KW) in two aspects: first, the economy ischaracterized by an arbitrary distribution of agents who specialize in producing aparticular consumption good; and second, these agents have preferences suchthat they want to consume any good with some probability. The resultsdepend crucially on the size of the fraction of producers of each goodand the probability with which different agents want to consume eachgood. KW and other related models are shown to be parameterizations ofthis more general one.
Resumo:
This paper analyzes the problem of matching heterogeneous agents in aBayesian learning model. One agent gives a noisy signal to another agent,who is responsible for learning. If production has a strong informationalcomponent, a phase of cross-matching occurs, so that agents of low knowledgecatch up with those of higher one. It is shown that:(i) a greater informational component in production makes cross-matchingmore likely;(ii) as the new technology is mastered, production becomes relatively morephysical and less informational;(iii) a greater dispersion of the ability to learn and transfer informationmakes self-matching more likely; and(iv) self-matching leads to more self-matching, whereas cross-matching canmake less productive agents overtake more productive ones.
Resumo:
A Method is offered that makes it possible to apply generalized canonicalcorrelations analysis (CANCOR) to two or more matrices of different row and column order. The new method optimizes the generalized canonical correlationanalysis objective by considering only the observed values. This is achieved byemploying selection matrices. We present and discuss fit measures to assessthe quality of the solutions. In a simulation study we assess the performance of our new method and compare it to an existing procedure called GENCOM,proposed by Green and Carroll. We find that our new method outperforms the GENCOM algorithm both with respect to model fit and recovery of the truestructure. Moreover, as our new method does not require any type of iteration itis easier to implement and requires less computation. We illustrate the methodby means of an example concerning the relative positions of the political parties inthe Netherlands based on provincial data.
Resumo:
We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.