100 resultados para INFRARED FILTER SET
Resumo:
We evaluate conditional predictive densities for U.S. output growth and inflationusing a number of commonly used forecasting models that rely on a large number ofmacroeconomic predictors. More specifically, we evaluate how well conditional predictive densities based on the commonly used normality assumption fit actual realizationsout-of-sample. Our focus on predictive densities acknowledges the possibility that, although some predictors can improve or deteriorate point forecasts, they might have theopposite effect on higher moments. We find that normality is rejected for most modelsin some dimension according to at least one of the tests we use. Interestingly, however,combinations of predictive densities appear to be correctly approximated by a normaldensity: the simple, equal average when predicting output growth and Bayesian modelaverage when predicting inflation.
Resumo:
This paper aims to estimate a translog stochastic frontier production function in the analysis of a panel of 150 mixed Catalan farms in the period 1989-1993, in order to attempt to measure and explain variation in technical inefficiency scores with a one-stage approach. The model uses gross value added as the output aggregate measure. Total employment, fixed capital, current assets, specific costs and overhead costs are introduced into the model as inputs. Stochasticfrontier estimates are compared with those obtained using a linear programming method using a two-stage approach. The specification of the translog stochastic frontier model appears as an appropriate representation of the data, technical change was rejected and the technical inefficiency effects were statistically significant. The mean technical efficiency in the period analyzed was estimated to be 64.0%. Farm inefficiency levels were found significantly at 5%level and positively correlated with the number of economic size units.
Resumo:
This note describes how the Kalman filter can be modified to allow for thevector of observables to be a function of lagged variables without increasing the dimensionof the state vector in the filter. This is useful in applications where it is desirable to keepthe dimension of the state vector low. The modified filter and accompanying code (whichnests the standard filter) can be used to compute (i) the steady state Kalman filter (ii) thelog likelihood of a parameterized state space model conditional on a history of observables(iii) a smoothed estimate of latent state variables and (iv) a draw from the distribution oflatent states conditional on a history of observables.
Resumo:
The set covering problem is an NP-hard combinatorial optimization problemthat arises in applications ranging from crew scheduling in airlines todriver scheduling in public mass transport. In this paper we analyze searchspace characteristics of a widely used set of benchmark instances throughan analysis of the fitness-distance correlation. This analysis shows thatthere exist several classes of set covering instances that have a largelydifferent behavior. For instances with high fitness distance correlation,we propose new ways of generating core problems and analyze the performanceof algorithms exploiting these core problems.
Resumo:
It is shown how correspondence analysis may be applied to a subset of response categories from a questionnaire survey, for example the subset of undecided responses or the subset of responses for a particular category. The idea is to maintain the original relative frequencies of the categories and not re-express them relative to totals within the subset, as would normally be done in a regular correspondence analysis of the subset. Furthermore, the masses and chi-square metric assigned to the data subset are the same as those in the correspondence analysis of the whole data set. This variant of the method, called Subset Correspondence Analysis, is illustrated on data from the ISSP survey on Family and Changing Gender Roles.
Resumo:
The influence of the basis set size and the correlation energy in the static electrical properties of the CO molecule is assessed. In particular, we have studied both the nuclear relaxation and the vibrational contributions to the static molecular electrical properties, the vibrational Stark effect (VSE) and the vibrational intensity effect (VIE). From a mathematical point of view, when a static and uniform electric field is applied to a molecule, the energy of this system can be expressed in terms of a double power series with respect to the bond length and to the field strength. From the power series expansion of the potential energy, field-dependent expressions for the equilibrium geometry, for the potential energy and for the force constant are obtained. The nuclear relaxation and vibrational contributions to the molecular electrical properties are analyzed in terms of the derivatives of the electronic molecular properties. In general, the results presented show that accurate inclusion of the correlation energy and large basis sets are needed to calculate the molecular electrical properties and their derivatives with respect to either nuclear displacements or/and field strength. With respect to experimental data, the calculated power series coefficients are overestimated by the SCF, CISD, and QCISD methods. On the contrary, perturbation methods (MP2 and MP4) tend to underestimate them. In average and using the 6-311 + G(3df) basis set and for the CO molecule, the nuclear relaxation and the vibrational contributions to the molecular electrical properties amount to 11.7%, 3.3%, and 69.7% of the purely electronic μ, α, and β values, respectively
Resumo:
In earlier work, the present authors have shown that hardness profiles are less dependent on the level of calculation than energy profiles for potential energy surfaces (PESs) having pathological behaviors. At variance with energy profiles, hardness profiles always show the correct number of stationary points. This characteristic has been used to indicate the existence of spurious stationary points on the PESs. In the present work, we apply this methodology to the hydrogen fluoride dimer, a classical difficult case for the density functional theory methods
Resumo:
We describe the motivation, design, and implementation of the CORNISH survey, an arcsecondresolution radio continuum survey of the inner galactic plane at 5 GHz using the Very Large Array (VLA). It is a blind survey coordinated with the northern SpitzerGLIMPSE I region covering 10°
Resumo:
Study of the publication models and the means of accessing scientific literature in the current environment of digital communication and the web. The text introduces the concept of journal article as a well-defined and stable unit within the publishing world, and as a nucleus on which professional and scholarly communication has been based since its beginnings in the 17th century. The transformation of scientific communication that the digital world has enabled is analysed. Descriptions are provided of some of the practices undertaken by authors, research organisations, publishers and library-related institutions as a response to the new possibilities being unveiled for articles, both as products as well as for their creation and distribution processes. These transformations affect the very nature of articles as a minimal unit -both unique and stable- of scientific communication. The article concludes by noting that under varying documentary forms of publisher aggregation and bibliographic control -sometimes simultaneously and, even, apparently contradictory- there flourishes a more pluralistic type of scientific communication. This pluralism offers: more possibilities for communication among authors; fewer levels of intermediaries such as agents that intervene and provide added value to the products; greater availability for users both economically speaking and from the point of view of access; and greater interaction and wealth of contents, thanks to the new hypertext and multimedia possibilities.
Resumo:
A method of making a multiple matched filter which allows the recognition of different characters in successive planes in simple conditions is proposed. The generation of the filter is based on recording on the same plate the Fourier transforms of the different patterns to be recognized, each of which is affected by different spherical phase factors because the patterns have been placed at different distances from the lens. This is proved by means of experiments with a triple filter which allows satisfactory recognition of three characters.
Resumo:
In this paper we show that the orthorhombic phase of FeSi2 (stable at room temperature) displays a sizable anisotropy in the infrared spectra, with minor effects in the Raman data too. This fact is not trivial at all, since the crystal structure corresponds to a moderate distortion of the fluorite symmetry. Our analysis is carried out on small single crystals grown by flux transport, through polarization-resolved far-infrared reflectivity and Raman measurements. Their interpretation has been obtained by means of the simulated spectra with tight-binding molecular dynamics.
Resumo:
A static comparative study on set-solutions for cooperative TU games is carried out. The analysis focuses on studying the compatibility between two classical and reasonable properties introduced by Young (1985) in the context of single valued solutions, namely core-selection and coalitional monotonicity. As the main result, it is showed that coalitional monotonicity is not only incompatible with the core-selection property but also with the bargaining-selection property. This new impossibility result reinforces the tradeoff between these kinds of interesting and intuitive economic properties. Positive results about compatibility between desirable economic properties are given replacing the core selection requirement by the core-extension property.
Resumo:
This paper provides an axiomatic framework to compare the D-core (the set of undominatedimputations) and the core of a cooperative game with transferable utility. Theorem1 states that the D-core is the only solution satisfying projection consistency, reasonableness (from above), (*)-antimonotonicity, and modularity. Theorem 2 characterizes the core replacing (*)-antimonotonicity by antimonotonicity. Moreover, these axioms alsocharacterize the core on the domain of convex games, totally balanced games, balancedgames, and superadditive games