26 resultados para C53 - Forecasting and Other Model Applications

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Es va realitzar el II Workshop en Tomografia Computeritzada (TC) a Monells. El primer dia es va dedicar íntegrament a la utilització del TC en temes de classificació de canals porcines, i el segon dia es va obrir a altres aplicacions del TC, ja sigui en animals vius o en diferents aspectes de qualitat de la carn o els productes carnis. Al workshop hi van assistir 45 persones de 12 països de la UE. The II workshop on the use of Computed Tomography (CT) in pig carcass classification. Other CT applications: live animals and meat technology was held in Monells. The first day it was dedicated to the use of CT in pig carcass classification. The segond day it was open to otehr CT applications, in live animals or in meat and meat products quality. There were 45 assistants of 12 EU countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide methods for forecasting variables and predicting turning points in panel Bayesian VARs. We specify a flexible model which accounts for both interdependencies in the cross section and time variations in the parameters. Posterior distributions for the parameters are obtained for a particular type of diffuse, for Minnesota-type and for hierarchical priors. Formulas for multistep, multiunit point and average forecasts are provided. An application to the problem of forecasting the growth rate of output and of predicting turning points in the G-7 illustrates the approach. A comparison with alternative forecasting methods is also provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is twofold. First, we construct a DSGE model which spells out explicitly the instrumentation of monetary policy. The interest rate is determined every period depending on the supply and demand for reserves which in turn are affected by fundamental shocks: unforeseeable changes in cash withdrawal, autonomous factors, technology and government spending. Unexpected changes in the monetary conditions of the economy are interpreted as monetary shocks. We show that these monetary shocks have the usual effects on economic activity without the need of imposing additional frictions as limited participation in asset markets or sticky prices. Second, we show that this view of monetary policy may have important consequences for empirical research. In the model, the contemporaneous correlations between interest rates, prices and output are due to the simultaneous effect of all fundamental shocks. We provide an example where these contemporaneous correlations may be misinterpreted as a Taylor rule. In addition, we use the sign of the impact responses of all shocks on output, prices and interest rates derived from the model to identify the sources of shocks in the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariableswith some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependenceof a composition with a categorical variable, a colored set of ternary diagrams mightbe a good idea for a first look at the data, but it will fast hide important aspects ifthe composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if theconventional, black-box ilr is used.Thinking on terms of the Euclidean structure of the simplex, we suggest to set upappropriate projections, which on one side show the compositional geometry and on theother side are still comprehensible by a non-expert analyst, readable for all locations andscales of the data. This is e.g. done by defining special balance displays with carefully-selected axes. Following this idea, we need to systematically ask how to display, explore,describe, and test the relation to complementary or explanatory data of categorical, real,ratio or again compositional scales.This contribution shows that it is sufficient to use some basic concepts and very fewadvanced tools from multivariate statistics (principal covariances, multivariate linearmodels, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariateanalysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effectiveness of decision rules depends on characteristics of bothrules and environments. A theoretical analysis of environments specifiesthe relative predictive accuracies of the lexicographic rule 'take-the-best'(TTB) and other simple strategies for binary choice. We identify threefactors: how the environment weights variables; characteristics of choicesets; and error. For cases involving from three to five binary cues, TTBis effective across many environments. However, hybrids of equal weights(EW) and TTB models are more effective as environments become morecompensatory. In the presence of error, TTB and similar models do not predictmuch better than a naïve model that exploits dominance. We emphasizepsychological implications and the need for more complete theories of theenvironment that include the role of error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducing bounded rationality in a standard consumption-based asset pricing model with time separable preferences strongly improves empirical performance. Learning causes momentum and mean reversion of returns and thereby excess volatility, persistence of price-dividend ratios, long-horizon return predictability and a risk premium, as in the habit model of Campbell and Cochrane (1999), but for lower risk aversion. This is obtained, even though our learning scheme introduces just one free parameter and we only consider learning schemes that imply small deviations from full rationality. The findings are robust to the learning rule used and other model features. What is key is that agents forecast future stock prices using past information on prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Es defineix l'expansió general d'operadors com una combinació lineal de projectors i s'exposa la seva aplicació generalitzada al càlcul d'integrals moleculars. Com a exemple numèric, es fa l'aplicació al càlcul d'integrals de repulsió electrònica entre quatre funcions de tipus s centrades en punts diferents, i es mostren tant resultats del càlcul com la definició d'escalat respecte a un valor de referència, que facilitarà el procés d'optimització de l'expansió per uns paràmetres arbitraris. Es donen resultats ajustats al valor exacte

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El participi i altres fenòmens relacionats en el Castellà i el Català antic

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, the surprising result that ab initio calculations on benzene and other planar arenes at correlated MP2, MP3, configuration interaction with singles and doubles (CISD), and coupled cluster with singles and doubles levels of theory using standard Pople’s basis sets yield nonplanar minima has been reported. The planar optimized structures turn out to be transition states presenting one or more large imaginary frequencies, whereas single-determinant-based methods lead to the expected planar minima and no imaginary frequencies. It has been suggested that such anomalous behavior can be originated by two-electron basis set incompleteness error. In this work, we show that the reported pitfalls can be interpreted in terms of intramolecular basis set superposition error (BSSE) effects, mostly between the C–H moieties constituting the arenes. We have carried out counterpoise-corrected optimizations and frequency calculations at the Hartree–Fock, B3LYP, MP2, and CISD levels of theory with several basis sets for a number of arenes. In all cases, correcting for intramolecular BSSE fixes the anomalous behavior of the correlated methods, whereas no significant differences are observed in the single-determinant case. Consequently, all systems studied are planar at all levels of theory. The effect of different intramolecular fragment definitions and the particular case of charged species, namely, cyclopentadienyl and indenyl anions, respectively, are also discussed

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Floods are the natural hazards that produce the highest number of casualties and material damage in the Western Mediterranean. An improvement in flood risk assessment and study of a possible increase in flooding occurrence are therefore needed. To carry out these tasks it is important to have at our disposal extensive knowledge on historical floods and to find an efficient way to manage this geographical data. In this paper we present a complete flood database spanning the 20th century for the whole of Catalonia (NE Spain), which includes documentary information (affected areas and damage) and instrumental information (meteorological and hydrological records). This geodatabase, named Inungama, has been implemented on a GIS (Geographical Information System) in order to display all the information within a given geographical scenario, as well as to carry out an analysis thereof using queries, overlays and calculus. Following a description of the type and amount of information stored in the database and the structure of the information system, the first applications of Inungama are presented. The geographical distribution of floods shows the localities which are more likely to be flooded, confirming that the most affected municipalities are the most densely populated ones in coastal areas. Regarding the existence of an increase in flooding occurrence, a temporal analysis has been carried out, showing a steady increase over the last 30 years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In dealing with systems as complex as the cytoskeleton, we need organizing principles or, short of that, an empirical framework into which these systems fit. We report here unexpected invariants of cytoskeletal behavior that comprise such an empirical framework. We measured elastic and frictional moduli of a variety of cell types over a wide range of time scales and using a variety of biological interventions. In all instances elastic stresses dominated at frequencies below 300 Hz, increased only weakly with frequency, and followed a power law; no characteristic time scale was evident. Frictional stresses paralleled the elastic behavior at frequencies below 10 Hz but approached a Newtonian viscous behavior at higher frequencies. Surprisingly, all data could be collapsed onto master curves, the existence of which implies that elastic and frictional stresses share a common underlying mechanism. Taken together, these findings define an unanticipated integrative framework for studying protein interactions within the complex microenvironment of the cell body, and appear to set limits on what can be predicted about integrated mechanical behavior of the matrix based solely on cytoskeletal constituents considered in isolation. Moreover, these observations are consistent with the hypothesis that the cytoskeleton of the living cell behaves as a soft glassy material, wherein cytoskeletal proteins modulate cell mechanical properties mainly by changing an effective temperature of the cytoskeletal matrix. If so, then the effective temperature becomes an easily quantified determinant of the ability of the cytoskeleton to deform, flow, and reorganize.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DnaSP is a software package for the analysis of DNA polymorphism data. Present version introduces several new modules and features which, among other options allow: (1) handling big data sets (~5 Mb per sequence); (2) conducting a large number of coalescent-based tests by Monte Carlo computer simulations; (3) extensive analyses of the genetic differentiation and gene flow among populations; (4) analysing the evolutionary pattern of preferred and unpreferred codons; (5) generating graphical outputs for an easy visualization of results. Availability: The software package, including complete documentation and examples, is freely available to academic users from: http://www.ub.es/dnasp