907 resultados para Unobserved-component model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a Markov switching unobserved component model we decompose the term premium of the North American CDX index into a permanent and a stationary component. We establish that the inversion of the CDX term premium is induced by sudden changes in the unobserved stationary component, which represents the evolution of the fundamentals underpinning the probability of default in the economy. We find evidence that the monetary policy response from the Fed during the crisis period was effective in reducing the volatility of the term premium. We also show that equity returns make a substantial contribution to the term premium over the entire sample period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses a rigorous treatment of the refractive scintillation of pulsar PSR B0833-45 caused by a two-component interstellar scattering medium. It is assumed that the interstellar scattering medium is composed of a thin screen ISM and an extended interstellar medium. We consider that the scattering of the thin screen concentrates in a thin layer presented by a delta function distribution and that the scattering density of the extended irregular medium satisfies the Gaussian distribution. We investigate and develop equations for the flux density structure function corresponding to this two-component ISM geometry in the scattering density distribution and compare our result with that of the Vela pulsar observations. We conclude that the refractive scintillation caused by this two-component ISM scattering gives a more satisfactory explanation for the observed flux density variation of the Vela pulsar than does the single extended medium model. The level of refractive scintillation is strongly sensitive to the distribution of scattering material along the line of sight. The logarithmic slope of the structure function is sensitive to thin screen location and is relatively insensitive to the scattering strength of the thin screen medium. Therefore, the proposed model can be applied to interpret the structure function of flux density observed in pulsar PSR B0833-45. The result suggests that the medium consists of a discontinuous distribution of plasma turbulence embedded in the Vela supernova remnant. Thus our work provides some insight into the distribution of the scattering along the line of sight to the Vela pulsar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Providing a method of transparent communication and interoperation between distributed software is a requirement for many organisations and several standard and non-standard infrastructures exist for this purpose. Component models do more than just provide a plumbing mechanism for distributed applications, they provide a more controlled interoperation between components. There are very few component models however that have support for advanced dynamic reconfigurability. This paper describes a component model which provides controlled and constrained transparent communication and inter-operation between components in the form of a hierarchical component model. At the same time, the model contains support for advanced run-time reconfigurability of components. The process and benefits of designing a system using the presented model are discussed. A way in which reflective techniques and component frameworks can work together to produce dynamic adaptable systems is explained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phytoplankton size structure is an important indicator of the state of the pelagic ecosystem. Stimulated by the paucity of in situ observations on size structure, and by the sampling advantages of autonomous remote platforms, new efforts are being made to infer the size-structure of the phytoplankton from oceanographic variables that may be measured at high temporal and spatial resolution, such as total chlorophyll concentration. Large-scale analysis of in situ data has revealed coherent relationships between size-fractionated chlorophyll and total chlorophyll that can be quantified using the three-component model of Brewin et al. (2010). However, there are variations surrounding these general relationships. In this paper, we first revise the three-component model using a global dataset of surface phytoplankton pigment measurements. Then, using estimates of the average irradiance in the mixed-layer, we investigate the influence of ambient light on the parameters of the three-component model. We observe significant relationships between model parameters and the average irradiance in the mixed-layer, consistent with ecological knowledge. These relationships are incorporated explicitly into the three-component model to illustrate variations in the relationship between size-structure and total chlorophyll, ensuing from variations in light availability. The new model may be used as a tool to investigate modifications in size-structure in the context of a changing climate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education in the Information Society", Plovdiv, May, 2013

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Financial integration has been pursued aggressively across the globe in the last fifty years; however, there is no conclusive evidence on the diversification gains (or losses) of such efforts. These gains (or losses) are related to the degree of comovements and synchronization among increasingly integrated global markets. We quantify the degree of comovements within the integrated Latin American market (MILA). We use dynamic correlation models to quantify comovements across securities as well as a direct integration measure. Our results show an increase in comovements when we look at the country indexes, however, the increase in the trend of correlation is previous to the institutional efforts to establish an integrated market in the region. On the other hand, when we look at sector indexes and an integration measure, we find a decreased in comovements among a representative sample of securities form the integrated market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this paper is to propose a novel setup that allows estimating separately the welfare costs of the uncertainty stemming from business-cycle uctuations and from economic-growth variation, when the two types of shocks associated with them (respectively,transitory and permanent shocks) hit consumption simultaneously. Separating these welfare costs requires dealing with degenerate bivariate distributions. Levis Continuity Theorem and the Disintegration Theorem allow us to adequately de ne the one-dimensional limiting marginal distributions. Under Normality, we show that the parameters of the original marginal distributions are not afected, providing the means for calculating separately the welfare costs of business-cycle uctuations and of economic-growth variation. Our empirical results show that, if we consider only transitory shocks, the welfare cost of business cycles is much smaller than previously thought. Indeed, we found it to be negative - -0:03% of per-capita consumption! On the other hand, we found that the welfare cost of economic-growth variation is relatively large. Our estimate for reasonable preference-parameter values shows that it is 0:71% of consumption US$ 208:98 per person, per year.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Existing literature has failed to find robust relationships between individual differences and the ability to fake psychological tests, possibly due to limitations in how successful faking is operationalised. In order to fake, individuals must alter their original profile to create a particular impression. Currently, successful faking is operationalised through statistical definitions, informant ratings, known groups comparisons, the use of archival and baseline data, and breaches of validity indexes. However, there are many methodological limitations to these approaches. This research proposed a three component model of successful faking to address this, where an original response is manipulated into a strategic response, which must match a criteria target. Further, by operationalising successful faking in this manner, this research takes into account the fact that individuals may have been successful in reaching their implicitly created profile, but that this may not have matched the criteria they were instructed to fake.Participants (N=48, 22 students and 26 non-students) completed the BDI-II honestly. Participants then faked the BDI-II as if they had no, mild, moderate and severe depression, as well as completing a checklist revealing which symptoms they thought indicated each level of depression. Findings were consistent with a three component model of successful faking, where individuals effectively changed their profile to what they believed was required, however this profile differed from the criteria defined by the psychometric norms of the test.One of the foremost issues for research in this area is the inconsistent manner in which successful faking is operationalised. This research allowed successful faking to be operationalised in an objective, quantifiable manner. Using this model as a template may allow researchers better understanding of the processes involved in faking, including the role of strategies and abilities in determining the outcome of test dissimulation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Language models (LMs) are often constructed by building multiple individual component models that are combined using context independent interpolation weights. By tuning these weights, using either perplexity or discriminative approaches, it is possible to adapt LMs to a particular task. This paper investigates the use of context dependent weighting in both interpolation and test-time adaptation of language models. Depending on the previous word contexts, a discrete history weighting function is used to adjust the contribution from each component model. As this dramatically increases the number of parameters to estimate, robust weight estimation schemes are required. Several approaches are described in this paper. The first approach is based on MAP estimation where interpolation weights of lower order contexts are used as smoothing priors. The second approach uses training data to ensure robust estimation of LM interpolation weights. This can also serve as a smoothing prior for MAP adaptation. A normalized perplexity metric is proposed to handle the bias of the standard perplexity criterion to corpus size. A range of schemes to combine weight information obtained from training data and test data hypotheses are also proposed to improve robustness during context dependent LM adaptation. In addition, a minimum Bayes' risk (MBR) based discriminative training scheme is also proposed. An efficient weighted finite state transducer (WFST) decoding algorithm for context dependent interpolation is also presented. The proposed technique was evaluated using a state-of-the-art Mandarin Chinese broadcast speech transcription task. Character error rate (CER) reductions up to 7.3 relative were obtained as well as consistent perplexity improvements. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Size-fractionated filtration (SFF) is a direct method for estimating pigment concentration in various size classes. It is also common practice to infer the size structure of phytoplankton communities from diagnostic pigments estimated by high-performance liquid chromatography (HPLC). In this paper, the three-component model of Brewin et al. (2010) was fitted to coincident data from HPLC and from SFF collected along Atlantic Meridional Transect cruises. The model accounted for the variability in each data set, but the fitted model parameters differed for the two data sets. Both HPLC and SFF data supported the conceptual framework of the three-component model, which assumes that the chlorophyll concentration in small cells increases to an asymptotic maximum, beyond which further increase in chlorophyll is achieved by the addition of larger celled phytoplankton. The three-component model was extended to a multicomponent model of size structure using observed relationships between model parameters and assuming that the asymptotic concentration that can be reached by cells increased linearly with increase in the upper bound on the cell size. The multicomponent model was verified using independent SFF data for a variety of size fractions and found to perform well (0.628 ≤ r ≤ 0.989) lending support for the underlying assumptions. An advantage of the multicomponent model over the three-component model is that, for the same number of parameters, it can be applied to any size range in a continuous fashion. The multicomponent model provides a useful tool for studying the distribution of phytoplankton size structure at large scales.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To study the behaviour of beam-to-column composite connection more sophisticated finite element models is required, since component model has some severe limitations. In this research a generic finite element model for composite beam-to-column joint with welded connections is developed using current state of the art local modelling. Applying mechanically consistent scaling method, it can provide the constitutive relationship for a plane rectangular macro element with beam-type boundaries. Then, this defined macro element, which preserves local behaviour and allows for the transfer of five independent states between local and global models, can be implemented in high-accuracy frame analysis with the possibility of limit state checks. In order that macro element for scaling method can be used in practical manner, a generic geometry program as a new idea proposed in this study is also developed for this finite element model. With generic programming a set of global geometric variables can be input to generate a specific instance of the connection without much effort. The proposed finite element model generated by this generic programming is validated against testing results from University of Kaiserslautern. Finally, two illustrative examples for applying this macro element approach are presented. In the first example how to obtain the constitutive relationships of macro element is demonstrated. With certain assumptions for typical composite frame the constitutive relationships can be represented by bilinear laws for the macro bending and shear states that are then coupled by a two-dimensional surface law with yield and failure surfaces. In second example a scaling concept that combines sophisticated local models with a frame analysis using a macro element approach is presented as a practical application of this numerical model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An international standard, ISO/DP 9459-4 has been proposed to establish a uniform standard of quality for small, factory-made solar heating systerns. In this proposal, system components are tested separatelyand total system performance is calculated using system simulations based on component model parameter values validated using the results from the component tests. Another approach is to test the whole system in operation under representative conditions, where the results can be used as a measure of the general system performance. The advantage of system testing of this form is that it is not dependent on simulations and the possible inaccuracies of the models. Its disadvantage is that it is restricted to the boundary conditions for the test. Component testing and system simulation is flexible, but requires an accurate and reliable simulation model.The heat store is a key component conceming system performance. Thus, this work focuses on the storage system consisting store, electrical auxiliary heater, heat exchangers and tempering valve. Four different storage system configurations with a volume of 750 litre were tested in an indoor system test using a six -day test sequence. A store component test and system simulation was carried out on one of the four configurations, applying the proposed standard for stores, ISO/DP 9459-4A. Three newly developed test sequences for intemalload side heat exchangers, not in the proposed ISO standard, were also carried out. The MULTIPORT store model was used for this work. This paper discusses the results of the indoor system test, the store component test, the validation of the store model parameter values and the system simulations.