924 resultados para Specification
Resumo:
Vietnam launched its first-ever stock market, named as Ho Chi Minh City Securities Trading Center (HSTC) on July 20, 2000. This is one of pioneering works on HSTC, which finds empirical evidences for the following: Anomalies of the HSTC stock returns through clusters of limit-hits, limit-hit sequences; Strong herd effect toward extreme positive returns of the market portfolio;The specification of ARMA-GARCH helps capture fairly well issues such as serial correlations and fat-tailed for the stabilized period. By using further information and policy dummy variables, it is justifiable that policy decisions on technicalities of trading can have influential impacts on the move of risk level, through conditional variance behaviors of HSTC stock returns. Policies on trading and disclosure practices have had profound impacts on Vietnam Stock Market (VSM). The over-using of policy tools can harm the market and investing mentality. Price limits become increasingly irrelevant and prevent the market from self-adjusting to equilibrium. These results on VSM have not been reported before in the literature on Vietnam’s financial markets. Given the policy implications, we suggest that the Vietnamese authorities re-think the use of price limit and give more freedom to market participants.
Resumo:
This paper analyzes the robustness of the estimate of a positive productivity shock on hours to the presence of a possible unit root in hours. Estimations in levels or in first differences provide opposite conclusions. We rely on an agnostic procedure in which the researcher does not have to choose between a specification in levels or in first differences. We find that a positive productivity shock has a negative impact effect on hours, but the effect is much shorter lived, and disappears after two quarters. The effect becomes positive at business-cycle frequencies, although it is not significant. © 2005 Cambridge University Press.
Resumo:
We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
Gaussian factor models have proven widely useful for parsimoniously characterizing dependence in multivariate data. There is a rich literature on their extension to mixed categorical and continuous variables, using latent Gaussian variables or through generalized latent trait models acommodating measurements in the exponential family. However, when generalizing to non-Gaussian measured variables the latent variables typically influence both the dependence structure and the form of the marginal distributions, complicating interpretation and introducing artifacts. To address this problem we propose a novel class of Bayesian Gaussian copula factor models which decouple the latent factors from the marginal distributions. A semiparametric specification for the marginals based on the extended rank likelihood yields straightforward implementation and substantial computational gains. We provide new theoretical and empirical justifications for using this likelihood in Bayesian inference. We propose new default priors for the factor loadings and develop efficient parameter-expanded Gibbs sampling for posterior computation. The methods are evaluated through simulations and applied to a dataset in political science. The models in this paper are implemented in the R package bfa.
Resumo:
© Institute of Mathematical Statistics, 2014.Motivated by recent findings in the field of consumer science, this paper evaluates the causal effect of debit cards on household consumption using population-based data from the Italy Survey on Household Income and Wealth (SHIW). Within the Rubin Causal Model, we focus on the estimand of population average treatment effect for the treated (PATT). We consider three existing estimators, based on regression, mixed matching and regression, propensity score weighting, and propose a new doubly-robust estimator. Semiparametric specification based on power series for the potential outcomes and the propensity score is adopted. Cross-validation is used to select the order of the power series. We conduct a simulation study to compare the performance of the estimators. The key assumptions, overlap and unconfoundedness, are systematically assessed and validated in the application. Our empirical results suggest statistically significant positive effects of debit cards on the monthly household spending in Italy.
Resumo:
T cell activation leads to dramatic shifts in cell metabolism to protect against pathogens and to orchestrate the action of other immune cells. Quiescent T cells require predominantly ATP-generating processes, whereas proliferating effector T cells require high metabolic flux through growth-promoting pathways. Further, functionally distinct T cell subsets require distinct energetic and biosynthetic pathways to support their specific functional needs. Pathways that control immune cell function and metabolism are intimately linked, and changes in cell metabolism at both the cell and system levels have been shown to enhance or suppress specific T cell functions. As a result of these findings, cell metabolism is now appreciated as a key regulator of T cell function specification and fate. This review discusses the role of cellular metabolism in T cell development, activation, differentiation, and function to highlight the clinical relevance and opportunities for therapeutic interventions that may be used to disrupt immune pathogenesis.
Resumo:
© 2015 IEEE.We consider the problem of verification of software implementations of linear time-invariant controllers. Commonly, different implementations use different representations of the controller's state, for example due to optimizations in a third-party code generator. To accommodate this variation, we exploit input-output controller specification captured by the controller's transfer function and show how to automatically verify correctness of C code controller implementations using a Frama-C/Why3/Z3 toolchain. Scalability of the approach is evaluated using randomly generated controller specifications of realistic size.
Resumo:
Use of structuring mechanisms (such as modularisation) is widely believed to be one of the key ways to improve software quality. Structuring is considered to be at least as important for specification documents as for source code, since it is assumed to improve comprehensibility. Yet, as with most widely held assumptions in software engineering, there is little empirical evidence to support this hypothesis. Also, even if structuring can be shown to he a good thing, we do not know how much structuring is somehow optimal. One of the more popular formal specification languages, Z, encourages structuring through its schema calculus. A controlled experiment is described in which two hypotheses about the effects of structure on the comprehensibility of Z specifications are tested. Evidence was found that structuring a specification into schemas of about 20 lines long significantly improved comprehensibility over a monolithic specification. However, there seems to be no perceived advantage in breaking down the schemas into much smaller components. The experiment can he fully replicated.
Resumo:
SMARTFIRE is a fire field model based on an open architecture integrated CFD code and knowledge-based system. It makes use of the expert system to assist the user in setting up the problem specification and new computational techniques such as Group Solvers to reduce the computational effort involved in solving the equations. This paper concentrates on recent research into the use of artificial intelligence techniques to assist in dynamic solution control of fire scenarios being simulated using fire field modelling techniques. This is designed to improve the convergence capabilities of the software while further decreasing the computational overheads. The technique automatically controls solver relaxations using an integrated production rule engine with a blackboard to monitor and implement the required control changes during solution processing. Initial results for a two-dimensional fire simulation are presented that demonstrate the potential for considerable savings in simulation run-times when compared with control sets from various sources. Furthermore, the results demonstrate enhanced solution reliability due to obtaining acceptable convergence within each time step unlike some of the comparison simulations.
Resumo:
Three paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation -- a nonlinear, structured-grid partial differential equation boundary value problem -- using the same algorithm on the same hardware. All of the paradigms -- parallel languages represented by the Portland Group's HPF, (semi-)automated serial-to-parallel source-to-source translation represented by CAP-Tools from the University of Greenwich, and parallel libraries represented by Argonne's PETSc -- are found to be easy to use for this problem class, and all are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under any paradigm includes specification of the data partitioning, corresponding to a geometrically simple decomposition of the domain of the PDE. Programming in SPMD style for the PETSc library requires writing only the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global-to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm as a starting point, introduction of concurrency through subdomain blocking (a task similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Programming with CAPTools involves feeding the same sequential implementation to the CAPTools interactive parallelization system, and guiding the source-to-source code transformation by responding to various queries about quantities knowable only at runtime. Results representative of "the state of the practice" for a scaled sequence of structured grid problems are given on three of the most important contemporary high-performance platforms: the IBM SP, the SGI Origin 2000, and the CRAYY T3E.
Resumo:
SMARTFIRE, an open architecture integrated CFD code and knowledge based system attempts to make fire field modeling accessible to non-experts in Computational Fluid Dynamics (CFD) such as fire fighters, architects and fire safety engineers. This is achieved by embedding expert knowledge into CFD software. This enables the 'black-art' associated with the CFD analysis such as selection of solvers, relaxation parameters, convergence criteria, time steps, grid and boundary condition specification to be guided by expert advice from the software. The user is however given the option of overriding these decisions, thus retaining ultimate control. SMARTFIRE also makes use of recent developments in CFD technology such as unstructured meshes and group solvers in order to make the CFD analysis more efficient. This paper describes the incorporation within SMARTFIRE of the expert fire modeling knowledge required for automatic problem setup and mesh generation as well as the concept and use of group solvers for automatic and manual dynamic control of the CFD code.
Resumo:
This paper describes progress on a project to utilise case based reasoning methods in the design and manufacture of furniture products. The novel feature of this research is that cases are represented as structures in a relational database of products, components and materials. The paper proposes a method for extending the usual "weighted sum" over attribute similarities for a ·single table to encompass relational structures over several tables. The capabilities of the system are discussed, particularly with respect to differing user objectives, such as cost estimation, CAD, cutting scheme re-use, and initial design. It is shown that specification of a target case as a relational structure combined with suitable weights can fulfil several user functions. However, it is also shown that some user functions cannot satisfactorily be specified via a single target case. For these functions it is proposed to allow the specification of a set of target cases. A derived similarity measure between individuals and sets of cases is proposed.
Resumo:
The so-called dividing instant (DI) problem is an ancient historical puzzle encountered when attempting to represent what happens at the boundary instant which divides two successive states. The specification of such a problem requires a thorough exploration of the primitives of the temporal ontology and the corresponding time structure, as well as the conditions that the resulting temporal models must satisfy. The problem is closely related to the question of how to characterize the relationship between time periods with positive duration and time instants with no duration. It involves the characterization of the ‘closed’ and ‘open’ nature of time intervals, i.e. whether time intervals include their ending points or not. In the domain of artificial intelligence, the DI problem may be treated as an issue of how to represent different assumptions (or hypotheses) about the DI in a consistent way. In this paper, we shall examine various temporal models including those based solely on points, those based solely on intervals and those based on both points and intervals, and point out the corresponding DI problem with regard to each of these temporal models. We shall propose a classification of assumptions about the DI and provide a solution to the corresponding problem.
Resumo:
This paper discusses an optimisation based decision support system and methodology for electronic packaging and product design and development which is capable of addressing in efficient manner specified environmental, reliability and cost requirements. A study which focuses on the design of a flip-chip package is presented. Different alternatives for the design of the flip-chip package are considered based on existing options for the applied underfill and volume of solder material used to form the interconnects. Variations in these design input parameters have simultaneous effect on package aspects such as cost, environmental impact and reliability. A decision system for the design of the flip-chip that uses numerical optimisation approach is used to identify the package optimal specification which satisfies the imposed requirements. The reliability aspect of interest is the fatigue of solder joints under thermal cycling. Transient nonlinear finite element analysis (FEA) is used to simulate the thermal fatigue damage in solder joints subject to thermal cycling. Simulation results are manipulated within design of experiments and response surface modelling framework to provide numerical model for reliability which can be used to quantify the package reliability. Assessment of the environmental impact of the package materials is performed by using so called Toxic Index (TI). In this paper we demonstrate the evaluation of the environmental impact only for underfill and lead-free solder materials. This evaluation is based on the amount of material per flip-chip package. Cost is the dominant factor in contemporary flip-chip packaging industry. In the optimisation based decision support system for the design of the flip-chip package, cost of materials which varies as a result of variations in the design parameters is considered.
Resumo:
This paper discusses a reliability based optimisation modelling approach demonstrated for the design of a SiP structure integrated by stacking dies one upon the other. In this investigation the focus is on the strategy for handling the uncertainties in the package design inputs and their implementation into the design optimisation modelling framework. The analysis of fhermo-mechanical behaviour of the package is utilised to predict the fatigue life-time of the lead-free board level solder interconnects and warpage of the package under thermal cycling. The SiP characterisation is obtained through the exploitation of Reduced Order Models (ROM) constructed using high fidelity analysis and Design of Experiments (DoE) methods. The design task is to identify the optimal SiP design specification by varying several package input parameters so that a specified target reliability of the solder joints is achieved and in the same time design requirements and package performance criteria are met