947 resultados para Specifications
Resumo:
The present study examines empirically the inflation dynamics of the euro area. The focus of the analysis is on the role of expectations in the inflation process. In six articles we relax rationality assumption and proxy expectations directly using OECD forecasts or Consensus Economics survey data. In the first four articles we estimate alternative Phillips curve specifications and find evidence that inflation cannot instantaneously adjust to changes in expectations. A possible departure of expectations from rationality seems not to be powerful enough to totally explain the persistence of euro area inflation in the New Keynesian framework. When expectations are measured directly, the purely forward-looking New Keynesian Phillips curve is outperformed by the hybrid Phillips curve with an additional lagged inflation term and the New Classical Phillips curve with a lagged expectations term. The results suggest that the euro area inflation process has become more forward-looking in the recent years of low and stable inflation. Moreover, in low inflation countries, the inflation dynamics have been more forward-looking already since the late 1970s. We find evidence of substantial heterogeneity of inflation dynamics across the euro area countries. Real time data analysis suggests that in the euro area real time information matters most in the expectations term in the Phillips curve and that the balance of expectations formation is more forward- than backward-looking. Vector autoregressive (VAR) models of actual inflation, inflation expectations and the output gap are estimated in the last two articles.The VAR analysis indicates that inflation expectations, which are relatively persistent, have a significant effect on output. However,expectations seem to react to changes in both output and actual inflation, especially in the medium term. Overall, this study suggests that expectations play a central role in inflation dynamics, which should be taken into account in conducting monetary policy.
Resumo:
Detect and Avoid (DAA) technology is widely acknowledged as a critical enabler for unsegregated Remote Piloted Aircraft (RPA) operations, particularly Beyond Visual Line of Sight (BVLOS). Image-based DAA, in the visible spectrum, is a promising technological option for addressing the challenges DAA presents. Two impediments to progress for this approach are the scarcity of available video footage to train and test algorithms, in conjunction with testing regimes and specifications which facilitate repeatable, statistically valid, performance assessment. This paper includes three key contributions undertaken to address these impediments. In the first instance, we detail our progress towards the creation of a large hybrid collision and near-collision encounter database. Second, we explore the suitability of techniques employed by the biometric research community (Speaker Verification and Language Identification), for DAA performance optimisation and assessment. These techniques include Detection Error Trade-off (DET) curves, Equal Error Rates (EER), and the Detection Cost Function (DCF). Finally, the hybrid database and the speech-based techniques are combined and employed in the assessment of a contemporary, image based DAA system. This system includes stabilisation, morphological filtering and a Hidden Markov Model (HMM) temporal filter.
Resumo:
We consider systems composed of a base system with multiple “features” or “controllers”, each of which independently advise the system on how to react to input events so as to conform to their individual specifications. We propose a methodology for developing such systems in a way that guarantees the “maximal” use of each feature. The methodology is based on the notion of “conflict-tolerant” features that are designed to continue offering advice even when their advice has been overridden in the past. We give a simple priority-based composition scheme for such features, which ensures that each feature is maximally utilized. We also provide a formal framework for specifying, verifying, and synthesizing such features. In particular we obtain a compositional technique for verifying systems developed in this framework.
Resumo:
This paper addresses the problem of detecting and resolving conflicts due to timing constraints imposed by features in real-time systems. We consider systems composed of a base system with multiple features or controllers, each of which independently advise the system on how to react to input events so as to conform to their individual specifications. We propose a methodology for developing such systems in a modular manner based on the notion of conflict tolerant features that are designed to continue offering advice even when their advice has been overridden in the past. We give a simple priority based scheme for composing such features. This guarantees the maximal use of each feature. We provide a formal framework for specifying such features, and a compositional technique for verifying systems developed in this framework.
Resumo:
This paper addresses the problem of detecting and resolving conflicts due to timing constraints imposed by features in real-time and hybrid systems. We consider systems composed of a base system with multiple features or controllers, each of which independently advise the system on how to react to input events so as to conform to their individual specifications. We propose a methodology for developing such systems in a modular manner based on the notion of conflict-tolerant features that are designed to continue offering advice even when their advice has been overridden in the past. We give a simple priority-based scheme forcomposing such features. This guarantees the maximal use of each feature. We provide a formal framework for specifying such features, and a compositional technique for verifying systems developed in this framework.
Resumo:
The problem of admission control of packets in communication networks is studied in the continuous time queueing framework under different classes of service and delayed information feedback. We develop and use a variant of a simulation based two timescale simultaneous perturbation stochastic approximation (SPSA) algorithm for finding an optimal feedback policy within the class of threshold type policies. Even though SPSA has originally been designed for continuous parameter optimization, its variant for the discrete parameter case is seen to work well. We give a proof of the hypothesis needed to show convergence of the algorithm on our setting along with a sketch of the convergence analysis. Extensive numerical experiments with the algorithm are illustrated for different parameter specifications. In particular, we study the effect of feedback delays on the system performance.
Resumo:
In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.
Resumo:
The use of different time units in option pricing may lead to inconsistent estimates of time decay and spurious jumps in implied volatilities. Different time units in the pricing model leads to different implied volatilities although the option price itself is the same.The chosen time unit should make it necessary to adjust the volatility parameter only when there are some fundamental reasons for it and not due to wrong specifications of the model. This paper examined the effects of option pricing using different time hypotheses and empirically investigated which time frame the option markets in Germany employ over weekdays. The paper specifically tries to get a picture of how the market prices options. The results seem to verify that the German market behaves in a fashion that deviates from the most traditional time units in option pricing, calendar and trading days. The study also showed that the implied volatility of Thursdays was somewhat higher and thus differed from the pattern of other days of the week. Using a GARCH model to further investigate the effect showed that although a traditional tests, like the analysis of variance, indicated a negative return for Thursday during the same period as the implied volatilities used, this was not supported using a GARCH model.
Resumo:
This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.
Resumo:
We present an interactive map-based technique for designing single-input-single-output compliant mechanisms that meet the requirements of practical applications. Our map juxtaposes user-specifications with the attributes of real compliant mechanisms stored in a database so that not only the practical feasibility of the specifications can be discerned quickly but also modifications can be done interactively to the existing compliant mechanisms. The practical utility of the method presented here exceeds that of shape and size optimizations because it accounts for manufacturing considerations, stress limits, and material selection. The premise for the method is the spring-leverage (SL) model, which characterizes the kinematic and elastostatic behavior of compliant mechanisms with only three SL constants. The user-specifications are met interactively using the beam-based 2D models of compliant mechanisms by changing their attributes such as: (i) overall size in two planar orthogonal directions, separately and together, (ii) uniform resizing of the in-plane widths of all the beam elements, (iii) uniform resizing of the out-of-plane thick-nesses of the beam elements, and (iv) the material. We present a design software program with a graphical user interface for interactive design. A case-study that describes the design procedure in detail is also presented while additional case-studies are posted on a website. DOI:10.1115/1.4001877].
Resumo:
This paper describes a technique for artificial generation of learning and test sample sets suitable for character recognition research. Sample sets of English (Latin), Malayalam, Kannada and Tamil characters are generated easily through their prototype specifications by the endpoint co-ordinates, nature of segments and connectivity.
Resumo:
Higher order LCL filters are essential in meeting the interconnection standard requirement for grid-connected voltage source converters. LCL filters offer better harmonic attenuation and better efficiency at a smaller size when compared to the traditional L filters. The focus of this paper is to analyze the LCL filter design procedure from the point of view of power loss and efficiency. The IEEE 1547-2008 specifications for high-frequency current ripple are used as a major constraint early in the design to ensure that all subsequent optimizations are still compliant with the standards. Power loss in each individual filter component is calculated on a per-phase basis. The total inductance per unit of the LCL filter is varied, and LCL parameter values which give the highest efficiency while simultaneously meeting the stringent standard requirements are identified. The power loss and harmonic output spectrum of the grid-connected LCL filter is experimentally verified, and measurements confirm the predicted trends.
Resumo:
This paper introduces CSP-like communication mechanisms into Backus’ Functional Programming (FP) systems extended by nondeterministic constructs. Several new functionals are used to describe nondeterminism and communication in programs. The functionals union and restriction are introduced into FP systems to develop a simple algebra of programs with nondeterminism. The behaviour of other functionals proposed in this paper are characterized by the properties of union and restriction. The axiomatic semantics of communication constructs are presented. Examples show that it is possible to reason about a communicating program by first transforming it into a non-communicating program by using the axioms of communication, and then reasoning about the resulting non-communicating version of the program. It is also shown that communicating programs can be developed from non-communicating programs given as specifications by using a transformational approach.
Resumo:
A review of the research work that has been carried out thus far relating the casting and heat treatment variables to the structure and mechanical properties of Al–7Si–Mg (wt-%) is presented here. Although specifications recommend a wide range of magnesium contents and a fairly high content of iron, a narrow range of magnesium contents, closer to either the upper or lower specified limits depending on the properties desired, and a low iron content will have to be maintained to obtain optimum and consistent mechanical properties. A few studies have revealed that the modification of eutectic silicon slightly increases ductility and fracture toughness and also that the effect of modification is predominant at low iron content. Generally, higher solidification rates give superior mechanical properties. Delayed aging (the time elapsed between quenching and artificial aging during precipitation hardening) severely affects the strength of the alloy. The mechanism of delayed aging can be explained on the basis of Pashley's kinetic model. It has been reported that certain trace additions (cadmium, indium, tin, etc.) neutralise the detrimental effect of delayed aging. In particular, it should be noted that delayed aging is not mentioned in any of the specifications. With reference to the mechanism by which trace additions neutralise the detrimental effect of delayed aging, various hypotheses have been postulated, of which impurity–vacancy interaction appears to be the most widely accepted.
Resumo:
Optimizing a shell and tube heat exchanger for a given duty is an important and relatively difficult task. There is a need for a simple, general and reliable method for realizing this task. The authors present here one such method for optimizing single phase shell-and-tube heat exchangers with given geometric and thermohydraulic constraints. They discuss the problem in detail. Then they introduce a basic algorithm for optimizing the exchanger. This algorithm is based on data from an earlier study of a large collection of feasible designs generated for different process specifications. The algorithm ensures a near-optimal design satisfying the given heat duty and geometric constraints. The authors also provide several sub-algorithms to satisfy imposed velocity limitations. They illustrate how useful these sub-algorithms are with several examples where the exchanger weight is minimized.