43 resultados para Time-varying system


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A simulation model has been constructed of a valve manufacturing plant with the aim of assessing capacity requirements in response to a forecast increase in demand. The plant provides a weekly cycle of valves of varying types, based on a yearly production plan. Production control is provided by a just-in-time type system to minimise inventory. The simulation model investigates the effect on production lead time of a range of valve sequences into the plant. The study required the collection of information from a variety of sources, and a model that reflected the true capabilities of the production system. The simulation results convinced management that substantial changes were needed in order to meet demand. The case highlights the use of simulation in enabling a manager to quantify operational scenarios and thus provide a rational basis on which to take decisions on meeting performance criteria.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article examines whether UK portfolio returns are time varying so that expected returns follow an AR(1) process as proposed by Conrad and Kaul for the USA. It explores this hypothesis for four portfolios that have been formed on the basis of market capitalization. The portfolio returns are modelled using a kalman filter signal extraction model in which the unobservable expected return is the state variable and is allowed to evolve as a stationary first order autoregressive process. It finds that this model is a good representation of returns and can account for most of the autocorrelation present in observed portfolio returns. This study concludes that UK portfolio returns are time varying and the nature of the time variation appears to introduce a substantial amount of autocorrelation to portfolio returns. Like Conrad and Kaul if finds a link between the extent to which portfolio returns are time varying and the size of firms within a portfolio but not the monotonic one found for the USA.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

On 20 October 1997 the London Stock Exchange introduced a new trading system called SETS. This system was to replace the dealer system SEAQ, which had been in operation since 1986. Using the iterative sum of squares test introduced by Inclan and Tiao (1994), we investigate whether there was a change in the unconditional variance of opening and closing returns, at the time SETS was introduced. We show that for the FTSE-100 stocks traded on SETS, on the days following its introduction, there was a widespread increase in the volatility of both opening and closing returns. However, no synchronous volatility changes were found to be associated with the FTSE-100 index or FTSE-250 stocks. We conclude therefore that the introduction of the SETS trading mechanism caused an increase in noise at the time the system was introduced.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A two-factor no-arbitrage model is used to provide a theoretical link between stock and bond market volatility. While this model suggests that short-term interest rate volatility may, at least in part, drive both stock and bond market volatility, the empirical evidence suggests that past bond market volatility affects both markets and feeds back into short-term yield volatility. The empirical modelling goes on to examine the (time-varying) correlation structure between volatility in the stock and bond markets and finds that the sign of this correlation has reversed over the last 20 years. This has important implications far portfolio selection in financial markets. © 2005 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effects of attentional modulation on activity within the human visual cortex were investigated using magnetoencephalography. Chromatic sinusoidal stimuli were used to evoke activity from the occipital cortex, with attention directed either toward or away from the stimulus using a bar-orientation judgment task. For five observers, global magnetic field power was plotted as a function of time from stimulus onset. The major peak of each function occurred at about 120 ms latency and was well modeled by a current dipole near the calcarine sulcus. Independent component analysis (ICA) on the non-averaged data for each observer also revealed one component of calcarine origin, the location of which matched that of the dipolar source determined from the averaged data. For two observers, ICA revealed a second component near the parieto-occipital sulcus. Although no effects of attention were evident using standard averaging procedures, time-varying spectral analyses of single trials revealed that the main effect of attention was to alter the level of oscillatory activity. Most notably, a sustained increase in alpha-band (7-12 Hz) activity of both calcarine and parieto-occipital origin was evident. In addition, calcarine activity in the range of 13-21 Hz was enhanced, while calcarine activity in the range of 5-6 Hz was reduced. Our results are consistent with the hypothesis that attentional modulation affects neural processing within the calcarine and parieto-occipital cortex by altering the amplitude of alpha-band activity and other natural brain rhythms. © 2003 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Speech comprises dynamic and heterogeneous acoustic elements, yet it is heard as a single perceptual stream even when accompanied by other sounds. The relative contributions of grouping “primitives” and of speech-specific grouping factors to the perceptual coherence of speech are unclear, and the acoustical correlates of the latter remain unspecified. The parametric manipulations possible with simplified speech signals, such as sine-wave analogues, make them attractive stimuli to explore these issues. Given that the factors governing perceptual organization are generally revealed only where competition operates, the second-formant competitor (F2C) paradigm was used, in which the listener must resist competition to optimize recognition [Remez et al., Psychol. Rev. 101, 129-156 (1994)]. Three-formant (F1+F2+F3) sine-wave analogues were derived from natural sentences and presented dichotically (one ear = F1+F2C+F3; opposite ear = F2). Different versions of F2C were derived from F2 using separate manipulations of its amplitude and frequency contours. F2Cs with time-varying frequency contours were highly effective competitors, regardless of their amplitude characteristics. In contrast, F2Cs with constant frequency contours were completely ineffective. Competitor efficacy was not due to energetic masking of F3 by F2C. These findings indicate that modulation of the frequency, but not the amplitude, contour is critical for across-formant grouping.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experimental observation of autosoliton propagation in a nonlinear switch-guided, dispersion-managed system operating at 80 Gbit/s is reported for the first time. The system is based on a strong dispersion map and supports autosoliton propagation over 3000 km.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Corporate restructuring is perceived as a challenge to research. Prior studies do not provide conclusive evidence regarding the effects of restructuring. Since there are discernible findings, this research attempts to examine the effects of restructuring events amongst the UK listed firms. The sample firms are listed in the LSE and London AIM stock exchange. Only completed restructuring transactions are included in the study. The time horizon extends from year 1999 to 2003. A three-year floating window is assigned to examine the sample firms. The key enquiry is to scrutinise the ex post effects of restructuring on performance and value measures of firms with contrast to a matched criteria non-restructured sample. A cross sectional study employing logit estimate is undertaken to examine firm characteristics of restructuring samples. Further, additional parameters, i.e. Conditional Volatility and Asymmetry are generated under the GJR-GARCH estimate and reiterated in logit models to capture time-varying heteroscedasticity of the samples. This research incorporates most forms of restructurings, while prior studies have examined certain forms of restructuring. Particularly, these studies have made limited attempts to examine different restructuring events simultaneously. In addition to logit analysis, an event study is adopted to evaluate the announcement effect of restructuring under both the OLS and GJR-GARCH estimate supplementing our prior results. By engaging a composite empirical framework, our estimation method validates a full appreciation of restructuring effect. The study provides evidence that restructurings indicate non-trivial significant positive effect. There are some evidences that the response differs because of the types of restructuring, particularly while event study is applied. The results establish that performance measures, i.e. Operating Profit Margin, Return on Equity, Return on Assets, Growth, Size, Profit Margin and Shareholders' Ownership indicate consistent and significant increase. However, Leverage and Asset Turn Over suggest reasonable influence on restructuring across the sample period. Similarly, value measures, i.e. Abnormal Returns, Return on Equity and Cash Flow Margin suggest sizeable improvement. A notable characteristic seen coherently throughout the analysis is the decreasing proportion of Systematic Risk. Consistent with these findings, Conditional Volatility and Asymmetry exhibit similar trend. The event study analysis suggests that on an average market perceives restructuring favourably and shareholders experience significant and systematic positive gain.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Thesis reports on the principles and usefulness of Performance Rating as developed by the writer over a number of years. In Part one a brief analysis is made of the Quality scene and its development up to the present. The need is exposed for Performance Rating as a tool for all areas of management*. At the same time a system of Quality Control is described which the writer has further developed under the title of 'Operator Control'. This system is based on the integration of all Quality control functions with the creative functions required for Quality achievement. The discussions are mainly focussed on the general philosophy of Quality, its creation and control and that part of Operator Control which affects Performance Rating. Whereas it is shown that the combination of Operator Control and Performance Rating is both economically and technically advantageous, Performance Rating can also usefully be applied under inspection control conditions. Part two describes the principles of Area Performance Rating. *The need for, and the advantages of, Performance Rating are particularly demonstrated in Case study No.1. From this a summation expression is derived which gives the key for grouping of areas with similar Performance Rating (P). A model is devised on which the theory is demonstrated. Relevant case studies, carried out in practice in factories are quoted in Part two, Chapter 4, one written by the Quality manager of that particular factory. Particular stress is laid in the final conclusions on management's function in the Quality field and how greatly this function is eased and improved through the introduction of Area Performance Rating.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis documents the design, implementation and testing of a smart sensing platform that is able to discriminate between differences or small changes in a persons walking. The distributive tactile sensing method is used to monitor the deflection of the platform surface using just a small number of sensors and, through the use of neural networks, infer the characteristics of the object in contact with the surface. The thesis first describes the development of a mathematical model which uses a novel method to track the position of a moving load as it passes over the smart sensing surface. Experimental methods are then described for using the platform to track the position of swinging pendulum in three dimensions. It is demonstrated that the method can be extended to that of real-time measurement of balance and sway of a person during quiet standing. Current classification methods are then investigated for use in the classification of different gait patterns, in particular to identify individuals by their unique gait pattern. Based on these observations, a novel algorithm is developed that is able to discriminate between abnormal and affected gait. This algorithm, using the distributive tactile sensing method, was found to have greater accuracy than other methods investigated and was designed to be able to cope with any type of gait variation. The system developed in this thesis has applications in the area of medical diagnostics, either as an initial screening tool for detecting walking disorders or to be able to automatically detect changes in gait over time. The system could also be used as a discrete biometric identification method, for example identifying office workers as they pass over the surface.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Distributed network utility maximization (NUM) is receiving increasing interests for cross-layer optimization problems in multihop wireless networks. Traditional distributed NUM algorithms rely heavily on feedback information between different network elements, such as traffic sources and routers. Because of the distinct features of multihop wireless networks such as time-varying channels and dynamic network topology, the feedback information is usually inaccurate, which represents as a major obstacle for distributed NUM application to wireless networks. The questions to be answered include if distributed NUM algorithm can converge with inaccurate feedback and how to design effective distributed NUM algorithm for wireless networks. In this paper, we first use the infinitesimal perturbation analysis technique to provide an unbiased gradient estimation on the aggregate rate of traffic sources at the routers based on locally available information. On the basis of that, we propose a stochastic approximation algorithm to solve the distributed NUM problem with inaccurate feedback. We then prove that the proposed algorithm can converge to the optimum solution of distributed NUM with perfect feedback under certain conditions. The proposed algorithm is applied to the joint rate and media access control problem for wireless networks. Numerical results demonstrate the convergence of the proposed algorithm. © 2013 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examines the selectivity and timing performance of 218 UK investment trusts over the period July 1981 to June 2009. We estimate the Treynor and Mazuy (1966) and Henriksson and Merton (1981) models augmented with the size, value, and momentum factors, either under the OLS method adjusted with the Newey-West procedure or under the GARCH(1,1)-in-mean method following the specification of Glosten et al. (1993; hereafter GJR-GARCH-M). We find that the OLS method provides little evidence in favour of the selectivity and timing ability, consistent with previous studies. Interestingly, the GJR-GARCH-M method reverses this result, showing some relatively strong evidence on favourable selectivity ability, particularly for international funds, as well as favourable timing ability, particularly for domestic funds. We conclude that the GJR-GARCH-M method performs better in evaluating fund performance compared with the OLS method and the non-parametric approach, as it essentially accounts for the time-varying characteristics of factor loadings and hence obtains more reliable results, in particular, when the high frequency data, such as the daily returns, are used in the analysis. Our results are robust to various in-sample and out-of-sample tests and have valuable implications for practitioners in making their asset allocation decisions across different fund styles. © 2012 Elsevier B.V.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

How speech is separated perceptually from other speech remains poorly understood. In a series of experiments, perceptual organisation was probed by presenting three-formant (F1+F2+F3) analogues of target sentences dichotically, together with a competitor for F2 (F2C), or for F2+F3, which listeners must reject to optimise recognition. To control for energetic masking, the competitor was always presented in the opposite ear to the corresponding target formant(s). Sine-wave speech was used initially, and different versions of F2C were derived from F2 using separate manipulations of its amplitude and frequency contours. F2Cs with time-varying frequency contours were highly effective competitors, whatever their amplitude characteristics, whereas constant-frequency F2Cs were ineffective. Subsequent studies used synthetic-formant speech to explore the effects of manipulating the rate and depth of formant-frequency change in the competitor. Competitor efficacy was not tuned to the rate of formant-frequency variation in the target sentences; rather, the reduction in intelligibility increased with competitor rate relative to the rate for the target sentences. Therefore, differences in speech rate may not be a useful cue for separating the speech of concurrent talkers. Effects of competitors whose depth of formant-frequency variation was scaled by a range of factors were explored using competitors derived either by inverting the frequency contour of F2 about its geometric mean (plausibly speech-like pattern) or by using a regular and arbitrary frequency contour (triangle wave, not plausibly speech-like) matched to the average rate and depth of variation for the inverted F2C. Competitor efficacy depended on the overall depth of frequency variation, not depth relative to that for the other formants. Furthermore, the triangle-wave competitors were as effective as their more speech-like counterparts. Overall, the results suggest that formant-frequency variation is critical for the across-frequency grouping of formants but that this grouping does not depend on speech-specific constraints.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Limited energy is a big challenge for large scale wireless sensor networks (WSN). Previous research works show that modulation scaling is an efficient technique to reduce energy consumption. However, the impacts of using modulation scaling on packet delivery latency and loss are not considered, which may have adverse effects on the application qualities. In this paper, we study this problem and propose control schemes to minimize energy consumption while ensuring application qualities. We first analyze the relationships of modulation scaling and energy consumption, end-to-end delivery latency and packet loss ratio. With the analytical model, we develop a centralized control scheme to adaptively adjust the modulation levels, in order to minimize energy consumption and ensure the application qualities. To improve the scalability of the centralized control scheme, we also propose a distributed control scheme. In this scheme, the sink will send the differences between the required and measured application qualities to the sensors. The sensors will update their modulation levels with the local information and feedback from the sink. Experimental results show the effectiveness of energy saving and QoS guarantee of the control schemes. The control schemes can adapt efficiently to the time-varying requirements on application qualities. Copyright © 2005 The Institute of Electronics, Information and Communication Engineers.