997 resultados para Awakening time
Resumo:
Criteria for the L2-stability of linear and nonlinear time-varying feedback systems are given. These are conditions in the time domain involving the solution of certain associated matrix Riccati equations and permitting the use of a very general class of L2-operators as multipliers.
Resumo:
Continuous common mode feedback (CMFB) circuits having high input impedance and low distortion are proposed. The proposed circuits are characterized for 0.18 mu m CMOS process with 1.8 V supply. Simulation results indicate that the proposed common mode detector consumes no standby power and CMFB circuit consumes 27-34% less power than previous high swing CMFB circuits.
Resumo:
In this study, we derive a fast, novel time-domain algorithm to compute the nth-order moment of the power spectral density of the photoelectric current as measured in laser-Doppler flowmetry (LDF). It is well established that in the LDF literature these moments are closely related to fundamental physiological parameters, i.e. concentration of moving erythrocytes and blood flow. In particular, we take advantage of the link between moments in the Fourier domain and fractional derivatives in the temporal domain. Using Parseval's theorem, we establish an exact analytical equivalence between the time-domain expression and the conventional frequency-domain counterpart. Moreover, we demonstrate the appropriateness of estimating the zeroth-, first- and second-order moments using Monte Carlo simulations. Finally, we briefly discuss the feasibility of implementing the proposed algorithm in hardware.
Resumo:
In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.
Resumo:
The purpose of this thesis is to examine the role of trade durations in price discovery. The motivation to use trade durations in the study of price discovery is that durations are robust to many microstructure effects that introduce a bias in the measurement of returns volatility. Another motivation to use trade durations in the study of price discovery is that it is difficult to think of economic variables, which really are useful in the determination of the source of volatility at arbitrarily high frequencies. The dissertation contains three essays. In the first essay, the role of trade durations in price discovery is examined with respect to the volatility pattern of stock returns. The theory on volatility is associated with the theory on the information content of trade, dear to the market microstructure theory. The first essay documents that the volatility per transaction is related to the intensity of trade, and a strong relationship between the stochastic process of trade durations and trading variables. In the second essay, the role of trade durations in price discovery is examined with respect to the quantification of risk due to a trading volume of a certain size. The theory on volume is intrinsically associated with the stock volatility pattern. The essay documents that volatility increases, in general, when traders choose to trade with large transactions. In the third essay, the role of trade durations in price discovery is examined with respect to the information content of a trade. The theory on the information content of a trade is associated with the theory on the rate of price revisions in the market. The essay documents that short durations are associated with information. Thus, traders are compensated for responding quickly to information
Resumo:
Motivated by developments in spacecraft dynamics, the asymptotic behaviour and boundedness of solution of a special class of time varying systems in which each term appears as the sum of a constant and a time varying part, are analysed in this paper. It is not possible to apply standard textbook results to such systems, which are originally in second order. Some of the existing results are reformulated. Four theorems which explore the relations between the asymptotic behaviour/boundedness of the constant coefficient system, obtained by equating the time varying terms to zero, to the corresponding behaviour of the time varying system, are developed. The results show the behaviour of the two systems to be intimately related, provided the solutions of the constant coefficient system approach zero are bounded for large values of time, and the time varying terms are suitably restrained. Two problems are tackled using these theorems.
Resumo:
This paper presents real-time simulation models of electrical machines on FPGA platform. Implementation of the real-time numerical integration methods with digital logic elements is discussed. Several numerical integrations are presented. A real-time simulation of DC machine is carried out on this FPGA platform and important transient results are presented. These results are compared to simulation results obtained through a commercial off-line simulation software
Resumo:
Since the emergence of service marketing, the focus of service research has evolved. Currently the focus of research is shifting towards value co-created by the customer. Consequently, value creation is increasingly less fixed to a specific time or location controlled by the service provider. However, present service management models, although acknowledging customer participation and accessibility, have not considered the role of the empowered customer who may perform the service at various locations and time frames. The present study expands this scope and provides a framework for exploring customer perceived value from a temporal and spatial perspective. The framework is used to understand and analyse customer perceived value and to explore customer value profiles. It is proposed that customer perceived value can be conceptualised as a function of technical, functional, temporal and spatial value dimensions. These dimensions are suggested to have value-increasing and value-decreasing facets. This conceptualisation is empirically explored in an online banking context and it is shown that time and location are more important value dimensions relative to the technical and functional dimensions. The findings demonstrate that time and location are important not only in terms of having the possibility to choose when and where the service is performed. Customers also value an efficient and optimised use of time and a private and customised service location. The study demonstrates that time and location are not external elements that form the service context, but service value dimensions, in addition to the technical and functional dimensions. This thesis contributes to existing service management research through its framework for understanding temporal and spatial dimensions of perceived value. Practical implications of the study are that time and location need to be considered as service design elements in order to differentiate the service from other services and create additional value for customers. Also, because of increased customer control and the importance of time and location, it is increasingly relevant for service providers to provide a facilitating arena for customers to create value, rather than trying to control the value creation process. Kristina Heinonen is associated with CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration
Resumo:
This study examined the effects of the Greeks of the options and the trading results of delta hedging strategies, with three different time units or option-pricing models. These time units were calendar time, trading time and continuous time using discrete approximation (CTDA) time. The CTDA time model is a pricing model, that among others accounts for intraday and weekend, patterns in volatility. For the CTDA time model some additional theta measures, which were believed to be usable in trading, were developed. The study appears to verify that there were differences in the Greeks with different time units. It also revealed that these differences influence the delta hedging of options or portfolios. Although it is difficult to say anything about which is the most usable of the different time models, as this much depends on the traders view of the passing of time, different market conditions and different portfolios, the CTDA time model can be viewed as an attractive alternative.
Resumo:
The likelihood ratio test of cointegration rank is the most widely used test for cointegration. Many studies have shown that its finite sample distribution is not well approximated by the limiting distribution. The article introduces and evaluates by Monte Carlo simulation experiments bootstrap and fast double bootstrap (FDB) algorithms for the likelihood ratio test. It finds that the performance of the bootstrap test is very good. The more sophisticated FDB produces a further improvement in cases where the performance of the asymptotic test is very unsatisfactory and the ordinary bootstrap does not work as well as it might. Furthermore, the Monte Carlo simulations provide a number of guidelines on when the bootstrap and FDB tests can be expected to work well. Finally, the tests are applied to US interest rates and international stock prices series. It is found that the asymptotic test tends to overestimate the cointegration rank, while the bootstrap and FDB tests choose the correct cointegration rank.
Resumo:
The low predictive power of implied volatility in forecasting the subsequently realized volatility is a well-documented empirical puzzle. As suggested by e.g. Feinstein (1989), Jackwerth and Rubinstein (1996), and Bates (1997), we test whether unrealized expectations of jumps in volatility could explain this phenomenon. Our findings show that expectations of infrequently occurring jumps in volatility are indeed priced in implied volatility. This has two important consequences. First, implied volatility is actually expected to exceed realized volatility over long periods of time only to be greatly less than realized volatility during infrequently occurring periods of very high volatility. Second, the slope coefficient in the classic forecasting regression of realized volatility on implied volatility is very sensitive to the discrepancy between ex ante expected and ex post realized jump frequencies. If the in-sample frequency of positive volatility jumps is lower than ex ante assessed by the market, the classic regression test tends to reject the hypothesis of informational efficiency even if markets are informationally effective.
Resumo:
This paper investigates the persistent pattern in the Helsinki Exchanges. The persistent pattern is analyzed using a time and a price approach. It is hypothesized that arrival times are related to movements in prices. Thus, the arrival times are defined as durations and formulated as an Autoregressive Conditional Duration (ACD) model as in Engle and Russell (1998). The prices are defined as price changes and formulated as a GARCH process including duration measures. The research question follows from market microstructure predictions about price intensities defined as time between price changes. The microstructure theory states that long transaction durations might be associated with both no news and bad news. Accordingly, short durations would be related to high volatility and long durations to low volatility. As a result, the spread will tend to be larger under intensive moments. The main findings of this study are 1) arrival times are positively autocorrelated and 2) long durations are associated with low volatility in the market.
Resumo:
In this two-part series of papers, a generalized non-orthogonal amplify and forward (GNAF) protocol which generalizes several known cooperative diversity protocols is proposed. Transmission in the GNAF protocol comprises of two phases - the broadcast phase and the cooperation phase. In the broadcast phase, the source broadcasts its information to the relays as well as the destination. In the cooperation phase, the source and the relays together transmit a space-time code in a distributed fashion. The GNAF protocol relaxes the constraints imposed by the protocol of Jing and Hassibi on the code structure. In Part-I of this paper, a code design criteria is obtained and it is shown that the GNAF protocol is delay efficient and coding gain efficient as well. Moreover GNAF protocol enables the use of sphere decoders at the destination with a non-exponential Maximum likelihood (ML) decoding complexity. In Part-II, several low decoding complexity code constructions are studied and a lower bound on the Diversity-Multiplexing Gain tradeoff of the GNAF protocol is obtained.
Resumo:
We address the problem of distributed space-time coding with reduced decoding complexity for wireless relay network. The transmission protocol follows a two-hop model wherein the source transmits a vector in the first hop and in the second hop the relays transmit a vector, which is a transformation of the received vector by a relay-specific unitary transformation. Design criteria is derived for this system model and codes are proposed that achieve full diversity. For a fixed number of relay nodes, the general system model considered in this paper admits code constructions with lower decoding complexity compared to codes based on some earlier system models.