913 resultados para Timing
Resumo:
Oxygen isotope records of stalagmites from China and Oman reveal a weak summer monsoon event, with a double-plunging structure, that started 8.21 +/- 0.02 kyr B. P. An identical but antiphased pattern is also evident in two stalagmite records from eastern Brazil, indicating that the South American Summer Monsoon was intensified during the 8.2 kyr B. P. event. These records demonstrate that the event was of global extent and synchronous within dating errors of <50 years. In comparison with recent model simulations, it is plausible that the 8.2 kyr B. P. event can be tied in changes of the Atlantic Meridional Overturning Circulation triggered by a glacial lake draining event. This, in turn, affected North Atlantic climate and latitudinal position of the Intertropical Convergence Zone, resulting in the observed low-latitude monsoonal precipitation patterns.
Resumo:
BACKGROUND AND OBJECTIVE: To a large extent, people who have suffered a stroke report unmet needs for rehabilitation. The purpose of this study was to explore aspects of rehabilitation provision that potentially contribute to self-reported met needs for rehabilitation 12 months after stroke with consideration also to severity of stroke. METHODS: The participants (n = 173) received care at the stroke units at the Karolinska University Hospital, Sweden. Using a questionnaire, the dependent variable, self-reported met needs for rehabilitation, was collected at 12 months after stroke. The independent variables were four aspects of rehabilitation provision based on data retrieved from registers and structured according to four aspects: amount of rehabilitation, service level (day care rehabilitation, primary care rehabilitation and home-based rehabilitation), operator level (physiotherapist, occupational therapist, speech therapist) and time after stroke onset. Multivariate logistic regression analyses regarding the aspects of rehabilitation were performed for the participants who were divided into three groups based on stroke severity at onset. RESULTS: Participants with moderate/severe stroke who had seen a physiotherapist at least once during each of the 1st, 2nd and 3rd-4th quarters of the first year (OR 8.36, CI 1.40-49.88 P = 0.020) were more likely to report met rehabilitation needs. CONCLUSION: For people with moderate/severe stroke, continuity in rehabilitation (preferably physiotherapy) during the first year after stroke seems to be associated with self-reported met needs for rehabilitation.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.
Resumo:
Market timing performance of mutual funds is usually evaluated with linear models with dummy variables which allow for the beta coefficient of CAPM to vary across two regimes: bullish and bearish market excess returns. Managers, however, use their predictions of the state of nature to deÞne whether to carry low or high beta portfolios instead of the observed ones. Our approach here is to take this into account and model market timing as a switching regime in a way similar to Hamilton s Markov-switching GNP model. We then build a measure of market timing success and apply it to simulated and real world data.
Resumo:
None
Resumo:
Este trabalho objetiva verificar a existência de “Market Timing” no mercado acionário brasileiro. Os nossos estudos foram divididos em três análises distintas. Primeiro verificamos a presença de “market-timing” nos IPOs e posteriormente expandimos para as ofertas subseqüentes de ações (OSAs). Por último verificamos a persistência dos efeitos do “Market Timing” sobre a estrutura de capital das empresas. Os resultados dos nossos estudos mostram que as empresas brasileiras tendem a emitir mais capital quando o mercado está aquecido. Essas emissões acontecem através de IPOs e de OSAs e alteram a estrutura de capital dessas empresas. Com o passar do tempo essa alteração na estrutura de capital tende a diminuir.
Resumo:
Este trabalho tem como objetivo estudar e avaliar técnicas para a aceleração de algoritmos de análise de timing funcional (FTA - Functional Timing Analysis) baseados em geração automática de testes (ATPG – Automatic Test Generation). Para tanto, são abordados três algoritmos conhecidos : algoritmo-D, o PODEM e o FAN. Após a análise dos algoritmos e o estudo de algumas técnicas de aceleração, é proposto o algoritmo DETA (Delay Enumeration-Based Timing Analysis) que determina o atraso crítico de circuitos que contêm portas complexas. O DETA está definido como um algoritmo baseado em ATPG com sensibilização concorrente de caminhos. Na implementação do algoritmo, foi possível validar o modelo de computação de atrasos para circuitos que contêm portas complexas utilizando a abordagem de macro-expansão implícita. Além disso, alguns resultados parciais demonstram que, para alguns circuitos, o DETA apresenta uma pequena dependência do número de entradas quando comparado com a dependência no procedimento de simulação. Desta forma, é possível evitar uma pesquisa extensa antes de se encontrar o teste e assim, obter sucesso na aplicação de métodos para aceleração do algoritmo.
Resumo:
I examine the effects of uncertainty about the timing of de aIs (i.e. temporary price cuts or sales) on consumer behavior in a dynamic inventory model of consumer choice. I derive implications for purchase behavior and test them empirically, using two years of scanner data for soft drinks. I fmd that loyal consumers' decisions, both about the allocation of their purchases over time and the quantity to be purchased in a particular deal, are affected by the uncertainty about the timing of the deal for the product. Loyal consumers buy a higher fraction of their overall purchases during de ais as the uncertainty decreases. This effect increases with an increase in the product' s share of a given consumer' s purchase in the same category or if the consumer stockpiles (i.e., is a shopper). During a particular deal, loyal shoppers increase the quantity they purchase the more time that has passed since the previous de aI, and the higher the uncertainty about the deals' timing. For the non-Ioyal consumers these effects are not significant. These results hold for products that are frequently purchased, like soft-drinks and yogurt, but do not hold for less frequentIy purchased products, such as laundry detergents. The fmdings suggest that manufacturers and retailers should incorporate the effects of deals' timing on consumers' purchase' decisions when deriving optimal pricing strategies.
Resumo:
This paper examines the relevance of market timing as a motive for initial public offerings (IPOs) by comparing IPOs of firms that are members of Japanese keiretsu industrial groups with IPOs of independent Japanese firms. We argue that Japanese keiretsu-linked IPOs form a favorable sample to find evidence of the market timing motive. Instead, the data provide strong evidence for a restructuring motive and little evidence for market timing. We find that long run returns to keiretsu and independent IPOs are not negative, contrary to U.S. evidence, and are indistinguishable from each other; initial returns to keiretsu-linked IPOs are significantly higher than to independent firms; and a significant number of keiretsu IPO firms adjust their linkages with the group following the IPO, with both increases and decreases.
Resumo:
This paper proposes a simple macroeconomic model with staggered investment decisions. The model captures the dynamic coordination problem arising from demand externalities and fixed costs of investment. In times of low economic activity, a firm faces low demand and hence has less incentives for investing, which reinforces firms’ expectations of low demand. In the unique equilibrium of the model, demand expectations are pinned down by fundamentals and history. Owing to the beliefs that arise in equilibrium, there is no special reason for stimulus at times of low economic activity.