85 resultados para Inflation Indexed Swaps
Resumo:
Background. With diffusion-tensor imaging (DTi) it is possible to estimate the structural characteristics of fiber bundles in vivo. This study used DTi to infer damage to the corticospinal tract (CST) and relates this parameter to (a) the level of residual motor ability at least 1 year poststroke and (b) the outcome of intensive motor rehabilitation with constraint-induced movement therapy (CIMT). Objective. To explore the role of CST damage in recovery and CIMT efficacy. Methods. Ten patients with low-functioning hemiparesis were scanned and tested at baseline, before and after CIMT. Lesion overlap with the CST was indexed as reduced anisotropy compared with a CST variability map derived from 26 controls. Residual motor ability was measured through the Wolf Motor Function Test (WMFT) and the Motor Activity Log (MAL) acquired at baseline. CIMT benefit was assessed through the pre—post treatment comparison of WMFT and MAL performance. Results. Lesion overlap with the CST correlated with residual motor ability at baseline, with greater deficits observed in patients with more extended CST damage. Infarct volume showed no systematic association with residual motor ability. CIMT led to significant improvements in motor function but outcome was not associated with the extent of CST damage or infarct volume. Conclusion. The study gives in vivo support for the proposition that structural CST damage, not infarct volume, is a major predictor for residual functional ability in the chronic state. The results provide initial evidence for positive effects of CIMT in patients with varying, including more severe, CST damage.
Resumo:
This paper presents evidence for several features of the population of chess players, and the distribution of their performances measured in terms of Elo ratings and by computer analysis of moves. Evidence that ratings have remained stable since the inception of the Elo system in the 1970’s is given in several forms: by showing that the population of strong players fits a simple logistic-curve model without inflation, by plotting players’ average error against the FIDE category of tournaments over time, and by skill parameters from a model that employs computer analysis keeping a nearly constant relation to Elo rating across that time. The distribution of the model’s Intrinsic Performance Ratings can hence be used to compare populations that have limited interaction, such as between players in a national chess federation and FIDE, and ascertain relative drift in their respective rating systems.
Resumo:
Given a nonlinear model, a probabilistic forecast may be obtained by Monte Carlo simulations. At a given forecast horizon, Monte Carlo simulations yield sets of discrete forecasts, which can be converted to density forecasts. The resulting density forecasts will inevitably be downgraded by model mis-specification. In order to enhance the quality of the density forecasts, one can mix them with the unconditional density. This paper examines the value of combining conditional density forecasts with the unconditional density. The findings have positive implications for issuing early warnings in different disciplines including economics and meteorology, but UK inflation forecasts are considered as an example.
Resumo:
The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.
Resumo:
This paper analyses the appraisal of a specialized form of real estate - data centres - that has a unique blend of locational, physical and technological characteristics that differentiate it from conventional real estate assets. Market immaturity, limited trading and a lack of pricing signals enhance levels of appraisal uncertainty and disagreement relative to conventional real estate assets. Given the problems of applying standard discounted cash flow, an approach to appraisal is proposed that uses pricing signals from traded cash flows that are similar to the cash flows generated from data centres. Based upon ‘the law of one price’, it is assumed that two assets that are expected to generate identical cash flows in the future must have the same value now. It is suggested that the expected cash flow of assets should be analysed over the life cycle of the building. Corporate bond yields are used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds.
Resumo:
Despite growing evidence on the neural bases of emotion regulation, little is known about the mechanisms underlying individual differences in cognitive regulation of negative emotion, and few studies have used objective measures to quantify regulatory success. Using a trait-like psychophysiological measure of emotion regulation, corrugator electromyography, we obtained an objective index of the ability to cognitively reappraise negative emotion in 56 healthy men (session 1), who returned 1.3 years later to perform the same regulation task using fMRI (session 2). Results indicated that the corrugator measure of regulatory skill predicted amygdala-prefrontal functional connectivity. Individuals with greater ability to down-regulate negative emotion as indexed by corrugator at session 1 showed not only greater amygdala attenuation but also greater inverse connectivity between the amygdala and several sectors of the prefrontal cortex while down-regulating negative emotion at session 2. Our results demonstrate that individual differences in emotion regulation are stable over time and underscore the important role of amygdala-prefrontal coupling for successful regulation of negative emotion.
Resumo:
Ensemble clustering (EC) can arise in data assimilation with ensemble square root filters (EnSRFs) using non-linear models: an M-member ensemble splits into a single outlier and a cluster of M−1 members. The stochastic Ensemble Kalman Filter does not present this problem. Modifications to the EnSRFs by a periodic resampling of the ensemble through random rotations have been proposed to address it. We introduce a metric to quantify the presence of EC and present evidence to dispel the notion that EC leads to filter failure. Starting from a univariate model, we show that EC is not a permanent but transient phenomenon; it occurs intermittently in non-linear models. We perform a series of data assimilation experiments using a standard EnSRF and a modified EnSRF by a resampling though random rotations. The modified EnSRF thus alleviates issues associated with EC at the cost of traceability of individual ensemble trajectories and cannot use some of algorithms that enhance performance of standard EnSRF. In the non-linear regimes of low-dimensional models, the analysis root mean square error of the standard EnSRF slowly grows with ensemble size if the size is larger than the dimension of the model state. However, we do not observe this problem in a more complex model that uses an ensemble size much smaller than the dimension of the model state, along with inflation and localisation. Overall, we find that transient EC does not handicap the performance of the standard EnSRF.
Resumo:
PURPOSE: Since its introduction in 2006, messages posted to the microblogging system Twitter have provided a rich dataset for researchers, leading to the publication of over a thousand academic papers. This paper aims to identify this published work and to classify it in order to understand Twitter based research. DESIGN/METHODOLOGY/APPROACH: Firstly the papers on Twitter were identified. Secondly, following a review of the literature, a classification of the dimensions of microblogging research was established. Thirdly, papers were qualitatively classified using open coded content analysis, based on the paper’s title and abstract, in order to analyze method, subject, and approach. FINDINGS: The majority of published work relating to Twitter concentrates on aspects of the messages sent and details of the users. A variety of methodological approaches are used across a range of identified domains. RESEARCH LIMITATIONS/IMPLICATIONS: This work reviewed the abstracts of all papers available via database search on the term “Twitter” and this has two major implications: 1) the full papers are not considered and so works may be misclassified if their abstract is not clear, 2) publications not indexed by the databases, such as book chapters, are not included. ORIGINALITY/VALUE: To date there has not been an overarching study to look at the methods and purpose of those using Twitter as a research subject. Our major contribution is to scope out papers published on Twitter until the close of 2011. The classification derived here will provide a framework within which researchers studying Twitter related topics will be able to position and ground their work
Resumo:
The present study addresses three methodological questions that have been ignored in previous research on EEG indices of the human mirror neuron system (hMNS), particularly in regard to autistic individuals. The first question regards how to elicit the EEG indexed hMNS during movement observation: Is hMNS activation best elicited using long stimulus presentations or multiple short repetitions? The second question regards what EEG sensorimotor frequency bands reflect sensorimotor reactivity during hand movement observation? The third question regards how widespread is the EEG reactivity over the sensorimotor cortex during movement observation? The present study explored sensorimotor alpha and low beta reactivity during hand movement versus static hand or bouncing balls observation and compared two experimental protocols (long exposure vs. multiple repetitions) in the same participants. Results using the multiple repetitions protocol indicated a greater low beta desynchronisation over the sensorimotor cortex during hand movement compared to static hand and bouncing balls observation. This result was not achieved using the long exposure protocol. Therefore, the present study suggests that the multiple repetitions protocol is a more robust protocol to use when exploring the sensorimotor reactivity induced by hand action observation. In addition, sensorimotor low beta desynchronisation was differently modulated during hand movement, static hand and bouncing balls observation (non-biological motion) while it was not the case for sensorimotor alpha and that suggest that low beta may be a more sensitive index of hMNS activation during biological motion observation. In conclusion the present study indicates that sensorimotor reactivity of low beta during hand movement observation was found to be more widespread over the sensorimotor cortex than previously thought.
Resumo:
Using simultaneous electroencephalography as a measure of ongoing activity and functional magnetic resonance imaging (fMRI) as a measure of the stimulus-driven neural response, we examined whether the amplitude and phase of occipital alpha oscillations at the onset of a brief visual stimulus affects the amplitude of the visually evoked fMRI response. When accounting for intrinsic coupling of alpha amplitude and occipital fMRI signal by modeling and subtracting pseudo-trials, no significant effect of prestimulus alpha amplitude on the evoked fMRI response could be demonstrated. Regarding the effect of alpha phase, we found that stimuli arriving at the peak of the alpha cycle yielded a lower blood oxygenation level-dependent (BOLD) fMRI response in early visual cortex (V1/V2) than stimuli presented at the trough of the cycle. Our results therefore show that phase of occipital alpha oscillations impacts the overall strength of a visually evoked response, as indexed by the BOLD signal. This observation complements existing evidence that alpha oscillations reflect periodic variations in cortical excitability and suggests that the phase of oscillations in postsynaptic potentials can serve as a mechanism of gain control for incoming neural activity. Finally, our findings provide a putative neural basis for observations of alpha phase dependence of visual perceptual performance.
Resumo:
Property ownership can tie up large amounts of capital and management energy that business could employ more productively elsewhere. Competitive pressures, accounting changes and increasingly sophisticated occupier requirements are building demand for new and innovative ways to satisfy corporate occupation needs. The investment climate is also changing. Falling interest rates and falling inflation can be expected to undermine returns from the traditional FRI lease. In future, investment returns will be more dependent on active and innovative management geared to the needs of occupiers on whom income depends. Occupier and investor interests, therefore, look set to coincide, but unlocking the potential for both parties will depend on developing new finance and investment vehicles that align their respective needs. In the UK, examples include PFI in the public sector and off-balance sheet financing in the private sector. In the USA, “synthetic lease” structures have also become popular. Growing investment market experience in assessing risks and returns suggests scope for further innovative arrangements in the corporate sector. But how can such arrangements be structured? What are the risks, drivers and barriers?
Resumo:
Background: Association mapping, initially developed in human disease genetics, is now being applied to plant species. The model species Arabidopsis provided some of the first examples of association mapping in plants, identifying previously cloned flowering time genes, despite high population sub-structure. More recently, association genetics has been applied to barley, where breeding activity has resulted in a high degree of population sub-structure. A major genotypic division within barley is that between winter- and spring-sown varieties, which differ in their requirement for vernalization to promote subsequent flowering. To date, all attempts to validate association genetics in barley by identifying major flowering time loci that control vernalization requirement (VRN-H1 and VRN-H2) have failed. Here, we validate the use of association genetics in barley by identifying VRN-H1 and VRN-H2, despite their prominent role in determining population sub-structure. Results: By taking barley as a typical inbreeding crop, and seasonal growth habit as a major partitioning phenotype, we develop an association mapping approach which successfully identifies VRN-H1 and VRN-H2, the underlying loci largely responsible for this agronomic division. We find a combination of Structured Association followed by Genomic Control to correct for population structure and inflation of the test statistic, resolved significant associations only with VRN-H1 and the VRN-H2 candidate genes, as well as two genes closely linked to VRN-H1 (HvCSFs1 and HvPHYC). Conclusion: We show that, after employing appropriate statistical methods to correct for population sub-structure, the genome-wide partitioning effect of allelic status at VRN-H1 and VRN-H2 does not result in the high levels of spurious association expected to occur in highly structured samples. Furthermore, we demonstrate that both VRN-H1 and the candidate VRN-H2 genes can be identified using association mapping. Discrimination between intragenic VRN-H1 markers was achieved, indicating that candidate causative polymorphisms may be discerned and prioritised within a larger set of positive associations. This proof of concept study demonstrates the feasibility of association mapping in barley, even within highly structured populations. A major advantage of this method is that it does not require large numbers of genome-wide markers, and is therefore suitable for fine mapping and candidate gene evaluation, especially in species for which large numbers of genetic markers are either unavailable or too costly.
Resumo:
This paper reviews the development of Greater Amman, Jordan noting that the vast urban expansion that has occurred over the last fifty years has led to the desertification of rare fertile lands, following the fragmented and scattered territorial expansion of the city. The future scenario for planning in Greater Amman is analyzed in respect of proposals outlined in the Metropolitan Growth Plan of 2008, which assumes a rapid population growth from 2,200,000 persons in 2006, to approximately 6,500,000 by 2025. The concentration of more than 39 per cent of the national population of Jordan in Greater Amman threatens the transformation of former distinct settlement pattern into a distinctive continuous urban zone, aggravating problems of infrastructural provision, water needs, agricultural lands, and leaving unresolved problems of land inflation, poor urban standards and housing shortages. In conclusion, the environmental implications of the Amman Metropolitan Growth Plan are analysed, and it is suggested that an alternative approach is needed, based on clear principles of sustainable urban development.
Resumo:
The validity of the linguistic relativity principle continues to stimulate vigorous debate and research. The debate has recently shifted from the behavioural investigation arena to a more biologically grounded field, in which tangible physiological evidence for language effects on perception can be obtained. Using brain potentials in a colour oddball detection task with Greek and English speakers, a recent study suggests that language effects may exist at early stages of perceptual integration [Thierry, G., Athanasopoulos, P., Wiggett, A., Dering, B., & Kuipers, J. (2009). Unconscious effects of language-specific terminology on pre-attentive colour perception. Proceedings of the National Academy of Sciences, 106, 4567–4570]. In this paper, we test whether in Greek speakers exposure to a new cultural environment (UK) with contrasting colour terminology from their native language affects early perceptual processing as indexed by an electrophysiological correlate of visual detection of colour luminance. We also report semantic mapping of native colour terms and colour similarity judgements. Results reveal convergence of linguistic descriptions, cognitive processing, and early perception of colour in bilinguals. This result demonstrates for the first time substantial plasticity in early, pre-attentive colour perception and has important implications for the mechanisms that are involved in perceptual changes during the processes of language learning and acculturation.
Resumo:
Background: Jargon aphasia is one of the most intractable forms of aphasia with limited recommendation on amelioration of associated naming difficulties and neologisms. The few naming therapy studies that exist in jargon aphasia have utilized either semantic or phonological approaches but the results have been equivocal. Moreover, the effect of therapy on characteristics of neologisms is less explored. Aims: This study investigates the effectiveness of a phonological naming therapy (i.e., phonological component analysis, PCA) on picture naming abilities and on quantitative and qualitative changes in neologisms for an individual with jargon aphasia (FF). Methods: FF showed evidence of jargon aphasia with severe naming difficulties and produced a very high proportion of neologisms. A single-subject multiple probe design across behaviors was employed to evaluate the effects of PCA therapy on the accuracy for three sets of words. In therapy, a phonological components analysis chart was used to identify five phonological components (i.e., rhymes, first sound, first sound associate, final sound, number of syllables) for each target word. Generalization effects—change in percent accuracy and error pattern—were examined comparing pre-and post-therapy responses on the Philadelphia Naming Test and these responses were analyzed to explore the characteristics of the neologisms. The quantitative change in neologisms was measured by change in the proportion of neologisms from pre- to post-therapy and the qualitative change was indexed by the phonological overlap between target and neologism. Results: As a consequence of PCA therapy, FF showed a significant improvement in his ability to name the treated items. His performance in maintenance and follow-up phases remained comparable to his performance during the therapy phases. Generalization to other naming tasks did not show a change in accuracy but distinct differences in error pattern (an increase in proportion of real word responses and a decrease in proportion of neologisms) were observed. Notably, the decrease in neologisms occurred with a corresponding trend for increase in the phonological similarity between the neologisms and the targets. Conclusions: This study demonstrated the effectiveness of a phonological therapy for improving naming abilities and reducing the amount of neologisms in an individual with severe jargon aphasia. The positive outcome of this research is encouraging, as it provides evidence for effective therapies for jargon aphasia and also emphasizes that use of the quality and quantity of errors may provide a sensitive outcome measure to determine therapy effectiveness, in particular for client groups who are difficult to treat.