131 resultados para Stochastic convergence
Convergence or divergence of contingent employment practices? Evidence of the role of MNCs in Europe
Resumo:
Purpose. This study considered whether vergence drives accommodation or accommodation drives vergence during the control of distance exotropia for near fixation. High accommodative convergence to accommodation (AC/A) ratios are often used to explain this control, but the role of convergence to drive accommodation (the CA/C relationship) is rarely considered. Atypical CA/C characteristics could equally, or better, explain common clinical findings. Methods. 19 distance exotropes, aged 4-11 years, were compared while controlling their deviation with 27 non-exotropic controls aged 5-9 years. Simultaneous vergence and accommodation responses were measured to a range of targets incorporating different combinations of blur, disparity and looming cues at four fixation distances between 2m and 33cm. Stimulus and response AC/A and CA/C ratios were calculated. Results. Accommodation responses for near targets (p=0.017) response gains (p=0.026) were greater in the exotropes than the controls. Despite higher clinical stimulus AC/A ratios, the distance exotropes showed lower laboratory response AC/A ratios (p=0.02), but significantly higher CA/C ratios (p=0.02). All the exotropes, whether the angle changed most with lenses (“controlled by accommodation”) or on occlusion (“controlled by fusion”), used binocular disparity not blur as their main cue to target distance. Conclusions. Increased vergence demand to control intermittent distance exotropia for near also drives significantly more accommodation. Minus lens therapy is more likely to act by correcting over-accommodation driven by controlling convergence, rather than by inducing blur-driven vergence. The use of convergence as a major drive to accommodation explains many clinical characteristics of distance exotropia, including apparently high near stimulus AC/A ratios.
Resumo:
European economic and political integration have been recognised as having implications for patterns of performance in national real estate and capital markets and have generated a wide body of research and commentary. In 1999, progress towards monetary integration within the European Union culminated in the introduction of a common currency and monetary policy. This paper investigates the effects of this ‘event’ on the behaviour of stock returns in European real estate companies. A range of statistical tests is applied to the performance of European property companies to test for changes in segmentation, co-movement and causality. The results suggest that, relative to the wider equity markets, the dispersion of performance is higher, correlations are lower, a common contemporaneous factor has much lower explanatory power whilst lead-lag relationships are stronger. Consequently, the evidence of transmission of monetary integration to real estate securities is less noticeable than to general securities. Less and slower integration is attributed to the relatively small size of the real estate securities market and the local and national nature of the majority of the companies’ portfolios.
Resumo:
Many numerical models for weather prediction and climate studies are run at resolutions that are too coarse to resolve convection explicitly, but too fine to justify the local equilibrium assumed by conventional convective parameterizations. The Plant-Craig (PC) stochastic convective parameterization scheme, developed in this paper, solves this problem by removing the assumption that a given grid-scale situation must always produce the same sub-grid-scale convective response. Instead, for each timestep and gridpoint, one of the many possible convective responses consistent with the large-scale situation is randomly selected. The scheme requires as input the large-scale state as opposed to the instantaneous grid-scale state, but must nonetheless be able to account for genuine variations in the largescale situation. Here we investigate the behaviour of the PC scheme in three-dimensional simulations of radiative-convective equilibrium, demonstrating in particular that the necessary space-time averaging required to produce a good representation of the input large-scale state is not in conflict with the requirement to capture large-scale variations. The resulting equilibrium profiles agree well with those obtained from established deterministic schemes, and with corresponding cloud-resolving model simulations. Unlike the conventional schemes the statistics for mass flux and rainfall variability from the PC scheme also agree well with relevant theory and vary appropriately with spatial scale. The scheme is further shown to adapt automatically to changes in grid length and in forcing strength.
Resumo:
An approach to incorporate spatial dependence into stochastic frontier analysis is developed and applied to a sample of 215 dairy farms in England and Wales. A number of alternative specifications for the spatial weight matrix are used to analyse the effect of these on the estimation of spatial dependence. Estimation is conducted using a Bayesian approach and results indicate that spatial dependence is present when explaining technical inefficiency.
Resumo:
Our group considered the desirability of including representations of uncertainty in the development of parameterizations. (By ‘uncertainty’ here we mean the deviation of sub-grid scale fluxes or tendencies in any given model grid box from truth.) We unanimously agreed that the ECWMF should attempt to provide a more physical basis for uncertainty estimates than the very effective but ad hoc methods being used at present. Our discussions identified several issues that will arise.
Resumo:
In 2005, the ECMWF held a workshop on stochastic parameterisation, at which the convection was seen as being a key issue. That much is clear from the working group reports and particularly the statement from working group 1 that “it is clear that a stochastic convection scheme is desirable”. The present note aims to consider our current status in comparison with some of the issues raised and hopes expressed in that working group report.
Resumo:
A remote haploscopic video refractor was used to assess vergence and accommodation responses in a group of 32 emmetropic, orthophoric, symptom free, young adults naïve to vision experiments in a minimally instructed setting. Picture targets were presented at four positions between 2 m and 33 cm. Blur, disparity and looming cues were presented in combination or separately to asses their contributions to the total near response in a within-subjects design. Response gain for both vergence and accommodation reduced markedly whenever disparity was excluded, with much smaller effects when blur and proximity were excluded. Despite the clinical homogeneity of the participant group there were also some individual differences.
Resumo:
This paper discusses concepts of value from the point of view of the user of the space and the counter view of the provider of the same. Land and property are factors of production. The value of the land flows from the use to which it is put, and that in turn, is dependent upon the demand (and supply) for the product or service that is produced/provided from that space. If there is a high demand for the product (at a fixed level of supply), the price will increase and the economic rent for the land/property will increase accordingly. This is the underlying paradigm of Ricardian rent theory where the supply of land is fixed and a single good is produced. In such a case the rent of land is wholly an economic rent. Economic theory generally distinguishes between two kinds of price, price of production or “value in use” (as determined by the labour theory of value), and market price or “value in exchange” (as determined by supply and demand). It is based on a coherent and consistent theory of value and price. Effectively the distinction is between what space is ‘worth’ to an individual and that space’s price of exchange in the market place. In a perfect market where any individual has access to the same information as all others in the market, price and worth should coincide. However in a market where access to information is not uniform, and where different uses compete for the same space, it is more likely that the two figures will diverge. This paper argues that the traditional reliance of valuers to use methods of comparison to determine “price” has led to an artificial divergence of “value in use” and “value in exchange”, but now such comparison are becoming more difficult due to the diversity of lettings in the market place, there will be a requirement to return to fundamentals and pay heed to the thought process of the user in assessing the worth of the space to be let.
Resumo:
In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems. In this setting, recent works have shown how to get a statistics of extremes in agreement with the classical Extreme Value Theory. We pursue these investigations by giving analytical expressions of Extreme Value distribution parameters for maps that have an absolutely continuous invariant measure. We compare these analytical results with numerical experiments in which we study the convergence to limiting distributions using the so called block-maxima approach, pointing out in which cases we obtain robust estimation of parameters. In regular maps for which mixing properties do not hold, we show that the fitting procedure to the classical Extreme Value Distribution fails, as expected. However, we obtain an empirical distribution that can be explained starting from a different observable function for which Nicolis et al. (Phys. Rev. Lett. 97(21): 210602, 2006) have found analytical results.
Resumo:
Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.