913 resultados para just in time


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Geographic diversity is a fundamental tenet in portfolio management. Yet there is evidence from the US that institutional investors prefer to concentrate their real estate investments in favoured and specific areas as primary locations for the properties that occupy their portfolios. The little work done in the UK draws similar conclusions, but has so far focused only on the office sector; no work has examined this issue for the retail sector. This paper therefore examines the extent of real estate investment concentration in institutional Retail portfolios in the UK at two points in time; 1998 and 2003, and presents some comparisons with equivalent concentrations in the office sector. The findings indicate that retail investment correlates more closely with the UK urban hierarchy than that for offices when measured against employment, and is focused on urban areas with high populations and large population densities which have larger numbers of retail units in which to invest.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Executive summary Nature of the problem (science/management/policy) • Freshwater ecosystems play a key role in the European nitrogen (N) cycle, both as a reactive agent that transfers, stores and processes N loadings from the atmosphere and terrestrial ecosystems, and as a natural environment severely impacted by the increase of these loadings. Approaches • This chapter is a review of major processes and factors controlling N transport and transformations for running waters, standing waters, groundwaters and riparian wetlands. Key findings/state of knowledge • The major factor controlling N processes in freshwater ecosystems is the residence time of water, which varies widely both in space and in time, and which is sensitive to changes in climate, land use and management. • The effects of increased N loadings to European freshwaters include acidification in semi-natural environments, and eutrophication in more disturbed ecosystems, with associated loss of biodiversity in both cases. • An important part of the nitrogen transferred by surface waters is in the form of organic N, as dissolved organic N (DON) and particulate organic N (PON). This part is dominant in semi-natural catchments throughout Europe and remains a significant component of the total N load even in nitrate enriched rivers. • In eutrophicated standing freshwaters N can be a factor limiting or co-limiting biological production, and control of both N and phosphorus (P) loading is oft en needed in impacted areas, if ecological quality is to be restored. Major uncertainties/challenges • The importance of storage and denitrifi cation in aquifers is a major uncertainty in the global N cycle, and controls in part the response of catchments to land use or management changes. In some aquifers, the increase of N concentrations will continue for decades even if efficient mitigation measures are implemented now. • Nitrate retention by riparian wetlands has oft en been highlighted. However, their use for mitigation must be treated with caution, since their effectiveness is difficult to predict, and side effects include increased DON emissions to adjacent open waters, N2O emissions to the atmosphere, and loss of biodiversity. • In fact, the character and specific spatial origins of DON are not fully understood, and similarly the quantitative importance of indirect N2O emissions from freshwater ecosystems as a result of N leaching losses from agricultural soils is still poorly known at the regional scale. • These major uncertainties remain due to the lack of adequate monitoring (all forms of N at a relevant frequency), especially – but not only – in the southern and eastern EU countries. Recommendations (research/policy) • The great variability of transfer pathways, buffering capacity and sensitivity of the catchments and of the freshwater ecosystems calls for site specific mitigation measures rather than standard ones applied at regional to national scale. • The spatial and temporal variations of the N forms, the processes controlling the transport and transformation of N within freshwaters, require further investigation if the role of N in influencing freshwater ecosystem health is to be better understood, underpinning the implementation of the EU Water Framework Directive for European freshwaters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article presents three ethnographic tales of interactions with living room media to help recreate the experience of significant moments in time, of affective encounters at the interface in which there is a collision or confusion of situated and virtual worlds. It draws on a year-long video ethnography of the practice and performance of everyday interactions with living room media. By studying situated activity and the lived practice of (new) media, rather than taking an exclusive focus on the virtual as a detached space, this ethnographic work demonstrates how the situated and mediated clash, or are crafted into complex emotional encounters during everyday living room life.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present study sets out to examine motivation to learn English by Chinese research students in an informal setting. Data were collected, using semi-structured interviews, from four research students at two points in time during their first year in the UK. The main findings are: they believed that learning English was important; their main goal orientations were instrumental and extrinsic; they set learning goals and persisted to attain them; they valued their current learning environment in general and saw it as supportive of their goals; they held both positive and negative attitudes towards the British, which had differential effects on their motivation; their self-perceived support seemed to have a positive impact on their motivation and the development of self-confidence; they tended to attribute their success to stable causes such as the environment and failure to unstable but controllable causes such as effort. It is concluded that qualitative data of this kind may complement insights from quantitative research. Implications for target country institutions in the provision of support are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For forecasting and economic analysis many variables are used in logarithms (logs). In time series analysis, this transformation is often considered to stabilize the variance of a series. We investigate under which conditions taking logs is beneficial for forecasting. Forecasts based on the original series are compared to forecasts based on logs. For a range of economic variables, substantial forecasting improvements from taking logs are found if the log transformation actually stabilizes the variance of the underlying series. Using logs can be damaging for the forecast precision if a stable variance is not achieved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In two recent papers Byrne and Lee (2006, 2007) examined the geographical concentration of institutional office and retail investment in England and Wales at two points in time; 1998 and 2003. The findings indicate that commercial office portfolios are concentrated in a very few urban areas, whereas retail holdings correlate more closely with the urban hierarchy of England and Wales and consequently are essentially ubiquitous. Research into the industrial sector is very much less developed, and this paper therefore makes a significant contribution to understanding the structure of industrial property investment in the UK. It shows that industrial investment concentration is between that of retail and office and is focussed on LAs with high levels of manual workers in areas with smaller industrial units. It also shows that during the period studied the structure of the sector changed, with greater emphasis on the distributional element, for which location is a principal consideration.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present molecular dynamics (MD) and slip-springs model simulations of the chain segmental dynamics in entangled linear polymer melts. The time-dependent behavior of the segmental orientation autocorrelation functions and mean-square segmental displacements are analyzed for both flexible and semiflexible chains, with particular attention paid to the scaling relations among these dynamic quantities. Effective combination of the two simulation methods at different coarse-graining levels allows us to explore the chain dynamics for chain lengths ranging from Z ≈ 2 to 90 entanglements. For a given chain length of Z ≈ 15, the time scales accessed span for more than 10 decades, covering all of the interesting relaxation regimes. The obtained time dependence of the monomer mean square displacements, g1(t), is in good agreement with the tube theory predictions. Results on the first- and second-order segmental orientation autocorrelation functions, C1(t) and C2(t), demonstrate a clear power law relationship of C2(t) C1(t)m with m = 3, 2, and 1 in the initial, free Rouse, and entangled (constrained Rouse) regimes, respectively. The return-to-origin hypothesis, which leads to inverse proportionality between the segmental orientation autocorrelation functions and g1(t) in the entangled regime, is convincingly verified by the simulation result of C1(t) g1(t)−1 t–1/4 in the constrained Rouse regime, where for well-entangled chains both C1(t) and g1(t) are rather insensitive to the constraint release effects. However, the second-order correlation function, C2(t), shows much stronger sensitivity to the constraint release effects and experiences a protracted crossover from the free Rouse to entangled regime. This crossover region extends for at least one decade in time longer than that of C1(t). The predicted time scaling behavior of C2(t) t–1/4 is observed in slip-springs simulations only at chain length of 90 entanglements, whereas shorter chains show higher scaling exponents. The reported simulation work can be applied to understand the observations of the NMR experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background A whole-genome genotyping array has previously been developed for Malus using SNP data from 28 Malus genotypes. This array offers the prospect of high throughput genotyping and linkage map development for any given Malus progeny. To test the applicability of the array for mapping in diverse Malus genotypes, we applied the array to the construction of a SNPbased linkage map of an apple rootstock progeny. Results Of the 7,867 Malus SNP markers on the array, 1,823 (23.2 %) were heterozygous in one of the two parents of the progeny, 1,007 (12.8 %) were heterozygous in both parental genotypes, whilst just 2.8 % of the 921 Pyrus SNPs were heterozygous. A linkage map spanning 1,282.2 cM was produced comprising 2,272 SNP markers, 306 SSR markers and the S-locus. The length of the M432 linkage map was increased by 52.7 cM with the addition of the SNP markers, whilst marker density increased from 3.8 cM/marker to 0.5 cM/marker. Just three regions in excess of 10 cM remain where no markers were mapped. We compared the positions of the mapped SNP markers on the M432 map with their predicted positions on the ‘Golden Delicious’ genome sequence. A total of 311 markers (13.7 % of all mapped markers) mapped to positions that conflicted with their predicted positions on the ‘Golden Delicious’ pseudo-chromosomes, indicating the presence of paralogous genomic regions or misassignments of genome sequence contigs during the assembly and anchoring of the genome sequence. Conclusions We incorporated data for the 2,272 SNP markers onto the map of the M432 progeny and have presented the most complete and saturated map of the full 17 linkage groups of M. pumila to date. The data were generated rapidly in a high-throughput semi-automated pipeline, permitting significant savings in time and cost over linkage map construction using microsatellites. The application of the array will permit linkage maps to be developed for QTL analyses in a cost-effective manner, and the identification of SNPs that have been assigned erroneous positions on the ‘Golden Delicious’ reference sequence will assist in the continued improvement of the genome sequence assembly for that variety.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Southern Hemisphere (SH) polar mesospheric clouds (PMCs), also known as noctilucent clouds, have been observed to be more variable and, in general, dimmer than their Northern Hemisphere (NH) counterparts. The precise cause of these hemispheric differences is not well understood. This paper focuses on one aspect of the hemispheric differences: the timing of the PMC season onset. Observations from the Aeronomy of Ice in the Mesosphere satellite indicate that in recent years the date on which the PMC season begins varies much more in the SH than in the NH. Using the Canadian Middle Atmosphere Model, we show that the generation of sufficiently low temperatures necessary for cloud formation in the SH summer polar mesosphere is perturbed by year‐to‐year variations in the timing of the late‐spring breakdown of the SH stratospheric polar vortex. These stratospheric variations, which persist until the end of December, influence the propagation of gravity waves up to the mesosphere. This adds a stratospheric control to the temperatures in the polar mesopause region during early summer, which causes the onset of PMCs to vary from one year to another. This effect is much stronger in the SH than in the NH because the breakdown of the polar vortex occurs much later in the SH, closer in time to the PMC season.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Geophysical time series sometimes exhibit serial correlations that are stronger than can be captured by the commonly used first‐order autoregressive model. In this study we demonstrate that a power law statistical model serves as a useful upper bound for the persistence of total ozone anomalies on monthly to interannual timescales. Such a model is usually characterized by the Hurst exponent. We show that the estimation of the Hurst exponent in time series of total ozone is sensitive to various choices made in the statistical analysis, especially whether and how the deterministic (including periodic) signals are filtered from the time series, and the frequency range over which the estimation is made. In particular, care must be taken to ensure that the estimate of the Hurst exponent accurately represents the low‐frequency limit of the spectrum, which is the part that is relevant to long‐term correlations and the uncertainty of estimated trends. Otherwise, spurious results can be obtained. Based on this analysis, and using an updated equivalent effective stratospheric chlorine (EESC) function, we predict that an increase in total ozone attributable to EESC should be detectable at the 95% confidence level by 2015 at the latest in southern midlatitudes, and by 2020–2025 at the latest over 30°–45°N, with the time to detection increasing rapidly with latitude north of this range.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The internal variability and coupling between the stratosphere and troposphere in CCMVal‐2 chemistry‐climate models are evaluated through analysis of the annular mode patterns of variability. Computation of the annular modes in long data sets with secular trends requires refinement of the standard definition of the annular mode, and a more robust procedure that allows for slowly varying trends is established and verified. The spatial and temporal structure of the models’ annular modes is then compared with that of reanalyses. As a whole, the models capture the key features of observed intraseasonal variability, including the sharp vertical gradients in structure between stratosphere and troposphere, the asymmetries in the seasonal cycle between the Northern and Southern hemispheres, and the coupling between the polar stratospheric vortices and tropospheric midlatitude jets. It is also found that the annular mode variability changes little in time throughout simulations of the 21st century. There are, however, both common biases and significant differences in performance in the models. In the troposphere, the annular mode in models is generally too persistent, particularly in the Southern Hemisphere summer, a bias similar to that found in CMIP3 coupled climate models. In the stratosphere, the periods of peak variance and coupling with the troposphere are delayed by about a month in both hemispheres. The relationship between increased variability of the stratosphere and increased persistence in the troposphere suggests that some tropospheric biases may be related to stratospheric biases and that a well‐simulated stratosphere can improve simulation of tropospheric intraseasonal variability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We explore the influence of the choice of attenuation factor on Katz centrality indices for evolving communication networks. For given snapshots of a network observed over a period of time, recently developed communicability indices aim to identify best broadcasters and listeners in the network. In this article, we looked into the sensitivity of communicability indices on the attenuation factor constraint, in relation to spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We proposed relaxed communicability measures where the spectral radius bound on attenuation factor is relaxed and the adjacency matrix is normalised in order to maintain the convergence of the measure. Using a vitality based measure of both standard and relaxed communicability indices we looked at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We illustrated our findings with two examples of real-life networks, MIT reality mining data set of daily communications between 106 individuals during one year and UK Twitter mentions network, direct messages on Twitter between 12.4k individuals during one week.