36 resultados para Spent Working Time
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that builds on existing approaches, including the use of image segmentation techniques prior to object classification to cope with the very large number of pixels in these scenes. Flood detection in urban areas is guided by the flood extent derived in adjacent rural areas. The algorithm assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, and in urban areas with reasonable accuracy. The accuracy was reduced in urban areas partly because of TerraSAR-X’s restricted visibility of the ground surface due to radar shadow and layover.
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management and flood forecasting. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, and in urban areas with reasonable accuracy.
Resumo:
Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.
Resumo:
Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that builds on existing approaches, including the use of image segmentation techniques prior to object classification to cope with the very large number of pixels in these scenes. Flood detection in urban areas is guided by the flood extent derived in adjacent rural areas. The algorithm assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, classifying 89% of flooded pixels correctly, with an associated false positive rate of 6%. Of the urban water pixels visible to TerraSAR-X, 75% were correctly detected, with a false positive rate of 24%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 57% and 18% respectively.
Resumo:
We examine the strategies interwar working-class British households used to “smooth” consumption over time and guard against negative contingencies such as illness, unemployment, and death. Newly discovered returns from the U.K. Ministry of Labour's 1937/38 Household Expenditure Survey are used to fully categorize expenditure smoothing via nineteen credit/savings vehicles. We find that households made extensive use of expenditure-smoothing devices. Families' reliance on expenditure-smoothing is shown to be inversely related to household income, while households also used these mechanisms more intensively during expenditure crisis phases of the family life cycle, especially the years immediately after new household formation.
Resumo:
Expectations of future market conditions are generally acknowledged to be crucial for the development decision and hence for shaping the built environment. This empirical study of the Central London office market from 1987 to 2009 tests for evidence of adaptive and naive expectations. Applying VAR models and a recursive OLS regression with one-step forecasts, we find evidence of adaptive and naïve, rather than rational expectations of developers. Although the magnitude of the errors and the length of time lags vary over time and development cycles, the results confirm that developers’ decisions are explained to a large extent by contemporaneous and past conditions in both London submarkets. The corollary of this finding is that developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of unexpected exogenous shocks.
Resumo:
The “case for property” in the mixed-asset portfolio is a topic of continuing interest to practitioners and academics. Such an analysis typically is performed over a fixed period of time and the optimum allocation to property inferred from the weight assigned to property through the use of mean-variance analysis. It is well known, however, that the parameters used in the portfolio analysis problem are unstable through time. Thus, the weight proposed for property in one period is unlikely to be that found in another. Consequently, in order to assess the case for property more thoroughly, the impact of property in the mixed-asset portfolio is evaluated on a rolling basis over a long period of time. In this way we test whether the inclusion of property significantly improves the performance of an existing equity/bond portfolio all of the time. The main findings are that the inclusion of direct property into an existing equity/bond portfolio leads to increase or decreases in return, depending on the relative performance of property compared with the other asset classes. However, including property in the mixed-asset portfolio always leads to reductions in portfolio risk. Consequently, adding property into an equity/bond portfolio can lead to significant increases in risk-adjusted performance. Thus, if the decision to include direct property in the mixed-asset portfolio is based upon its diversification benefits the answer is yes, there is a “case for property” all the time!
Resumo:
This paper investigates the degree of return volatility persistence and the time-varying behaviour of systematic risk (beta) for 31 market segments in the UK real estate market. The findings suggest that different property types exhibit differences in volatility persistence and time variability. There is also evidence that the volatility persistence of each market segment and its systematic risk are significantly positively related. Thus, the systematic risks of different property types tend to move in different directions during periods of increased market volatility. Finally, the market segments with systematic risks less than one tend to show negative time variability, while market segments with systematic risk greater than one generally show positive time variability, indicating a positive relationship between the volatility of the market and the systematic risk of individual market segments. Consequently safer and riskier market segments are affected differently by increases in market volatility.
Resumo:
In estimating the inputs into the Modern Portfolio Theory (MPT) portfolio optimisation problem, it is usual to use equal weighted historic data. Equal weighting of the data, however, does not take account of the current state of the market. Consequently this approach is unlikely to perform well in any subsequent period as the data is still reflecting market conditions that are no longer valid. The need for some return-weighting scheme that gives greater weight to the most recent data would seem desirable. Therefore, this study uses returns data which are weighted to give greater weight to the most recent observations to see if such a weighting scheme can offer improved ex-ante performance over that based on un-weighted data.
Resumo:
User interaction within a virtual environment may take various forms: a teleconferencing application will require users to speak to each other (Geak, 1993), with computer supported co-operative working; an Engineer may wish to pass an object to another user for examination; in a battle field simulation (McDonough, 1992), users might exchange fire. In all cases it is necessary for the actions of one user to be presented to the others sufficiently quickly to allow realistic interaction. In this paper we take a fresh look at the approach of virtual reality operating systems by tackling the underlying issues of creating real-time multi-user environments.
Resumo:
Dorsolateral prefrontal cortex (DLPFC) is recruited during visual working memory (WM) when relevant information must be maintained in the presence of distracting information. The mechanism by which DLPFC might ensure successful maintenance of the contents of WM is, however, unclear; it might enhance neural maintenance of memory targets or suppress processing of distracters. To adjudicate between these possibilities, we applied time-locked transcranial magnetic stimulation (TMS) during functional MRI, an approach that permits causal assessment of a stimulated brain region's influence on connected brain regions, and evaluated how this influence may change under different task conditions. Participants performed a visual WM task requiring retention of visual stimuli (faces or houses) across a delay during which visual distracters could be present or absent. When distracters were present, they were always from the opposite stimulus category, so that targets and distracters were represented in distinct posterior cortical areas. We then measured whether DLPFC-TMS, administered in the delay at the time point when distracters could appear, would modulate posterior regions representing memory targets or distracters. We found that DLPFC-TMS influenced posterior areas only when distracters were present and, critically, that this influence consisted of increased activity in regions representing the current memory targets. DLPFC-TMS did not affect regions representing current distracters. These results provide a new line of causal evidence for a top-down DLPFC-based control mechanism that promotes successful maintenance of relevant information in WM in the presence of distraction.
Resumo:
Left inferior frontal gyrus (IFG) is a critical neural substrate for the resolution of proactive interference (PI) in working memory. We hypothesized that left IFG achieves this by controlling the influence of familiarity- versus recollection-based information about memory probes. Consistent with this idea, we observed evidence for an early (200 msec)-peaking signal corresponding to memory probe familiarity and a late (500 msec)-resolving signal corresponding to full accrual of trial-related contextual ("recollection-based") information. Next, we applied brief trains of repetitive transcranial magnetic stimulation (rTMS) time locked to these mnemonic signals, to left IFG and to a control region. Only early rTMS of left IFG produced a modulation of the false alarm rate for high-PI probes. Additionally, the magnitude of this effect was predicted by individual differences in susceptibility to PI. These results suggest that left IFG-based control may bias the influence of familiarity- and recollection-based signals on recognition decisions.
Resumo:
This article considers cinematic time in James Benning’s film, casting a glance (2007), in relation to its subject, Robert Smithson’s 1970 earthwork Spiral Jetty, and his film of the same name. The radicalism of Smithson’s thinking on time has been widely acknowledged, and his influence continues to pervade contemporary artistic practice. The relationship of Benning’s films with this legacy may appear somewhat oblique, given their apparent phenomenological rendition of ‘real time’. However, closer examination of Benning’s formal strategies reveals a more complex temporal construction, characterized by uncertain intervals that interrupt the folding of cinematic time into the flow of consciousness. Smithson’s film uses cinematic analogy to gesture towards vast reaches of geological time; Benning’s film creates a simulated timescale to evoke the short history of the earthwork itself. Smithson’s embrace of the entropic was a counter-cultural stance at the end of the1960s, but under the shadow of ecological disaster, this orientation has come to appear melancholy and romantic rather than radical. Benning’s film returns the jetty to anthropic time, but raises questions about the ways we inhabit time. His practice of working with ‘borrowed time’ is particularly suited to the cultural and historical moment of his later work.
Resumo:
Emerging evidence suggests that items held in working memory(WM)might not all be in the same representational state. One item might be privileged over others, making it more accessible and thereby recalled with greater precision. Here, using transcranial magnetic stimulation (TMS), we provide causal evidence in human participants that items inWMare differentially susceptible to disruptive TMS, depending on their state, determined either by task relevance or serial position. Across two experiments, we applied TMS to area MT during the WM retention of two motion directions. In Experiment 1, we used an “incidental cue” to bring one of the two targets into a privileged state. In Experiment 2, we presented the targets sequentially so that the last item was in a privileged state by virtue of recency. In both experiments, recall precision of motion direction was differentially affected by TMS, depending on the state of the memory target at the time of disruption. Privileged items were recalled with less precision, whereas nonprivileged items were recalled with higher precision. Thus, only the privileged item was susceptible to disruptive TMS over MT�. By contrast, precision of the nonprivileged item improved either directly because of facilitation by TMS or indirectly through reduced interference from the privileged item. Our results provide a unique line of evidence, as revealed by TMS over a posterior sensory brain region, for at least two different states of item representation in WM.