131 resultados para Time Use


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Commercial real estate is a highly specific asset: heterogeneous, indivisible and with less information transparency than most other commonly held investment assets. These attributes encourage the use of intermediaries during asset acquisition and disposal. However, there are few attempts to explain the use of different brokerage models (with differing costs) in different markets. This study aims to address this gap. Design/methodology/approach – The study analyses 9,338 real estate transactions in London and New York City from 2001 to 2011. Data are provided by Real Capital Analytics and cover over $450 billion of investments in this period. Brokerage trends in the two cities are compared and probit regressions are used to test whether the decision to transact with broker representation varies with investor or asset characteristics. Findings – Results indicate greater use of brokerage in London, especially by purchasers. This persists when data are disaggregated by sector, time or investor type, pointing to the role of local market culture and institutions in shaping brokerage models and transaction costs. Within each city, the nature of the investors involved seems to be a more significant influence on broker use than the characteristics of the assets being traded. Originality/value – Brokerage costs are the single largest non-tax charge to an investor when trading commercial real estate, yet there is little research in this area. This study examines the role of brokers and provides empirical evidence on factors that influence the use and mode of brokerage in two major investment destinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern age samples from various depositional environments were examined for signal resetting. For 19 modern aeolian/beach samples all De values obtained were View the MathML source, with ∼70% having View the MathML source. For 21 fluvial/colluvial samples, all De values were View the MathML source with ∼80% being View the MathML source. De as a function of illumination (OSL measurement) time (De(t)) plots were examined for all samples. Based on previous laboratory experiments, increases in De(t) were expected for partially reset samples, and constant De(t) for fully reset samples. All aeolian samples, both modern age and additional ‘young’ samples (<1000 years), showed constant (flat) De(t) while all modern, non-zero De, fluvial/colluvial samples showed increasing De(t). ‘Replacement plots’, where a regenerated signal is substituted for the natural, yielded constant (flat) De(t). These findings support strongly the use of De(t) as a method of identifying incomplete resetting in fluvial samples. Potential complicating factors, such as illumination (bleaching) spectrum, thermal instability and component composition are discussed and a series of internal checks on the applicability of the De(t) for each individual aliquot/grain level are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The anthropogenic heat emissions generated by human activities in London are analysed in detail for 2005–2008 and considered in context of long-term past and future trends (1970–2025). Emissions from buildings, road traffic and human metabolism are finely resolved in space (30 min) and time (200 × 200 m2). Software to compute and visualize the results is provided. The annual mean anthropogenic heat flux for Greater London is 10.9 W m−2 for 2005–2008, with the highest peaks in the central activities zone (CAZ) associated with extensive service industry activities. Towards the outskirts of the city, emissions from the domestic sector and road traffic dominate. Anthropogenic heat is mostly emitted as sensible heat, with a latent heat fraction of 7.3% and a heat-to-wastewater fraction of 12%; the implications related to the use of evaporative cooling towers are briefly addressed. Projections indicate a further increase of heat emissions within the CAZ in the next two decades related to further intensification of activities within this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK has adopted legally binding carbon reduction targets of 34% by 2020 and 80% by 2050 (measured against the 1990 baseline). Buildings are estimated to be responsible for more than 50% of greenhouse gas (GHG) emissions in the UK. These consist of both operational, produced during use, and embodied, produced during manufacture of materials and components, and during construction, refurbishments and demolition. A brief assessment suggests that it is unlikely that UK emission reduction targets can be met without substantial reductions in both Oc and Ec. Oc occurs over the lifetime of a building whereas the bulk of Ec occurs at the start of a building’s life. A time value for emissions could influence the decision making process when it comes to comparing mitigation measures which have benefits that occur at different times. An example might be the choice between building construction using low Ec construction materials versus building construction using high Ec construction materials but with lower Oc, although the use of high Ec materials does not necessarily imply a lower Oc. Particular time related issues examined here are: the urgency of the need to achieve large emissions reductions during the next 10 to 20 years; the earlier effective action is taken, the less costly it will be; future reduction in carbon intensity of energy supply; the carbon cycle and relationship between the release of GHG’s and their subsequent concentrations in the atmosphere. An equation is proposed, which weights emissions according to when they occur during the building life cycle, and which effectively increases Ec as a proportion of the total, suggesting that reducing Ec is likely to be more beneficial, in terms of climate change, for most new buildings. Thus, giving higher priority to Ec reductions is likely to result in a bigger positive impact on climate change and mitigation costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calculation of interval forecasts for highly persistent autoregressive (AR) time series based on the bootstrap is considered. Three methods are considered for countering the small-sample bias of least-squares estimation for processes which have roots close to the unit circle: a bootstrap bias-corrected OLS estimator; the use of the Roy–Fuller estimator in place of OLS; and the use of the Andrews–Chen estimator in place of OLS. All three methods of bias correction yield superior results to the bootstrap in the absence of bias correction. Of the three correction methods, the bootstrap prediction intervals based on the Roy–Fuller estimator are generally superior to the other two. The small-sample performance of bootstrap prediction intervals based on the Roy–Fuller estimator are investigated when the order of the AR model is unknown, and has to be determined using an information criterion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This note describes a simple procedure for removing unphysical temporal discontinuities in ERA-Interim upper stratospheric global mean temperatures in March 1985 and August 1998 that have arisen due to changes in satellite radiance data used in the assimilation. The derived temperature adjustments (offsets) are suitable for use in stratosphere-resolving chemistry-climate models that are nudged (relaxed) to ERA-Interim winds and temperatures. Simulations using a nudged version of the Canadian Middle Atmosphere Model (CMAM) show that the inclusion of the temperature adjustments produces temperature time series that are devoid of the large jumps in 1985 and 1998. Due to its strong temperature dependence, the simulated upper stratospheric ozone is also shown to vary smoothly in time, unlike in a nudged simulation without the adjustments where abrupt changes in ozone occur at the times of the temperature jumps. While the adjustments to the ERA-Interim temperatures remove significant artefacts in the nudged CMAM simulation, spurious transient effects that arise due to water vapour and persist for about 5 yr after the 1979 switch to ERA-Interim data are identified, underlining the need for caution when analysing trends in runs nudged to reanalyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With a wide range of applications benefiting from dense network air temperature observations but with limitations of costs, existing siting guidelines and risk of damage to sensors, new methods are required to gain a high resolution understanding of the spatio-temporal patterns of urban meteorological phenomena such as the urban heat island or precision farming needs. With the launch of a new generation of low cost sensors it is possible to deploy a network to monitor air temperature at finer spatial resolutions. Here we investigate the Aginova Sentinel Micro (ASM) sensor with a bespoke radiation shield (together < US$150) which can provide secure near-real-time air temperature data to a server utilising existing (or user deployed) Wireless Fidelity (Wi-Fi) networks. This makes it ideally suited for deployment where wireless communications readily exist, notably urban areas. Assessment of the performance of the ASM relative to traceable standards in a water bath and atmospheric chamber show it to have good measurement accuracy with mean errors < ± 0.22 °C between -25 and 30 °C, with a time constant in ambient air of 110 ± 15 s. Subsequent field tests of it within the bespoke shield also had excellent performance (root-mean-square error = 0.13 °C) over a range of meteorological conditions relative to a traceable operational UK Met Office platinum resistance thermometer. These results indicate that the ASM and bespoke shield are more than fit-for-purpose for dense network deployment in urban areas at relatively low cost compared to existing observation techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Runoff generation processes and pathways vary widely between catchments. Credible simulations of solute and pollutant transport in surface waters are dependent on models which facilitate appropriate, catchment-specific representations of perceptual models of the runoff generation process. Here, we present a flexible, semi-distributed landscape-scale rainfall-runoff modelling toolkit suitable for simulating a broad range of user-specified perceptual models of runoff generation and stream flow occurring in different climatic regions and landscape types. PERSiST (the Precipitation, Evapotranspiration and Runoff Simulator for Solute Transport) is designed for simulating present-day hydrology; projecting possible future effects of climate or land use change on runoff and catchment water storage; and generating hydrologic inputs for the Integrated Catchments (INCA) family of models. PERSiST has limited data requirements and is calibrated using observed time series of precipitation, air temperature and runoff at one or more points in a river network. Here, we apply PERSiST to the river Thames in the UK and describe a Monte Carlo tool for model calibration, sensitivity and uncertainty analysis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decadal climate predictions exhibit large biases, which are often subtracted and forgotten. However, understanding the causes of bias is essential to guide efforts to improve prediction systems, and may offer additional benefits. Here the origins of biases in decadal predictions are investigated, including whether analysis of these biases might provide useful information. The focus is especially on the lead-time-dependent bias tendency. A “toy” model of a prediction system is initially developed and used to show that there are several distinct contributions to bias tendency. Contributions from sampling of internal variability and a start-time-dependent forcing bias can be estimated and removed to obtain a much improved estimate of the true bias tendency, which can provide information about errors in the underlying model and/or errors in the specification of forcings. It is argued that the true bias tendency, not the total bias tendency, should be used to adjust decadal forecasts. The methods developed are applied to decadal hindcasts of global mean temperature made using the Hadley Centre Coupled Model, version 3 (HadCM3), climate model, and it is found that this model exhibits a small positive bias tendency in the ensemble mean. When considering different model versions, it is shown that the true bias tendency is very highly correlated with both the transient climate response (TCR) and non–greenhouse gas forcing trends, and can therefore be used to obtain observationally constrained estimates of these relevant physical quantities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Taste and smell detection threshold measurements are frequently time consuming especially when the method involves reversing the concentrations presented to replicate and improve accuracy of results. These multiple replications are likely to cause sensory and cognitive fatigue which may be more pronounced in elderly populations. A new rapid detection threshold methodology was developed that quickly located the likely position of each individuals sensory detection threshold then refined this by providing multiple concentrations around this point to determine their threshold. This study evaluates the reliability and validity of this method. Findings indicate that this new rapid detection threshold methodology was appropriate to identify differences in sensory detection thresholds between different populations and has positive benefits in providing a shorter assessment of detection thresholds. The results indicated that this method is appropriate at determining individual as well as group detection thresholds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What is the relationship between magnitude judgments relying on directly available characteristics versus probabilistic cues? Question frame was manipulated in a comparative judgment task previously assumed to involve inference across a probabilistic mental model (e.g., “which city is largest” – the “larger” question – versus “which city is smallest” – the “smaller” question). Participants identified either the largest or smallest city (Experiments 1a, 2) or the richest or poorest person (Experiment 1b) in a three-alternative forced choice (3-AFC) task (Experiment 1) or 2-AFC task (Experiment 2). Response times revealed an interaction between question frame and the number of options recognized. When asked the smaller question, response times were shorter when none of the options were recognized. The opposite pattern was found when asked the larger question: response time was shorter when all options were recognized. These task-stimuli congruity results in judgment under uncertainty are consistent with, and predicted by, theories of magnitude comparison which make use of deductive inferences from declarative knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous research has shown that listening to stories supports vocabulary growth in preschool and school-aged children and that lexical entries for even very difficult or rare words can be established if these are defined when they are first introduced. However, little is known about the nature of the lexical representations children form for the words they encounter while listening to stories, or whether these are sufficiently robust to support the child’s own use of such ‘high-level’ vocabulary. This study explored these questions by administering multiple assessments of children’s knowledge about a set of newly-acquired vocabulary. Four- and 6-year-old children were introduced to nine difficult new words (including nouns, verbs and adjectives) through three exposures to a story read by their class teacher. The story included a definition of each new word at its first encounter. Learning of the target vocabulary was assessed by means of two tests of semantic understanding – a forced choice picture-selection task and a definition production task – and a grammaticality judgment task, which asked children to choose between a syntactically-appropriate and syntactically-inappropriate usage of the word. Children in both age groups selected the correct pictorial representation and provided an appropriate definition for the target words in all three word classes significantly more often than they did for a matched set of non-exposed control words. However, only the older group was able to identify the syntactically-appropriate sentence frames in the grammaticality judgment task. Further analyses elucidate some of the components of the lexical representations children lay down when they hear difficult new vocabulary in stories and how different tests of word knowledge might overlap in their assessment of these components.