994 resultados para deterministic fractals


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several methods are examined which allow to produce forecasts for time series in the form of probability assignments. The necessary concepts are presented, addressing questions such as how to assess the performance of a probabilistic forecast. A particular class of models, cluster weighted models (CWMs), is given particular attention. CWMs, originally proposed for deterministic forecasts, can be employed for probabilistic forecasting with little modification. Two examples are presented. The first involves estimating the state of (numerically simulated) dynamical systems from noise corrupted measurements, a problem also known as filtering. There is an optimal solution to this problem, called the optimal filter, to which the considered time series models are compared. (The optimal filter requires the dynamical equations to be known.) In the second example, we aim at forecasting the chaotic oscillations of an experimental bronze spring system. Both examples demonstrate that the considered time series models, and especially the CWMs, provide useful probabilistic information about the underlying dynamical relations. In particular, they provide more than just an approximation to the conditional mean.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Chartered Institute of Building Service Engineers (CIBSE) produced a technical memorandum (TM36) presenting research on future climate impacting building energy use and thermal comfort. One climate projection for each of four CO2 emissions scenario were used in TM36, so providing a deterministic outlook. As part of the UK Climate Impacts Programme (UKCIP) probabilistic climate projections are being studied in relation to building energy simulation techniques. Including uncertainty in climate projections is considered an important advance to climate impacts modelling and is included in the latest UKCIP data (UKCP09). Incorporating the stochastic nature of these new climate projections in building energy modelling requires a significant increase in data handling and careful statistical interpretation of the results to provide meaningful conclusions. This paper compares the results from building energy simulations when applying deterministic and probabilistic climate data. This is based on two case study buildings: (i) a mixed-mode office building with exposed thermal mass and (ii) a mechanically ventilated, light-weight office building. Building (i) represents an energy efficient building design that provides passive and active measures to maintain thermal comfort. Building (ii) relies entirely on mechanical means for heating and cooling, with its light-weight construction raising concern over increased cooling loads in a warmer climate. Devising an effective probabilistic approach highlighted greater uncertainty in predicting building performance, depending on the type of building modelled and the performance factors under consideration. Results indicate that the range of calculated quantities depends not only on the building type but is strongly dependent on the performance parameters that are of interest. Uncertainty is likely to be particularly marked with regard to thermal comfort in naturally ventilated buildings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the error dynamics for cycled data assimilation systems, such that the inverse problem of state determination is solved at tk, k = 1, 2, 3, ..., with a first guess given by the state propagated via a dynamical system model from time tk − 1 to time tk. In particular, for nonlinear dynamical systems that are Lipschitz continuous with respect to their initial states, we provide deterministic estimates for the development of the error ||ek|| := ||x(a)k − x(t)k|| between the estimated state x(a) and the true state x(t) over time. Clearly, observation error of size δ > 0 leads to an estimation error in every assimilation step. These errors can accumulate, if they are not (a) controlled in the reconstruction and (b) damped by the dynamical system under consideration. A data assimilation method is called stable, if the error in the estimate is bounded in time by some constant C. The key task of this work is to provide estimates for the error ||ek||, depending on the size δ of the observation error, the reconstruction operator Rα, the observation operator H and the Lipschitz constants K(1) and K(2) on the lower and higher modes of controlling the damping behaviour of the dynamics. We show that systems can be stabilized by choosing α sufficiently small, but the bound C will then depend on the data error δ in the form c||Rα||δ with some constant c. Since ||Rα|| → ∞ for α → 0, the constant might be large. Numerical examples for this behaviour in the nonlinear case are provided using a (low-dimensional) Lorenz '63 system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Price movements in many commodity markets exhibit significant seasonal patterns. However, given an observed futures price, a deterministic seasonal component at the price level is not relevant for the pricing of commodity options. In contrast, this is not true for the seasonal pattern observed in the volatility of the commodity price. Analyzing an extensive sample of soybean, corn, heating oil and natural gas options, we find that seasonality in volatility is an important aspect to consider when valuing these contracts. The inclusion of an appropriate seasonality adjustment significantly reduces pricing errors in these markets and yields more improvement in valuation accuracy than increasing the number of stochastic factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This paper seeks to critically review the conceptual frameworks that have been developed for assessing the impact of information and communications technology (ICT) on real estate. Design/methodology/approach – The research is based on a critical review of existing literature and draws from examples of previous empirical research in the field. Findings – The paper suggests that a “socio-technical framework” is more appropriate to examine ICT impact in real estate than other “deterministic” frameworks. Therefore, ICT is an important part of the new economy, but must be seen in the context of a number of other social and economic factors. Research limitations/implications – The research is based on a qualitative assessment of existing frameworks, and by using examples from commercial real estate, assesses the extent to which a “socio-technical” framework can aid understanding of ICT impact. Practical implications – The paper is important in highlighting a number of the main issues in conceptualising ICT impact in real estate and also critically examines the emergence of a new economy in the information society within the general context of real estate. The paper also highlights research gaps in the field. Originality/value – The paper deconstructs the myths of the “death of real estate” and “productivity increase means jobs loss”, in relation to office real estate. Finally, it examines some of the ways in which ICT is impacting on real estate and suggests the most important components for a future research agenda in the field of ICT and real estate impact, and will be of value to property investors, facilities managers, developers, financiers, and others.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geophysical time series sometimes exhibit serial correlations that are stronger than can be captured by the commonly used first‐order autoregressive model. In this study we demonstrate that a power law statistical model serves as a useful upper bound for the persistence of total ozone anomalies on monthly to interannual timescales. Such a model is usually characterized by the Hurst exponent. We show that the estimation of the Hurst exponent in time series of total ozone is sensitive to various choices made in the statistical analysis, especially whether and how the deterministic (including periodic) signals are filtered from the time series, and the frequency range over which the estimation is made. In particular, care must be taken to ensure that the estimate of the Hurst exponent accurately represents the low‐frequency limit of the spectrum, which is the part that is relevant to long‐term correlations and the uncertainty of estimated trends. Otherwise, spurious results can be obtained. Based on this analysis, and using an updated equivalent effective stratospheric chlorine (EESC) function, we predict that an increase in total ozone attributable to EESC should be detectable at the 95% confidence level by 2015 at the latest in southern midlatitudes, and by 2020–2025 at the latest over 30°–45°N, with the time to detection increasing rapidly with latitude north of this range.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamics affects the distribution and abundance of stratospheric ozone directly through transport of ozone itself and indirectly through its effect on ozone chemistry via temperature and transport of other chemical species. Dynamical processes must be considered in order to understand past ozone changes, especially in the northern hemisphere where there appears to be significant low-frequency variability which can look “trend-like” on decadal time scales. A major challenge is to quantify the predictable, or deterministic, component of past ozone changes. Over the coming century, changes in climate will affect the expected recovery of ozone. For policy reasons it is important to be able to distinguish and separately attribute the effects of ozone-depleting substances and greenhouse gases on both ozone and climate. While the radiative-chemical effects can be relatively easily identified, this is not so evident for dynamics — yet dynamical changes (e.g., changes in the Brewer-Dobson circulation) could have a first-order effect on ozone over particular regions. Understanding the predictability and robustness of such dynamical changes represents another major challenge. Chemistry-climate models have recently emerged as useful tools for addressing these questions, as they provide a self-consistent representation of dynamical aspects of climate and their coupling to ozone chemistry. We can expect such models to play an increasingly central role in the study of ozone and climate in the future, analogous to the central role of global climate models in the study of tropospheric climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a unified neurofuzzy modelling scheme. To begin with, the initial fuzzy base construction method is based on fuzzy clustering utilising a Gaussian mixture model (GMM) combined with the analysis of covariance (ANOVA) decomposition in order to obtain more compact univariate and bivariate membership functions over the subspaces of the input features. The mean and covariance of the Gaussian membership functions are found by the expectation maximisation (EM) algorithm with the merit of revealing the underlying density distribution of system inputs. The resultant set of membership functions forms the basis of the generalised fuzzy model (GFM) inference engine. The model structure and parameters of this neurofuzzy model are identified via the supervised subspace orthogonal least square (OLS) learning. Finally, instead of providing deterministic class label as model output by convention, a logistic regression model is applied to present the classifier’s output, in which the sigmoid type of logistic transfer function scales the outputs of the neurofuzzy model to the class probability. Experimental validation results are presented to demonstrate the effectiveness of the proposed neurofuzzy modelling scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite many decades investigating scalp recordable 8–13-Hz (alpha) electroencephalographic activity, no consensus has yet emerged regarding its physiological origins nor its functional role in cognition. Here we outline a detailed, physiologically meaningful, theory for the genesis of this rhythm that may provide important clues to its functional role. In particular we find that electroencephalographically plausible model dynamics, obtained with physiological admissible parameterisations, reveals a cortex perched on the brink of stability, which when perturbed gives rise to a range of unanticipated complex dynamics that include 40-Hz (gamma) activity. Preliminary experimental evidence, involving the detection of weak nonlinearity in resting EEG using an extension of the well-known surrogate data method, suggests that nonlinear (deterministic) dynamics are more likely to be associated with weakly damped alpha activity. Thus rather than the “alpha rhythm” being an idling rhythm it may be more profitable to conceive it as a readiness rhythm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper argues for the use of ‘fractals’ in theorising sociospatial relations. From a realist position, a nonmathematical but nonmetaphoric and descriptive view of ‘fractals’ is advanced. Insights from the natural sciences are combined with insights on the position of the observer from Luhmann and notions of assemblages and repetitions from Deleuze. It is argued that the notion of ‘fractals’ can augment current understanding of sociospatialities in three ways. First, it can pose questions about the scalar position of the observer or the grain of observation; second, as a signifier of particular attributes, it prompts observation and description of particular structuring processes; and third, the epistemic access afforded by the concept can open up possibilities for transformative interventions and thereby inform the same. The theoretical usefulness of the concept is demonstrated by discussing the territory, place, scale, and networks (TPSN) model for theorising sociospatial relations advanced by B Jessop, N Brenner, and M Jones in their 2008 paper “Theorizing sociospatial relations”, published in this journal (volume 26, pages 389–401). It is suggested that a heuristic arising from a ‘fractal’ ontology can contribute to a polymorphous, as opposed to polyvalent, understanding of sociospatial relations.