77 resultados para Statistical Foundations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of a combined engineering and statistical Artificial Neural Network model of UK domestic appliance load profiles is presented. The model uses diary-style appliance use data and a survey questionnaire collected from 51 suburban households and 46 rural households during the summer of 2010 and2011 respectively. It also incorporates measured energy data and is sensitive to socioeconomic, physical dwelling and temperature variables. A prototype model is constructed in MATLAB using a two layer feed forward network with back propagation training which has a 12:10:24 architecture. Model outputs include appliance load profiles which can be applied to the fields of energy planning (microrenewables and smart grids), building simulation tools and energy policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Planning is highly conditioned by the relationships between the market, state and politics. This becomes particularly clear in looking at the changes taking place in the countries of the former Communist block as they attempt to establish a new set of relationships. The old power structures have been dislodged and old laws discarded. This paper examines the situation in Bulgaria and explores the preconditions for setting up a new planning system there. The first section outlines the political changes since 1989 and shows how political instability has effected the pace of change. The establishment of a market in land and property is a second precondition for the planning system there and moves in this direction are presented, including restitution policies. Finally the issues raised by the early attempts towards a new planning system are discussed. This paper is the first of a series looking at the countries of Eastern Europe and the author would welcome comments from others working in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question as to whether it is better to diversify a real estate portfolio within a property type across the regions or within a region across the property types is one of continuing interest for academics and practitioners alike. The current study, however, is somewhat different from the usual sector/regional analysis taking account of the fact that holdings in the UK real estate market are heavily concentrated in a single region, London. As a result this study is designed to investigate whether a real estate fund manager can obtain a statistically significant improvement in risk/return performance from extending out of a London based portfolio into firstly the rest of the South East of England and then into the remainder of the UK, or whether the manger would be better off staying within London and diversifying across the various property types. The results indicating that staying within London and diversifying across the various property types may offer performance comparable with regional diversification, although this conclusion largely depends on the time period and the fund manager’s ability to diversify efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Weekly monitoring of profiles of student performances on formative and summative coursework throughout the year can be used to quickly identify those who need additional help, possibly due to acute and sudden-onset problems. Such an early-warning system can help retention, but also assist students in overcoming problems early on, thus helping them fulfil their potential in the long run. We have developed a simple approach for the automatic monitoring of student mark profiles for individual modules, which we intend to trial in the near future. Its ease of implementation means that it can be used for very large cohorts with little additional effort when marks are already collected and recorded on a spreadsheet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A precipitation downscaling method is presented using precipitation from a general circulation model (GCM) as predictor. The method extends a previous method from monthly to daily temporal resolution. The simplest form of the method corrects for biases in wet-day frequency and intensity. A more sophisticated variant also takes account of flow-dependent biases in the GCM. The method is flexible and simple to implement. It is proposed here as a correction of GCM output for applications where sophisticated methods are not available, or as a benchmark for the evaluation of other downscaling methods. Applied to output from reanalyses (ECMWF, NCEP) in the region of the European Alps, the method is capable of reducing large biases in the precipitation frequency distribution, even for high quantiles. The two variants exhibit similar performances, but the ideal choice of method can depend on the GCM/reanalysis and it is recommended to test the methods in each case. Limitations of the method are found in small areas with unresolved topographic detail that influence higher-order statistics (e.g. high quantiles). When used as benchmark for three regional climate models (RCMs), the corrected reanalysis and the RCMs perform similarly in many regions, but the added value of the latter is evident for high quantiles in some small regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A systematic evaluation of agricultural factors affecting the adaptation of the tropical oil plant Jatropha curcas L. to the semi-arid subtropical climate in Northeastern Mexico has been conducted. The factors studied include plant density and topology, as well as fungi and virus abundances. A multiple regression analysis shows that total fruit production can be well predicted by the area per plant and the total presence of fungi. Four common herbicides and a mechanical weed control measure were established at a dedicated test array and their impact on plant productivity was assessed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate decadal climate predictions could be used to inform adaptation actions to a changing climate. The skill of such predictions from initialised dynamical global climate models (GCMs) may be assessed by comparing with predictions from statistical models which are based solely on historical observations. This paper presents two benchmark statistical models for predicting both the radiatively forced trend and internal variability of annual mean sea surface temperatures (SSTs) on a decadal timescale based on the gridded observation data set HadISST. For both statistical models, the trend related to radiative forcing is modelled using a linear regression of SST time series at each grid box on the time series of equivalent global mean atmospheric CO2 concentration. The residual internal variability is then modelled by (1) a first-order autoregressive model (AR1) and (2) a constructed analogue model (CA). From the verification of 46 retrospective forecasts with start years from 1960 to 2005, the correlation coefficient for anomaly forecasts using trend with AR1 is greater than 0.7 over parts of extra-tropical North Atlantic, the Indian Ocean and western Pacific. This is primarily related to the prediction of the forced trend. More importantly, both CA and AR1 give skillful predictions of the internal variability of SSTs in the subpolar gyre region over the far North Atlantic for lead time of 2 to 5 years, with correlation coefficients greater than 0.5. For the subpolar gyre and parts of the South Atlantic, CA is superior to AR1 for lead time of 6 to 9 years. These statistical forecasts are also compared with ensemble mean retrospective forecasts by DePreSys, an initialised GCM. DePreSys is found to outperform the statistical models over large parts of North Atlantic for lead times of 2 to 5 years and 6 to 9 years, however trend with AR1 is generally superior to DePreSys in the North Atlantic Current region, while trend with CA is superior to DePreSys in parts of South Atlantic for lead time of 6 to 9 years. These findings encourage further development of benchmark statistical decadal prediction models, and methods to combine different predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of automatically identifying and restoring damaged and contaminated images. We suggest a novel approach based on a semi-parametric model. This has two components, a parametric component describing known physical characteristics and a more flexible non-parametric component. The latter avoids the need for a detailed model for the sensor, which is often costly to produce and lacking in robustness. We assess our approach using an analysis of electroencephalographic images contaminated by eye-blink artefacts and highly damaged photographs contaminated by non-uniform lighting. These experiments show that our approach provides an effective solution to problems of this type.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The author contends that many of the conventions of Italian film studies derive from the conflicts and the critical vocabulary that shaped the Italian reception of neorealism in the first decade after the Second World War. Those conflicts, and that critical vocabulary, which lie at the foundation of what has been called the ‘institution of neorealism,’ established an irreconcilable binary: Cronaca and Narrativa. For the neorealists and their critics, Cronaca stood for the effort to record data faithfully, while Narrativa represented the effort to employ the shaping force of human invention in the representation of information. This essay’s first section analyzes the earliest reviews of Rossellini’s Roma città aperta alongside the contemporaneous literary debates over Cronaca and Narrativa. The second section reconsiders the reception of Pratolini’s Metello and Visconti’s Senso, which similarly centered upon the conflict between Cronaca and Narrativa. The third section proposes that the concepts which have often been employed to unify neorealism are destabilized by the Cronaca/Narrativa binary. In search of a solution to neorealism’s conceptual instability, this essay proposes more critical and purposeful appropriations of the movement’s problematic genealogy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geophysical time series sometimes exhibit serial correlations that are stronger than can be captured by the commonly used first‐order autoregressive model. In this study we demonstrate that a power law statistical model serves as a useful upper bound for the persistence of total ozone anomalies on monthly to interannual timescales. Such a model is usually characterized by the Hurst exponent. We show that the estimation of the Hurst exponent in time series of total ozone is sensitive to various choices made in the statistical analysis, especially whether and how the deterministic (including periodic) signals are filtered from the time series, and the frequency range over which the estimation is made. In particular, care must be taken to ensure that the estimate of the Hurst exponent accurately represents the low‐frequency limit of the spectrum, which is the part that is relevant to long‐term correlations and the uncertainty of estimated trends. Otherwise, spurious results can be obtained. Based on this analysis, and using an updated equivalent effective stratospheric chlorine (EESC) function, we predict that an increase in total ozone attributable to EESC should be detectable at the 95% confidence level by 2015 at the latest in southern midlatitudes, and by 2020–2025 at the latest over 30°–45°N, with the time to detection increasing rapidly with latitude north of this range.