927 resultados para Distribution system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Air distribution systems are one of the major electrical energy consumers in air-conditioned commercial buildings which maintain comfortable indoor thermal environment and air quality by supplying specified amounts of treated air into different zones. The sizes of air distribution lines affect energy efficiency of the distribution systems. Equal friction and static regain are two well-known approaches for sizing the air distribution lines. Concerns to life cycle cost of the air distribution systems, T and IPS methods have been developed. Hitherto, all these methods are based on static design conditions. Therefore, dynamic performance of the system has not been yet addressed; whereas, the air distribution systems are mostly performed in dynamic rather than static conditions. Besides, none of the existing methods consider any aspects of thermal comfort and environmental impacts. This study attempts to investigate the existing methods for sizing of the air distribution systems and proposes a dynamic approach for size optimisation of the air distribution lines by taking into account optimisation criteria such as economic aspects, environmental impacts and technical performance. These criteria have been respectively addressed through whole life costing analysis, life cycle assessment and deviation from set-point temperature of different zones. Integration of these criteria into the TRNSYS software produces a novel dynamic optimisation approach for duct sizing. Due to the integration of different criteria into a well- known performance evaluation software, this approach could be easily adopted by designers in busy nature of design. Comparison of this integrated approach with the existing methods reveals that under the defined criteria, system performance is improved up to 15% compared to the existing methods. This approach is interpreted as a significant step forward reaching to the net zero emission building in future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study a model economy that examines the optimal intraday rate. In Freeman’s (1996) paper, he shows that the efficient allocation can be implemented by adopting a policy in which the intraday rate is zero. We modify the production set and show that such a model economy can account for the non-uniform distribution of settlements within a day. In addition, by modifying both the consumption set and the production set, we show that the central bank may be able to implement the planner’s allocation with a positive intraday interest rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The British system of development control is time-consuming and uncertain in outcome. Moreover, it is becoming increasingly overloaded as it has gradually switched away from being centred on a traditional ‘is it an appropriate land-use?’ type approach to one based on multi-faceted inspections of projects and negotiations over the distribution of the potential financial gains arising from them. Recent policy developments have centred on improving the operation of development control. This paper argues that more fundamental issues may be a stake as well. Important market changes have increased workloads. Furthermore, the UK planning system's institutional framework encourages change to move in specific directions, which is not always helpful. If expectations of increased long-term housing supply are to be met more substantial changes to development control may be essential but hard to achieve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One goal in the development of distributed virtual environments (DVEs) is to create a system such that users are unaware of the distribution-the distribution should be transparent. The paper begins by discussing the general issues in DVEs that might make this possible, and a system that allows some level of distribution transparency is described. The system described suffers from effects of inconsistency, which in turn cause undesirable visual effects. The causal surface is introduced as a solution that removes these visual effects. The paper then introduces two determining factors of distribution transparency relating to user perception and performance. With regard to these factors, two hypotheses are stated relating to the causal surface. A user-trial on forty-five subjects is used to validate the hypotheses. A discussion of the results of the trial concludes that the causal surface solution does significantly improve the distribution transparency in a DVE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To understand the resilience of aquatic ecosystems to environmental change, it is important to determine how multiple, related environmental factors, such as near-surface air temperature and river flow, will change during the next century. This study develops a novel methodology that combines statistical downscaling and fish species distribution modeling, to enhance the understanding of how global climate changes (modeled by global climate models at coarse-resolution) may affect local riverine fish diversity. The novelty of this work is the downscaling framework developed to provide suitable future projections of fish habitat descriptors, focusing particularly on the hydrology which has been rarely considered in previous studies. The proposed modeling framework was developed and tested in a major European system, the Adour-Garonne river basin (SW France, 116,000 km(2)), which covers distinct hydrological and thermal regions from the Pyrenees to the Atlantic coast. The simulations suggest that, by 2100, the mean annual stream flow is projected to decrease by approximately 15% and temperature to increase by approximately 1.2 °C, on average. As consequence, the majority of cool- and warm-water fish species is projected to expand their geographical range within the basin while the few cold-water species will experience a reduction in their distribution. The limitations and potential benefits of the proposed modeling approach are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In language contact studies, specific features of the contact languages are often seen to be the result of transfer (interference), but it remains difficult to disentangle the role of intra-systemic and inter-systemic factors. We propose to unravel these factors in the analysis of a feature of Brussels French which many researchers attribute to transfer from (Brussels) Dutch: the adverbial use of une fois. We compare the use of this particle in Brussels French with its occurrence in corpora of other varieties of French, including several that have not been influenced by a Germanic substrate or adstrate. A detailed analysis of the frequency of occurrence, the functions and the distribution of the particle over different syntactic positions shows that some uses of une fois can be traced back to sixteenth-century French, but that there is also ample evidence for overt and covert transfer (Mougeon and Beniak, 1991) from Brussels Dutch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The translation of an ensemble of model runs into a probability distribution is a common task in model-based prediction. Common methods for such ensemble interpretations proceed as if verification and ensemble were draws from the same underlying distribution, an assumption not viable for most, if any, real world ensembles. An alternative is to consider an ensemble as merely a source of information rather than the possible scenarios of reality. This approach, which looks for maps between ensembles and probabilistic distributions, is investigated and extended. Common methods are revisited, and an improvement to standard kernel dressing, called ‘affine kernel dressing’ (AKD), is introduced. AKD assumes an affine mapping between ensemble and verification, typically not acting on individual ensemble members but on the entire ensemble as a whole, the parameters of this mapping are determined in parallel with the other dressing parameters, including a weight assigned to the unconditioned (climatological) distribution. These amendments to standard kernel dressing, albeit simple, can improve performance significantly and are shown to be appropriate for both overdispersive and underdispersive ensembles, unlike standard kernel dressing which exacerbates over dispersion. Studies are presented using operational numerical weather predictions for two locations and data from the Lorenz63 system, demonstrating both effectiveness given operational constraints and statistical significance given a large sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation is presented of a quasi-stationary convective system (QSCS) which occurred over the UK Southwest Peninsula on 21 July 2010. This system was remarkably similar in its location and structure to one which caused devastating flash flooding in the coastal village of Boscastle, Cornwall on 16 August 2004. However, in the 2010 case rainfall accumulations were around four times smaller and no flooding was recorded. The more extreme nature of the Boscastle case is shown to be related to three factors: (1) higher rain rates, associated with a warmer and moister tropospheric column and deeper convective clouds; (2) a more stationary system, due to slower evolution of the large-scale flow; and (3) distribution of the heaviest precipitation over fewer river catchments. Overall, however, the synoptic setting of the two events was broadly similar, suggesting that such conditions favour the development of QSCSs over the Southwest Peninsula. A numerical simulation of the July 2010 event was performed using a 1.5-km grid length configuration of the Met Office Unified Model. This reveals that convection was repeatedly initiated through lifting of low-level air parcels along a quasi-stationary coastal convergence line. Sensitivity tests are used to show that this convergence line was a sea breeze front which temporarily stalled along the coastline due to the retarding influence of an offshore-directed background wind component. Several deficiencies are noted in the 1.5-km model’s representation of the storm system, including delayed convective initiation; however, significant improvements are observed when the grid length is reduced to 500 m. These result in part from an improved representation of the convergence line, which enhances the associated low-level ascent allowing air parcels to more readily reach their level of free convection. The implications of this finding for forecasting convective precipitation are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The climate of the Earth, like planetary climates in general, is broadly controlled by solar irradiation, planetary albedo and emissivity as well as its rotation rate and distribution of land (with its orography) and oceans. However, the majority of climate fluctuations that affect mankind are internal modes of the general circulation of the atmosphere and the oceans. Some of these modes, such as El Nino-Southern Oscillation (ENSO), are quasi-regular and have some longer-term predictive skill; others like the Arctic and Antarctic Oscillation are chaotic and generally unpredictable beyond a few weeks. Studies using general circulation models indicate that internal processes dominate the regional climate and that some like ENSO events have even distinct global signatures. This is one of the reasons why it is so difficult to separate internal climate processes from external ones caused, for example, by changes in greenhouse gases and solar irradiation. However, the accumulation of the warmest seasons during the latest two decades is lending strong support to the forcing of the greenhouse gases. As models are getting more comprehensive, they show a gradually broader range of internal processes including those on longer time scales, challenging the interpretation of the causes of past and present climate events further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work investigated the personal exposure to indoor particulate matters using the intake fraction metric and provided a possible way to trace the particle inhaled from an indoor particle source. A turbulence model validated by the particle measurements in a room with underfloor air distribution (UFAD) system was used to predict the indoor particle concentrations. Inhalation intake fraction of indoor particles was defined and evaluated in two rooms equipped with the UFAD, i.e., the experimental room and a small office. According to the exposure characteristics and a typical respiratory rate, the intake fraction was determined in two rooms with a continuous and episodic (human cough) source of particles, respectively. The findings showed that the well-mixing assumption of indoor air failed to give an accurate estimation of inhalation exposure and the average concentration at return outlet or within the overall room could not relate well the intake fraction to the amount of particle emitted from an indoor source.