57 resultados para Underground distribution system
Resumo:
Geographic distributions of pathogens are the outcome of dynamic processes involving host availability, susceptibility and abundance, suitability of climate conditions, and historical contingency including evolutionary change. Distributions have changed fast and are changing fast in response to many factors, including climatic change. The response time of arable agriculture is intrinsically fast, but perennial crops and especially forests are unlikely to adapt easily. Predictions of many of the variables needed to predict changes in pathogen range are still rather uncertain, and their effects will be profoundly modified by changes elsewhere in the agricultural system, including both economic changes affecting growing systems and hosts and evolutionary changes in pathogens and hosts. Tools to predict changes based on environmental correlations depend on good primary data, which is often absent, and need to be checked against the historical record, which remains very poor for almost all pathogens. We argue that at present the uncertainty in predictions of change is so great that the important adaptive response is to monitor changes and to retain the capacity to innovate, both by access to economic capital with reasonably long-term rates of return and by retaining wide scientific expertise, including currently less fashionable specialisms.
Conditioning of incremental variational data assimilation, with application to the Met Office system
Resumo:
Implementations of incremental variational data assimilation require the iterative minimization of a series of linear least-squares cost functions. The accuracy and speed with which these linear minimization problems can be solved is determined by the condition number of the Hessian of the problem. In this study, we examine how different components of the assimilation system influence this condition number. Theoretical bounds on the condition number for a single parameter system are presented and used to predict how the condition number is affected by the observation distribution and accuracy and by the specified lengthscales in the background error covariance matrix. The theoretical results are verified in the Met Office variational data assimilation system, using both pseudo-observations and real data.
Resumo:
Increased penetration of generation and decentralised control are considered to be feasible and effective solution for reducing cost and emissions and hence efficiency associated with power generation and distribution. Distributed generation in combination with the multi-agent technology are perfect candidates for this solution. Pro-active and autonomous nature of multi-agent systems can provide an effective platform for decentralised control whilst improving reliability and flexibility of the grid.
Resumo:
Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.
Resumo:
Air distribution systems are one of the major electrical energy consumers in air-conditioned commercial buildings which maintain comfortable indoor thermal environment and air quality by supplying specified amounts of treated air into different zones. The sizes of air distribution lines affect energy efficiency of the distribution systems. Equal friction and static regain are two well-known approaches for sizing the air distribution lines. Concerns to life cycle cost of the air distribution systems, T and IPS methods have been developed. Hitherto, all these methods are based on static design conditions. Therefore, dynamic performance of the system has not been yet addressed; whereas, the air distribution systems are mostly performed in dynamic rather than static conditions. Besides, none of the existing methods consider any aspects of thermal comfort and environmental impacts. This study attempts to investigate the existing methods for sizing of the air distribution systems and proposes a dynamic approach for size optimisation of the air distribution lines by taking into account optimisation criteria such as economic aspects, environmental impacts and technical performance. These criteria have been respectively addressed through whole life costing analysis, life cycle assessment and deviation from set-point temperature of different zones. Integration of these criteria into the TRNSYS software produces a novel dynamic optimisation approach for duct sizing. Due to the integration of different criteria into a well- known performance evaluation software, this approach could be easily adopted by designers in busy nature of design. Comparison of this integrated approach with the existing methods reveals that under the defined criteria, system performance is improved up to 15% compared to the existing methods. This approach is interpreted as a significant step forward reaching to the net zero emission building in future.
Resumo:
In this paper, we study a model economy that examines the optimal intraday rate. In Freeman’s (1996) paper, he shows that the efficient allocation can be implemented by adopting a policy in which the intraday rate is zero. We modify the production set and show that such a model economy can account for the non-uniform distribution of settlements within a day. In addition, by modifying both the consumption set and the production set, we show that the central bank may be able to implement the planner’s allocation with a positive intraday interest rate.
Resumo:
The British system of development control is time-consuming and uncertain in outcome. Moreover, it is becoming increasingly overloaded as it has gradually switched away from being centred on a traditional ‘is it an appropriate land-use?’ type approach to one based on multi-faceted inspections of projects and negotiations over the distribution of the potential financial gains arising from them. Recent policy developments have centred on improving the operation of development control. This paper argues that more fundamental issues may be a stake as well. Important market changes have increased workloads. Furthermore, the UK planning system's institutional framework encourages change to move in specific directions, which is not always helpful. If expectations of increased long-term housing supply are to be met more substantial changes to development control may be essential but hard to achieve.
Resumo:
One goal in the development of distributed virtual environments (DVEs) is to create a system such that users are unaware of the distribution-the distribution should be transparent. The paper begins by discussing the general issues in DVEs that might make this possible, and a system that allows some level of distribution transparency is described. The system described suffers from effects of inconsistency, which in turn cause undesirable visual effects. The causal surface is introduced as a solution that removes these visual effects. The paper then introduces two determining factors of distribution transparency relating to user perception and performance. With regard to these factors, two hypotheses are stated relating to the causal surface. A user-trial on forty-five subjects is used to validate the hypotheses. A discussion of the results of the trial concludes that the causal surface solution does significantly improve the distribution transparency in a DVE.
Resumo:
The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation
Resumo:
To understand the resilience of aquatic ecosystems to environmental change, it is important to determine how multiple, related environmental factors, such as near-surface air temperature and river flow, will change during the next century. This study develops a novel methodology that combines statistical downscaling and fish species distribution modeling, to enhance the understanding of how global climate changes (modeled by global climate models at coarse-resolution) may affect local riverine fish diversity. The novelty of this work is the downscaling framework developed to provide suitable future projections of fish habitat descriptors, focusing particularly on the hydrology which has been rarely considered in previous studies. The proposed modeling framework was developed and tested in a major European system, the Adour-Garonne river basin (SW France, 116,000 km(2)), which covers distinct hydrological and thermal regions from the Pyrenees to the Atlantic coast. The simulations suggest that, by 2100, the mean annual stream flow is projected to decrease by approximately 15% and temperature to increase by approximately 1.2 °C, on average. As consequence, the majority of cool- and warm-water fish species is projected to expand their geographical range within the basin while the few cold-water species will experience a reduction in their distribution. The limitations and potential benefits of the proposed modeling approach are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
Resumo:
In language contact studies, specific features of the contact languages are often seen to be the result of transfer (interference), but it remains difficult to disentangle the role of intra-systemic and inter-systemic factors. We propose to unravel these factors in the analysis of a feature of Brussels French which many researchers attribute to transfer from (Brussels) Dutch: the adverbial use of une fois. We compare the use of this particle in Brussels French with its occurrence in corpora of other varieties of French, including several that have not been influenced by a Germanic substrate or adstrate. A detailed analysis of the frequency of occurrence, the functions and the distribution of the particle over different syntactic positions shows that some uses of une fois can be traced back to sixteenth-century French, but that there is also ample evidence for overt and covert transfer (Mougeon and Beniak, 1991) from Brussels Dutch.
Resumo:
Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.
Resumo:
The translation of an ensemble of model runs into a probability distribution is a common task in model-based prediction. Common methods for such ensemble interpretations proceed as if verification and ensemble were draws from the same underlying distribution, an assumption not viable for most, if any, real world ensembles. An alternative is to consider an ensemble as merely a source of information rather than the possible scenarios of reality. This approach, which looks for maps between ensembles and probabilistic distributions, is investigated and extended. Common methods are revisited, and an improvement to standard kernel dressing, called ‘affine kernel dressing’ (AKD), is introduced. AKD assumes an affine mapping between ensemble and verification, typically not acting on individual ensemble members but on the entire ensemble as a whole, the parameters of this mapping are determined in parallel with the other dressing parameters, including a weight assigned to the unconditioned (climatological) distribution. These amendments to standard kernel dressing, albeit simple, can improve performance significantly and are shown to be appropriate for both overdispersive and underdispersive ensembles, unlike standard kernel dressing which exacerbates over dispersion. Studies are presented using operational numerical weather predictions for two locations and data from the Lorenz63 system, demonstrating both effectiveness given operational constraints and statistical significance given a large sample.
Resumo:
An investigation is presented of a quasi-stationary convective system (QSCS) which occurred over the UK Southwest Peninsula on 21 July 2010. This system was remarkably similar in its location and structure to one which caused devastating flash flooding in the coastal village of Boscastle, Cornwall on 16 August 2004. However, in the 2010 case rainfall accumulations were around four times smaller and no flooding was recorded. The more extreme nature of the Boscastle case is shown to be related to three factors: (1) higher rain rates, associated with a warmer and moister tropospheric column and deeper convective clouds; (2) a more stationary system, due to slower evolution of the large-scale flow; and (3) distribution of the heaviest precipitation over fewer river catchments. Overall, however, the synoptic setting of the two events was broadly similar, suggesting that such conditions favour the development of QSCSs over the Southwest Peninsula. A numerical simulation of the July 2010 event was performed using a 1.5-km grid length configuration of the Met Office Unified Model. This reveals that convection was repeatedly initiated through lifting of low-level air parcels along a quasi-stationary coastal convergence line. Sensitivity tests are used to show that this convergence line was a sea breeze front which temporarily stalled along the coastline due to the retarding influence of an offshore-directed background wind component. Several deficiencies are noted in the 1.5-km model’s representation of the storm system, including delayed convective initiation; however, significant improvements are observed when the grid length is reduced to 500 m. These result in part from an improved representation of the convergence line, which enhances the associated low-level ascent allowing air parcels to more readily reach their level of free convection. The implications of this finding for forecasting convective precipitation are discussed.
Resumo:
Much of mainstream economic analysis assumes that markets adjust smoothly, through prices, to changes in economic conditions. However, this is not necessarily the case for local housing markets, whose spatial structures may exhibit persistence, so that conditions may not be those most suited to the requirements of modern-day living. Persistence can arise from the existence of transaction costs. The paper tests the proposition that housing markets in Inner London exhibit a degree of path dependence, through the construction of a three-equation model, and examines the impact of variables constructed for the 19th and early 20th centuries on modern house prices. These include 19th-century social structures, slum clearance programmes and the 1908 underground network. Each is found to be significant. The tests require the construction of novel historical datasets, which are also described in the paper.