939 resultados para Pseudorandom generator
Resumo:
This paper describes a new method for the assessment of palaeohydrology through the Holocene. A palaeoclimate model was linked with a hydrological model, using a weather generator to correct bias in the rainfall estimates, to simulate the changes in the flood frequency and the groundwater response through the late Pleistocene and Holocene for the Wadi Faynan in southern Jordan, a site considered internationally important due to its rich archaeological heritage spanning the Pleistocene and Holocene. This is the first study to describe the hydrological functioning of the Wadi Faynan, a meso-scale (241 km2) semi-arid catchment, setting this description within the framework of contemporary archaeological investigations. Historic meteorological records were collated and supplemented with new hydrological and water quality data. The modelled outcomes indicate that environmental changes, such as deforestation, had a major impact on the local water cycle and this amplified the effect of the prevailing climate on the flow regime. The results also show that increased rainfall alone does not necessarily imply better conditions for farming and highlight the importance of groundwater. The discussion focuses on the utility of the method and the importance of the local hydrology to the sustained settlement of the Wadi Faynan through pre-history and history.
Resumo:
Harmonic analysis on configuration spaces is used in order to extend explicit expressions for the images of creation, annihilation, and second quantization operators in L2-spaces with respect to Poisson point processes to a set of functions larger than the space obtained by directly using chaos expansion. This permits, in particular, to derive an explicit expression for the generator of the second quantization of a sub-Markovian contraction semigroup on a set of functions which forms a core of the generator.
Resumo:
Almost all the electricity currently produced in the UK is generated as part of a centralised power system designed around large fossil fuel or nuclear power stations. This power system is robust and reliable but the efficiency of power generation is low, resulting in large quantities of waste heat. The principal aim of this paper is to investigate an alternative concept: the energy production by small scale generators in close proximity to the energy users, integrated into microgrids. Microgrids—de-centralised electricity generation combined with on-site production of heat—bear the promise of substantial environmental benefits, brought about by a higher energy efficiency and by facilitating the integration of renewable sources such as photovoltaic arrays or wind turbines. By virtue of good match between generation and load, microgrids have a low impact on the electricity network, despite a potentially significant level of generation by intermittent energy sources. The paper discusses the technical and economic issues associated with this novel concept, giving an overview of the generator technologies, the current regulatory framework in the UK, and the barriers that have to be overcome if microgrids are to make a major contribution to the UK energy supply. The focus of this study is a microgrid of domestic users powered by small Combined Heat and Power generators and photovoltaics. Focusing on the energy balance between the generation and load, it is found that the optimum combination of the generators in the microgrid- consisting of around 1.4 kWp PV array per household and 45% household ownership of micro-CHP generators- will maintain energy balance on a yearly basis if supplemented by energy storage of 2.7 kWh per household. We find that there is no fundamental technological reason why microgrids cannot contribute an appreciable part of the UK energy demand. Indeed, an estimate of cost indicates that the microgrids considered in this study would supply electricity at a cost comparable with the present electricity supply if the current support mechanisms for photovoltaics were maintained. Combining photovoltaics and micro-CHP and a small battery requirement gives a microgrid that is independent of the national electricity network. In the short term, this has particular benefits for remote communities but more wide-ranging possibilities open up in the medium to long term. Microgrids could meet the need to replace current generation nuclear and coal fired power stations, greatly reducing the demand on the transmission and distribution network.
Resumo:
This paper extends and clarifies results of Steinsaltz and Evans [Trans. Amer. Math. Soc. 359 (2007) 1285–1234], which found conditions for convergence of a killed one-dimensional diffusion conditioned on survival, to a quasistationary distribution whose density is given by the principal eigenfunction of the generator. Under the assumption that the limit of the killing at infinity differs from the principal eigenvalue we prove that convergence to quasistationarity occurs if and only if the principal eigenfunction is integrable. When the killing at ∞ is larger than the principal eigenvalue, then the eigenfunction is always integrable. When the killing at ∞ is smaller, the eigenfunction is integrable only when the unkilled process is recurrent; otherwise, the process conditioned on survival converges to 0 density on any bounded interval.
Resumo:
Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.
Resumo:
The global atmospheric electric circuit is driven by thunderstorms and electrified rain/shower clouds and is also influenced by energetic charged particles from space. The global circuit maintains the ionosphere as an equipotential at∼+250 kV with respect to the good conducting Earth (both land and oceans). Its “load” is the fair weather atmosphere and semi-fair weather atmosphere at large distances from the disturbed weather “generator” regions. The main solar-terrestrial (or space weather) influence on the global circuit arises from spatially and temporally varying fluxes of galactic cosmic rays (GCRs) and energetic electrons precipitating from the magnetosphere. All components of the circuit exhibit much variability in both space and time. Global circuit variations between solar maximum and solar minimum are considered together with Forbush decrease and solar flare effects. The variability in ion concentration and vertical current flow are considered in terms of radiative effects in the troposphere, through infra-red absorption, and cloud effects, in particular possible cloud microphysical effects from charging at layer cloud edges. The paper identifies future research areas in relation to Task Group 4 of the Climate and Weather of the Sun-Earth System (CAWSES-II) programme.
Resumo:
For a Lévy process ξ=(ξt)t≥0 drifting to −∞, we define the so-called exponential functional as follows: Formula Under mild conditions on ξ, we show that the following factorization of exponential functionals: Formula holds, where × stands for the product of independent random variables, H− is the descending ladder height process of ξ and Y is a spectrally positive Lévy process with a negative mean constructed from its ascending ladder height process. As a by-product, we generate an integral or power series representation for the law of Iξ for a large class of Lévy processes with two-sided jumps and also derive some new distributional properties. The proof of our main result relies on a fine Markovian study of a class of generalized Ornstein–Uhlenbeck processes, which is itself of independent interest. We use and refine an alternative approach of studying the stationary measure of a Markov process which avoids some technicalities and difficulties that appear in the classical method of employing the generator of the dual Markov process.
Resumo:
This article reports the results of an experiment that examined how demand aggregators can discipline vertically-integrated firms - generator and distributor-retailer holdings-, which have a high share in wholesale electricity market with uniform price double auction (UPDA). We initially develop a treatment where holding members redistribute the profit based on the imposition of supra-competitive prices, in equal proportions (50%-50%). Subsequently, we introduce a vertical disintegration (unbundling) treatment with holding-s information sharing, where profits are distributed according to market outcomes. Finally, a third treatment is performed to introduce two active demand aggregators, with flexible interruptible loads in real time. We found that the introduction of responsive demand aggregators neutralizes the power market and increases market efficiency, even beyond what is achieved through vertical disintegration.
Resumo:
Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.
Resumo:
Catastrophe risk models used by the insurance industry are likely subject to significant uncertainty, but due to their proprietary nature and strict licensing conditions they are not available for experimentation. In addition, even if such experiments were conducted, these would not be repeatable by other researchers because commercial confidentiality issues prevent the details of proprietary catastrophe model structures from being described in public domain documents. However, such experimentation is urgently required to improve decision making in both insurance and reinsurance markets. In this paper we therefore construct our own catastrophe risk model for flooding in Dublin, Ireland, in order to assess the impact of typical precipitation data uncertainty on loss predictions. As we consider only a city region rather than a whole territory and have access to detailed data and computing resources typically unavailable to industry modellers, our model is significantly more detailed than most commercial products. The model consists of four components, a stochastic rainfall module, a hydrological and hydraulic flood hazard module, a vulnerability module, and a financial loss module. Using these we undertake a series of simulations to test the impact of driving the stochastic event generator with four different rainfall data sets: ground gauge data, gauge-corrected rainfall radar, meteorological reanalysis data (European Centre for Medium-Range Weather Forecasts Reanalysis-Interim; ERA-Interim) and a satellite rainfall product (The Climate Prediction Center morphing method; CMORPH). Catastrophe models are unusual because they use the upper three components of the modelling chain to generate a large synthetic database of unobserved and severe loss-driving events for which estimated losses are calculated. We find the loss estimates to be more sensitive to uncertainties propagated from the driving precipitation data sets than to other uncertainties in the hazard and vulnerability modules, suggesting that the range of uncertainty within catastrophe model structures may be greater than commonly believed.
Resumo:
This paper demonstrates the oscillatory characteristics of electrical signals acquired from two ornamental plant types (Epipremnum pinnatum and Philodendron scandens - Family Araceae), using a noninvasive acquisition system. The electrical signal was recorded using Ag/AgCl superficial electrodes inside a Faraday cage. The presence of the oscillatory electric generator was shown using a classical power spectral density. The Lempel and Ziv complexity measurement showed that the plant signal was not noise despite its nonlinear behavior. The oscillatory characteristics of the signal were explained using a simulated electrical model that establishes that for a frequency range from 5 to 15 Hz, the oscillatory characteristic is higher than for other frequency ranges. All results show that non-invasive electrical plant signals can be acquired with improvement of signal-to-noise ratio using a Faraday cage, and a simple electrical model is able to explain the electrical signal being generated. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We quantified gait and stride characteristics (velocity, frequency, stride length, stance and swing duration, and duty factor) in the bursts of locomotion of two small, intermittently moving, closely related South American gymnophthalmid lizards: Vanzosaura rubricauda and Procellosaurinus tetradactylus. They occur in different environments: V rubricauda is widely distributed in open areas with various habitats and substrates, while P. tetradactylus is endemic to dunes in the semi-arid Brazilian Caatinga. Both use trot or walking trot characterised by a lateral sequence. For various substrates in a gradient of roughness (perspex, cardboard, sand, gravel), both species have low relative velocities in comparison with those reported for larger continuously moving lizards. To generate velocity, these animals increase stride frequency but decrease relative stride length. For these parameters, P. tetradactylus showed lower values than V rubricauda. In their relative range of velocities, no significant differences in stride length and frequency were recorded for gravel. However, the slopes of a correlation between velocity and its components were lower in P. tetradactylus on cardboard, whereas on sand this was only observed for velocity and stride length. The data showed that the difference in rhythmic parameters between both species increased with the smoothness of the substrates. Moreover, P. tetradactylus shows a highly specialised locomotor strategy involving lower stride length and frequency for generating lower velocities than in V. rubricauda. This suggests the evolution of a central motor pattern generator to control slower limb movements and to produce fewer and longer pauses in intermittent locomotion. (c) 2008 Elsevier GmbH. All rights reserved.
Resumo:
Mutation testing has been used to assess the quality of test case suites by analyzing the ability in distinguishing the artifact under testing from a set of alternative artifacts, the so-called mutants. The mutants are generated from the artifact under testing by applying a set of mutant operators, which produce artifacts with simple syntactical differences. The mutant operators are usually based on typical errors that occur during the software development and can be related to a fault model. In this paper, we propose a language-named MuDeL (MUtant DEfinition Language)-for the definition of mutant operators, aiming not only at automating the mutant generation, but also at providing precision and formality to the operator definition. The proposed language is based on concepts from transformational and logical programming paradigms, as well as from context-free grammar theory. Denotational semantics formal framework is employed to define the semantics of the MuDeL language. We also describe a system-named mudelgen-developed to support the use of this language. An executable representation of the denotational semantics of the language is used to check the correctness of the implementation of mudelgen. At the very end, a mutant generator module is produced, which can be incorporated into a specific mutant tool/environment. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A novel cryptography method based on the Lorenz`s attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.
Resumo:
This paper describes a visual stimulus generator (VSImG) capable of displaying a gray-scale, 256 x 256 x 8 bitmap image with a frame rate of 500 Hz using a boustrophedonic scanning technique. It is designed for experiments with motion-sensitive neurons of the fly`s visual system, where the flicker fusion frequency of the photoreceptors can reach up to 500 Hz. Devices with such a high frame rate are not commercially available, but are required, if sensory systems with high flicker fusion frequency are to be studied. The implemented hardware approach gives us complete real-time control of the displacement sequence and provides all the signals needed to drive an electrostatic deflection display. With the use of analog signals, very small high-resolution displacements, not limited by the image`s pixel size can be obtained. Very slow image displacements with visually imperceptible steps can also be generated. This can be of interest for other vision research experiments. Two different stimulus files can be used simultaneously, allowing the system to generate X-Y displacements on one display or independent movements on two displays as long as they share the same bitmap image. (C) 2011 Elsevier B.V. All rights reserved.