916 resultados para deterministic bispectrum
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
One of the most important problems in the theory of cellular automata (CA) is determining the proportion of cells in a specific state after a given number of time iterations. We approach this problem using patterns in preimage sets - that is, the set of blocks which iterate to the desired output. This allows us to construct a response curve - a relationship between the proportion of cells in state 1 after niterations as a function of the initial proportion. We derive response curve formulae for many two-dimensional deterministic CA rules with L-neighbourhood. For all remaining rules, we find experimental response curves. We also use preimage sets to classify surjective rules. In the last part of the thesis, we consider a special class of one-dimensional probabilistic CA rules. We find response surface formula for these rules and experimental response surfaces for all remaining rules.
Resumo:
Nature is full of phenomena which we call "chaotic", the weather being a prime example. What we mean by this is that we cannot predict it to any significant accuracy, either because the system is inherently complex, or because some of the governing factors are not deterministic. However, during recent years it has become clear that random behaviour can occur even in very simple systems with very few number of degrees of freedom, without any need for complexity or indeterminacy. The discovery that chaos can be generated even with the help of systems having completely deterministic rules - often models of natural phenomena - has stimulated a lo; of research interest recently. Not that this chaos has no underlying order, but it is of a subtle kind, that has taken a great deal of ingenuity to unravel. In the present thesis, the author introduce a new nonlinear model, a ‘modulated’ logistic map, and analyse it from the view point of ‘deterministic chaos‘.
Resumo:
We study cooperating distributed systems (CD-systems) of restarting automata that are very restricted: they are deterministic, they cannot rewrite, but only delete symbols, they restart immediately after performing a delete operation, they are stateless, and they have a read/write window of size 1 only, that is, these are stateless deterministic R(1)-automata. We study the expressive power of these systems by relating the class of languages that they accept by mode =1 computations to other well-studied language classes, showing in particular that this class only contains semi-linear languages, and that it includes all rational trace languages. In addition, we investigate the closure and non-closure properties of this class of languages and some of its algorithmic properties.
Resumo:
We study cooperating distributed systems (CD-systems) of stateless deterministic restarting automata with window size 1 that are governed by an external pushdown store. In this way we obtain an automata-theoretical characterization for the class of context-free trace languages.
Resumo:
It is known that cooperating distributed systems (CD-systems) of stateless deterministic restarting automata with window size 1 accept a class of semi-linear languages that properly includes all rational trace languages. Although the component automata of such a CD-system are all deterministic, in general the CD-system itself is not, as in each of its computations, the initial component and the successor components are still chosen nondeterministically. Here we study CD-systems of stateless deterministic restarting automata with window size 1 that are themselves completely deterministic. In fact, we consider two such types of CD-systems, the strictly deterministic systems and the globally deterministic systems.
Resumo:
This dissertation has as its goal the quantitative evaluation of the application of coupled hydrodynamic, ecological and clarity models, to address the deterministic prediction of water clarity in lakes and reservoirs. Prediction of water clarity is somewhat unique, insofar as it represents the integrated and coupled effects of a broad range of individual water quality components. These include the biological components such as phytoplankton, together with the associated cycles of nutrients that are needed to sustain their popuiations, and abiotic components such as suspended particles that may be introduced by streams, atmospheric deposition or sediment resuspension. Changes in clarity induced by either component will feed back on the phytoplankton dynamics, as incident light also affects biological growth. Thus ability to successfully model changes in clarity will by necessity have to achieve the correct modeling of these other water quality parameters. Water clarity is also unique in that it may be one of the earliest and most easily detected wamings of the acceleration of the process of eutrophication in a water body.
Resumo:
This paper discusses the dangers inherent in allempting to simplify something as complex as development. It does this by exploring the Lynn and Vanhanen theory of deterministic development which asserts that varying levels of economic development seen between countries can be explained by differences in 'national intelligence' (national IQ). Assuming that intelligence is genetically determined, and as different races have been shown to have different IQ, then they argue that economic development (measured as GDP/capita) is largely a function of race and interventions to address imbalances can only have a limited impact. The paper presents the Lynne and Vanhanen case and critically discusses the data and analyses (linear regression) upon which it is based. It also extends the cause-effect basis of Lynne and Vanhanen's theory for economic development into human development by using the Human Development Index (HDI). It is argued that while there is nothing mathematically incorrect with their calculations, there are concerns over the data they employ. Even more fundamentally it is argued that statistically significant correlations between the various components of the HDI and national IQ can occur via a host of cause-effect pathways, and hence the genetic determinism theory is far from proven. The paper ends by discussing the dangers involved in the use of over-simplistic measures of development as a means of exploring cause-effect relationships. While the creators of development indices such as the HDI have good intentions, simplistic indices can encourage simplistic explanations of under-development. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
One of the enablers for new consumer electronics based products to be accepted in to the market is the availability of inexpensive, flexible and multi-standard chipsets and services. DVB-T, the principal standard for terrestrial broadcast of digital video in Europe, has been extremely successful in leading to governments reconsidering their targets for analogue television broadcast switch-off. To enable one further small step in creating increasingly cost effective chipsets, the ODFM deterministic equalizer has been presented before with its application to DVB-T. This paper discusses the test set-up of a DVB-T compliant baseband simulation that includes the deterministic equalizer and DVB-T standard propagation channels. This is then followed by a presentation of the found inner and outer Bit Error Rate (BER) results using various modulation levels, coding rates and propagation channels in order to ascertain the actual performance of the deterministic equalizer(1).
Resumo:
Recent developments in the UK concerning the reception of Digital Terrestrial Television (DTT) have indicated that, as it currently stands, DVB-T receivers may not be sufficient to maintain adequate quality of digital picture information to the consumer. There are many possible reasons why such large errors are being introduced into the system preventing reception failure. It has been suggested that one possibility is that the assumptions concerning the immunity to multipath that Coded Orthogonal Frequency Division Multiplex (COFDM) is expected to have, may not be entirely accurate. Previous research has shown that multipath can indeed have an impact on a DVB-T receiver performance. In the UK, proposals have been made to change the modulation from 64-QAM to 16-QAM to improve the immunity to multipath, but this paper demonstrates that the 16-QAM performance may again not be sufficient. To this end, this paper presents a deterministic approach to equalization such that a 64-QAM receiver with the simple equalizer presented in this paper has the same order of MPEG-2 BER performance as that to a 16-QAM receiver without equalization. Thus, alleviating the requirement in the broadcasters to migrate from 64-QAM to 16-QAM Of course, by adding the equalizer to a 16-QAM receiver then the BER is also further improved and thus creating one more step to satisfying the consumers(1).
Resumo:
In this paper we consider bilinear forms of matrix polynomials and show that these polynomials can be used to construct solutions for the problems of solving systems of linear algebraic equations, matrix inversion and finding extremal eigenvalues. An almost Optimal Monte Carlo (MAO) algorithm for computing bilinear forms of matrix polynomials is presented. Results for the computational costs of a balanced algorithm for computing the bilinear form of a matrix power is presented, i.e., an algorithm for which probability and systematic errors are of the same order, and this is compared with the computational cost for a corresponding deterministic method.
Resumo:
It is known that certain video deghoster systems cannot fully process the induced signal derived from the quadrature carrier forming nature of the VSB filter under a multipath condition. A new deterministic IIR deghoster filter structure is given which is capable of deghosting terrestrial video for any relative ghost carrier phase.
Resumo:
Our group considered the desirability of including representations of uncertainty in the development of parameterizations. (By ‘uncertainty’ here we mean the deviation of sub-grid scale fluxes or tendencies in any given model grid box from truth.) We unanimously agreed that the ECWMF should attempt to provide a more physical basis for uncertainty estimates than the very effective but ad hoc methods being used at present. Our discussions identified several issues that will arise.