920 resultados para deterministic bispectrum
Resumo:
The need for integration in the supply chain management leads us to considerthe coordination of two logistic planning functions: transportation andinventory. The coordination of these activities can be an extremely importantsource of competitive advantage in the supply chain management. The battle forcost reduction can pass through the equilibrium of transportation versusinventory managing costs. In this work, we study the specific case of aninventory-routing problem for a week planning period with different types ofdemand. A heuristic methodology, based on the Iterated Local Search, isproposed to solve the Multi-Period Inventory Routing Problem with stochasticand deterministic demand.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
We have investigated hysteresis and the return-point memory (RPM) property in deterministic cellular automata with avalanche dynamics. The RPM property reflects a partial ordering of metastable states, preserved by the dynamics. Recently, Sethna et al. [Phys. Rev. Lett. 70, 3347 (1993)] proved this behavior for a homogeneously driven system with static disorder. This Letter shows that the partial ordering and the RPM can be displayed as well by systems driven heterogeneously, as a result of its own evolution dynamics. In particular, we prove the RPM property for a deterministic 2D sandpile automaton driven at a central site.
Resumo:
The study investigates the possibility to incorporate fracture intensity and block geometry as spatially continuous parameters in GIS-based systems. For this purpose, a deterministic method has been implemented to estimate block size (Bloc3D) and joint frequency (COLTOP). In addition to measuring the block size, the Bloc3D Method provides a 3D representation of the shape of individual blocks. These two methods were applied using field measurements (joint set orientation and spacing) performed over a large field area, in the Swiss Alps. This area is characterized by a complex geology, a number of different rock masses and varying degrees of metamorphism. The spatial variability of the parameters was evaluated with regard to lithology and major faults. A model incorporating these measurements and observations into a GIS system to assess the risk associated with rock falls is proposed. The analysis concludes with a discussion on the feasibility of such an application in regularly and irregularly jointed rock masses, with persistent and impersistent discontinuities.
Resumo:
Membrane bioreactors (MBRs) are a combination of activated sludge bioreactors and membrane filtration, enabling high quality effluent with a small footprint. However, they can be beset by fouling, which causes an increase in transmembrane pressure (TMP). Modelling and simulation of changes in TMP could be useful to describe fouling through the identification of the most relevant operating conditions. Using experimental data from a MBR pilot plant operated for 462days, two different models were developed: a deterministic model using activated sludge model n°2d (ASM2d) for the biological component and a resistance in-series model for the filtration component as well as a data-driven model based on multivariable regressions. Once validated, these models were used to describe membrane fouling (as changes in TMP over time) under different operating conditions. The deterministic model performed better at higher temperatures (>20°C), constant operating conditions (DO set-point, membrane air-flow, pH and ORP), and high mixed liquor suspended solids (>6.9gL-1) and flux changes. At low pH (<7) or periods with higher pH changes, the data-driven model was more accurate. Changes in the DO set-point of the aerobic reactor that affected the TMP were also better described by the data-driven model. By combining the use of both models, a better description of fouling can be achieved under different operating conditions
Resumo:
This article discusses, from the standpoint of cellular biology, the deterministic and indeterministic androgenesis theories. The role of the vacuole and of various types of stresses on deviation of the microspore from normal development and the point where androgenetic competence is acquired are examined. Based on extensive literature review and data on wheat studies from our laboratory, a model for androgenetic capacity of pollen grain is proposed. A two point deterministic model for in vitro androgenesis is our proposal for acquisition of androgenetic potential of the pollen grain: the first switch point would be early meiosis and the second switch point the uninucleate pollen stage, because the elimination of cytoplasmatic sporophytic determinants takes place at those two strategic moments. Any abnormality in this process allowing the maintenance of sporophytic informational molecules results in the absence of establishment of a gametophytic program, allowing the reactivation of the embryogenic process
Resumo:
One of the most important problems in the theory of cellular automata (CA) is determining the proportion of cells in a specific state after a given number of time iterations. We approach this problem using patterns in preimage sets - that is, the set of blocks which iterate to the desired output. This allows us to construct a response curve - a relationship between the proportion of cells in state 1 after niterations as a function of the initial proportion. We derive response curve formulae for many two-dimensional deterministic CA rules with L-neighbourhood. For all remaining rules, we find experimental response curves. We also use preimage sets to classify surjective rules. In the last part of the thesis, we consider a special class of one-dimensional probabilistic CA rules. We find response surface formula for these rules and experimental response surfaces for all remaining rules.
Resumo:
Nature is full of phenomena which we call "chaotic", the weather being a prime example. What we mean by this is that we cannot predict it to any significant accuracy, either because the system is inherently complex, or because some of the governing factors are not deterministic. However, during recent years it has become clear that random behaviour can occur even in very simple systems with very few number of degrees of freedom, without any need for complexity or indeterminacy. The discovery that chaos can be generated even with the help of systems having completely deterministic rules - often models of natural phenomena - has stimulated a lo; of research interest recently. Not that this chaos has no underlying order, but it is of a subtle kind, that has taken a great deal of ingenuity to unravel. In the present thesis, the author introduce a new nonlinear model, a ‘modulated’ logistic map, and analyse it from the view point of ‘deterministic chaos‘.
Resumo:
We study cooperating distributed systems (CD-systems) of restarting automata that are very restricted: they are deterministic, they cannot rewrite, but only delete symbols, they restart immediately after performing a delete operation, they are stateless, and they have a read/write window of size 1 only, that is, these are stateless deterministic R(1)-automata. We study the expressive power of these systems by relating the class of languages that they accept by mode =1 computations to other well-studied language classes, showing in particular that this class only contains semi-linear languages, and that it includes all rational trace languages. In addition, we investigate the closure and non-closure properties of this class of languages and some of its algorithmic properties.
Resumo:
We study cooperating distributed systems (CD-systems) of stateless deterministic restarting automata with window size 1 that are governed by an external pushdown store. In this way we obtain an automata-theoretical characterization for the class of context-free trace languages.
Resumo:
It is known that cooperating distributed systems (CD-systems) of stateless deterministic restarting automata with window size 1 accept a class of semi-linear languages that properly includes all rational trace languages. Although the component automata of such a CD-system are all deterministic, in general the CD-system itself is not, as in each of its computations, the initial component and the successor components are still chosen nondeterministically. Here we study CD-systems of stateless deterministic restarting automata with window size 1 that are themselves completely deterministic. In fact, we consider two such types of CD-systems, the strictly deterministic systems and the globally deterministic systems.
Resumo:
This dissertation has as its goal the quantitative evaluation of the application of coupled hydrodynamic, ecological and clarity models, to address the deterministic prediction of water clarity in lakes and reservoirs. Prediction of water clarity is somewhat unique, insofar as it represents the integrated and coupled effects of a broad range of individual water quality components. These include the biological components such as phytoplankton, together with the associated cycles of nutrients that are needed to sustain their popuiations, and abiotic components such as suspended particles that may be introduced by streams, atmospheric deposition or sediment resuspension. Changes in clarity induced by either component will feed back on the phytoplankton dynamics, as incident light also affects biological growth. Thus ability to successfully model changes in clarity will by necessity have to achieve the correct modeling of these other water quality parameters. Water clarity is also unique in that it may be one of the earliest and most easily detected wamings of the acceleration of the process of eutrophication in a water body.
Resumo:
This paper discusses the dangers inherent in allempting to simplify something as complex as development. It does this by exploring the Lynn and Vanhanen theory of deterministic development which asserts that varying levels of economic development seen between countries can be explained by differences in 'national intelligence' (national IQ). Assuming that intelligence is genetically determined, and as different races have been shown to have different IQ, then they argue that economic development (measured as GDP/capita) is largely a function of race and interventions to address imbalances can only have a limited impact. The paper presents the Lynne and Vanhanen case and critically discusses the data and analyses (linear regression) upon which it is based. It also extends the cause-effect basis of Lynne and Vanhanen's theory for economic development into human development by using the Human Development Index (HDI). It is argued that while there is nothing mathematically incorrect with their calculations, there are concerns over the data they employ. Even more fundamentally it is argued that statistically significant correlations between the various components of the HDI and national IQ can occur via a host of cause-effect pathways, and hence the genetic determinism theory is far from proven. The paper ends by discussing the dangers involved in the use of over-simplistic measures of development as a means of exploring cause-effect relationships. While the creators of development indices such as the HDI have good intentions, simplistic indices can encourage simplistic explanations of under-development. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
One of the enablers for new consumer electronics based products to be accepted in to the market is the availability of inexpensive, flexible and multi-standard chipsets and services. DVB-T, the principal standard for terrestrial broadcast of digital video in Europe, has been extremely successful in leading to governments reconsidering their targets for analogue television broadcast switch-off. To enable one further small step in creating increasingly cost effective chipsets, the ODFM deterministic equalizer has been presented before with its application to DVB-T. This paper discusses the test set-up of a DVB-T compliant baseband simulation that includes the deterministic equalizer and DVB-T standard propagation channels. This is then followed by a presentation of the found inner and outer Bit Error Rate (BER) results using various modulation levels, coding rates and propagation channels in order to ascertain the actual performance of the deterministic equalizer(1).
Resumo:
Recent developments in the UK concerning the reception of Digital Terrestrial Television (DTT) have indicated that, as it currently stands, DVB-T receivers may not be sufficient to maintain adequate quality of digital picture information to the consumer. There are many possible reasons why such large errors are being introduced into the system preventing reception failure. It has been suggested that one possibility is that the assumptions concerning the immunity to multipath that Coded Orthogonal Frequency Division Multiplex (COFDM) is expected to have, may not be entirely accurate. Previous research has shown that multipath can indeed have an impact on a DVB-T receiver performance. In the UK, proposals have been made to change the modulation from 64-QAM to 16-QAM to improve the immunity to multipath, but this paper demonstrates that the 16-QAM performance may again not be sufficient. To this end, this paper presents a deterministic approach to equalization such that a 64-QAM receiver with the simple equalizer presented in this paper has the same order of MPEG-2 BER performance as that to a 16-QAM receiver without equalization. Thus, alleviating the requirement in the broadcasters to migrate from 64-QAM to 16-QAM Of course, by adding the equalizer to a 16-QAM receiver then the BER is also further improved and thus creating one more step to satisfying the consumers(1).