934 resultados para dynamic systems theory
Resumo:
A model-based approach for fault diagnosis is proposed, where the fault detection is based on checking the consistencyof the Analytical Redundancy Relations (ARRs) using an interval tool. The tool takes into account the uncertainty in theparameters and the measurements using intervals. Faults are explicitly included in the model, which allows for the exploitation of additional information. This information is obtained from partial derivatives computed from the ARRs. The signs in the residuals are used to prune the candidate space when performing the fault diagnosis task. The method is illustrated using a two-tank example, in which these aspects are shown to have an impact on the diagnosis and fault discrimination, since the proposed method goes beyond the structural methods
Resumo:
The vibrational configuration interaction method used to obtain static vibrational (hyper)polarizabilities is extended to dynamic nonlinear optical properties in the infinite optical frequency approximation. Illustrative calculations are carried out on H2 O and N H3. The former molecule is weakly anharmonic while the latter contains a strongly anharmonic umbrella mode. The effect on vibrational (hyper)polarizabilities due to various truncations of the potential energy and property surfaces involved in the calculation are examined
Resumo:
We report experimental and numerical results showing how certain N-dimensional dynamical systems are able to exhibit complex time evolutions based on the nonlinear combination of N-1 oscillation modes. The experiments have been done with a family of thermo-optical systems of effective dynamical dimension varying from 1 to 6. The corresponding mathematical model is an N-dimensional vector field based on a scalar-valued nonlinear function of a single variable that is a linear combination of all the dynamic variables. We show how the complex evolutions appear associated with the occurrence of successive Hopf bifurcations in a saddle-node pair of fixed points up to exhaust their instability capabilities in N dimensions. For this reason the observed phenomenon is denoted as the full instability behavior of the dynamical system. The process through which the attractor responsible for the observed time evolution is formed may be rather complex and difficult to characterize. Nevertheless, the well-organized structure of the time signals suggests some generic mechanism of nonlinear mode mixing that we associate with the cluster of invariant sets emerging from the pair of fixed points and with the influence of the neighboring saddle sets on the flow nearby the attractor. The generation of invariant tori is likely during the full instability development and the global process may be considered as a generalized Landau scenario for the emergence of irregular and complex behavior through the nonlinear superposition of oscillatory motions
Resumo:
A radiative equation of the Cattaneo–Vernotte type is derived from information theory and the radiative transfer equation. The equation thus derived is a radiative analog of the equation that is used for the description of hyperbolic heat conduction. It is shown, without recourse to any phenomenological assumption, that radiative transfer may be included in a natural way in the framework of extendedirreversible thermodynamics
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.
Resumo:
In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.
Resumo:
Secondary accident statistics can be useful for studying the impact of traffic incident management strategies. An easy-to-implement methodology is presented for classifying secondary accidents using data fusion of a police accident database with intranet incident reports. A current method for classifying secondary accidents uses a static threshold that represents the spatial and temporal region of influence of the primary accident, such as two miles and one hour. An accident is considered secondary if it occurs upstream from the primary accident and is within the duration and queue of the primary accident. However, using the static threshold may result in both false positives and negatives because accident queues are constantly varying. The methodology presented in this report seeks to improve upon this existing method by making the threshold dynamic. An incident progression curve is used to mark the end of the queue throughout the entire incident. Four steps in the development of incident progression curves are described. Step one is the processing of intranet incident reports. Step two is the filling in of incomplete incident reports. Step three is the nonlinear regression of incident progression curves. Step four is the merging of individual incident progression curves into one master curve. To illustrate this methodology, 5,514 accidents from Missouri freeways were analyzed. The results show that secondary accidents identified by dynamic versus static thresholds can differ by more than 30%.
Resumo:
We survey the population genetic basis of social evolution, using a logically consistent set of arguments to cover a wide range of biological scenarios. We start by reconsidering Hamilton's (Hamilton 1964 J. Theoret. Biol. 7, 1-16 (doi:10.1016/0022-5193(64)90038-4)) results for selection on a social trait under the assumptions of additive gene action, weak selection and constant environment and demography. This yields a prediction for the direction of allele frequency change in terms of phenotypic costs and benefits and genealogical concepts of relatedness, which holds for any frequency of the trait in the population, and provides the foundation for further developments and extensions. We then allow for any type of gene interaction within and between individuals, strong selection and fluctuating environments and demography, which may depend on the evolving trait itself. We reach three conclusions pertaining to selection on social behaviours under broad conditions. (i) Selection can be understood by focusing on a one-generation change in mean allele frequency, a computation which underpins the utility of reproductive value weights; (ii) in large populations under the assumptions of additive gene action and weak selection, this change is of constant sign for any allele frequency and is predicted by a phenotypic selection gradient; (iii) under the assumptions of trait substitution sequences, such phenotypic selection gradients suffice to characterize long-term multi-dimensional stochastic evolution, with almost no knowledge about the genetic details underlying the coevolving traits. Having such simple results about the effect of selection regardless of population structure and type of social interactions can help to delineate the common features of distinct biological processes. Finally, we clarify some persistent divergences within social evolution theory, with respect to exactness, synergies, maximization, dynamic sufficiency and the role of genetic arguments.
Resumo:
This paper presents a dynamic choice model in the attributespace considering rational consumers that discount the future. In lightof the evidence of several state-dependence patterns, the model isfurther extended by considering a utility function that allows for thedifferent types of behavior described in the literature: pure inertia,pure variety seeking and hybrid. The model presents a stationaryconsumption pattern that can be inertial, where the consumer only buysone product, or a variety-seeking one, where the consumer buys severalproducts simultane-ously. Under the inverted-U marginal utilityassumption, the consumer behaves inertial among the existing brands forseveral periods, and eventually, once the stationary levels areapproached, the consumer turns to a variety-seeking behavior. An empiricalanalysis is run using a scanner database for fabric softener andsignificant evidence of hybrid behavior for most attributes is found,which supports the functional form considered in the theory.
Resumo:
Minkowski's ?(x) function can be seen as the confrontation of two number systems: regular continued fractions and the alternated dyadic system. This way of looking at it permits us to prove that its derivative, as it also happens for many other non-decreasing singular functions from [0,1] to [0,1], when it exists can only attain two values: zero and infinity. It is also proved that if the average of the partial quotients in the continued fraction expansion of x is greater than k* =5.31972, and ?'(x) exists then ?'(x)=0. In the same way, if the same average is less than k**=2 log2(F), where F is the golden ratio, then ?'(x)=infinity. Finally some results are presented concerning metric properties of continued fraction and alternated dyadic expansions.
Resumo:
Customer choice behavior, such as 'buy-up' and 'buy-down', is an importantphe-nomenon in a wide range of industries. Yet there are few models ormethodologies available to exploit this phenomenon within yield managementsystems. We make some progress on filling this void. Specifically, wedevelop a model of yield management in which the buyers' behavior ismodeled explicitly using a multi-nomial logit model of demand. Thecontrol problem is to decide which subset of fare classes to offer ateach point in time. The set of open fare classes then affects the purchaseprobabilities for each class. We formulate a dynamic program todetermine the optimal control policy and show that it reduces to a dynamicnested allocation policy. Thus, the optimal choice-based policy caneasily be implemented in reservation systems that use nested allocationcontrols. We also develop an estimation procedure for our model based onthe expectation-maximization (EM) method that jointly estimates arrivalrates and choice model parameters when no-purchase outcomes areunobservable. Numerical results show that this combined optimization-estimation approach may significantly improve revenue performancerelative to traditional leg-based models that do not account for choicebehavior.
Resumo:
Game theory is a branch of applied mathematics used to analyze situation where two or more agents are interacting. Originally it was developed as a model for conflicts and collaborations between rational and intelligent individuals. Now it finds applications in social sciences, eco- nomics, biology (particularly evolutionary biology and ecology), engineering, political science, international relations, computer science, and philosophy. Networks are an abstract representation of interactions, dependencies or relationships. Net- works are extensively used in all the fields mentioned above and in many more. Many useful informations about a system can be discovered by analyzing the current state of a network representation of such system. In this work we will apply some of the methods of game theory to populations of agents that are interconnected. A population is in fact represented by a network of players where one can only interact with another if there is a connection between them. In the first part of this work we will show that the structure of the underlying network has a strong influence on the strategies that the players will decide to adopt to maximize their utility. We will then introduce a supplementary degree of freedom by allowing the structure of the population to be modified along the simulations. This modification allows the players to modify the structure of their environment to optimize the utility that they can obtain.
Resumo:
This paper presents a tractable dynamic general equilibrium model thatcan explain cross-country empirical regularities in geographical mobility,unemployment and labor market institutions. Rational agents vote overunemployment insurance (UI), taking the dynamic distortionary effects ofinsurance on the performance of the labor market into consideration.Agents with higher cost of moving, i.e., more attached to their currentlocation, prefer more generous UI. The key assumption is that an agent'sattachment to a location increases the longer she has resided there. UIreduces the incentive for labor mobility and increases, therefore, thefraction of attached agents and the political support for UI. The mainresult is that this self-reinforcing mechanism can give rise to multiplesteady-states-one 'European' steady-state featuring high unemployment,low geographical mobility and high unemployment insurance, and one'American' steady-state featuring low unemployment, high mobility andlow unemployment insurance.
Resumo:
We investigate dynamics of public perceptions of the 2009 H1N1 influenza pandemic to understand changing patterns of sense-making and blame regarding the outbreak of emerging infectious diseases. We draw on social representation theory combined with a dramaturgical perspective to identify changes in how various collectives are depicted over the course of the pandemic, according to three roles: heroes, villains and victims. Quantitative results based on content analysis of three cross-sectional waves of interviews show a shift from mentions of distant collectives (e.g., far-flung countries) at Wave 1 to local collectives (e.g., risk groups) as the pandemic became of more immediate concern (Wave 2) and declined (Wave 3). Semi-automated content analysis of media coverage shows similar results. Thematic analyses of the discourse associated with collectives revealed that many were consistently perceived as heroes, villains and victims.