62 resultados para Observability Gramian


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modal filters may be obtained by a properly designed weighted sum of the output signals of an array of sensors distributed on the host structure. Although several research groups have been interested in techniques for designing and implementing modal filters based on a given array of sensors, the effect of the array topology on the effectiveness of the modal filter has received much less attention. In particular, it is known that some parameters, such as size, shape and location of a sensor, are very important in determining the observability of a vibration mode. Hence, this paper presents a methodology for the topological optimization of an array of sensors in order to maximize the effectiveness of a set of selected modal filters. This is done using a genetic algorithm optimization technique for the selection of 12 piezoceramic sensors from an array of 36 piezoceramic sensors regularly distributed on an aluminum plate, which maximize the filtering performance, over a given frequency range, of a set of modal filters, each one aiming to isolate one of the first vibration modes. The vectors of the weighting coefficients for each modal filter are evaluated using QR decomposition of the complex frequency response function matrix. Results show that the array topology is not very important for lower frequencies but it greatly affects the filter effectiveness for higher frequencies. Therefore, it is possible to improve the effectiveness and frequency range of a set of modal filters by optimizing the topology of an array of sensors. Indeed, using 12 properly located piezoceramic sensors bonded on an aluminum plate it is shown that the frequency range of a set of modal filters may be enlarged by 25-50%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses robust model-order reduction of a high dimensional nonlinear partial differential equation (PDE) model of a complex biological process. Based on a nonlinear, distributed parameter model of the same process which was validated against experimental data of an existing, pilot-scale BNR activated sludge plant, we developed a state-space model with 154 state variables in this work. A general algorithm for robustly reducing the nonlinear PDE model is presented and based on an investigation of five state-of-the-art model-order reduction techniques, we are able to reduce the original model to a model with only 30 states without incurring pronounced modelling errors. The Singular perturbation approximation balanced truncating technique is found to give the lowest modelling errors in low frequency ranges and hence is deemed most suitable for controller design and other real-time applications. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid increase in the use of microprocessor-based systems in critical areas, where failures imply risks to human lives, to the environment or to expensive equipment, significantly increased the need for dependable systems, able to detect, tolerate and eventually correct faults. The verification and validation of such systems is frequently performed via fault injection, using various forms and techniques. However, as electronic devices get smaller and more complex, controllability and observability issues, and sometimes real time constraints, make it harder to apply most conventional fault injection techniques. This paper proposes a fault injection environment and a scalable methodology to assist the execution of real-time fault injection campaigns, providing enhanced performance and capabilities. Our proposed solutions are based on the use of common and customized on-chip debug (OCD) mechanisms, present in many modern electronic devices, with the main objective of enabling the insertion of faults in microprocessor memory elements with minimum delay and intrusiveness. Different configurations were implemented starting from basic Components Off-The-Shelf (COTS) microprocessors, equipped with real-time OCD infrastructures, to improved solutions based on modified interfaces, and dedicated OCD circuitry that enhance fault injection capabilities and performance. All methodologies and configurations were evaluated and compared concerning performance gain and silicon overhead.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coevolution between two antagonistic species has been widely studied theoretically for both ecologically- and genetically-driven Red Queen dynamics. A typical outcome of these systems is an oscillatory behavior causing an endless series of one species adaptation and others counter-adaptation. More recently, a mathematical model combining a three-species food chain system with an adaptive dynamics approach revealed genetically driven chaotic Red Queen coevolution. In the present article, we analyze this mathematical model mainly focusing on the impact of species rates of evolution (mutation rates) in the dynamics. Firstly, we analytically proof the boundedness of the trajectories of the chaotic attractor. The complexity of the coupling between the dynamical variables is quantified using observability indices. By using symbolic dynamics theory, we quantify the complexity of genetically driven Red Queen chaos computing the topological entropy of existing one-dimensional iterated maps using Markov partitions. Co-dimensional two bifurcation diagrams are also built from the period ordering of the orbits of the maps. Then, we study the predictability of the Red Queen chaos, found in narrow regions of mutation rates. To extend the previous analyses, we also computed the likeliness of finding chaos in a given region of the parameter space varying other model parameters simultaneously. Such analyses allowed us to compute a mean predictability measure for the system in the explored region of the parameter space. We found that genetically driven Red Queen chaos, although being restricted to small regions of the analyzed parameter space, might be highly unpredictable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores - Área de Especialização em Automação e Sistemas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Target tracking with bearing-only sensors is a challenging problem when the target moves dynamically in complex scenarios. Besides the partial observability of such sensors, they have limited field of views, occlusions can occur, etc. In those cases, cooperative approaches with multiple tracking robots are interesting, but the different sources of uncertain information need to be considered appropriately in order to achieve better estimates. Even though there exist probabilistic filters that can estimate the position of a target dealing with incertainties, bearing-only measurements bring usually additional problems with initialization and data association. In this paper, we propose a multi-robot triangulation method with a dynamic baseline that can triangulate bearing-only measurements in a probabilistic manner to produce 3D observations. This method is combined with a decentralized stochastic filter and used to tackle those initialization and data association issues. The approach is validated with simulations and field experiments where a team of aerial and ground robots with cameras track a dynamic target.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze a model where firms chose a production technology which, together with some random event, determines the final emission level. We consider the coexistence of two alternative technologies: a "clean" technology, and a "dirty" technology. The environmental regulation is based on taxes over reported emissions, and on penalties over unreported emissions. We show that the optimal inspection policy is a cut-off strategy, for several scenarios concerning the observability of the adoption of the clean technology and the cost of adopting it. We also show that the optimal inspection policy induces the firm to adopt the clean technology if the adoption cost is not too high, but the cost levels for which the firm adopts it depend on the scenario.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Expectations about the future are central for determination of current macroeconomic outcomes and the formulation of monetary policy. Recent literature has explored ways for supplementing the benchmark of rational expectations with explicit models of expectations formation that rely on econometric learning. Some apparently natural policy rules turn out to imply expectational instability of private agents’ learning. We use the standard New Keynesian model to illustrate this problem and survey the key results about interest-rate rules that deliver both uniqueness and stability of equilibrium under econometric learning. We then consider some practical concerns such as measurement errors in private expectations, observability of variables and learning of structural parameters required for policy. We also discuss some recent applications including policy design under perpetual learning, estimated models with learning, recurrent hyperinflations, and macroeconomic policy to combat liquidity traps and deflation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines properties of optimal poverty assistance programs under different informational environments using an income maintenanceframework. To that end, we make both the income generating ability andthe disutility of labor of individuals unobservable, and compare theresulting benefit schedules with those of programs found in the UnitedStates since Welfare Reform (1996). We find that optimal programs closelyresemble a Negative Income Tax with a Benefit Reduction rate that dependson the distribution of population characteristics. A policy of workfare(unpaid public sector work) is inefficient when disutility of labor isunobservable, but minimum work requirements (for paid work) may be usedin that same environment. The distortions to work incentives and thepresence of minimum work requirements depend on the observability andrelative importance of the population's characteristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examine the conditions under which competitive equilibria can beobtained as the limit, when the number of strategic traders getslarge, of Nash equilibria in economies with asymmetric informationon agents' effort and possibly imperfect observability of agents'trades. Convergence always occur when either effort is publiclyobserved (no matter what is the information available tointermediaries on agents' trades); or effort is private informationbut agents' trades are perfectly observed; or no information at allis available on agents' trades. On the other hand, when eachintermediary can observe its trades with an agent, but not theagent's trades with other intermediaries, the (Nash) equilibriawith strategic intermediaries do not converge to any of thecompetitive equilibria, for an open set of economies. The source ofthe difficulties for convergence is the combination of asymmetricinformation and the restrictions on the observability of tradeswhich prevent the formation of exclusive contractual relationshipsand generate barriers to entry in the markets for contracts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As indicated by Grenon (1989), the data of the present series of reports on the UBVRI photometry of late-type stars in the Hipparcos Input Catalog are to be employed in computations of Hipparcos observing time, as well as in evaluating the observability of faint stars by the satellite. Attention is here given to late type stars in the V = 8-12 range, including distant red giants in the Galactic plane (Hipparcos proposal 189), as well as high proper motion stars included in the G, LTT, LP, and MCC catalogs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: There is an emerging knowledge base on the effectiveness of strategies to close the knowledge-practice gap. However, less is known about how attributes of an innovation and other contextual and situational factors facilitate and impede an innovation's adoption. The Healthy Heart Kit (HHK) is a risk management and patient education resource for the prevention of cardiovascular disease (CVD) and promotion of cardiovascular health. Although previous studies have demonstrated the HHK's content validity and practical utility, no published study has examined physicians' uptake of the HHK and factors that shape its adoption. OBJECTIVES: Conceptually informed by Rogers' Diffusion of Innovation theory, and Theory of Planned Behaviour, this study had two objectives: (1) to determine if specific attributes of the HHK as well as contextual and situational factors are associated with physicians' intention and actual usage of the HHK kit; and (2), to determine if any contextual and situational factors are associated with individual or environmental barriers that prevent the uptake of the HHK among those physicians who do not plan to use the kit. METHODS: A sample of 153 physicians who responded to an invitation letter sent to all family physicians in the province of Alberta, Canada were recruited for the study. Participating physicians were sent a HHK, and two months later a study questionnaire assessed primary factors on the physicians' clinical practice, attributes of the HHK (relative advantage, compatibility, complexity, trialability, observability), confidence and control using the HHK, barriers to use, and individual attributes. All measures were used in path analysis, employing a causal model based on Rogers' Diffusion of Innovations Theory and Theory of Planned Behaviour. RESULTS: 115 physicians (follow up rate of 75%) completed the questionnaire. Use of the HHK was associated with intention to use the HHK, relative advantage, and years of experience. Relative advantage and the observability of the HHK benefits were also significantly associated with physicians' intention to use the HHK. Physicians working in solo medical practices reported experiencing more individual and environmental barriers to using the HHK. CONCLUSION: The results of this study suggest that future information innovations must demonstrate an advantage over current resources and the research evidence supporting the innovation must be clearly visible. Findings also suggest that the innovation adoption process has a social element, and collegial interactions and discussions may facilitate that process. These results could be valuable for knowledge translation researchers and health promotion developers in future innovation adoption planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans cette thèse, je me suis interessé à l’identification partielle des effets de traitements dans différents modèles de choix discrets avec traitements endogènes. Les modèles d’effets de traitement ont pour but de mesurer l’impact de certaines interventions sur certaines variables d’intérêt. Le type de traitement et la variable d’intérêt peuvent être défini de manière générale afin de pouvoir être appliqué à plusieurs différents contextes. Il y a plusieurs exemples de traitement en économie du travail, de la santé, de l’éducation, ou en organisation industrielle telle que les programmes de formation à l’emploi, les techniques médicales, l’investissement en recherche et développement, ou l’appartenance à un syndicat. La décision d’être traité ou pas n’est généralement pas aléatoire mais est basée sur des choix et des préférences individuelles. Dans un tel contexte, mesurer l’effet du traitement devient problématique car il faut tenir compte du biais de sélection. Plusieurs versions paramétriques de ces modèles ont été largement étudiées dans la littérature, cependant dans les modèles à variation discrète, la paramétrisation est une source importante d’identification. Dans un tel contexte, il est donc difficile de savoir si les résultats empiriques obtenus sont guidés par les données ou par la paramétrisation imposée au modèle. Etant donné, que les formes paramétriques proposées pour ces types de modèles n’ont généralement pas de fondement économique, je propose dans cette thèse de regarder la version nonparamétrique de ces modèles. Ceci permettra donc de proposer des politiques économiques plus robustes. La principale difficulté dans l’identification nonparamétrique de fonctions structurelles, est le fait que la structure suggérée ne permet pas d’identifier un unique processus générateur des données et ceci peut être du soit à la présence d’équilibres multiples ou soit à des contraintes sur les observables. Dans de telles situations, les méthodes d’identifications traditionnelles deviennent inapplicable d’où le récent développement de la littérature sur l’identification dans les modèles incomplets. Cette littérature porte une attention particuliere à l’identification de l’ensemble des fonctions structurelles d’intérêt qui sont compatibles avec la vraie distribution des données, cet ensemble est appelé : l’ensemble identifié. Par conséquent, dans le premier chapitre de la thèse, je caractérise l’ensemble identifié pour les effets de traitements dans le modèle triangulaire binaire. Dans le second chapitre, je considère le modèle de Roy discret. Je caractérise l’ensemble identifié pour les effets de traitements dans un modèle de choix de secteur lorsque la variable d’intérêt est discrète. Les hypothèses de sélection du secteur comprennent le choix de sélection simple, étendu et généralisé de Roy. Dans le dernier chapitre, je considère un modèle à variable dépendante binaire avec plusieurs dimensions d’hétérogéneité, tels que les jeux d’entrées ou de participation. je caractérise l’ensemble identifié pour les fonctions de profits des firmes dans un jeux avec deux firmes et à information complète. Dans tout les chapitres, l’ensemble identifié des fonctions d’intérêt sont écrites sous formes de bornes et assez simple pour être estimées à partir des méthodes d’inférence existantes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The extent to which the four-dimensional variational data assimilation (4DVAR) is able to use information about the time evolution of the atmosphere to infer the vertical spatial structure of baroclinic weather systems is investigated. The singular value decomposition (SVD) of the 4DVAR observability matrix is introduced as a novel technique to examine the spatial structure of analysis increments. Specific results are illustrated using 4DVAR analyses and SVD within an idealized 2D Eady model setting. Three different aspects are investigated. The first aspect considers correcting errors that result in normal-mode growth or decay. The results show that 4DVAR performs well at correcting growing errors but not decaying errors. Although it is possible for 4DVAR to correct decaying errors, the assimilation of observations can be detrimental to a forecast because 4DVAR is likely to add growing errors instead of correcting decaying errors. The second aspect shows that the singular values of the observability matrix are a useful tool to identify the optimal spatial and temporal locations for the observations. The results show that the ability to extract the time-evolution information can be maximized by placing the observations far apart in time. The third aspect considers correcting errors that result in nonmodal rapid growth. 4DVAR is able to use the model dynamics to infer some of the vertical structure. However, the specification of the case-dependent background error variances plays a crucial role.