933 resultados para Set of Weak Stationary Dynamic Actions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work employs a set of complementary techniques to investigate the influence of outlying Ru(II) groups on the ground- and excited-state photophysical properties of free-base tetrapyridyl porphyrin (H(2)TPyP). Single pulse and, pulse train Z-scan techniques used M association with laser flash photolysis, absorbance and fluorescence spectroscopy, and fluorescence decay measurements, allowed us to conclude that the presence of outlying Ru(II) groups causes significant changes on both electronic structure and vibrational properties of porphyrin. Such modifications take place mainly due to the activation of. nonradiative decay channels responsible for the emission, quenching, as well as by favoring some vibrational modes in the light absorption process, It is also observed that, differently from what happens when the Ru(II) is placed at the center of the macrocycle, the peripheral groups cause an increase of the intersystem crossing processes, probably due to the structural distortion of the ring that implies a worse spin orbit coupling, responsible for the intersystem crossing mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of epidemiological intelligence, as a construction of information societies, goes beyond monitoring a list of diseases and the ability to elicit rapid responses. The concept should consider the complexity of the definition of epidemiology in the identification of this object of study without being limited to a set of actions in a single government sector. The activities of epidemiological intelligence include risk assessment, strategies for prevention and protection, subsystems of information, crisis management rooms, geographical analysis, etc. This concept contributes to the understanding of policies in health, in multisectorial and geopolitical dimensions, as regards the organization of services around public health emergencies, primary healthcare, as well as disasters. The activities of epidemiological intelligence should not be restricted to scientific research, but the researchers must beware of threats to public health. Lalonde's model enabled consideration of epidemiological intelligence as a way to restructure policies and share resources by creating communities of intelligence, whose purpose is primarily to deal with public health emergencies and disasters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Paulo CA, Roschel H, Ugrinowitsch C, Kobal R and Tricoli V. Influence of different resistance exercise loading schemes on mechanical power output in work to rest ratio-equated and -nonequated conditions. J Strength Cond Res 26(5): 1308-1312, 2012-It is well known that most sports are characterized by the performance of intermittent high-intensity actions, requiring high muscle power production within different intervals. In fact, the manipulation of the exercise to rest ratio in muscle power training programs may constitute an interesting strategy when considering the specific performance demand of a given sport modality. Thus, the aim of this study was to evaluate the influence of different schemes of rest intervals and number of repetitions per set on muscle power production in the squat exercise between exercise to rest ratio-equated and -nonequated conditions. Nineteen young males (age: 25.7 +/- 4.4 years; weight: 81.3 +/- 13.7 kg; height: 178.1 +/- 5.5 cm) were randomly submitted to 3 different resistance exercise loading schemes, as follows: short-set short-interval condition (SSSI; 12 sets of 3 repetitions with a 27.3-second interval between sets); short-set long-interval condition (SSLI; 12 sets of 3 repetitions with a 60-second interval between sets); long-set long-interval (LSLI; 6 sets of 6 repetitions with a 60-second rest interval between sets). The main finding of the present study is that the lower exercise to rest ratio protocol (SSLI) resulted in greater average power production (601.88 +/- 142.48 W) when compared with both SSSI and LSLI (581.86 +/- 113.18 W; 578 +/- 138.78 W, respectively). Additionally, both the exercise to rest ratio-equated conditions presented similar performance and metabolic results. In summary, these findings suggest that shorter rest intervals may fully restore the individual's ability to produce muscle power if a smaller exercise volume per set is performed and that lower exercise to rest ratio protocols result in greater average power production when compared with higher ratio ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this analysis a 3.5 years data set of aerosol and precipitation chemistry, obtained in a remote site in Central Amazonia (Balbina, (1A degrees 55' S, 59A degrees 29' W, 174 m a.s.l.), about 200 km north of Manaus) is discussed. Aerosols were sampled using stacked filter units (SFU), which separate fine (d < 2.5 mu m) and coarse mode (2.5 mu m < d < 10.0 mu m) aerosol particles. Filters were analyzed for particulate mass (PM), Equivalent Black Carbon (BCE) and elemental composition by Particle Induced X-Ray Emission (PIXE). Rainwater samples were collected using a wet-only sampler and samples were analyzed for pH and ionic composition, which was determined using ionic chromatography (IC). Natural sources dominated the aerosol mass during the wet season, when it was predominantly of natural biogenic origin mostly in the coarse mode, which comprised up to 81% of PM10. Biogenic aerosol from both primary emissions and secondary organic aerosol dominates the fine mode in the wet season, with very low concentrations (average 2.2 mu g m(-3)). Soil dust was responsible for a minor fraction of the aerosol mass (less than 17%). Sudden increases in the concentration of elements as Al, Ti and Fe were also observed, both in fine and coarse mode (mostly during the April-may months), which we attribute to episodes of Saharan dust transport. During the dry periods, a significant contribution to the fine aerosols loading was observed, due to the large-scale transport of smoke from biomass burning in other portions of the Amazon basin. This contribution is associated with the enhancement of the concentration of S, K, Zn and BCE. Chlorine, which is commonly associated to sea salt and also to biomass burning emissions, presented higher concentration not only during the dry season but also for the April-June months, due to the establishment of more favorable meteorological conditions to the transport of Atlantic air masses to Central Amazonia. The chemical composition of rainwater was similar to those ones observed in other remote sites in tropical forests. The volume-weighted mean (VWM) pH was 4.90. The most important contribution to acidity was from weak organic acids. The organic acidity was predominantly associated with the presence of acetic acid instead of formic acid, which is more often observed in pristine tropical areas. Wet deposition rates for major species did not differ significantly between dry and wet season, except for NH4+, citrate and acetate, which had smaller deposition rates during dry season. While biomass burning emissions were clearly identified in the aerosol component, it did not present a clear signature in rainwater. The biogenic component and the long-range transport of sea salt were observed both in aerosols and rainwater composition. The results shown here indicate that in Central Amazonia it is still possible to observe quite pristine atmospheric conditions, relatively free of anthropogenic influences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The periodic spectroscopic events in eta Carinae are now well established and occur near the periastron passage of two massive stars in a very eccentric orbit. Several mechanisms have been proposed to explain the variations of different spectral features, such as an eclipse by the wind-wind collision (WWC) boundary, a shell ejection from the primary star or accretion of its wind onto the secondary. All of them have problems explaining all the observed phenomena. To better understand the nature of the cyclic events, we performed a dense monitoring of eta Carinae with five Southern telescopes during the 2009 low-excitation event, resulting in a set of data of unprecedented quality and sampling. The intrinsic luminosity of the He II lambda 4686 emission line (L similar to 310 L-circle dot) just before periastron reveals the presence of a very luminous transient source of extreme UV radiation emitted in the WWC region. Clumps in the primary's wind probably explain the flare-like behavior of both the X-ray and He II lambda 4686 light curves. After a short-lived minimum, He II lambda 4686 emission rises again to a new maximum, when X-rays are still absent or very weak. We interpret this as a collapse of the WWC onto the "surface" of the secondary star, switching off the hard X-ray source and diminishing the WWC shock cone. The recovery from this state is controlled by the momentum balance between the secondary's wind and the clumps in the primary's wind.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present two new constraint qualifications (CQs) that are weaker than the recently introduced relaxed constant positive linear dependence (RCPLD) CQ. RCPLD is based on the assumption that many subsets of the gradients of the active constraints preserve positive linear dependence locally. A major open question was to identify the exact set of gradients whose properties had to be preserved locally and that would still work as a CQ. This is done in the first new CQ, which we call the constant rank of the subspace component (CRSC) CQ. This new CQ also preserves many of the good properties of RCPLD, such as local stability and the validity of an error bound. We also introduce an even weaker CQ, called the constant positive generator (CPG), which can replace RCPLD in the analysis of the global convergence of algorithms. We close this work by extending convergence results of algorithms belonging to all the main classes of nonlinear optimization methods: sequential quadratic programming, augmented Lagrangians, interior point algorithms, and inexact restoration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motoneuron (MN) dendrites may be changed from a passive to an active state by increasing the levels of spinal cord neuromodulators, which activate persistent inward currents (PICs). These exert a powerful influence on MN behavior and modify the motor control both in normal and pathological conditions. Motoneuronal PICs are believed to induce nonlinear phenomena such as the genesis of extra torque and torque hysteresis in response to percutaneous electrical stimulation or tendon vibration in humans. An existing large-scale neuromuscular simulator was expanded to include MN models that have a capability to change their dynamic behaviors depending on the neuromodulation level. The simulation results indicated that the variability (standard deviation) of a maintained force depended on the level of neuromodulatory activity. A force with lower variability was obtained when the motoneuronal network was under a strong influence of PICs, suggesting a functional role in postural and precision tasks. In an additional set of simulations when PICs were active in the dendrites of the MN models, the results successfully reproduced experimental results reported from humans. Extra torque was evoked by the self-sustained discharge of spinal MNs, whereas differences in recruitment and de-recruitment levels of the MNs were the main reason behind torque and electromyogram (EMG) hysteresis. Finally, simulations were also used to study the influence of inhibitory inputs on a MN pool that was under the effect of PICs. The results showed that inhibition was of great importance in the production of a phasic force, requiring a reduced co-contraction of agonist and antagonist muscles. These results show the richness of functionally relevant behaviors that can arise from a MN pool under the action of PICs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A systematic approach to model nonlinear systems using norm-bounded linear differential inclusions (NLDIs) is proposed in this paper. The resulting NLDI model is suitable for the application of linear control design techniques and, therefore, it is possible to fulfill certain specifications for the underlying nonlinear system, within an operating region of interest in the state-space, using a linear controller designed for this NLDI model. Hence, a procedure to design a dynamic output feedback controller for the NLDI model is also proposed in this paper. One of the main contributions of the proposed modeling and control approach is the use of the mean-value theorem to represent the nonlinear system by a linear parameter-varying model, which is then mapped into a polytopic linear differential inclusion (PLDI) within the region of interest. To avoid the combinatorial problem that is inherent of polytopic models for medium- and large-sized systems, the PLDI is transformed into an NLDI, and the whole process is carried out ensuring that all trajectories of the underlying nonlinear system are also trajectories of the resulting NLDI within the operating region of interest. Furthermore, it is also possible to choose a particular structure for the NLDI parameters to reduce the conservatism in the representation of the nonlinear system by the NLDI model, and this feature is also one important contribution of this paper. Once the NLDI representation of the nonlinear system is obtained, the paper proposes the application of a linear control design method to this representation. The design is based on quadratic Lyapunov functions and formulated as search problem over a set of bilinear matrix inequalities (BMIs), which is solved using a two-step separation procedure that maps the BMIs into a set of corresponding linear matrix inequalities. Two numerical examples are given to demonstrate the effectiveness of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of geoid models to estimate the Mean Dynamic Topography was stimulated with the launching of the GRACE satellite system, since its models present unprecedented precision and space-time resolution. In the present study, besides the DNSC08 mean sea level model, the following geoid models were used with the objective of computing the MDTs: EGM96, EIGEN-5C and EGM2008. In the method adopted, geostrophic currents for the South Atlantic were computed based on the MDTs. In this study it was found that the degree and order of the geoid models affect the determination of TDM and currents directly. The presence of noise in the MDT requires the use of efficient filtering techniques, such as the filter based on Singular Spectrum Analysis, which presents significant advantages in relation to conventional filters. Geostrophic currents resulting from geoid models were compared with the HYCOM hydrodynamic numerical model. In conclusion, results show that MDTs and respective geostrophic currents calculated with EIGEN-5C and EGM2008 models are similar to the results of the numerical model, especially regarding the main large scale features such as boundary currents and the retroflection at the Brazil-Malvinas Confluence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We prove a uniqueness result related to the Germain–Lagrange dynamic plate differential equation. We consider the equation {∂2u∂t2+△2u=g⊗f,in ]0,+∞)×R2,u(0)=0,∂u∂t(0)=0, where uu stands for the transverse displacement, ff is a distribution compactly supported in space, and g∈Lloc1([0,+∞)) is a function of time such that g(0)≠0g(0)≠0 and there is a T0>0T0>0 such that g∈C1[0,T0[g∈C1[0,T0[. We prove that the knowledge of uu over an arbitrary open set of the plate for any interval of time ]0,T[]0,T[, 0

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Programa de doctorado: Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería Instituto Universitario (SIANI)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die protokollbasierte Medizin stellt einen interdisziplinären Brennpunkt der Informatik dar. Als besonderer Ausschnitt der medizinischen Teilgebiete erlaubt sie die relativ formale Spezifikation von Prozessen in den drei Bereichen der Prävention, Diagnose und Therapie.Letzterer wurde immer besonders fokussiert und gilt seit jeher im Rahmen klinischer Studien als Projektionsfläche für informationstechnologische Konzepte. Die Euphorie der frühen Jahre ernüchtert sich jedoch bei jeder Bilanz. Nur sehr wenige der unzählbaren Projekte haben ihre Routine in der alltäglichen Praxis gefunden. Die meisten Vorhaben sind an der Illusion der vollständigen Berechenbarkeit medizinischer Arbeitsabläufe gescheitert. Die traditionelle Sichtweise der klinischen Praxis beruht auf einer blockorientierten Vorstellung des Therapieausführungsprozesses. Sie entsteht durch seine Zerlegung in einzelne Therapiezweige, welche aus vordefinierten Blöcken zusammengesetzt sind. Diese können sequentiell oder parallel ausgeführt werden und sind selbst zusammengesetzt aus jeweils einer Menge von Elementen,welche die Aktivitäten der untersten Ebene darstellen. Das blockorientierte Aufbaumodell wird ergänzt durch ein regelorientiertes Ablaufmodell. Ein komplexes Regelwerk bestimmt Bedingungen für die zeitlichen und logischen Abhängigkeiten der Blöcke, deren Anordnung durch den Ausführungsprozeß gebildet wird. Die Modellierung der Therapieausführung steht zunächst vor der grundsätzlichen Frage, inwieweit die traditionelle Sichtweise für eine interne Repräsentation geeignet ist. Das übergeordnete Ziel besteht in der Integration der unterschiedlichen Ebenen der Therapiespezifikation. Dazu gehört nicht nur die strukturelle Komponente, sondern vorallem die Ablaufkomponente. Ein geeignetes Regelmodell ist erforderlich, welches den spezifischen Bedürfnissen der Therapieüberwachung gerecht wird. Die zentrale Aufgabe besteht darin, diese unterschiedlichen Ebenen zusammenzuführen. Eine sinnvolle Alternative zur traditionellen Sichtweise liefert das zustandsorientierte Modell des Therapieausführungsprozesses. Das zustandsorientierte Modell beruht auf der Sichtweise, daß der gesamte Therapieausführungsprozeß letztendlich eine lineare Folge von Zuständen beschreibt, wobei jeder Zustandsübergang durch ein Ereignis eingeleitet wird, an bestimmte Bedingungen geknüpft ist und bestimmte Aktionen auslösen kann. Die Parallelität des blockorientierten Modells tritt in den Hintergrund, denn die Menge der durchzuführenden Maßnahmen sind lediglich Eigenschaften der Zustände und keine strukturellen Elemente der Ablaufspezifikation. Zu jedem Zeitpunkt ist genau ein Zustand aktiv, und er repräsentiert eine von endlich vielen klinischen Situationen, mit all ihren spezifischen Aktivitäten und Ausführungsregeln. Die Vorteile des zustandsorientierten Modells liegen in der Integration. Die Grundstruktur verbindet die statische Darstellung der möglichen Phasenanordnungen mit der dynamischen Ausführung aktiver Regeln. Die ursprünglichen Inhalte des blockorientierten Modells werden als gewöhnliche Eigenschaften der Zustände reproduziert und stellen damit nur einen Spezialfall der zustandsbezogenen Sicht dar.Weitere Möglichkeiten für die Anreicherung der Zustände mit zusätzlichen Details sind denkbar wie sinnvoll. Die Grundstruktur bleibt bei jeder Erweiterung jedoch die gleiche. Es ergibt sich ein wiederverwendbares Grundgerüst,ein gemeinsamer Nenner für die Erfüllung der Überwachungsaufgabe.