901 resultados para Formal Methods. Component-Based Development. Competition. Model Checking


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose, An integrated ionic mobility-pore model for epidermal iontophoresis is developed from theoretical considerations using both the free volume and pore restriction forms of the model for a range of solute radii (r(j)) approaching the pore radii (r(p)) as well as approximation of the pore restriction form for r(j)/r(p) < 0.4. In this model, we defined the determinants for iontophoresis as solute size (defined by MV, MW or radius), solute mobility, solute shape, solute charge, the Debye layer thickness, total current applied, solute concentration, fraction ionized, presence of extraneous ions (defined by solvent conductivity), epidermal permselectivity, partitioning rates to account for interaction of unionized and ionized lipophilic solutes with the wall of the pore and electroosmosis. Methods, The ionic mobility-pore model was developed from theoretical considerations to include each of the determinants of iontophoretic transport. The model was then used to reexamine iontophoretic flux conductivity and iontophoretic flux-fraction ionized literature data on the determinants of iontophoretic flux. Results. The ionic mobility-pore model was found to be consistent with existing experimental data and determinants defining iontophoretic transport. However, the predicted effects of solute size on iontophoresis are more consistent with the pore-restriction than free volume form of the model. A reanalysis of iontophoretic flux-conductivity data confirmed the model's prediction that, in the absence of significant electroosmosis, the reciprocal of flux is linearly related to either donor or receptor solution conductivity. Significant interaction with the pore walls, as described by the model, accounted for the reported pH dependence of the iontophoretic transport for a range of ionizable solutes. Conclusions. The ionic mobility-pore iontophoretic model developed enables a range of determinants of iontophoresis to be described in a single unifying equation which recognises a range of determinants of iontophoretic flux.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rise of component-based software development has created an urgent need for effective application program interface (API) documentation. Experience has shown that it is hard to create precise and readable documentation. Prose documentation can provide a good overview but lacks precision. Formal methods offer precision but the resulting documentation is expensive to develop. Worse, few developers have the skill or inclination to read formal documentation. We present a pragmatic solution to the problem of API documentation. We augment the prose documentation with executable test cases, including expected outputs, and use the prose plus the test cases as the documentation. With appropriate tool support, the test cases are easy to develop and read. Such test cases constitute a completely formal, albeit partial, specification of input/output behavior. Equally important, consistency between code and documentation is demonstrated by running the test cases. This approach provides an attractive bridge between formal and informal documentation. We also present a tool that supports compact and readable test cases; and generation of test drivers and documentation, and illustrate the approach with detailed case studies. (C) 2002 Elsevier Science Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Management systems standards (MSSs) have developed in an unprecedented manner in the last few years. These MSS cover a wide array of different disciplines, aims and activities of organisations. Also, organisations are populated with an enormous diversity of independent management systems (MSs). An integrated management system (IMS) tends to integrate some or all components of the business. Maximising their integration in one coherent and efficient MS is increasingly a strategic priority and constitutes an opportunity for businesses to be more competitive and consequently, promote its sustainable success. Those organisations that are quicker and more efficient in their integration and continuous improvement will have a competitive advantage in obtaining sustainable value in our global and competitive business world. Several scholars have proposed various theoretical approaches regarding the integration of management sub-systems, leading to the conclusion that there is no common practice for all organisations as they encompass different characteristics. One other author shows that several tangible and intangible gains for organisations, as well as to their internal and external stakeholders, are achieved with the integration of the individual standardised MSs. The purpose of this work was to conceive a model, Flexible, Integrator and Lean for IMSs, according to ISO 9001 for quality; ISO 14001 for environment and OHSAS 18001 for occupational health and safety (IMS–QES), that can be adapted and progressively assimilate other MSs, such as, SA 8000/ISO 26000 for social accountability, ISO 31000 for risk management and ISO/IEC 27001 for information security management, among others. The IMS–QES model was designed in the real environment of an industrial Portuguese small and medium enterprise, that over the years has been adopting, gradually, in whole or in part, individual MSSs. The developed model is based on a preliminary investigation conducted through a questionnaire. The strategy and research methods have taken into consideration the case study. Among the main findings of the survey we highlight: the creation of added value for the business through the elimination of several organisational wastes; the integrated management of the sustainability components; the elimination of conflicts between independent MS; dialogue with the main stakeholders and commitment to their ongoing satisfaction and increased contribution to the company’s competitiveness; and greater valorisation and motivation of employees as a result of the expansion of their skill base, actions and responsibilities, with their consequent empowerment. A set of key performance indicators (KPIs) constitute the support, in a perspective of business excellence, to the follow up of the organisation’s progress towards the vision and achievement of the defined objectives in the context of each component of the IMS model. The conceived model had many phases and the one presented in this work is the last required for the integration of quality, environment, safety and others individual standardised MSs. Globally, the investigation results, by themselves, justified and prioritised the conception of an IMS–QES model, to be implemented at the company where the investigation was conducted, but also a generic model of an IMS, which may be more flexible, integrator and lean as possible, potentiating the efficiency, added value both in the present and, fundamentally, for future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the development of specific slicing techniques for functional programs and their use for the identification of possible coherent components from monolithic code. An associated tool is also introduced. This piece of research is part of a broader project on program understanding and re-engineering of legacy code supported by formal methods

Relevância:

100.00% 100.00%

Publicador:

Resumo:

work presented in the context of the European Master’s program in Computational Logic, as the partial requirement for obtaining Master of Science degree in Computational Logic

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze recent contributions to growth theory based on the model of expanding variety of Romer (1990). In the first part, we present different versions of the benchmark linear model with imperfect competition. These include the labequipment model, labor-for-intermediates and directed technical change . We review applications of the expanding variety framework to the analysis of international technology diffusion, trade, cross-country productivity differences, financial development and fluctuations. In many such applications, a key role is played by complementarities in the process of innovation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two concentration methods for fast and routine determination of caffeine (using HPLC-UV detection) in surface, and wastewater are evaluated. Both methods are based on solid-phase extraction (SPE) concentration with octadecyl silica sorbents. A common “offline” SPE procedure shows that quantitative recovery of caffeine is obtained with 2 mL of an elution mixture solvent methanol-water containing at least 60% methanol. The method detection limit is 0.1 μg L−1 when percolating 1 L samples through the cartridge. The development of an “online” SPE method based on a mini-SPE column, containing 100 mg of the same sorbent, directly connected to the HPLC system allows the method detection limit to be decreased to 10 ng L−1 with a sample volume of 100 mL. The “offline” SPE method is applied to the analysis of caffeine in wastewater samples, whereas the “on-line” method is used for analysis in natural waters from streams receiving significant water intakes from local wastewater treatment plants

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diffusion of mobile telephony began in 1971 in Finland, when the first car phones, called ARP1 were taken to use. Technologies changed from ARP to NMT and later to GSM. The main application of the technology, however, was voice transfer. The birth of the Internet created an open public data network and easy access to other types of computer-based services over networks. Telephones had been used as modems, but the development of the cellular technologies enabled automatic access from mobile phones to Internet. Also other wireless technologies, for instance Wireless LANs, were also introduced. Telephony had developed from analog to digital in fixed networks and allowed easy integration of fixed and mobile networks. This development opened a completely new functionality to computers and mobile phones. It also initiated the merger of the information technology (IT) and telecommunication (TC) industries. Despite the arising opportunity for firms' new competition the applications based on the new functionality were rare. Furthermore, technology development combined with innovation can be disruptive to industries. This research focuses on the new technology's impact on competition in the ICT industry through understanding the strategic needs and alternative futures of the industry's customers. The change speed inthe ICT industry is high and therefore it was valuable to integrate the DynamicCapability view of the firm in this research. Dynamic capabilities are an application of the Resource-Based View (RBV) of the firm. As is stated in the literature, strategic positioning complements RBV. This theoretical framework leads theresearch to focus on three areas: customer strategic innovation and business model development, external future analysis, and process development combining these two. The theoretical contribution of the research is in the development of methodology integrating theories of the RBV, dynamic capabilities and strategic positioning. The research approach has been constructive due to the actual managerial problems initiating the study. The requirement for iterative and innovative progress in the research supported the chosen research approach. The study applies known methods in product development, for instance, innovation process in theGroup Decision Support Systems (GDSS) laboratory and Quality Function Deployment (QFD), and combines them with known strategy analysis tools like industry analysis and scenario method. As the main result, the thesis presents the strategic innovation process, where new business concepts are used to describe the alternative resource configurations and scenarios as alternative competitive environments, which can be a new way for firms to achieve competitive advantage in high-velocity markets. In addition to the strategic innovation process as a result, thestudy has also resulted in approximately 250 new innovations for the participating firms, reduced technology uncertainty and helped strategic infrastructural decisions in the firms, and produced a knowledge-bank including data from 43 ICT and 19 paper industry firms between the years 1999 - 2004. The methods presentedin this research are also applicable to other industries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies the possibility to use lean tools and methods in a quotation process which is carried out in an office environment. The aim of the study was to find out and test the relevant lean tools and methods which can help to balance and standardize the quotation process, and reduce the variance in quotation lead times and in quality. Seminal works, researches and guide books related to the topic were used as the basis for the theory development. Based on the literature review and the case company’s own lean experience, the applicable lean tools and methods were selected to be tested by a sales support team. Leveling production, by product categorization and value stream mapping, was a key method to be used to balance the quotation process. 5S method was started concurrently for standardizing the work. Results of the testing period showed that lean tools and methods are applicable in office process and selected tools and methods helped to balance and standardize the quotation process. Case company’s sales support team decided to implement new lean based quotation process model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the present study was to compare the effect of electroacupuncture (EA) and carprofen (CP) on postoperative incisional pain using the plantar incision (PI) model in rats. A 1-cm longitudinal incision was made through skin, fascia and muscles of a hind paw of male Wistar rats and the development of mechanical and thermal hypersensitivity was determined over 4 days using the von Frey and Hargreaves methods, respectively. Based on the experimental treatments received on the third postoperative day, the animals were divided into the following groups: PI+CP (CP, 2 mg/kg, po); PI+EAST36 (100-Hz EA applied bilaterally at the Zusanli point (ST36)); PI+EANP (EA applied to a non-acupoint region); PI+IMMO (immobilization only); PI (vehicle). In the von Frey test, the PI+EAST36 group had higher withdrawal force thresholds in response to mechanical stimuli than the PI, PI+IMMO and PI+EANP groups at several times studied. Furthermore, the PI+EAST36 group showed paw withdrawal thresholds in response to mechanical stimuli that were similar to those of the PI+CP group. In the Hargreaves test, all groups had latencies higher than those observed with PI. The PI+EAST36 group was similar to the PI+IMMO, PI+EANP and PI+CP groups. We conclude that 100-Hz EA at the ST36 point, but not at non-acupoints, can reduce mechanical nociception in the rat model of incisional pain, and its effectiveness is comparable to that of carprofen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mit aktiven Magnetlagern ist es möglich, rotierende Körper durch magnetische Felder berührungsfrei zu lagern. Systembedingt sind bei aktiv magnetgelagerten Maschinen wesentliche Signale ohne zusätzlichen Aufwand an Messtechnik für Diagnoseaufgaben verfügbar. In der Arbeit wird ein Konzept entwickelt, das durch Verwendung der systeminhärenten Signale eine Diagnose magnetgelagerter rotierender Maschinen ermöglicht und somit neben einer kontinuierlichen Anlagenüberwachung eine schnelle Bewertung des Anlagenzustandes gestattet. Fehler können rechtzeitig und ursächlich in Art und Größe erkannt und entsprechende Gegenmaßnahmen eingeleitet werden. Anhand der erfassten Signale geschieht die Gewinnung von Merkmalen mit signal- und modellgestützten Verfahren. Für den Magnetlagerregelkreis erfolgen Untersuchungen zum Einsatz modellgestützter Parameteridentifikationsverfahren, deren Verwendbarkeit wird bei der Diagnose am Regler und Leistungsverstärker nachgewiesen. Unter Nutzung von Simulationsmodellen sowie durch Experimente an Versuchsständen werden die Merkmalsverläufe im normalen Referenzzustand und bei auftretenden Fehlern aufgenommen und die Ergebnisse in einer Wissensbasis abgelegt. Diese dient als Grundlage zur Festlegung von Grenzwerten und Regeln für die Überwachung des Systems und zur Erstellung wissensbasierter Diagnosemodelle. Bei der Überwachung werden die Merkmalsausprägungen auf das Überschreiten von Grenzwerten überprüft, Informationen über erkannte Fehler und Betriebszustände gebildet sowie gegebenenfalls Alarmmeldungen ausgegeben. Sich langsam anbahnende Fehler können durch die Berechnung der Merkmalstrends mit Hilfe der Regressionsanalyse erkannt werden. Über die bisher bei aktiven Magnetlagern übliche Überwachung von Grenzwerten hinaus erfolgt bei der Fehlerdiagnose eine Verknüpfung der extrahierten Merkmale zur Identifizierung und Lokalisierung auftretender Fehler. Die Diagnose geschieht mittels regelbasierter Fuzzy-Logik, dies gestattet die Einbeziehung von linguistischen Aussagen in Form von Expertenwissen sowie die Berücksichtigung von Unbestimmtheiten und ermöglicht damit eine Diagnose komplexer Systeme. Für Aktor-, Sensor- und Reglerfehler im Magnetlagerregelkreis sowie Fehler durch externe Kräfte und Unwuchten werden Diagnosemodelle erstellt und verifiziert. Es erfolgt der Nachweis, dass das entwickelte Diagnosekonzept mit beherrschbarem Rechenaufwand korrekte Diagnoseaussagen liefert. Durch Kaskadierung von Fuzzy-Logik-Modulen wird die Transparenz des Regelwerks gewahrt und die Abarbeitung der Regeln optimiert. Endresultat ist ein neuartiges hybrides Diagnosekonzept, welches signal- und modellgestützte Verfahren der Merkmalsgewinnung mit wissensbasierten Methoden der Fehlerdiagnose kombiniert. Das entwickelte Diagnosekonzept ist für die Anpassung an unterschiedliche Anforderungen und Anwendungen bei rotierenden Maschinen konzipiert.