984 resultados para Minimal-model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Electrical stimulation is a new way to treat digestive disorders such as constipation. Colonic propulsive activity can be triggered by battery operated devices. This study aimed to demonstrate the effect of direct electrical colonic stimulation on mean transit time in a chronic porcine model. The impact of stimulation and implanted material on the colonic wall was also assessed. Three pairs of electrodes were implanted into the caecal wall of 12 anaesthetized pigs. Reference colonic transit time was determined by radiopaque markers for each pig before implantation. It was repeated 4 weeks after implantation with sham stimulation and 5 weeks after implantation with electrical stimulation. Aboral sequential trains of 1-ms pulse width (10 V; 120 Hz) were applied twice daily for 6 days, using an external battery operated stimulator. For each course of markers, a mean value was computed from transit times obtained from individual pig. Microscopic examination of the caecum was routinely performed after animal sacrifice. A reduction of mean transit time was observed after electrical stimulation (19 +/- 13 h; mean +/- SD) when compared to reference (34 +/- 7 h; P = 0.045) and mean transit time after sham stimulation (36 +/- 9 h; P = 0.035). Histological examination revealed minimal chronic inflammation around the electrodes. Colonic transit time measured in a chronic porcine model is reduced by direct sequential electrical stimulation. Minimal tissue lesion is elicited by stimulation or implanted material. Electrical colonic stimulation could be a promising approach to treat specific disorders of the large bowel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interaction of atomic hydrogen with C4H9, Si4H9, and Ge4H9 model clusters has been studied using all-electron and pseudopotential ab initio Hartree-Fock computations with basis sets of increasing flexibility. The results show that the effect of polarization functions is important in order to reproduce the experimental findings, but their inclusion only for the atoms directly involved in the chemisorption bond is usually sufficient. For the systems H-C4H9 and H-Si4H9 all-electron and pseudopotential results are in excellent agreement when basis sets of comparable quality are used. Besides, semiempirical modified-neglect-of-differential-overlap computations provide quite reliable results both for diamond and silicon and have been used to investigate larger model clusters. The results confirm the local nature of chemisorption and further justify the use of minimal X4H9 model clusters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Propionibacterium acnes is an important cause of orthopedic-implant-associated infections, for which the optimal treatment has not yet been determined. We investigated the activity of rifampin, alone and in combination, against planktonic and biofilm P. acnes in vitro and in a foreign-body infection model. The MIC and the minimal bactericidal concentration (MBC) were 0.007 and 4 μg/ml for rifampin, 1 and 4 μg/ml for daptomycin, 1 and 8 μg/ml for vancomycin, 1 and 2 μg/ml for levofloxacin, 0.03 and 16 μg/ml for penicillin G, 0.125 and 512 μg/ml for clindamycin, and 0.25 and 32 μg/ml for ceftriaxone. The P. acnes minimal biofilm eradication concentration (MBEC) was 16 μg/ml for rifampin; 32 μg/ml for penicillin G; 64 μg/ml for daptomycin and ceftriaxone; and ≥128 μg/ml for levofloxacin, vancomycin, and clindamycin. In the animal model, implants were infected by injection of 10⁹ CFU P. acnes in cages. Antimicrobial activity on P. acnes was investigated in the cage fluid (planktonic form) and on explanted cages (biofilm form). The cure rates were 4% for daptomycin, 17% for vancomycin, 0% for levofloxacin, and 36% for rifampin. Rifampin cured 63% of the infected cages in combination with daptomycin, 46% with vancomycin, and 25% with levofloxacin. While all tested antimicrobials showed good activity against planktonic P. acnes, for eradication of biofilms, rifampin was needed. In combination with rifampin, daptomycin showed higher cure rates than with vancomycin in this foreign-body infection model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Limited antimicrobial agents are available for the treatment of implant-associated infections caused by fluoroquinolone-resistant Gram-negative bacilli. We compared the activities of fosfomycin, tigecycline, colistin, and gentamicin (alone and in combination) against a CTX-M15-producing strain of Escherichia coli (Bj HDE-1) in vitro and in a foreign-body infection model. The MIC and the minimal bactericidal concentration in logarithmic phase (MBC(log)) and stationary phase (MBC(stat)) were 0.12, 0.12, and 8 μg/ml for fosfomycin, 0.25, 32, and 32 μg/ml for tigecycline, 0.25, 0.5, and 2 μg/ml for colistin, and 2, 8, and 16 μg/ml for gentamicin, respectively. In time-kill studies, colistin showed concentration-dependent activity, but regrowth occurred after 24 h. Fosfomycin demonstrated rapid bactericidal activity at the MIC, and no regrowth occurred. Synergistic activity between fosfomycin and colistin in vitro was observed, with no detectable bacterial counts after 6 h. In animal studies, fosfomycin reduced planktonic counts by 4 log(10) CFU/ml, whereas in combination with colistin, tigecycline, or gentamicin, it reduced counts by >6 log(10) CFU/ml. Fosfomycin was the only single agent which was able to eradicate E. coli biofilms (cure rate, 17% of implanted, infected cages). In combination, colistin plus tigecycline (50%) and fosfomycin plus gentamicin (42%) cured significantly more infected cages than colistin plus gentamicin (33%) or fosfomycin plus tigecycline (25%) (P < 0.05). The combination of fosfomycin plus colistin showed the highest cure rate (67%), which was significantly better than that of fosfomycin alone (P < 0.05). In conclusion, the combination of fosfomycin plus colistin is a promising treatment option for implant-associated infections caused by fluoroquinolone-resistant Gram-negative bacilli.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The available virus-like particle (VLP)-based prophylactic vaccines against specific human papillomavirus (HPV) types afford close to 100% protection against the type-associated lesions and disease. Based on papillomavirus animal models, it is likely that protection against genital lesions in humans is mediated by HPV type-restricted neutralizing antibodies that transudate or exudate at the sites of genital infection. However, a correlate of protection was not established in the clinical trials because few disease cases occurred, and true incident infection could not be reliably distinguished from the emergence or reactivation of prevalent infection. In addition, the current assays for measuring vaccine-induced antibodies, even the gold standard HPV pseudovirion (PsV) in vitro neutralization assay, may not be sensitive enough to measure the minimum level of antibodies needed for protection. Here, we characterize the recently developed model of genital challenge with HPV PsV and determine the minimal amounts of VLP-induced neutralizing antibodies that can afford protection from genital infection in vivo after transfer into recipient mice. Our data show that serum antibody levels >100-fold lower than those detectable by in vitro PsV neutralization assays are sufficient to confer protection against an HPV PsV genital infection in this model. The results clearly demonstrate that, remarkably, the in vivo assay is substantially more sensitive than in vitro PsV neutralization and thus may be better suited for studies to establish correlates of protection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several methods and approaches for measuring parameters to determine fecal sources of pollution in water have been developed in recent years. No single microbial or chemical parameter has proved sufficient to determine the source of fecal pollution. Combinations of parameters involving at least one discriminating indicator and one universal fecal indicator offer the most promising solutions for qualitative and quantitative analyses. The universal (nondiscriminating) fecal indicator provides quantitative information regarding the fecal load. The discriminating indicator contributes to the identification of a specific source. The relative values of the parameters derived from both kinds of indicators could provide information regarding the contribution to the total fecal load from each origin. It is also essential that both parameters characteristically persist in the environment for similar periods. Numerical analysis, such as inductive learning methods, could be used to select the most suitable and the lowest number of parameters to develop predictive models. These combinations of parameters provide information on factors affecting the models, such as dilution, specific types of animal source, persistence of microbial tracers, and complex mixtures from different sources. The combined use of the enumeration of somatic coliphages and the enumeration of Bacteroides-phages using different host specific strains (one from humans and another from pigs), both selected using the suggested approach, provides a feasible model for quantitative and qualitative analyses of fecal source identification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several methods and approaches for measuring parameters to determine fecal sources of pollution in water have been developed in recent years. No single microbial or chemical parameter has proved sufficient to determine the source of fecal pollution. Combinations of parameters involving at least one discriminating indicator and one universal fecal indicator offer the most promising solutions for qualitative and quantitative analyses. The universal (nondiscriminating) fecal indicator provides quantitative information regarding the fecal load. The discriminating indicator contributes to the identification of a specific source. The relative values of the parameters derived from both kinds of indicators could provide information regarding the contribution to the total fecal load from each origin. It is also essential that both parameters characteristically persist in the environment for similar periods. Numerical analysis, such as inductive learning methods, could be used to select the most suitable and the lowest number of parameters to develop predictive models. These combinations of parameters provide information on factors affecting the models, such as dilution, specific types of animal source, persistence of microbial tracers, and complex mixtures from different sources. The combined use of the enumeration of somatic coliphages and the enumeration of Bacteroides-phages using different host specific strains (one from humans and another from pigs), both selected using the suggested approach, provides a feasible model for quantitative and qualitative analyses of fecal source identification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Left atrial (LA) dilatation is associated with a large variety of cardiac diseases. Current cardiovascular magnetic resonance (CMR) strategies to measure LA volumes are based on multi-breath-hold multi-slice acquisitions, which are time-consuming and susceptible to misregistration. AIM: To develop a time-efficient single breath-hold 3D CMR acquisition and reconstruction method to precisely measure LA volumes and function. METHODS: A highly accelerated compressed-sensing multi-slice cine sequence (CS-cineCMR) was combined with a non-model-based 3D reconstruction method to measure LA volumes with high temporal and spatial resolution during a single breath-hold. This approach was validated in LA phantoms of different shapes and applied in 3 patients. In addition, the influence of slice orientations on accuracy was evaluated in the LA phantoms for the new approach in comparison with a conventional model-based biplane area-length reconstruction. As a reference in patients, a self-navigated high-resolution whole-heart 3D dataset (3D-HR-CMR) was acquired during mid-diastole to yield accurate LA volumes. RESULTS: Phantom studies. LA volumes were accurately measured by CS-cineCMR with a mean difference of -4.73 ± 1.75 ml (-8.67 ± 3.54%, r2 = 0.94). For the new method the calculated volumes were not significantly different when different orientations of the CS-cineCMR slices were applied to cover the LA phantoms. Long-axis "aligned" vs "not aligned" with the phantom long-axis yielded similar differences vs the reference volume (-4.87 ± 1.73 ml vs. -4.45 ± 1.97 ml, p = 0.67) and short-axis "perpendicular" vs. "not-perpendicular" with the LA long-axis (-4.72 ± 1.66 ml vs. -4.75 ± 2.13 ml; p = 0.98). The conventional bi-plane area-length method was susceptible for slice orientations (p = 0.0085 for the interaction of "slice orientation" and "reconstruction technique", 2-way ANOVA for repeated measures). To use the 3D-HR-CMR as the reference for LA volumes in patients, it was validated in the LA phantoms (mean difference: -1.37 ± 1.35 ml, -2.38 ± 2.44%, r2 = 0.97). Patient study: The CS-cineCMR LA volumes of the mid-diastolic frame matched closely with the reference LA volume (measured by 3D-HR-CMR) with a difference of -2.66 ± 6.5 ml (3.0% underestimation; true LA volumes: 63 ml, 62 ml, and 395 ml). Finally, a high intra- and inter-observer agreement for maximal and minimal LA volume measurement is also shown. CONCLUSIONS: The proposed method combines a highly accelerated single-breathhold compressed-sensing multi-slice CMR technique with a non-model-based 3D reconstruction to accurately and reproducibly measure LA volumes and function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alpine tree-line ecotones are characterized by marked changes at small spatial scales that may result in a variety of physiognomies. A set of alternative individual-based models was tested with data from four contrasting Pinus uncinata ecotones in the central Spanish Pyrenees to reveal the minimal subset of processes required for tree-line formation. A Bayesian approach combined with Markov chain Monte Carlo methods was employed to obtain the posterior distribution of model parameters, allowing the use of model selection procedures. The main features of real tree lines emerged only in models considering nonlinear responses in individual rates of growth or mortality with respect to the altitudinal gradient. Variation in tree-line physiognomy reflected mainly changes in the relative importance of these nonlinear responses, while other processes, such as dispersal limitation and facilitation, played a secondary role. Different nonlinear responses also determined the presence or absence of krummholz, in agreement with recent findings highlighting a different response of diffuse and abrupt or krummholz tree lines to climate change. The method presented here can be widely applied in individual-based simulation models and will turn model selection and evaluation in this type of models into a more transparent, effective, and efficient exercise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The kinematics of the anatomical shoulder are analysed and modelled as a parallel mechanism similar to a Stewart platform. A new method is proposed to describe the shoulder kinematics with minimal coordinates and solve the indeterminacy. The minimal coordinates are defined from bony landmarks and the scapulothoracic kinematic constraints. Independent from one another, they uniquely characterise the shoulder motion. A humanoid mechanism is then proposed with identical kinematic properties. It is then shown how minimal coordinates can be obtained for this mechanism and how the coordinates simplify both the motion-planning task and trajectory-tracking control. Lastly, the coordinates are also shown to have an application in the field of biomechanics where they can be used to model the scapulohumeral rhythm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Repair of segmental defects in load-bearing long bones is a challenging task because of the diversity of the load affecting the area; axial, bending, shearing and torsional forces all come together to test the stability/integrity of the bone. The natural biomechanical requirements for bone restorative materials include strength to withstand heavy loads, and adaptivity to conform into a biological environment without disturbing or damaging it. Fiber-reinforced composite (FRC) materials have shown promise, as metals and ceramics have been too rigid, and polymers alone are lacking in strength which is needed for restoration. The versatility of the fiber-reinforced composites also allows tailoring of the composite to meet the multitude of bone properties in the skeleton. The attachment and incorporation of a bone substitute to bone has been advanced by different surface modification methods. Most often this is achieved by the creation of surface texture, which allows bone growth, onto the substitute, creating a mechanical interlocking. Another method is to alter the chemical properties of the surface to create bonding with the bone – for example with a hydroxyapatite (HA) or a bioactive glass (BG) coating. A novel fiber-reinforced composite implant material with a porous surface was developed for bone substitution purposes in load-bearing applications. The material’s biomechanical properties were tailored with unidirectional fiber reinforcement to match the strength of cortical bone. To advance bone growth onto the material, an optimal surface porosity was created by a dissolution process, and an addition of bioactive glass to the material was explored. The effects of dissolution and orientation of the fiber reinforcement were also evaluated for bone-bonding purposes. The Biological response to the implant material was evaluated in a cell culture study to assure the safety of the materials combined. To test the material’s properties in a clinical setting, an animal model was used. A critical-size bone defect in a rabbit’s tibia was used to test the material in a load-bearing application, with short- and long-term follow-up, and a histological evaluation of the incorporation to the host bone. The biomechanical results of the study showed that the material is durable and the tailoring of the properties can be reproduced reliably. The Biological response - ex vivo - to the created surface structure favours the attachment and growth of bone cells, with the additional benefit of bioactive glass appearing on the surface. No toxic reactions to possible agents leaching from the material could be detected in the cell culture study when compared to a nontoxic control material. The mechanical interlocking was enhanced - as expected - with the porosity, whereas the reinforcing fibers protruding from the surface of the implant gave additional strength when tested in a bone-bonding model. Animal experiments verified that the material is capable of withstanding load-bearing conditions in prolonged use without breaking of the material or creating stress shielding effects to the host bone. A Histological examination verified the enhanced incorporation to host bone with an abundance of bone growth onto and over the material. This was achieved with minimal tissue reactions to a foreign body. An FRC implant with surface porosity displays potential in the field of reconstructive surgery, especially regarding large bone defects with high demands on strength and shape retention in load-bearing areas or flat bones such as facial / cranial bones. The benefits of modifying the strength of the material and adjusting the surface properties with fiber reinforcement and bone-bonding additives to meet the requirements of different bone qualities are still to be fully discovered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, Small Modular Reactors (SMRs) have attracted increased public discussion. While large nuclear power plant new build projects are facing challenges, the focus of attention is turning to small modular reactors. One particular project challenge arises in the area of nuclear licensing, which plays a significant role in new build projects affecting their quality as well as costs and schedules. This dissertation - positioned in the field of nuclear engineering but also with a significant section in the field of systems engineering - examines the nuclear licensing processes and their suitability for the characteristics of SMRs. The study investigates the licensing processes in selected countries, as well as other safety critical industry fields. Viewing the licensing processes and their separate licensing steps in terms of SMRs, the study adopts two different analysis theories for review and comparison. The primary data consists of a literature review, semi-structured interviews, and questionnaire responses concerning licensing processes and practices. The result of the study is a recommendation for a new, optimized licensing process for SMRs. The most important SMR-specific feature, in terms of licensing, is the modularity of the design. Here the modularity indicates multi-module SMR designs, which creates new challenges in the licensing process. As this study focuses on Finland, the main features of the new licensing process are adapted to the current Finnish licensing process, aiming to achieve the main benefits with minimal modifications to the current process. The application of the new licensing process is developed using Systems Engineering, Requirements Management, and Project Management practices and tools. Nuclear licensing includes a large amount of data and documentation which needs to be managed in a suitable manner throughout the new build project and then during the whole life cycle of the nuclear power plant. To enable a smooth licensing process and therefore ensure the success of the new build nuclear power plant project, management processes and practices play a significant role. This study contributes to the theoretical understanding of how licensing processes are structured and how they are put into action in practice. The findings clarify the suitability of different licensing processes and their selected licensing steps for SMR licensing. The results combine the most suitable licensing steps into a new licensing process for SMRs. The results are also extended to the concept of licensing management practices and tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electroacupuncture has been proposed to be a low cost and practical method that allows effective pain management with minimal collateral effects. In this study we have examined the effect of electroacupuncture against the hyperalgesia developed in a model of post-incisional pain in rats. A 1-cm longitudinal incision was made through the skin and fascia of the plantar region of the animal hind paw. Mechanical hyperalgesia in the incision was evaluated 135 min after the surgery with von Frey filaments. The tension threshold was reduced from 75 g (upper limit of the test) to 1.36 ± 0.36 g (mean ± SEM) in control rats. It is shown that a 15-min period of electroacupuncture applied 120 min after surgery to the Zusanli (ST36) and Sanyinjiao (SP6) points, but not to non-acupoints, produces a significant and long-lasting reduction of the mechanical hyperalgesia induced by the surgical incision of the plantar surface of the ipsilateral hind paw. The tension threshold was reduced from 75 to 27.6 ± 4.2 g in animals soon after the end of electroacupuncture. The mechanical threshold in this group was about 64% less than in control. Electroacupuncture was ineffective in rats treated 10 min earlier with naloxone (1 mg/kg, ip), thus confirming the involvement of opioid mechanisms in the antinociceptive effects of such procedure. The results indicate that post-incisional pain is a useful model for studying the anti-hyperalgesic properties of electroacupuncture in laboratory animals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.