944 resultados para mixed verification methods
Resumo:
The use of digital communication systems is increasing very rapidly. This is due to lower system implementation cost compared to analogue transmission and at the same time, the ease with which several types of data sources (data, digitised speech and video, etc.) can be mixed. The emergence of packet broadcast techniques as an efficient type of multiplexing, especially with the use of contention random multiple access protocols, has led to a wide-spread application of these distributed access protocols in local area networks (LANs) and a further extension of them to radio and mobile radio communication applications. In this research, a proposal for a modified version of the distributed access contention protocol which uses the packet broadcast switching technique has been achieved. The carrier sense multiple access with collision avoidance (CSMA/CA) is found to be the most appropriate protocol which has the ability to satisfy equally the operational requirements for local area networks as well as for radio and mobile radio applications. The suggested version of the protocol is designed in a way in which all desirable features of its precedents is maintained. However, all the shortcomings are eliminated and additional features have been added to strengthen its ability to work with radio and mobile radio channels. Operational performance evaluation of the protocol has been carried out for the two types of non-persistent and slotted non-persistent, through mathematical and simulation modelling of the protocol. The results obtained from the two modelling procedures validate the accuracy of both methods, which compares favourably with its precedent protocol CSMA/CD (with collision detection). A further extension of the protocol operation has been suggested to operate with multichannel systems. Two multichannel systems based on the CSMA/CA protocol for medium access are therefore proposed. These are; the dynamic multichannel system, which is based on two types of channel selection, the random choice (RC) and the idle choice (IC), and the sequential multichannel system. The latter has been proposed in order to supress the effect of the hidden terminal, which always represents a major problem with the usage of the contention random multiple access protocols with radio and mobile radio channels. Verification of their operation performance evaluation has been carried out using mathematical modelling for the dynamic system. However, simulation modelling has been chosen for the sequential system. Both systems are found to improve system operation and fault tolerance when compared to single channel operation.
Resumo:
There is an increasing emphasis on the use of software to control safety critical plants for a wide area of applications. The importance of ensuring the correct operation of such potentially hazardous systems points to an emphasis on the verification of the system relative to a suitably secure specification. However, the process of verification is often made more complex by the concurrency and real-time considerations which are inherent in many applications. A response to this is the use of formal methods for the specification and verification of safety critical control systems. These provide a mathematical representation of a system which permits reasoning about its properties. This thesis investigates the use of the formal method Communicating Sequential Processes (CSP) for the verification of a safety critical control application. CSP is a discrete event based process algebra which has a compositional axiomatic semantics that supports verification by formal proof. The application is an industrial case study which concerns the concurrent control of a real-time high speed mechanism. It is seen from the case study that the axiomatic verification method employed is complex. It requires the user to have a relatively comprehensive understanding of the nature of the proof system and the application. By making a series of observations the thesis notes that CSP possesses the scope to support a more procedural approach to verification in the form of testing. This thesis investigates the technique of testing and proposes the method of Ideal Test Sets. By exploiting the underlying structure of the CSP semantic model it is shown that for certain processes and specifications the obligation of verification can be reduced to that of testing the specification over a finite subset of the behaviours of the process.
Resumo:
The adsorption and diffusion of mixed hydrocarbon components in silicalite have been studied using molecular dynamic simulation methods. We have investigated the effect of molecular loadings and temperature on the diffusional behavior of both pure and mixed alkane components. For binary mixtures with components of similar sizes, molecular diffusional behavior in the channels was noticed to be reversed as loading is increased. This behavior was noticeably absent for components of different sizes in the mixture. Methane molecules in the methane/propane mixture have the highest diffusion coefficients across the entire loading range. Binary mixtures containing ethane molecules prove more difficult to separate compared to other binary components. In the ternary mixture, however, ethane molecules diffuse much faster at 400 K in the channel with a tendency to separate out quickly from other components. © 2005 Elsevier Inc. All rights reserved.
Resumo:
A verification task of proving the equivalence of two descriptions of the same device is examined for the case, when one of the descriptions is partially defined. In this case, the verification task is reduced to checking out whether logical descriptions are equivalent on the domain of the incompletely defined one. Simulation-based approach to solving this task for different vector forms of description representations is proposed. Fast Boolean computations over Boolean and ternary vectors having big sizes underlie the offered methods.
Resumo:
This article describes the approach, which allows to develop information systems without taking into consideration details of physical storage of the relational model and type database management system. Described in terms of graph model, this approach allows to construct several algorithms, for example, for verification application domain. This theory was introduced into operation testing as a part of CASE-system METAS.
Resumo:
The paper describes a learning-oriented interactive method for solving linear mixed integer problems of multicriteria optimization. The method increases the possibilities of the decision maker (DM) to describe his/her local preferences and at the same time it overcomes some computational difficulties, especially in problems of large dimension. The method is realized in an experimental decision support system for finding the solution of linear mixed integer multicriteria optimization problems.
Resumo:
This paper advances a philosophically informed rationale for the broader, reflexive and practical application of arts-based methods to benefit research, practice and pedagogy. It addresses the complexity and diversity of learning and knowing, foregrounding a cohabitative position and recognition of a plurality of research approaches, tailored and responsive to context. Appreciation of art and aesthetic experience is situated in the everyday, underpinned by multi-layered exemplars of pragmatic visual-arts narrative inquiry undertaken in the third, creative and communications sectors. Discussion considers semi-guided use of arts-based methods as a conduit for topic engagement, reflection and intersubjective agreement; alongside observation and interpretation of organically employed approaches used by participants within daily norms. Techniques span handcrafted (drawing), digital (photography), hybrid (cartooning), performance dimensions (improvised installations) and music (metaphor and structure). The process of creation, the artefact/outcome produced and experiences of consummation are all significant, with specific reflexivity impacts. Exploring methodology and epistemology, both the "doing" and its interpretation are explicated to inform method selection, replication, utility, evaluation and development of cross-media skills literacy. Approaches are found engaging, accessible and empowering, with nuanced capabilities to alter relationships with phenomena, experiences and people. By building a discursive space that reduces barriers; emancipation, interaction, polyphony, letting-go and the progressive unfolding of thoughts are supported, benefiting ways of knowing, narrative (re)construction, sensory perception and capacities to act. This can also present underexplored researcher risks in respect to emotion work, self-disclosure, identity and agenda. The paper therefore elucidates complex, intricate relationships between form and content, the represented and the representation or performance, researcher and participant, and the self and other. This benefits understanding of phenomena including personal experience, sensitive issues, empowerment, identity, transition and liminality. Observations are relevant to qualitative and mixed methods researchers and a multidisciplinary audience, with explicit identification of challenges, opportunities and implications.
Resumo:
Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.
Resumo:
This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.
Resumo:
Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.
Resumo:
The verification and validation of engineering designs are of primary importance as they directly influence production performance and ultimately define product functionality and customer perception. Research in aspects of verification and validation is widely spread ranging from tools employed during the digital design phase, to methods deployed for prototype verification and validation. This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validation of products and processes. The scope of the paper includes aspects of system design and demonstrates how complex products are validated in the context of their lifecycle. Industrial requirements are highlighted and research trends and priorities identified. © 2010 CIRP.
Resumo:
Our modular approach to data hiding is an innovative concept in the data hiding research field. It enables the creation of modular digital watermarking methods that have extendable features and are designed for use in web applications. The methods consist of two types of modules – a basic module and an application-specific module. The basic module mainly provides features which are connected with the specific image format. As JPEG is a preferred image format on the Internet, we have put a focus on the achievement of a robust and error-free embedding and retrieval of the embedded data in JPEG images. The application-specific modules are adaptable to user requirements in the concrete web application. The experimental results of the modular data watermarking are very promising. They indicate excellent image quality, satisfactory size of the embedded data and perfect robustness against JPEG transformations with prespecified compression ratios. ACM Computing Classification System (1998): C.2.0.
Resumo:
Heuristics, simulation, artificial intelligence techniques and combinations thereof have all been employed in the attempt to make computer systems adaptive, context-aware, reconfigurable and self-managing. This paper complements such efforts by exploring the possibility to achieve runtime adaptiveness using mathematically-based techniques from the area of formal methods. It is argued that formal methods @ runtime represents a feasible approach, and promising preliminary results are summarised to support this viewpoint. The survey of existing approaches to employing formal methods at runtime is accompanied by a discussion of their challenges and of the future research required to overcome them. © 2011 Springer-Verlag.
Resumo:
The objective was to identify evidence to support use of specific harms for the development of a children and young people's safety thermometer (CYPST). We searched PubMed, Web of Knowledge, and Cochrane Library post-1999 for studies in pediatric settings about pain, skin integrity, extravasation injury, and use of pediatric early warning scores (PEWS). Following screening, nine relevant articles were included. Convergent synthesis methods were used drawing on thematic analysis to combine findings from studies using a range of methods (qualitative, quantitative, and mixed methods). A review of PEWS was identified so other studies on this issue were excluded. No relevant studies about extravasation injury were identified. The synthesized results therefore focused on pain and skin integrity. Measurement and perception of pain were complex and not always carried out according to best practice. Skin abrasions were common and mostly associated with device related injuries. The findings demonstrate a need for further work on perceptions of pain and effective communication of concerns about pain between parents and nursing staff. Strategies for reducing device-related injuries warrant further research focusing on prevention. Together with the review of PEWS, these synthesized findings support the inclusion of pain, skin integrity, and PEWS in the CYPST.
Resumo:
OBJECTIVE: The aim of this meta-analysis was to compare the efficacy and safety of infliximab-biosimilar and other available biologicals for the treatment of rheumatoid arthritis (RA), namely abatacept, adalimumab, certolizumab pegol, etanercept, golimumab, infliximab, rituximab and tocilizumab. METHODS: A systematic literature review of MEDLINE database until August 2013 was carried out to identify relevant randomized controlled trials (RCTs). Bayesian mixed treatment comparison method was applied for the pairwise comparison of treatments. Improvement rates by the American College of Rheumatology criteria (ACR20 and ACR50) at week 24 were used as efficacy endpoints, and the occurrence of serious adverse events was considered to assess the safety of the biologicals. RESULTS: Thirty-six RCTs were included in the meta-analysis. All the biological agents proved to be superior to placebo. For ACR20 response, certolizumab pegol showed the highest odds ratio (OR) compared to placebo, OR 7.69 [95 % CI 3.69-14.26], followed by abatacept OR 3.7 [95 % CI 2.17-6.06], tocilizumab OR 3.69 [95 % CI 1.87-6.62] and infliximab-biosimilar OR 3.47 [95 % CI 0.85-9.7]. For ACR50 response, certolizumab pegol showed the highest OR compared to placebo OR 8.46 [3.74-16.82], followed by tocilizumab OR 5.57 [95 % CI 2.77-10.09], and infliximab-biosimilar OR 4.06 [95 % CI 1.01-11.54]. Regarding the occurrence of serious adverse events, the results show no statistically significant difference between infliximab-biosimilar and placebo, OR 1.87 [95 % CI 0.74-3.84]. No significant difference regarding efficacy and safety was found between infliximab-biosimilar and the other biological treatments. CONCLUSION: This is the first indirect meta-analysis in RA that compares the efficacy and safety of biosimilar-infliximab to the other biologicals indicated in RA. We found no significant difference between infliximab-biosimilar and other biological agents in terms of clinical efficacy and safety.