908 resultados para Verification and validation technology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper summarises test results that were used to validate a model and scale-up procedure of the high pressure grinding roll (HPGR) which was developed at the JKMRC by Morrell et al. [Morrell, Lim, Tondo, David,1996. Modelling the high pressure grinding rolls. In: Mining Technology Conference, pp. 169-176.]. Verification of the model is based on results from four data sets that describe the performance of three industrial scale units fitted with both studded and smooth roll surfaces. The industrial units are currently in operation within the diamond mining industry and are represented by De Beers, BHP Billiton and Rio Tinto. Ore samples from the De Beers and BHP Billiton operations were sent to the JKMRC for ore characterisation and HPGR laboratory-scale tests. Rio Tinto contributed an historical data set of tests completed during a previous research project. The results conclude that the modelling of the HPGR process has matured to a point where the model may be used to evaluate new and to optimise existing comminution circuits. The model prediction of product size distribution is good and has been found to be strongly dependent of the characteristics of the material being tested. The prediction of throughput and corresponding power draw (based on throughput) is sensitive to inconsistent gap/diameter ratios observed between laboratory-scale tests and full-scale operations. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A complete workflow specification requires careful integration of many different process characteristics. Decisions must be made as to the definitions of individual activities, their scope, the order of execution that maintains the overall business process logic, the rules governing the discipline of work list scheduling to performers, identification of time constraints and more. The goal of this paper is to address an important issue in workflows modelling and specification, which is data flow, its modelling, specification and validation. Researchers have neglected this dimension of process analysis for some time, mainly focussing on structural considerations with limited verification checks. In this paper, we identify and justify the importance of data modelling in overall workflows specification and verification. We illustrate and define several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension. A discussion on essential requirements of the workflow data model in order to support data validation is also given..

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The results of empirical studies are limited to particular contexts, difficult to generalise and the studies themselves are expensive to perform. Despite these problems, empirical studies in software engineering can be made effective and they are important to both researchers and practitioners. The key to their effectiveness lies in the maximisation of the information that can be gained by examining existing studies, conducting power analyses for an accurate minimum sample size and benefiting from previous studies through replication. This approach was applied in a controlled experiment examining the combination of automated static analysis tools and code inspection in the context of verification and validation (V&V) of concurrent Java components. The combination of these V&V technologies was shown to be cost-effective despite the size of the study, which thus contributes to research in V&V technology evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distance learning is growing and transforming educational institutions. The increasing use of distance learning by higher education institutions and particularly community colleges coupled with the higher level of student attrition in online courses than in traditional classrooms suggests that increased attention should be paid to factors that affect online student course completion. The purpose of the study was to develop and validate an instrument to predict community college online student course completion based on faculty perceptions, yielding a prediction model of online course completion rates. Social Presence and Media Richness theories were used to develop a theoretically-driven measure of online course completion. This research study involved surveying 311 community college faculty who taught at least one online course in the past 2 years. Email addresses of participating faculty were provided by two south Florida community colleges. Each participant was contacted through email, and a link to an Internet survey was given. The survey response rate was 63% (192 out of 303 available questionnaires). Data were analyzed through factor analysis, alpha reliability, and multiple regression. The exploratory factor analysis using principal component analysis with varimax rotation yielded a four-factor solution that accounted for 48.8% of the variance. Consistent with Social Presence theory, the factors with their percent of variance in parentheses were: immediacy (21.2%), technological immediacy (11.0%), online communication and interactivity (10.3%), and intimacy (6.3%). Internal consistency of the four factors was calculated using Cronbach's alpha (1951) with reliability coefficients ranging between .680 and .828. Multiple regression analysis yielded a model that significantly predicted 11% of the variance of the dependent variable, the percentage of student who completed the online course. As indicated in the literature (Johnson & Keil, 2002; Newberry, 2002), Media Richness theory appears to be closely related to Social Presence theory. However, elements from this theory did not emerge in the factor analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over 2 million Anterior Cruciate Ligament (ACL) injuries occur annually worldwide resulting in considerable economic and health burdens (e.g., suffering, surgery, loss of function, risk for re-injury, and osteoarthritis). Current screening methods are effective but they generally rely on expensive and time-consuming biomechanical movement analysis, and thus are impractical solutions. In this dissertation, I report on a series of studies that begins to investigate one potentially efficient alternative to biomechanical screening, namely skilled observational risk assessment (e.g., having experts estimate risk based on observations of athletes movements). Specifically, in Study 1 I discovered that ACL injury risk can be accurately and reliably estimated with nearly instantaneous visual inspection when observed by skilled and knowledgeable professionals. Modern psychometric optimization techniques were then used to develop a robust and efficient 5-item test of ACL injury risk prediction skill—i.e., the ACL Injury-Risk-Estimation Quiz or ACL-IQ. Study 2 cross-validated the results from Study 1 in a larger representative sample of both skilled (Exercise Science/Sports Medicine) and un-skilled (General Population) groups. In accord with research on human expertise, quantitative structural and process modeling of risk estimation indicated that superior performance was largely mediated by specific strategies and skills (e.g., ignoring irrelevant information), independent of domain general cognitive abilities (e.g., metal rotation, general decision skill). These cognitive models suggest that ACL-IQ is a trainable skill, providing a foundation for future research and applications in training, decision support, and ultimately clinical screening investigations. Overall, I present the first evidence that observational ACL injury risk prediction is possible including a robust technology for fast, accurate and reliable measurement—i.e., the ACL-IQ. Discussion focuses on applications and outreach including a web platform that was developed to house the test, provide a repository for further data collection, and increase public and professional awareness and outreach (www.ACL-IQ.org). Future directions and general applications of the skilled movement analysis approach are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficient numerical models facilitate the study and design of solid oxide fuel cells (SOFCs), stacks, and systems. Whilst the accuracy and reliability of the computed results are usually sought by researchers, the corresponding modelling complexities could result in practical difficulties regarding the implementation flexibility and computational costs. The main objective of this article is to adapt a simple but viable numerical tool for evaluation of our experimental rig. Accordingly, a model for a multi-layer SOFC surrounded by a constant temperature furnace is presented, trained and validated against experimental data. The model consists of a four-layer structure including stand, two interconnects, and PEN (Positive electrode-Electrolyte-Negative electrode); each being approximated by a lumped parameter model. The heating process through the surrounding chamber is also considered. We used a set of V-I characteristics data for parameter adjustment followed by model verification against two independent sets of data. The model results show a good agreement with practical data, offering a significant improvement compared to reduced models in which the impact of external heat loss is neglected. Furthermore, thermal analysis for adiabatic and non-adiabatic process is carried out to capture the thermal behaviour of a single cell followed by a polarisation loss assessment. Finally, model-based design of experiment is demonstrated for a case study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to identify the parameters related to the expression of the reactivity in horses during handling and based on that proposed and validated a scale of composite measure reactivity score to characterize horse's reactivity. To this end, the first stage (S1) proposed the scale and the second (S2) validated it. In S1, 364 Lusitano horses were evaluated, 188 were adult breeding mares (4–12 years old), and 176 were foals (males/females, aged from 2 months to 2 years). During hooves trimming, vermifuge application, palpation scores were assigned to behaviors of movement, ears and eyes position, breathing, vocalization, and urination. A response parameter called reactivity was attributed to each animal, ranging from score 1 (nonreactive/calm) to score 4 (very reactive/aggressive). The verification of the possible parameters (age, behavior), which explains the response parameter (reactivity), was taken using ordinal proportional odds model. Movement, breathing, ears and eyes position, vocalization, and age appear to explain the reactivity of horses during handling (P < .01). Therefore, based on these parameters, it was possible to propose two scales of composite measure reactivity score: one to characterize the mares and another the foals. On S2, the proposed scale was validated by the simultaneous application of Forced Human Approach Test, another commonly used test to evaluate the reactivity in horses, with a correlation of 0.97 (P < .05). The assessment of the reactivity of horses during handling by a composite measure reactivity score scale is valid, and easy to apply, without disrupting daily routine and override the impact of individual differences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This manuscript reports the overall development of a Ph.D. research project during the “Mechanics and advanced engineering sciences” course at the Department of Industrial Engineering of the University of Bologna. The project is focused on the development of a combustion control system for an innovative Spark Ignited engine layout. In details, the controller is oriented to manage a prototypal engine equipped with a Port Water Injection system. The water injection technology allows an increment of combustion efficiency due to the knock mitigation effect that permits to keep the combustion phasing closer to the optimal position with respect to the traditional layout. At the beginning of the project, the effects and the possible benefits achievable by water injection have been investigated by a focused experimental campaign. Then the data obtained by combustion analysis have been processed to design a control-oriented combustion model. The model identifies the correlation between Spark Advance, combustion phasing and injected water mass, and two different strategies are presented, both based on an analytic and semi-empirical approach and therefore compatible with a real-time application. The model has been implemented in a combustion controller that manages water injection to reach the best achievable combustion efficiency while keeping knock levels under a pre-established threshold. Three different versions of the algorithm are described in detail. This controller has been designed and pre-calibrated in a software-in-the-loop environment and later an experimental validation has been performed with a rapid control prototyping approach to highlight the performance of the system on real set-up. To further make the strategy implementable on an onboard application, an estimation algorithm of combustion phasing, necessary for the controller, has been developed during the last phase of the PhD Course, based on accelerometric signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The genera Cochliomyia and Chrysomya contain both obligate and saprophagous flies, which allows the comparison of different feeding habits between closely related species. Among the different strategies for comparing these habits is the use of qPCR to investigate the expression levels of candidate genes involved in feeding behavior. To ensure an accurate measure of the levels of gene expression, it is necessary to normalize the amount of the target gene with the amount of a reference gene having a stable expression across the compared species. Since there is no universal gene that can be used as a reference in functional studies, candidate genes for qPCR data normalization were selected and validated in three Calliphoridae (Diptera) species, Cochliomyia hominivorax Coquerel, Cochliomyia macellaria Fabricius, and Chrysomya albiceps Wiedemann . The expression stability of six genes ( Actin, Gapdh, Rp49, Rps17, α -tubulin, and GstD1) was evaluated among species within the same life stage and between life stages within each species. The expression levels of Actin, Gapdh, and Rp49 were the most stable among the selected genes. These genes can be used as reliable reference genes for functional studies in Calliphoridae using similar experimental settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fluid flow over bodies with complex geometry has been the subject of research of many scientists and widely explored experimentally and numerically. The present study proposes an Eulerian Immersed Boundary Method for flows simulations over stationary or moving rigid bodies. The proposed method allows the use of Cartesians Meshes. Here, two-dimensional simulations of fluid flow over stationary and oscillating circular cylinders were used for verification and validation. Four different cases were explored: the flow over a stationary cylinder, the flow over a cylinder oscillating in the flow direction, the flow over a cylinder oscillating in the normal flow direction, and a cylinder with angular oscillation. The time integration was carried out by a classical 4th order Runge-Kutta scheme, with a time step of the same order of distance between two consecutive points in x direction. High-order compact finite difference schemes were used to calculate spatial derivatives. The drag and lift coefficients, the lock-in phenomenon and vorticity contour plots were used for the verification and validation of the proposed method. The extension of the current method allowing the study of a body with different geometry and three-dimensional simulations is straightforward. The results obtained show a good agreement with both numerical and experimental results, encouraging the use of the proposed method.