827 resultados para Verification and validation technology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Centre for Aquatic Animal Health, Cochin University of Science and Technology

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Globalization and liberalization, with the entry of many prominent foreign manufacturers, changed the automobile scenario in India, since early 1990‟s. Manufacturers such as Ford, General Motors, Honda, Toyota, Suzuki, Hyundai, Renault, Mitsubishi, Benz, BMW, Volkswagen and Nissan set up their manufacturing units in India in joint venture with their Indian counterpart companies, by making use of the Foreign Direct Investment policy of the Government of India, These manufacturers started capturing the hearts of Indian car customers with their choice of technological and innovative product features, with quality and reliability. With the multiplicity of choices available to the Indian passenger car buyers, it drastically changed the way the car purchase scenario in India and particularly in the State of Kerala. This transformed the automobile scene from a sellers‟ market to buyers‟ market. Car customers started developing their own personal preferences and purchasing patterns, which were hitherto unknown in the Indian automobile segment. The main purpose of this paper is to come up with the identification of possible parameters and a framework development, that influence the consumer purchase behaviour patterns of passenger car owners in the State of Kerala, so that further research could be done, based on the framework and the identified parameters

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cochin University Of Science And Technology

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El sistema de fangs activats és el tractament biològic més àmpliament utilitzat arreu del món per la depuració d'aigües residuals. El seu funcionament depèn de la correcta operació tant del reactor biològic com del decantador secundari. Quan la fase de sedimentació no es realitza correctament, la biomassa no decantada s'escapa amb l'efluent causant un impacte sobre el medi receptor. Els problemes de separació de sòlids, són actualment una de les principals causes d'ineficiència en l'operació dels sistemes de fangs activats arreu del món. Inclouen: bulking filamentós, bulking viscós, escumes biològiques, creixement dispers, flòcul pin-point i desnitrificació incontrolada. L'origen dels problemes de separació generalment es troba en un desequilibri entre les principals comunitats de microorganismes implicades en la sedimentació de la biomassa: els bacteris formadors de flòcul i els bacteris filamentosos. Degut a aquest origen microbiològic, la seva identificació i control no és una tasca fàcil pels caps de planta. Els Sistemes de Suport a la Presa de Decisions basats en el coneixement (KBDSS) són un grup d'eines informàtiques caracteritzades per la seva capacitat de representar coneixement heurístic i tractar grans quantitats de dades. L'objectiu de la present tesi és el desenvolupament i validació d'un KBDSS específicament dissenyat per donar suport als caps de planta en el control dels problemes de separació de sòlids d'orígen microbiològic en els sistemes de fangs activats. Per aconseguir aquest objectiu principal, el KBDSS ha de presentar les següents característiques: (1) la implementació del sistema ha de ser viable i realista per garantir el seu correcte funcionament; (2) el raonament del sistema ha de ser dinàmic i evolutiu per adaptar-se a les necessitats del domini al qual es vol aplicar i (3) el raonament del sistema ha de ser intel·ligent. En primer lloc, a fi de garantir la viabilitat del sistema, s'ha realitzat un estudi a petita escala (Catalunya) que ha permès determinar tant les variables més utilitzades per a la diagnosi i monitorització dels problemes i els mètodes de control més viables, com la detecció de les principals limitacions que el sistema hauria de resoldre. Els resultats d'anteriors aplicacions han demostrat que la principal limitació en el desenvolupament de KBDSSs és l'estructura de la base de coneixement (KB), on es representa tot el coneixement adquirit sobre el domini, juntament amb els processos de raonament a seguir. En el nostre cas, tenint en compte la dinàmica del domini, aquestes limitacions es podrien veure incrementades si aquest disseny no fos òptim. En aquest sentit, s'ha proposat el Domino Model com a eina per dissenyar conceptualment el sistema. Finalment, segons el darrer objectiu referent al seguiment d'un raonament intel·ligent, l'ús d'un Sistema Expert (basat en coneixement expert) i l'ús d'un Sistema de Raonament Basat en Casos (basat en l'experiència) han estat integrats com els principals sistemes intel·ligents encarregats de dur a terme el raonament del KBDSS. Als capítols 5 i 6 respectivament, es presenten el desenvolupament del Sistema Expert dinàmic (ES) i del Sistema de Raonament Basat en Casos temporal, anomenat Sistema de Raonament Basat en Episodis (EBRS). A continuació, al capítol 7, es presenten detalls de la implementació del sistema global (KBDSS) en l'entorn G2. Seguidament, al capítol 8, es mostren els resultats obtinguts durant els 11 mesos de validació del sistema, on aspectes com la precisió, capacitat i utilitat del sistema han estat validats tant experimentalment (prèviament a la implementació) com a partir de la seva implementació real a l'EDAR de Girona. Finalment, al capítol 9 s'enumeren les principals conclusions derivades de la present tesi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It's a fact that functional verification (FV) is paramount within the hardware's design cycle. With so many new techniques available today to help with FV, which techniques should we really use? The answer is not straightforward and is often confusing and costly. The tools and techniques to be used in a project have to be decided upon early in the design cycle to get the best value for these new verification methods. This paper gives a quick survey in the form of an overview on FV, establishes the difference between verification and validation, describes the bottlenecks that appear in the verification process, examines the challenges in FV and exposes the current FV technologies and trends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Throughout the industrial processes of sheet metal manufacturing and refining, shear cutting is widely used for its speed and cost advantages over competing cutting methods. Industrial shears may include some force measurement possibilities, but the force is most likely influenced by friction losses between shear tool and the point of measurement, and are in general not showing the actual force applied to the sheet. Well defined shears and accurate measurements of force and shear tool position are important for understanding the influence of shear parameters. Accurate experimental data are also necessary for calibration of numerical shear models. Here, a dedicated laboratory set-up with well defined geometry and movement in the shear, and high measurability in terms of force and geometry is designed, built and verified. Parameters important to the shear process are studied with perturbation analysis techniques and requirements on input parameter accuracy are formulated to meet experimental output demands. Input parameters in shearing are mostly geometric parameters, but also material properties and contact conditions. Based on the accuracy requirements, a symmetric experiment with internal balancing of forces is constructed to avoid guides and corresponding friction losses. Finally, the experimental procedure is validated through shearing of a medium grade steel. With the obtained experimental set-up performance, force changes as result of changes in studied input parameters are distinguishable down to a level of 1%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objetivo:traduzir o instrumento Venous legulcer quality of life questionnaire (VLU-QoL), adaptá-lo culturalmente para o português do Brasil e validá-lo com pacientes do Hospital das Clínicas da Faculdade de Medicina de Botucatu (FMB) da Universidade Estadual Paulista (Unesp). Métodos:o questionário foi traduzido por um tradutor profissional e por dois dermatologistas especialistas na área de úlceras venosas (UV), sendo reformulado em reunião com os três tradutores. O constructo (VLU-QoL-Br) foi submetido a pré-entrevista com 10 portadores de UV para a adaptação da linguagem. Posteriormente, foi aplicado em pacientes do HC-Unesp, e como teste-reteste para verificação de sua reprodutibilidade. Resultados:foram avaliados 82 pacientes, sendo 56 (68%) do sexo feminino. A idade média foi de 67,3 anos. O questionário foi traduzido, adaptado e aplicado aos pacientes. O constructo apresentou alta consistência interna (alfa= 0,94) e adequada correlação item-total. Quando avaliados os 32 retestes, observou-se correlação intraclasse para concordância de 0,78 (p < 0,01), indicando boa reprodutibilidade do constructo. A análise fatorial confirmatória corroborou as dimensões do questionário original: atividades, psicológico e sintomas. Escores do VLU-QoL-Br se associaram, independentemente, à área total das úlceras e a menor escolaridade dos sujeitos (p < 0,01). Conclusão:a tradução, a adaptação e a validação do questionário VLU-Qol-Br demonstrou boa performance psicométrica, permitindo seu uso clínico no Brasil. É importante avaliar seu desempenho em outras regiões e em diferentes amostras de indivíduos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution, well-calibrated records of lake sediments are critically important for quantitative climate reconstructions, but they remain a methodological and analytical challenge. While several comprehensive paleotemperature reconstructions have been developed across Europe, only a few quantitative high-resolution studies exist for precipitation. Here we present a calibration and verification study of lithoclastic sediment proxies from proglacial Lake Oeschinen (46°30′N, 7°44′E, 1,580 m a.s.l., north–west Swiss Alps) that are sensitive to rainfall for the period AD 1901–2008. We collected two sediment cores, one in 2007 and another in 2011. The sediments are characterized by two facies: (A) mm-laminated clastic varves and (B) turbidites. The annual character of the laminae couplets was confirmed by radiometric dating (210Pb, 137Cs) and independent flood-layer chronomarkers. Individual varves consist of a dark sand-size spring-summer layer enriched in siliciclastic minerals and a lighter clay-size calcite-rich winter layer. Three subtypes of varves are distinguished: Type I with a 1–1.5 mm fining upward sequence; Type II with a distinct fine-sand base up to 3 mm thick; and Type III containing multiple internal microlaminae caused by individual summer rainstorm deposits. Delta-fan surface samples and sediment trap data fingerprint different sediment source areas and transport processes from the watershed and confirm the instant response of sediment flux to rainfall and erosion. Based on a highly accurate, precise and reproducible chronology, we demonstrate that sediment accumulation (varve thickness) is a quantitative predictor for cumulative boreal alpine spring (May–June) and spring/summer (May–August) rainfall (rMJ = 0.71, rMJJA = 0.60, p < 0.01). Bootstrap-based verification of the calibration model reveals a root mean squared error of prediction (RMSEPMJ = 32.7 mm, RMSEPMJJA = 57.8 mm) which is on the order of 10–13 % of mean MJ and MJJA cumulative precipitation, respectively. These results highlight the potential of the Lake Oeschinen sediments for high-resolution reconstructions of past rainfall conditions in the northern Swiss Alps, central and eastern France and south-west Germany.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spanish wheat (Triticum spp.) landraces have a considerable polymorphism, containing many unique alleles, relative to other collections. The existence of a core collection is a favored approach for breeders to efficiently explore novel variation and enhance the use of germplasm. In this study, the Spanish durum wheat (Triticum turgidum L.) core collection (CC) was created using a population structure–based method, grouping accessions by subspecies and allocating the number of genotypes among populations according to the diversity of simple sequence repeat (SSR) markers. The CC of 94 genotypes was established, which accounted for 17% of the accessions in the entire collection. An alternative core collection (CH), with the same number of genotypes per subspecies and maximizing the coverage of SSR alleles, was assembled with the Core Hunter software. The quality of both core collections was compared with a random core collection and evaluated using geographic, agromorphological, and molecular marker data not previously used in the selection of genotypes. Both core collections had a high genetic representativeness, which validated their sampling strategies. Geographic and agromorphological variation, phenotypic correlations, and gliadin alleles of the original collection were more accurately depicted by the CC. Diversity arrays technology (DArT) markers revealed that the CC included genotypes less similar than the CH. Although more SSR alleles were retained by the CH (94%) than by the CC (91%), the results showed that the CC was better than CH for breeding purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of digital competence remains an issue of interest for both the scientific community and the supranational political agenda. This study uses the Delphi method to validate the design of a questionnaire to determine the perceived importance of digital competence in higher education. The questionnaire was constructed from different framework documents in digital competence standards (NETS, ACLR, UNESCO). The triangulation of non-parametric techniques made it possible to consolidate the results obtained through the Delphi panel, the suitability of which was highlighted through the expert competence index (K). The resulting questionnaire emerges as a good tool for undertaking future national and international studies on digital competence in higher education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper summarises test results that were used to validate a model and scale-up procedure of the high pressure grinding roll (HPGR) which was developed at the JKMRC by Morrell et al. [Morrell, Lim, Tondo, David,1996. Modelling the high pressure grinding rolls. In: Mining Technology Conference, pp. 169-176.]. Verification of the model is based on results from four data sets that describe the performance of three industrial scale units fitted with both studded and smooth roll surfaces. The industrial units are currently in operation within the diamond mining industry and are represented by De Beers, BHP Billiton and Rio Tinto. Ore samples from the De Beers and BHP Billiton operations were sent to the JKMRC for ore characterisation and HPGR laboratory-scale tests. Rio Tinto contributed an historical data set of tests completed during a previous research project. The results conclude that the modelling of the HPGR process has matured to a point where the model may be used to evaluate new and to optimise existing comminution circuits. The model prediction of product size distribution is good and has been found to be strongly dependent of the characteristics of the material being tested. The prediction of throughput and corresponding power draw (based on throughput) is sensitive to inconsistent gap/diameter ratios observed between laboratory-scale tests and full-scale operations. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A complete workflow specification requires careful integration of many different process characteristics. Decisions must be made as to the definitions of individual activities, their scope, the order of execution that maintains the overall business process logic, the rules governing the discipline of work list scheduling to performers, identification of time constraints and more. The goal of this paper is to address an important issue in workflows modelling and specification, which is data flow, its modelling, specification and validation. Researchers have neglected this dimension of process analysis for some time, mainly focussing on structural considerations with limited verification checks. In this paper, we identify and justify the importance of data modelling in overall workflows specification and verification. We illustrate and define several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension. A discussion on essential requirements of the workflow data model in order to support data validation is also given..