13 resultados para mixed verification methods

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

80.00% 80.00%

Publicador:

Resumo:

 Diplomityön tavoitteena on tutkia ja kehittää muovituotteen valmistuskonseptien verifiointimenetelmiä ja luoda malli, jolla saadaan esituotantovaiheestalähtien parannettu tuotantoprosessi massatuotannon aloittamiseen (ramp up). Työn tavoitteena on myös toimia viestintäkeinona lisäämässä organisaation tietoisuutta esituotantovaiheen tärkeydestä. Työ pohjautuu tekijän aikaisempaan tutkimukseen "esituotantoprosessista oppimisen vaikutukset massatuotannon aloitukseen", joka on tehty erikoistyönä v.2006. Tutkimus menetelminä on käytetty pääasiassa prosessien kuvaamista, benchmarkingia, asiantuntijoiden haastatteluja sekä analysoitu toteutuneita projekteja. Lopputuloksena on saatu toimintamalli, joka vastaa hyvin pitkälle prosessien kuvausten aikana syntynyttä mallia. Keskeisenä ajatuksena on valmistuskonseptin verifioinnin kytkeminen tuotteen mekaniikkasuunnittelun kypsyyteen. Koko projektia koskevien tavoitteiden määrittäminen johdetaan ramp up tavoitteista. Verifioitavaksi valitaan kriittisin tuote ja prosessi. Tähän on teoreettisena viitekehyksenä käytetty Quality Function Deployment (QFD) menetelmää. Jatkotoimenpiteiksi esitetään ramp up tavoitteiden ja mittareiden kehittämistä, joilla pystytään seuraamaan ja ohjaamaan projektia jo heti alusta alkaen. Lisätutkimusta tarvitaan myös esituotannon aikaisten prosessiparametrien suunnitteluun ja optimointiin.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Active Magnetic Bearings offer many advantages that have brought new applications to the industry. However, similarly to all new technology, active magnetic bearings also have downsides and one of those is the low standardization level. This thesis is studying mainly the ISO 14839 standard and more specifically the system verification methods. These verifying methods are conducted using a practical test with an existing active magnetic bearing system. The system is simulated with Matlab using rotor-bearing dynamics toolbox, but this study does not include the exact simulation code or a direct algebra calculation. However, this study provides the proof that standardized simulation methods can be applied in practical problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laboratoriomittakaavainen formeri on välttämätön, jotta paperinvalmistusprosessin jäljitteleminen olisi mahdollista. Vaikka erilaisia formereita löytyykin paperiteollisuudesta, tilaa on kuitenkin laboratoriomittakaavaiselle paperinvalmistusmenetelmälle, joka sijoittuisipilottikoneen ja perinteisen laboratorioarkkimuotin välille. Formeri, jolla saadaan aikaiseksi oikean paperinvalmistuksen kaltaiset olosuhteet ja ilmiöt on kehitetty, ja sen toiminta on testattu Nalcon Papermaking Centreof Excellence:ssä Espoossa. Formeri on yhdistetty Nalcon lähestymisjärjetelmäsimulaattoriin ja simulaattorilla aikaansaadut hydro-kemialliset ilmiöt voidaan testata nyt myös arkeista. Laitteessa on perälaatikko ja viiraosa. Perälaatikosta massa virtaa viiralle, joka liikkuu eteenpäin hihnakuljettimen hihnojen päällä. Suihku-viira -suhdetta voidaan muuttaa joko muuttamalla virtausnopeutta tai viiran nopeutta tai säätämällä perälaatikon huuliaukkoa. Formerintoiminnan testaus osoitti, että se toimii teknisesti hyvin ja tulokset ovat toistettavia ja loogisia. Arkeissa kuidut ovat orientoituneet, formaatio ja vetolujuussuhde KS/PS riippuvat voimakkaasti suihku-viira -suhteesta, kuten oikeillakinpaperikoneilla.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays software testing and quality assurance have a great value in software development process. Software testing does not mean a concrete discipline, it is the process of validation and verification that starts from the idea of future product and finishes at the end of product’s maintenance. The importance of software testing methods and tools that can be applied on different testing phases is highly stressed in industry. The initial objectives for this thesis were to provide a sufficient literature review on different testing phases and for each of the phases define the method that can be effectively used for improving software’s quality. Software testing phases, chosen for study are: unit testing, integration testing, functional testing, system testing, acceptance testing and usability testing. The research showed that there are many software testing methods that can be applied at different phases and in the most of the cases the choice of the method should be done depending on software type and its specification. In the thesis the problem, concerned to each of the phases was identified; the method that can help in eliminating this problem was suggested and particularly described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to study the effect of different fuel mixtures on the operation of circulating fluidized bed (CFB) boiler. The applicability of heat balance modeling software IPSEpro to simulate CFB boiler operation is also investigated. The work discusses various types of boilers and methods of boiler operation. The fuel properties and the possible fuel influence on the boiler efficiency are described. Various biofuel types that are possible to use in combination with other fuels are presented. Some examples of the fuel mixtures use are given. A CFB boiler model has been constructed using IPSEpro and applied to analyze boiler operation outside design conditions. In the simulations, the effect of different load levels and moisture contents for the fuel mixture has been studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to examine and expand understanding concerning young Finnish registered nurses (RN) with an intention to leave the profession and the related variables, specifically when that intention has emerged before the age of 30. The overall goal of the study was to develop a conceptual model in relation to young RNs’ intention to leave the profession. Suggestions for policymakers, nurse leaders and nurse managers are presented for how to retain more young RNs in the nursing workforce. Suggestions for future nursing research are also provided. Phase I consists of two sequential integrative literature reviews of 75 empirical articles concerning nurses’ intention to leave the profession. In phase II, data had been collected as part of the Nurses’ Early Exit (NEXT) study, using the BQ-12 structured postal questionnaire. A total of 147 young RNs participated in the study. The data were analysed with statistical methods. In phase III, firstly, an in-depth interpretive case study was conducted in order to understand how young RNs explain and make sense of their intention to leave the profession. The data in this study consisted of longitudinal career stories by three young RNs. The data was analysed by using narrative holistic-content and thematic methods. Secondly, a total of 15 young RNs were interviewed in order to explore in-depth their experiences concerning organizational turnover and their intent to leave the profession. The data was analysed using conventional content analysis. Based on earlier research, empirical research on the young RNs intention to leave the profession is scarce. Nurses’ intention to leave the profession has mainly been studied with quantitative descriptive studies, conducted with survey questionnaires. Furthermore, the quality of previous studies varies considerably. Moreover, nurses’ intention to leave the profession seems to be driven by a number of variables. According to the survey study, 26% of young RNs had often considered giving up nursing completely and starting a different kind of job during the course of the previous year. Many different variables were associated with an intention to leave the profession (e.g. personal burnout, job dissatisfaction). According to the in-depth inquiries, poor nursing practice environments and a nursing career as a ‘second-best’ or serendipitous career choice were themes associated with young RNs’ intention to leave the profession. In summary, young RNs intention to leave the profession is a complex phenomenon with multiple associated variables. These findings suggest that policymakers, nurse leaders and nurse managers should enable improvements in nursing practice environments in order to retain more young RNs. These improvements can include, for example, adequate staffing levels, balanced nursing workloads, measures to reduce work-related stress as well as possibilities for advancement and development. Young RNs’ requirements to provide high-quality and ethical nursing care must be recognized in society and health-care organizations. Moreover, sufficient mentoring and orientation programmes should be provided for all graduate RNs. Future research is needed into whether the motive for choosing a nursing career affects the length of the tenure in the profession. Both quantitative and in-depth research is needed for the comprehensive development of nursing-turnover research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preparative liquid chromatography is one of the most selective separation techniques in the fine chemical, pharmaceutical, and food industries. Several process concepts have been developed and applied for improving the performance of classical batch chromatography. The most powerful approaches include various single-column recycling schemes, counter-current and cross-current multi-column setups, and hybrid processes where chromatography is coupled with other unit operations such as crystallization, chemical reactor, and/or solvent removal unit. To fully utilize the potential of stand-alone and integrated chromatographic processes, efficient methods for selecting the best process alternative as well as optimal operating conditions are needed. In this thesis, a unified method is developed for analysis and design of the following singlecolumn fixed bed processes and corresponding cross-current schemes: (1) batch chromatography, (2) batch chromatography with an integrated solvent removal unit, (3) mixed-recycle steady state recycling chromatography (SSR), and (4) mixed-recycle steady state recycling chromatography with solvent removal from fresh feed, recycle fraction, or column feed (SSR–SR). The method is based on the equilibrium theory of chromatography with an assumption of negligible mass transfer resistance and axial dispersion. The design criteria are given in general, dimensionless form that is formally analogous to that applied widely in the so called triangle theory of counter-current multi-column chromatography. Analytical design equations are derived for binary systems that follow competitive Langmuir adsorption isotherm model. For this purpose, the existing analytic solution of the ideal model of chromatography for binary Langmuir mixtures is completed by deriving missing explicit equations for the height and location of the pure first component shock in the case of a small feed pulse. It is thus shown that the entire chromatographic cycle at the column outlet can be expressed in closed-form. The developed design method allows predicting the feasible range of operating parameters that lead to desired product purities. It can be applied for the calculation of first estimates of optimal operating conditions, the analysis of process robustness, and the early-stage evaluation of different process alternatives. The design method is utilized to analyse the possibility to enhance the performance of conventional SSR chromatography by integrating it with a solvent removal unit. It is shown that the amount of fresh feed processed during a chromatographic cycle and thus the productivity of SSR process can be improved by removing solvent. The maximum solvent removal capacity depends on the location of the solvent removal unit and the physical solvent removal constraints, such as solubility, viscosity, and/or osmotic pressure limits. Usually, the most flexible option is to remove solvent from the column feed. Applicability of the equilibrium design for real, non-ideal separation problems is evaluated by means of numerical simulations. Due to assumption of infinite column efficiency, the developed design method is most applicable for high performance systems where thermodynamic effects are predominant, while significant deviations are observed under highly non-ideal conditions. The findings based on the equilibrium theory are applied to develop a shortcut approach for the design of chromatographic separation processes under strongly non-ideal conditions with significant dispersive effects. The method is based on a simple procedure applied to a single conventional chromatogram. Applicability of the approach for the design of batch and counter-current simulated moving bed processes is evaluated with case studies. It is shown that the shortcut approach works the better the higher the column efficiency and the lower the purity constraints are.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumer neuroscience (neuromarketing) is an emerging field of marketing research which uses brain imaging techniques to study neural conditions and processes that underlie consumption. The purpose of this study was to map this fairly new and growing field in Finland by studying the opinions of both Finnish consumers and marketing professionals towards it and comparing the opinions to the current consumer neuroscience literature, and based on that evaluate the usability of brain imaging techniques as a marketing research method. Mixed methods research design was chosen for this study. Quantitative data was collected from 232 consumers and 28 marketing professionals by means of online surveys. Both respondent groups had either neutral opinions or lacked knowledge about the four themes chosen for this study: benefits, limitations and challenges, ethical issues and future prospects of consumer neuroscience. Qualitative interview data was collected from 2 individuals from Finnish neuromarketing companies to deepen insights gained from quantitative research. The four interview themes were the same as in the surveys and the interviewees’ answers were mostly in line with the current literature, although more optimistic about the future of the field. The interviews also exposed a gap between academic consumer neuroscience research and practical level applications. The results of this study suggest that there are still many unresolved challenges and relevant populations either have neutral opinions or lack information about consumer neuroscience. The practical level applications are, however, already being successfully used and this new field of marketing research is growing both globally and in Finland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumer neuroscience (neuromarketing) is an emerging field of marketing research which uses brain imaging techniques to study neural conditions and processes that underlie consumption. The purpose of this study was to map this fairly new and growing field in Finland by studying the opinions of both Finnish consumers and marketing professionals towards it and comparing the opinions to the current consumer neuroscience literature, and based on that evaluate the usability of brain imaging techniques as a marketing research method. Mixed methods research design was chosen for this study. Quantitative data was collected from 232 consumers and 28 marketing professionals by means of online surveys. Both respondent groups had either neutral opinions or lacked knowledge about the four themes chosen for this study: benefits, limitations and challenges, ethical issues and future prospects of consumer neuroscience. Qualitative interview data was collected from 2 individuals from Finnish neuromarketing companies to deepen insights gained from quantitative research. The four interview themes were the same as in the surveys and the interviewees’ answers were mostly in line with the current literature, although more optimistic about the future of the field. The interviews also exposed a gap between academic consumer neuroscience research and practical level applications. The results of this study suggest that there are still many unresolved challenges and relevant populations either have neutral opinions or lack information about consumer neuroscience. The practical level applications are, however, already being successfully used and this new field of marketing research is growing both globally and in Finland.