31 resultados para Cleaning validation
Resumo:
This thesis presents an approach for formulating and validating a space averaged drag model for coarse mesh simulations of gas-solid flows in fluidized beds using the two-fluid model. Proper modeling for fluid dynamics is central in understanding any industrial multiphase flow. The gas-solid flows in fluidized beds are heterogeneous and usually simulated with the Eulerian description of phases. Such a description requires the usage of fine meshes and small time steps for the proper prediction of its hydrodynamics. Such constraint on the mesh and time step size results in a large number of control volumes and long computational times which are unaffordable for simulations of large scale fluidized beds. If proper closure models are not included, coarse mesh simulations for fluidized beds do not give reasonable results. The coarse mesh simulation fails to resolve the mesoscale structures and results in uniform solids concentration profiles. For a circulating fluidized bed riser, such predicted profiles result in a higher drag force between the gas and solid phase and also overestimated solids mass flux at the outlet. Thus, there is a need to formulate the closure correlations which can accurately predict the hydrodynamics using coarse meshes. This thesis uses the space averaging modeling approach in the formulation of closure models for coarse mesh simulations of the gas-solid flow in fluidized beds using Geldart group B particles. In the analysis of formulating the closure correlation for space averaged drag model, the main parameters for the modeling were found to be the averaging size, solid volume fraction, and distance from the wall. The closure model for the gas-solid drag force was formulated and validated for coarse mesh simulations of the riser, which showed the verification of this modeling approach. Coarse mesh simulations using the corrected drag model resulted in lowered values of solids mass flux. Such an approach is a promising tool in the formulation of appropriate closure models which can be used in coarse mesh simulations of large scale fluidized beds.
Resumo:
In this thesis, a model called CFB3D is validated for oxygen combustion in circulating fluidized bed boiler. The first part of the work consists of literature review in which circulating fluidized bed and oxygen combustion technologies are studied. In addition, the modeling of circulating fluidized bed furnaces is discussed and currently available industrial scale three-dimensional furnace models are presented. The main features of CFB3D model are presented along with the theories and equations related to the model parameters used in this work. The second part of this work consists of the actual research and modeling work including measurements, model setup, and modeling results. The objectives of this thesis is to study how well CFB3D model works with oxygen combustion compared to air combustion in circulating fluidized bed boiler and what model parameters need to be adjusted when changing from air to oxygen combustion. The study is performed by modeling two air combustion cases and two oxygen combustion cases with comparable boiler loads. The cases are measured at Ciuden 30 MWth Flexi-Burn demonstration plant in April 2012. The modeled furnace temperatures match with the measurements as well in oxygen combustion cases as in air combustion cases but the modeled gas concentrations differ from the measurements clearly more in oxygen combustion cases. However, the same model parameters are optimal for both air and oxygen combustion cases. When the boiler load is changed, some combustion and heat transfer related model parameters need to be adjusted. To improve the accuracy of modeling results, better flow dynamics model should be developed in the CFB3D model. Additionally, more measurements are needed from the lower furnace to find the best model parameters for each case. The validation work needs to be continued in order to improve the modeling results and model predictability.
Resumo:
In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.
Resumo:
Capillary electrophoresis method designed originally for the analysis of monosaccharides was validated using reference solutions of polydatin. The validation was conducted by studying and determining the concentration levels of LOD and LOQ and the range of linearity and by determining levels of uncertainty in respect to repeatability and reproducibility. The reliability of the gained results is also discussed. A guide with recommendations considering the validation and overall design of analysis sequences with CE is also produced as a result of this study.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Fluid particle breakup and coalescence are important phenomena in a number of industrial flow systems. This study deals with a gas-liquid bubbly flow in one wastewater cleaning application. Three-dimensional geometric model of a dispersion water system was created in ANSYS CFD meshing software. Then, numerical study of the system was carried out by means of unsteady simulations performed in ANSYS FLUENT CFD software. Single-phase water flow case was setup to calculate the entire flow field using the RNG k-epsilon turbulence model based on the Reynolds-averaged Navier-Stokes (RANS) equations. Bubbly flow case was based on a computational fluid dynamics - population balance model (CFD-PBM) coupled approach. Bubble breakup and coalescence were considered to determine the evolution of the bubble size distribution. Obtained results are considered as steps toward optimization of the cleaning process and will be analyzed in order to make the process more efficient.
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.
Resumo:
Wood-based bioprocesses present one of the fields of interest with the most potential in the circular economy. Expanding the use of wood raw material in sustainable industrial processes is acknowledged on both a global and a regional scale. This thesis concerns the application of a capillary zone electrophoresis (CZE) method with the aim of monitoring wood-based bioprocesses. The range of detectable carbohydrate compounds is expanded to furfural and polydatin in aquatic matrices. The experimental portion has been conducted on a laboratory scale with samples imitating process samples. This thesis presents a novel strategy for the uncertainty evaluation via in-house validation. The focus of the work is on the uncertainty factors of the CZE method. The CZE equipment is sensitive to ambient conditions. Therefore, a proper validation is essential for robust application. This thesis introduces a tool for process monitoring of modern bioprocesses. As a result, it is concluded that the applied CZE method provides additional results to the analysed samples and that the profiling approach is suitable for detecting changes in process samples. The CZE method shows significant potential in process monitoring because of the capability of simultaneously detecting carbohydrate-related compound clusters. The clusters can be used as summary terms, indicating process variation and drift.
Resumo:
Kirjallisuusarvostelu
Resumo:
The objective of the work is to study the flow behavior and to support the design of air cleaner by dynamic simulation.In a paper printing industry, it is necessary to monitor the quality of paper when the paper is being produced. During the production, the quality of the paper can be monitored by camera. Therefore, it is necessary to keep the camera lens clean as wood particles may fall from the paper and lie on the camera lens. In this work, the behavior of the air flow and effect of the airflow on the particles at different inlet angles are simulated. Geometries of a different inlet angles of single-channel and double-channel case were constructed using ANSYS CFD Software. All the simulations were performed in ANSYS Fluent. The simulation results of single-channel and double-channel case revealed significant differences in the behavior of the flow and the particle velocity. The main conclusion from this work are in following. 1) For the single channel case the best angle was 0 degree because in that case, the air flow can keep 60% of the particles away from the lens which would otherwise stay on lens. 2) For the double channel case, the best solution was found when the angle of the first inlet was 0 degree and the angle of second inlet was 45 degree . In that case, the airflow can keep 91% of particles away from the lens which would otherwise stay on lens.
Resumo:
Prostate cancer (PCa) has emerged as the most commonly diagnosed lethal cancer in European men. PCa is a heterogeneous cancer that in the majority of the cases is slow growing: consequently, these patients would not need any medical treatment. Currently, the measurement of prostate-specific antigen (PSA) from blood by immunoassay followed by digital rectal examination and a pathological examination of prostate tissue biopsies are the most widely used methods in the diagnosis of PCa. These methods suffer from a lack of sensitivity and specificity that may cause either missed cancers or overtreatment as a consequence of over-diagnosis. Therefore, more reliable biomarkers are needed for a better discrimination between indolent and potentially aggressive cancers. The aim of this thesis was the identification and validation of novel biomarkers for PCa. The mRNA expression level of 14 genes including AMACR, AR, PCA3, SPINK1, TMPRSS2-ERG, KLK3, ACSM1, CACNA1D, DLX1, LMNB1, PLA2G7, RHOU, SPON2, and TDRD1 was measured by a truly quantitative reverse transcription PCR in different prostate tissue samples from men with and without PCa. For the last eight genes the function of the genes in PCa progression was studied by a specific siRNA knockdown in PC-3 and VCaP cells. The results from radical prostatectomy and cystoprostatectomy samples showed statistically significant overexpression for all the target genes, except for KLK3 in men with PCa compared with men without PCa. Statistically significant difference was also observed in low versus high Gleason grade tumors (for PLA2G7), PSA relapse versus no relapse (for SPON2), and low versus high TNM stages (for CACNA1D and DLX1). Functional studies and siRNA silencing results revealed a cytotoxicity effect for the knock-down of DLX1, PLA2G7, and RHOU, and altered tumor cell invasion for PLA2G7, RHOU, ACSM1, and CACNA1D knock-down in 3D conditions. In addition, effects on tumor cell motility were observed after silencing PLA2G7 and RHOU in 2D monolayer cultures. Altogether, these findings indicate the possibility of utilizing these new markers as diagnostic and prognostic markers, and they may also represent therapeutic targets for PCa.
Resumo:
The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.