969 resultados para Validation Measures
Resumo:
In this thesis, a model called CFB3D is validated for oxygen combustion in circulating fluidized bed boiler. The first part of the work consists of literature review in which circulating fluidized bed and oxygen combustion technologies are studied. In addition, the modeling of circulating fluidized bed furnaces is discussed and currently available industrial scale three-dimensional furnace models are presented. The main features of CFB3D model are presented along with the theories and equations related to the model parameters used in this work. The second part of this work consists of the actual research and modeling work including measurements, model setup, and modeling results. The objectives of this thesis is to study how well CFB3D model works with oxygen combustion compared to air combustion in circulating fluidized bed boiler and what model parameters need to be adjusted when changing from air to oxygen combustion. The study is performed by modeling two air combustion cases and two oxygen combustion cases with comparable boiler loads. The cases are measured at Ciuden 30 MWth Flexi-Burn demonstration plant in April 2012. The modeled furnace temperatures match with the measurements as well in oxygen combustion cases as in air combustion cases but the modeled gas concentrations differ from the measurements clearly more in oxygen combustion cases. However, the same model parameters are optimal for both air and oxygen combustion cases. When the boiler load is changed, some combustion and heat transfer related model parameters need to be adjusted. To improve the accuracy of modeling results, better flow dynamics model should be developed in the CFB3D model. Additionally, more measurements are needed from the lower furnace to find the best model parameters for each case. The validation work needs to be continued in order to improve the modeling results and model predictability.
Resumo:
Tämän työn tavoitteena on antaa kuvaus riskinhallintamenetelmistä viidelle välituotekemikaalille, joita käytetään Stora Enson Imatran tehtailla. Välituotekemikaalit ovat mustalipeä, viherlipeä, valkolipeä, natriumbisulfiitti ja natriumsulfiitti. Nämä kemikaalit ovat jo rekisteröityjä ECHA:an ja rekisteröintiin liittyen ECHA:an on toimitettava myös kuvaus riskinhallintamenetelmistä. Työn alussa kuvaillaan työn kannalta olennaiset säädökset ja viranomaiset, jotka valvovat kemikaalien käyttöä ja valmistusta Euroopan Unionin alueella. Tämän jälkeen kerrotaan yleisesti välituotekemikaalien rekisteröintikriteereistä. Työn loppuosa käsittää kuvauksen riskinhallintamenetelmistä jokaiselle kemikaalille. Riskinhallintamenetelmät sisältävät eristyksen teknisin keinoin, menettelytapa- ja valvontatekniikat, johtamistavat ja henkilökunnan koulutuksen ja välituotekemikaalien kuljetuksen. Myös jokaisen kemikaalin ominaisuudet on kuvattu ja lyhyt prosessikuvaus kemikaalien valmistuksesta ja käytöstä on esitetty helpottamaan ymmärtämistä.
Resumo:
One of the problems that slows the development of off-line programming is the low static and dynamic positioning accuracy of robots. Robot calibration improves the positioning accuracy and can also be used as a diagnostic tool in robot production and maintenance. A large number of robot measurement systems are now available commercially. Yet, there is a dearth of systems that are portable, accurate and low cost. In this work a measurement system that can fill this gap in local calibration is presented. The measurement system consists of a single CCD camera mounted on the robot tool flange with a wide angle lens, and uses space resection models to measure the end-effector pose relative to a world coordinate system, considering radial distortions. Scale factors and image center are obtained with innovative techniques, making use of a multiview approach. The target plate consists of a grid of white dots impressed on a black photographic paper, and mounted on the sides of a 90-degree angle plate. Results show that the achieved average accuracy varies from 0.2mm to 0.4mm, at distances from the target from 600mm to 1000mm respectively, with different camera orientations.
Resumo:
Eutrophication caused by anthropogenic nutrient pollution has become one of the most severe threats to water bodies. Nutrients enter water bodies from atmospheric precipitation, industrial and domestic wastewaters and surface runoff from agricultural and forest areas. As point pollution has been significantly reduced in developed countries in recent decades, agricultural non-point sources have been increasingly identified as the largest source of nutrient loading in water bodies. In this study, Lake Säkylän Pyhäjärvi and its catchment are studied as an example of a long-term, voluntary-based, co-operative model of lake and catchment management. Lake Pyhäjärvi is located in the centre of an intensive agricultural area in southwestern Finland. More than 20 professional fishermen operate in the lake area, and the lake is used as a drinking water source and for various recreational activities. Lake Pyhäjärvi is a good example of a large and shallow lake that suffers from eutrophication and is subject to measures to improve this undesired state under changing conditions. Climate change is one of the most important challenges faced by Lake Pyhäjärvi and other water bodies. The results show that climatic variation affects the amounts of runoff and nutrient loading and their timing during the year. The findings from the study area concerning warm winters and their influences on nutrient loading are in accordance with the IPCC scenarios of future climate change. In addition to nutrient reduction measures, the restoration of food chains (biomanipulation) is a key method in water quality management. The food-web structure in Lake Pyhäjärvi has, however, become disturbed due to mild winters, short ice cover and low fish catch. Ice cover that enables winter seining is extremely important to the water quality and ecosystem of Lake Pyhäjärvi, as the vendace stock is one of the key factors affecting the food web and the state of the lake. New methods for the reduction of nutrient loading and the treatment of runoff waters from agriculture, such as sand filters, were tested in field conditions. The results confirm that the filter technique is an applicable method for nutrient reduction, but further development is needed. The ability of sand filters to absorb nutrients can be improved with nutrient binding compounds, such as lime. Long-term hydrological, chemical and biological research and monitoring data on Lake Pyhäjärvi and its catchment provide a basis for water protection measures and improve our understanding of the complicated physical, chemical and biological interactions between the terrestrial and aquatic realms. In addition to measurements carried out in field conditions, Lake Pyhäjärvi and its catchment were studied using various modelling methods. In the calibration and validation of models, long-term and wide-ranging time series data proved to be valuable. Collaboration between researchers, modellers and local water managers further improves the reliability and usefulness of models. Lake Pyhäjärvi and its catchment can also be regarded as a good research laboratory from the point of view of the Baltic Sea. The main problem in both of them is eutrophication caused by excess nutrients, and nutrient loading has to be reduced – especially from agriculture. Mitigation measures are also similar in both cases.
Resumo:
In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.
Resumo:
The objective of this study was to optimize and validate the solid-liquid extraction (ESL) technique for determination of picloram residues in soil samples. At the optimization stage, the optimal conditions for extraction of soil samples were determined using univariate analysis. Ratio soil/solution extraction, type and time of agitation, ionic strength and pH of extraction solution were evaluated. Based on the optimized parameters, the following method of extraction and analysis of picloram was developed: weigh 2.00 g of soil dried and sieved through a sieve mesh of 2.0 mm pore, add 20.0 mL of KCl concentration of 0.5 mol L-1, shake the bottle in the vortex for 10 seconds to form suspension and adjust to pH 7.00, with alkaline KOH 0.1 mol L-1. Homogenate the system in a shaker system for 60 minutes and then let it stand for 10 minutes. The bottles are centrifuged for 10 minutes at 3,500 rpm. After the settlement of the soil particles and cleaning of the supernatant extract, an aliquot is withdrawn and analyzed by high performance liquid chromatography. The optimized method was validated by determining the selectivity, linearity, detection and quantification limits, precision and accuracy. The ESL methodology was efficient for analysis of residues of the pesticides studied, with percentages of recovery above 90%. The limits of detection and quantification were 20.0 and 66.0 mg kg-1 soil for the PVA, and 40.0 and 132.0 mg kg-1 soil for the VLA. The coefficients of variation (CV) were equal to 2.32 and 2.69 for PVA and TH soils, respectively. The methodology resulted in low organic solvent consumption and cleaner extracts, as well as no purification steps for chromatographic analysis were required. The parameters evaluated in the validation process indicated that the ESL methodology is efficient for the extraction of picloram residues in soils, with low limits of detection and quantification.
Resumo:
Capillary electrophoresis method designed originally for the analysis of monosaccharides was validated using reference solutions of polydatin. The validation was conducted by studying and determining the concentration levels of LOD and LOQ and the range of linearity and by determining levels of uncertainty in respect to repeatability and reproducibility. The reliability of the gained results is also discussed. A guide with recommendations considering the validation and overall design of analysis sequences with CE is also produced as a result of this study.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Tämän diplomityön tarkoituksena on kehittää menetelmä, jolla voidaan seurata inhalaatiovalmisteen sisältämiä epäpuhtauksia. Menetelmä kehitetään erittäin korkean suorituskyvyn kromatografialaitteistolle (Ultra High Performance Liquid Chromatography, UHPLC) jo olemassa olevan HPLC-epäpuhtausmenetelmän pohjalta. Uusi menetelmä kehitetään analyysiajan lyhentämiseksi ja erotuksen resoluution parantamiseksi. Työn kirjallisuusosa esittelee lyhyesti inhalaatiovalmisteet sekä nestekromatografian perusteet. Korkean erotuskyvyn nestekromatografia ja analysoinnin apuna käytettävät parametrit selvitetään laskukaavoineen. Kirjallisuusosa keskittyy epäpuhtausmenetelmän kehittämisen kulkuun ja menetelmän validoinnissa suoritettaviin kokeisiin. Soveltavassa osassa Orion Oyj:n kehitteillä olevalle kahden vaikuttavan aineen inhalaatiovalmisteelle kehitetään UHPLC-epäpuhtausmenetelmä kirjallisuusosiossa esitellyn menetelmäkehitysrungon pohjalta. Menetelmäkehityksen kulku ja analyysin olosuhteiden valinta esitellään pääpiirteittäin, jonka jälkeen kehitetty menetelmä validoidaan sille tehdyn validointisuunnitelman mukaisesti. Kehitetystä UHPLC-epäpuhtausmenetelmästä saatiin 16 minuuttia lyhyempi kuin vastaava HPLC-epäpuhtausmenetelmä. Myös analyysin resoluutio parantui merkittävästi. Toistuvien injektioiden jälkeen kromatogrammissa esiintyi kuitenkin häntimistä, joka johti merkittävään resoluution heikkenemiseen. Häntimisen syyksi epäiltiin toisen vaikuttavan aineen kiinnittymistä kolonnimateriaaliin, mutta sen estämiseksi ehdotetut toimenpiteet eivät sopineet kehitettyyn menetelmään.
Resumo:
The objective of the present study was to validate the transit-time technique for long-term measurements of iliac and renal blood flow in rats. Flow measured with ultrasonic probes was confirmed ex vivo using excised arteries perfused at varying flow rates. An implanted 1-mm probe reproduced with accuracy different patterns of flow relative to pressure in freely moving rats and accurately quantitated the resting iliac flow value (on average 10.43 ± 0.99 ml/min or 2.78 ± 0.3 ml min-1 100 g body weight-1). The measurements were stable over an experimental period of one week but were affected by probe size (resting flows were underestimated by 57% with a 2-mm probe when compared with a 1-mm probe) and by anesthesia (in the same rats, iliac flow was reduced by 50-60% when compared to the conscious state). Instantaneous changes of iliac and renal flow during exercise and recovery were accurately measured by the transit-time technique. Iliac flow increased instantaneously at the beginning of mild exercise (from 12.03 ± 1.06 to 25.55 ± 3.89 ml/min at 15 s) and showed a smaller increase when exercise intensity increased further, reaching a plateau of 38.43 ± 1.92 ml/min at the 4th min of moderate exercise intensity. In contrast, exercise-induced reduction of renal flow was smaller and slower, with 18% and 25% decreases at mild and moderate exercise intensities. Our data indicate that transit-time flowmetry is a reliable method for long-term and continuous measurements of regional blood flow at rest and can be used to quantitate the dynamic flow changes that characterize exercise and recovery
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
Targeted measures bring the greatest benefits for environmental protection in agriculture final report of the TEHO Plus project (2011–2014). Particular priority areas of the project activities included creating a training package for agri-environment advisers and testing it, farm-specific advisory services and exploitation of experiences of advice provision, putting together an information package of agri-environment issues, farm-level experiments, and development of water quality monitoring. The recommendations issued by advisers on targeting environmental measures were based on utilising geographical information material and nutrient balances. The project was implemented in cooperation by Southwest Finland Centre for Economic Development, Transport and the Environment, MTK-Satakunta and MTK-Varsinais-Suomi. The project received funding from the Ministry of Agriculture and Forestry and the Ministry of the Environment. Participating farmers, with whom environmental advice was developed and experiments were carried out, were important partners for the project. The operating area of the project was Satakunta and Varsinais-Suomi in Southwest Finland, but its outcomes can be exploited nationally. A national perspective was ensured by close cooperation with actors in different provinces. This final report describes project experiences and outcomes including environmental advisory services, training of agri-environmental advisers, farm visits and the feedback received from farmers on agri-environmental advice, development of water quality monitoring, experiments and project work.
Resumo:
The phonological loop is a component of the working memory system specifically involved in the processing and manipulation of limited amounts of information of a sound-based phonological nature. Phonological memory can be assessed by the Children's Test of Nonword Repetition (CNRep) in English speakers but not in Portuguese speakers due to phonotactic differences between the two languages. The objectives of the present study were: 1) to develop the Brazilian Children's Test of Pseudoword Repetition (BCPR), a Portuguese version of the CNRep, and 2) to validate the BCPR by correlation with the Auditory Digit Span Test from the Stanford-Binet Intelligence Scale. The BCPR and Digit Span were assessed in 182 children aged 4-10 years, 84 from Minas Gerais State (42 from a rural region) and 98 from the city of São Paulo. There are subject age and word length effects causing repetition accuracy to decline as a function of the number of syllables of the pseudowords. Correlations between BCPR and Digit Span forward (r = 0.50; P <= 0.01) and backward (r = 0.43; P <= 0.01) were found, and partial correlation indicated that higher BCPR scores were associated with higher Digit Span scores. BCPR appears to depend more on schooling, while Digit Span was more related to development. The results demonstrate that the BCPR is a reliable measure of phonological working memory, similar to the CNRep.
Resumo:
A gravimetric method was evaluated as a simple, sensitive, reproducible, low-cost alternative to quantify the extent of brain infarct after occlusion of the medial cerebral artery in rats. In ether-anesthetized rats, the left medial cerebral artery was occluded for 1, 1.5 or 2 h by inserting a 4-0 nylon monofilament suture into the internal carotid artery. Twenty-four hours later, the brains were processed for histochemical triphenyltetrazolium chloride (TTC) staining and quantitation of the schemic infarct. In each TTC-stained brain section, the ischemic tissue was dissected with a scalpel and fixed in 10% formalin at 0ºC until its total mass could be estimated. The mass (mg) of the ischemic tissue was weighed on an analytical balance and compared to its volume (mm³), estimated either by plethysmometry using platinum electrodes or by computer-assisted image analysis. Infarct size as measured by the weighing method (mg), and reported as a percent (%) of the affected (left) hemisphere, correlated closely with volume (mm³, also reported as %) estimated by computerized image analysis (r = 0.88; P < 0.001; N = 10) or by plethysmography (r = 0.97-0.98; P < 0.0001; N = 41). This degree of correlation was maintained between different experimenters. The method was also sensitive for detecting the effect of different ischemia durations on infarct size (P < 0.005; N = 23), and the effect of drug treatments in reducing the extent of brain damage (P < 0.005; N = 24). The data suggest that, in addition to being simple and low cost, the weighing method is a reliable alternative for quantifying brain infarct in animal models of stroke.