10 resultados para bioanalytical method validation
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Wood-based bioprocesses present one of the fields of interest with the most potential in the circular economy. Expanding the use of wood raw material in sustainable industrial processes is acknowledged on both a global and a regional scale. This thesis concerns the application of a capillary zone electrophoresis (CZE) method with the aim of monitoring wood-based bioprocesses. The range of detectable carbohydrate compounds is expanded to furfural and polydatin in aquatic matrices. The experimental portion has been conducted on a laboratory scale with samples imitating process samples. This thesis presents a novel strategy for the uncertainty evaluation via in-house validation. The focus of the work is on the uncertainty factors of the CZE method. The CZE equipment is sensitive to ambient conditions. Therefore, a proper validation is essential for robust application. This thesis introduces a tool for process monitoring of modern bioprocesses. As a result, it is concluded that the applied CZE method provides additional results to the analysed samples and that the profiling approach is suitable for detecting changes in process samples. The CZE method shows significant potential in process monitoring because of the capability of simultaneously detecting carbohydrate-related compound clusters. The clusters can be used as summary terms, indicating process variation and drift.
Resumo:
Raaka-aineen hiukkaskoko on lääkekehityksessä keskeinen materiaaliparametri. Lääkeaineen partikkelikoko vaikuttaa moneen lääketuotteen tärkeään ominaisuuteen, esimerkiksi lääkkeen biologiseen hyväksikäytettävyyteen. Tässä diplomityössä keskityttiin jauhemaisten lääkeaineiden hiukkaskoon määrittämiseen laserdiffraktiomenetelmällä. Menetelmä perustuu siihen, että partikkeleista sironneen valon intensiteetin sirontakulmajakauma on riippuvainen partikkelien kokojakaumasta. Työn kirjallisuusosassa esiteltiin laserdiffraktiomenetelmän teoriaa. PIDS (Polarization Intensity Differential Scattering) tekniikka, jota voidaan käyttää laserdiffraktion yhteydessä, on myös kuvattu kirjallisuusosassa. Muihin menetelmiin perustuvista analyysimenetelmistä tutustuttiin mikroskopiaan sekä aerodynaamisen lentoajan määrittämiseen perustuvaan menetelmään. Kirjallisuusosassa esiteltiin myös partikkelikoon yleisimpiä esitystapoja. Työn kokeellisen osan tarkoituksena oli kehittää ja validoida laserdiffraktioon perustuva partikkelikoon määritysmenetelmä tietylle lääkeaineelle. Menetelmäkehitys tehtiin käyttäen Beckman Coulter LS 13 320 laserdiffraktoria. Laite mahdollistaa PIDS-tekniikan käytön laserdiffraktiotekniikan ohella. Menetelmäkehitys aloitettiin arvioimalla, että kyseinen lääkeaine soveltuu parhaiten määritettäväksi nesteeseen dispergoituna. Liukoisuuden perusteella väliaineeksi valittiin tällä lääkeaineella kyllästetty vesiliuos. Dispergointiaineen sekä ultraäänihauteen käyttö havaittiin tarpeelliseksi dispergoidessa kyseistä lääkeainetta kylläiseen vesiliuokseen. Lopuksi sekoitusnopeus näytteensyöttöyksikössä säädettiin sopivaksi. Validointivaiheessa kehitetyn menetelmän todettiin soveltuvan hyvin kyseiselle lääkeaineelle ja tulosten todettiin olevan oikeellisia sekä toistettavia. Menetelmä ei myöskään ollut herkkä pienille häiriöille.
Resumo:
Capillary electrophoresis method designed originally for the analysis of monosaccharides was validated using reference solutions of polydatin. The validation was conducted by studying and determining the concentration levels of LOD and LOQ and the range of linearity and by determining levels of uncertainty in respect to repeatability and reproducibility. The reliability of the gained results is also discussed. A guide with recommendations considering the validation and overall design of analysis sequences with CE is also produced as a result of this study.
Resumo:
Työn tavoite oli kehittää karakterisointimenetelmät kalkkikiven ja polttoaineen tuhkan jauhautumisen ennustamiselle kiertoleijukattilan tulipesässä. Kiintoainekäyttäytymisen karakterisoinnilla ja mallintamisella voidaan tarkentaa tulipesän lämmönsiirron ja tuhkajaon ennustamista. Osittain kokeelliset karakterisointimenetelmät perustuvat kalkkikiven jauhautumiseen laboratoriokokoluokan leijutetussa kvartsiputkireaktorissa ja tuhkan jauhatumiseen rotaatiomyllyssä. Karakterisointimenetelmät ottavat huomioon eri-laiset toimintaolosuhteet kaupallisen kokoluokan kiertoleijukattiloissa. Menetelmät kelpoistettiin kaupallisen kokoluokan kiertoleijukattiloista mitattujen ja fraktioittaisella kiintoainemallilla mallinnettujen taseiden avulla. Kelpoistamistaseiden vähäisyydestä huolimatta karakterisointimenetelmät arvioitiin virhetarkastelujen perusteella järkeviksi. Karakterisointimenetelmien kehittämistä ja tarkentamista tullaan jatkamaan.
Resumo:
Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.
Resumo:
The aim of the present study was to demonstrate the wide applicability of the novel photoluminescent labels called upconverting phosphors (UCPs) in proximity-based bioanalytical assays. The exceptional features of the lanthanide-doped inorganic UCP compounds stem from their capability for photon upconversion resulting in anti-Stokes photoluminescence at visible wavelengths under near-infrared (NIR) excitation. Major limitations related to conventional photoluminescent labels are avoided, rendering the UCPs a competitive next-generation label technology. First, the background luminescence is minimized due to total elimination of autofluorescence. Consequently, improvements in detectability are expected. Second, at the long wavelengths (>600 nm) used for exciting and detecting the UCPs, the transmittance of sample matrixes is significantly greater in comparison with shorter wavelengths. Colored samples are no longer an obstacle to the luminescence measurement, and more flexibility is allowed even in homogeneous assay concepts, where the sample matrix remains present during the entire analysis procedure, including label detection. To transform a UCP particle into a biocompatible label suitable for bioanalytical assays, it must be colloidal in an aqueous environment and covered with biomolecules capable of recognizing the analyte molecule. At the beginning of this study, only UCP bulk material was available, and it was necessary to process the material to submicrometer-sized particles prior to use. Later, the ground UCPs, with irregular shape, wide size-distribution and heterogeneous luminescence properties, were substituted by a smaller-sized spherical UCP material. The surface functionalization of the UCPs was realized by producing a thin hydrophilic coating. Polymer adsorption on the UCP surface is a simple way to introduce functional groups for bioconjugation purposes, but possible stability issues encouraged us to optimize an optional silica-encapsulation method which produces a coating that is not detached in storage or assay conditions. An extremely thin monolayer around the UCPs was pursued due to their intended use as short-distance energy donors, and much attention was paid to controlling the thickness of the coating. The performance of the UCP technology was evaluated in three different homogeneous resonance energy transfer-based bioanalytical assays: a competitive ligand binding assay, a hybridization assay for nucleic acid detection and an enzyme activity assay. To complete the list, a competitive immunoassay has been published previously. Our systematic investigation showed that a nonradiative energy transfer mechanism is indeed involved, when a UCP and an acceptor fluorophore are brought into close proximity in aqueous suspension. This process is the basis for the above-mentioned homogeneous assays, in which the distance between the fluorescent species depends on a specific biomolecular binding event. According to the studies, the submicrometer-sized UCP labels allow versatile proximity-based bioanalysis with low detection limits (a low-nanomolar concentration for biotin, 0.01 U for benzonase enzyme, 0.35 nM for target DNA sequence).
Resumo:
In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.