951 resultados para Methods validation
Resumo:
A gas chromatographic method has been developed for the assay of fluvastatin sodium (FLU). FLU was silylated with N,O-bis(trimethylsilyl)trifluoroacetamide-1% trimethylchlorosilane at 90 ºC for 30 min and analysed in a DB-1 column by capillary gas chromatograph with a flame ionization detector. The method was validated. The assay was linear over the concentration range at 10.0 to 50.0 µg mL-1. The limit of detection and the limit of quantitation were 1.0 and 3.0 µg mL-1, respectively. The recoveries of FLU derivatives were in the range of 99.25-99.80%. In inter-day and intra-day analysis, the values of relative standard deviation (%) and the relative mean error (%) were found between 0.20-0.80% and -0.20-0.75%, respectively. The developed method was succesfully applied to analyze the FLU content in tablet formulation. The results were statistically compared with those obtained by the official method, and no significant difference was found between the two methods. Therefore, it can be recommended for the quality control assay of FLU in pharmaceutical industry.
Resumo:
A dissolution test for in vitro evaluation of tablet dosage forms containing 10 mg of rupatadine was developed and validated by RP-LC. A discriminatory dissolution method was established using apparatus paddle at a stirring rate of 50 rpm with 900 mL of deaerated 0.01 M hydrochloric acid. The proposed method was validated yielding acceptable results for the parameters evaluated, and was applied for the quality control analysis of rupatadine tablets, and to evaluate the formulation during an accelerated stability study. Moreover, quantitative analyses were also performed, to compare the applicability of the RP-LC and the LC-MS/MS methods.
Resumo:
Three simple, sensitive, economical and reproducible spectrophotometric methods (A, B and C) are described for determination of mesalamine in pure drug as well as in tablet dosage forms. Method A is based on the reduction of tungstate and/or molybdate in Folin Ciocalteu's reagent; method B describes the reaction between the diazotized drug and α-naphthol and method C is based on the reaction of the drug with vanillin, in acidic medium. Under optimum conditions, mesalamine could be quantified in the concentration ranges, 1-30, 1-15 and 2-30 µg mL-1 by method A, B and C, respectively. All the methods have been applied to the determination of mesalamine in tablet dosage forms. Results of analysis are validated statistically.
Resumo:
Two analytical methods were validated for determination of trichlorophenols, tetrachlorophenols and pentachlorophenol in drinking water. Limits of quantification were at least ten times lower than maximum permissible levels set by the Brazilian legislation, which are 200 ng mL-1 for 2,4,6-trichlorophenol and 9 ng mL-1 for pentachlorophenol. Chlorophenol levels were determined in tap water collected in the Municipality of Rio de Janeiro. 2,4,6-Trichlorophenol residues were detected in 36% of the samples, varying from 0.008 to 0.238 ng mL-1. All other analytes were below the limit of quantification. The validated methods showed to be suitable for application in routine quality control.
Resumo:
A simple, precise, specific, repeatable and discriminating dissolution test for primaquine (PQ) matrix tablets was developed and validated according to ICH and FDA guidelines. Two UV assaying methods were validated for determination of PQ released in 0.1 M hydrochloric acid and water media. Both methods were linear (R²>0.999), precise (R.S.D.<1.87%) and accurate (97.65-99.97%). Dissolution efficiency (69-88%) and equivalence of formulations (f2) was assessed in different media and apparatuses (basket/100 rpm and paddle/50 rpm) tested. Discriminating condition was 900 mL aqueous medium, basket at 100 rpm and sampling times at 1, 4 and 8 h. Repeatability (R.S.D.<2.71%) and intermediate precision (R.S.D.<2.06%) of dissolution method were satisfactory.
Resumo:
Nowadays software testing and quality assurance have a great value in software development process. Software testing does not mean a concrete discipline, it is the process of validation and verification that starts from the idea of future product and finishes at the end of product’s maintenance. The importance of software testing methods and tools that can be applied on different testing phases is highly stressed in industry. The initial objectives for this thesis were to provide a sufficient literature review on different testing phases and for each of the phases define the method that can be effectively used for improving software’s quality. Software testing phases, chosen for study are: unit testing, integration testing, functional testing, system testing, acceptance testing and usability testing. The research showed that there are many software testing methods that can be applied at different phases and in the most of the cases the choice of the method should be done depending on software type and its specification. In the thesis the problem, concerned to each of the phases was identified; the method that can help in eliminating this problem was suggested and particularly described.
Resumo:
Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.
Resumo:
Scarcity of long-term series of sediment-related variables has led watershed managers to apply mathematical models to simulate sediment fluxes. Due to the high efforts for installation and maintenance of sedimentological gauges, tracers have been pointed out as an alternative to validate soil redistribution modelling. In this study, the 137Cs technique was used to assess the WASA-SED model performance at the Benguê watershed (933 km²), in the Brazilian semiarid. Qualitatively, good agreement was found among the 137Cs technique and the WASA-SED model results. Nonetheless, quantitatively great differences, up to two orders of magnitude, were found between the two methods. Among the uncertainties inherent to the 137Cs technique, definition of the reference inventory seems to be a major source of imprecision. In addition, estimations of water and sediment fluxes with mathematical models usually also present high uncertainty, contributing to the quantitative differences of the soil redistribution estimates with the two methods.
Resumo:
Objective: To develop and validate an instrument for measuring the acquisition of technical skills in conducting operations of increasing difficulty for use in General Surgery Residency (GSR) programs. Methods: we built a surgical skills assessment tool containing 11 operations in increasing levels of difficulty. For instrument validation we used the face validaity method. Through an electronic survey tool (Survey MonKey(r)) we sent a questionnaire to Full and Emeritus members of the Brazilian College of Surgeons - CBC - all bearers of the CBC Specialist Title. Results: Of the 307 questionnaires sent we received 100 responses. For the analysis of the data collected we used the Cronbach's alpha test. We observed that, in general, the overall alpha presented with values near or greater than 0.70, meaning good consistency to assess their points of interest. Conclusion: The evaluation instrument built was validated and can be used as a method of assessment of technical skill acquisition in the General Surgery Residency programs in Brazil.
Resumo:
Analysis of faecal glucocorticoid metabolites provides a powerful noninvasive tool for monitoring adrenocortical activity in wild animals. However, differences regarding the metabolism and excretion of these substances make a validation for each species and sex investigated obligatory. Although maned wolves (Chrysocyon brachyurus) are the biggest canids in South America, their behaviour and physiology are poorly known and they are at risk in the wild. Two methods for measuring glucocorticoid metabolites in maned wolves were validated: a radio- and an enzyme immunoassay. An ACTH challenge was used to demonstrate that changes in adrenal function are reflected in faecal glucocorticoid metabolites. Our results suggest that both methods enable a reliable assessment of stress hormones in maned wolves avoiding short-term rises in glucocorticoid concentrations due to handling and restraint. These methods can be used as a valuable tool in studies of stress and conservation in this wild species.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.
Resumo:
The aim of this thesis is to propose a novel control method for teleoperated electrohydraulic servo systems that implements a reliable haptic sense between the human and manipulator interaction, and an ideal position control between the manipulator and the task environment interaction. The proposed method has the characteristics of a universal technique independent of the actual control algorithm and it can be applied with other suitable control methods as a real-time control strategy. The motivation to develop this control method is the necessity for a reliable real-time controller for teleoperated electrohydraulic servo systems that provides highly accurate position control based on joystick inputs with haptic capabilities. The contribution of the research is that the proposed control method combines a directed random search method and a real-time simulation to develop an intelligent controller in which each generation of parameters is tested on-line by the real-time simulator before being applied to the real process. The controller was evaluated on a hydraulic position servo system. The simulator of the hydraulic system was built based on Markov chain Monte Carlo (MCMC) method. A Particle Swarm Optimization algorithm combined with the foraging behavior of E. coli bacteria was utilized as the directed random search engine. The control strategy allows the operator to be plugged into the work environment dynamically and kinetically. This helps to ensure the system has haptic sense with high stability, without abstracting away the dynamics of the hydraulic system. The new control algorithm provides asymptotically exact tracking of both, the position and the contact force. In addition, this research proposes a novel method for re-calibration of multi-axis force/torque sensors. The method makes several improvements to traditional methods. It can be used without dismantling the sensor from its application and it requires smaller number of standard loads for calibration. It is also more cost efficient and faster in comparison to traditional calibration methods. The proposed method was developed in response to re-calibration issues with the force sensors utilized in teleoperated systems. The new approach aimed to avoid dismantling of the sensors from their applications for applying calibration. A major complication with many manipulators is the difficulty accessing them when they operate inside a non-accessible environment; especially if those environments are harsh; such as in radioactive areas. The proposed technique is based on design of experiment methodology. It has been successfully applied to different force/torque sensors and this research presents experimental validation of use of the calibration method with one of the force sensors which method has been applied to.
Resumo:
Decaffeinated coffee accounts for 10 percent of coffee sales in the world; it is preferred by consumers that do not wish or are sensitive to caffeine effects. This article presents an analytical comparison of capillary electrophoresis (CE) and high performance liquid chromatography (HPLC) methods for residual caffeine quantification in decaffeinated coffee in terms of validation parameters, costs, analysis time, composition and treatment of the residues generated, and caffeine quantification in 20 commercial samples. Both methods showed suitable validation parameters. Caffeine content did not differ statistically in the two different methods of analysis. The main advantage of the high performance liquid chromatography (HPLC) method was the 42-fold lower detection limit. Nevertheless, the capillary electrophoresis (CE) detection limit was 115-fold lower than the allowable limit by the Brazilian law. The capillary electrophoresis (CE) analyses were 30% faster, the reagent costs were 76.5-fold, and the volume of the residues generated was 33-fold lower. Therefore, the capillary electrophoresis (CE) method proved to be a valuable analytical tool for this type of analysis.