968 resultados para Method validation
Resumo:
This study describes a simple, fast and reproducible method using RP-HPLC-UV, in a gradient system, for quantification of reserpine in Rauvolfia sellowii stem bark. The analysis were carried out on a C18 column; mobile phase was water and acetonitrile, and separations were carried out in 10 min, flow rate of 1.0 mL min-1, 25 ºC and 268 nm. The validation data showed that the method was specific, accurate, precise and robust. Results were linear over a range of 0.625-40.0 μg mL-1, and the mean recovery was 95.1%. The amount of reserpine found in the dried stem bark was 0.01% (m/m).
Resumo:
Caesalpinia peltophoroides is a domesticated tree found in Brazil. It was necessary to develop an analytical method to determine the content of total polyphenols (TP) in this herbal drug. The pre-analytical method was standardized for analysis time, wavelength, and the best standard to use. The optimum conditions were: pyrogallol, 760 nm, and 30 min respectively. Under these conditions, validation by UV/Vis spectrophotometry proved to be reliable for TP of the crude extract and semipurified fractions from C. peltophoroides. Standardization is required for every herbal drug, and this method proved to be linear, precise, accurate, reproducible, robust, and easy to perform.
Resumo:
To assess topical delivery studies of glycoalkaloids, an analytical method by HPLC-UV was developed and validated for the determination of solasonine (SN) and solamargine (SM) in different skin layers, as well as in a topical formulation. The method was linear within the ranges 0.86 to 990.00 µg/mL for SN and 1.74 to 1000.00 µg/mL for SM (r = 0.9996). Moreover, the recoveries for both glycoalkaloids were higher than 88.94 and 93.23% from skin samples and topical formulation, respectively. The method developed is reliable and suitable for topical delivery skin studies and for determining the content of SN and SM in topical formulations.
Resumo:
A simple, precise, specific, repeatable and discriminating dissolution test for primaquine (PQ) matrix tablets was developed and validated according to ICH and FDA guidelines. Two UV assaying methods were validated for determination of PQ released in 0.1 M hydrochloric acid and water media. Both methods were linear (R²>0.999), precise (R.S.D.<1.87%) and accurate (97.65-99.97%). Dissolution efficiency (69-88%) and equivalence of formulations (f2) was assessed in different media and apparatuses (basket/100 rpm and paddle/50 rpm) tested. Discriminating condition was 900 mL aqueous medium, basket at 100 rpm and sampling times at 1, 4 and 8 h. Repeatability (R.S.D.<2.71%) and intermediate precision (R.S.D.<2.06%) of dissolution method were satisfactory.
Resumo:
A statistical mixture-design technique was used to study the effects of different solvents and their mixtures on the yield, total polyphenol content, and antioxidant capacity of the crude extracts from the bark of Schinus terebinthifolius Raddi (Anacardiaceae). The experimental results and their response-surface models showed that ternary mixtures with equal portions of all the three solvents (water, ethanol and acetone) were better than the binary mixtures in generating crude extracts with the highest yield (22.04 ± 0.48%), total polyphenol content (29.39 ± 0.39%), and antioxidant capacity (6.38 ± 0.21). An analytical method was developed and validated for the determination of total polyphenols in the extracts. Optimal conditions for the various parameters in this analytical method, namely, the time for the chromophoric reaction to stabilize, wavelength of the absorption maxima to be monitored, the reference standard and the concentration of sodium carbonate were determined to be 5 min, 780 nm, pyrogallol, and 14.06% w v-1, respectively. UV-Vis spectrophotometric monitoring of the reaction under these conditions proved the method to be linear, specific, precise, accurate, reproducible, robust, and easy to perform.
Resumo:
This article describes the isolation and identification of flavonoids in the hydroethanolic extract of the aerial parts from Tonina fluviatilis and evaluation of their antiradical activity. A method based on HPLC-DAD was developed and validated for detecting and quantifying flavonoids in hydroethanolic extracts. The flavonoids identified and quantified in the extract were 6,7-dimethoxyquercetin-3-O-β-D-glucopyranoside (1), 6-hydroxy-7-methoxyquercetin-3-O-β-D-glucopyranoside (2), and 6-methoxyquercetin-3-O-β-D-glucopyranoside (3). The developed method presented good validation parameters, showing that the results obtained are consistent and can be used in ensuring the quantification of these constituents in the extracts. Compounds 2 and 3 showed strong antiradical activity when compared with the positive controls (quercetin and gallic acid).
Resumo:
In spite of different methods reported in the literature to determine olanzapine in biological fluids, all of them used high volumes of plasma. Therefore, the purpose of this paper was to develop an LC-MS/MS method using small plasma volume (0.1 mL) to apply in a preclinical pharmacokinetic investigation. The method was linear over the concentration ranges of 10 - 1000 ng mL-1. Extraction recoveries, stability, and validation parameters were evaluated. Results were within the acceptable limits of international guidelines. A significant decrease in clearance led to a significant 2.26-times increase in AUC0 - 6h of olanzapine-loaded lipid-core nanocapsules compared with free-olanzapine.
Resumo:
The quantitative structure property relationship (QSPR) for the boiling point (Tb) of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs) was investigated. The molecular distance-edge vector (MDEV) index was used as the structural descriptor. The quantitative relationship between the MDEV index and Tb was modeled by using multivariate linear regression (MLR) and artificial neural network (ANN), respectively. Leave-one-out cross validation and external validation were carried out to assess the prediction performance of the models developed. For the MLR method, the prediction root mean square relative error (RMSRE) of leave-one-out cross validation and external validation was 1.77 and 1.23, respectively. For the ANN method, the prediction RMSRE of leave-one-out cross validation and external validation was 1.65 and 1.16, respectively. A quantitative relationship between the MDEV index and Tb of PCDD/Fs was demonstrated. Both MLR and ANN are practicable for modeling this relationship. The MLR model and ANN model developed can be used to predict the Tb of PCDD/Fs. Thus, the Tb of each PCDD/F was predicted by the developed models.
Resumo:
Objective: To develop and validate an instrument for measuring the acquisition of technical skills in conducting operations of increasing difficulty for use in General Surgery Residency (GSR) programs. Methods: we built a surgical skills assessment tool containing 11 operations in increasing levels of difficulty. For instrument validation we used the face validaity method. Through an electronic survey tool (Survey MonKey(r)) we sent a questionnaire to Full and Emeritus members of the Brazilian College of Surgeons - CBC - all bearers of the CBC Specialist Title. Results: Of the 307 questionnaires sent we received 100 responses. For the analysis of the data collected we used the Cronbach's alpha test. We observed that, in general, the overall alpha presented with values near or greater than 0.70, meaning good consistency to assess their points of interest. Conclusion: The evaluation instrument built was validated and can be used as a method of assessment of technical skill acquisition in the General Surgery Residency programs in Brazil.
Resumo:
The demand for more efficient manufacturing processes has been increasing in the last few years. The cold forging process is presented as a possible solution, because it allows the production of parts with a good surface finish and with good mechanical properties. Nevertheless, the cold forming sequence design is very empirical and it is based on the designer experience. The computational modeling of each forming process stage by the finite element method can make the sequence design faster and more efficient, decreasing the use of conventional "trial and error" methods. In this study, the application of a commercial general finite element software - ANSYS - has been applied to model a forming operation. Models have been developed to simulate the ring compression test and to simulate a basic forming operation (upsetting) that is applied in most of the cold forging parts sequences. The simulated upsetting operation is one stage of the automotive starter parts manufacturing process. Experiments have been done to obtain the stress-strain material curve, the material flow during the simulated stage, and the required forming force. These experiments provided results used as numerical model input data and as validation of model results. The comparison between experiments and numerical results confirms the developed methodology potential on die filling prediction.
Resumo:
In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.
Resumo:
The objective of this study was to optimize and validate the solid-liquid extraction (ESL) technique for determination of picloram residues in soil samples. At the optimization stage, the optimal conditions for extraction of soil samples were determined using univariate analysis. Ratio soil/solution extraction, type and time of agitation, ionic strength and pH of extraction solution were evaluated. Based on the optimized parameters, the following method of extraction and analysis of picloram was developed: weigh 2.00 g of soil dried and sieved through a sieve mesh of 2.0 mm pore, add 20.0 mL of KCl concentration of 0.5 mol L-1, shake the bottle in the vortex for 10 seconds to form suspension and adjust to pH 7.00, with alkaline KOH 0.1 mol L-1. Homogenate the system in a shaker system for 60 minutes and then let it stand for 10 minutes. The bottles are centrifuged for 10 minutes at 3,500 rpm. After the settlement of the soil particles and cleaning of the supernatant extract, an aliquot is withdrawn and analyzed by high performance liquid chromatography. The optimized method was validated by determining the selectivity, linearity, detection and quantification limits, precision and accuracy. The ESL methodology was efficient for analysis of residues of the pesticides studied, with percentages of recovery above 90%. The limits of detection and quantification were 20.0 and 66.0 mg kg-1 soil for the PVA, and 40.0 and 132.0 mg kg-1 soil for the VLA. The coefficients of variation (CV) were equal to 2.32 and 2.69 for PVA and TH soils, respectively. The methodology resulted in low organic solvent consumption and cleaner extracts, as well as no purification steps for chromatographic analysis were required. The parameters evaluated in the validation process indicated that the ESL methodology is efficient for the extraction of picloram residues in soils, with low limits of detection and quantification.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
The objective of the present study was to validate the transit-time technique for long-term measurements of iliac and renal blood flow in rats. Flow measured with ultrasonic probes was confirmed ex vivo using excised arteries perfused at varying flow rates. An implanted 1-mm probe reproduced with accuracy different patterns of flow relative to pressure in freely moving rats and accurately quantitated the resting iliac flow value (on average 10.43 ± 0.99 ml/min or 2.78 ± 0.3 ml min-1 100 g body weight-1). The measurements were stable over an experimental period of one week but were affected by probe size (resting flows were underestimated by 57% with a 2-mm probe when compared with a 1-mm probe) and by anesthesia (in the same rats, iliac flow was reduced by 50-60% when compared to the conscious state). Instantaneous changes of iliac and renal flow during exercise and recovery were accurately measured by the transit-time technique. Iliac flow increased instantaneously at the beginning of mild exercise (from 12.03 ± 1.06 to 25.55 ± 3.89 ml/min at 15 s) and showed a smaller increase when exercise intensity increased further, reaching a plateau of 38.43 ± 1.92 ml/min at the 4th min of moderate exercise intensity. In contrast, exercise-induced reduction of renal flow was smaller and slower, with 18% and 25% decreases at mild and moderate exercise intensities. Our data indicate that transit-time flowmetry is a reliable method for long-term and continuous measurements of regional blood flow at rest and can be used to quantitate the dynamic flow changes that characterize exercise and recovery
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.