967 resultados para bioanalytical method validation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To develop and validate an instrument for measuring the acquisition of technical skills in conducting operations of increasing difficulty for use in General Surgery Residency (GSR) programs. Methods: we built a surgical skills assessment tool containing 11 operations in increasing levels of difficulty. For instrument validation we used the face validaity method. Through an electronic survey tool (Survey MonKey(r)) we sent a questionnaire to Full and Emeritus members of the Brazilian College of Surgeons - CBC - all bearers of the CBC Specialist Title. Results: Of the 307 questionnaires sent we received 100 responses. For the analysis of the data collected we used the Cronbach's alpha test. We observed that, in general, the overall alpha presented with values near or greater than 0.70, meaning good consistency to assess their points of interest. Conclusion: The evaluation instrument built was validated and can be used as a method of assessment of technical skill acquisition in the General Surgery Residency programs in Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The demand for more efficient manufacturing processes has been increasing in the last few years. The cold forging process is presented as a possible solution, because it allows the production of parts with a good surface finish and with good mechanical properties. Nevertheless, the cold forming sequence design is very empirical and it is based on the designer experience. The computational modeling of each forming process stage by the finite element method can make the sequence design faster and more efficient, decreasing the use of conventional "trial and error" methods. In this study, the application of a commercial general finite element software - ANSYS - has been applied to model a forming operation. Models have been developed to simulate the ring compression test and to simulate a basic forming operation (upsetting) that is applied in most of the cold forging parts sequences. The simulated upsetting operation is one stage of the automotive starter parts manufacturing process. Experiments have been done to obtain the stress-strain material curve, the material flow during the simulated stage, and the required forming force. These experiments provided results used as numerical model input data and as validation of model results. The comparison between experiments and numerical results confirms the developed methodology potential on die filling prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to optimize and validate the solid-liquid extraction (ESL) technique for determination of picloram residues in soil samples. At the optimization stage, the optimal conditions for extraction of soil samples were determined using univariate analysis. Ratio soil/solution extraction, type and time of agitation, ionic strength and pH of extraction solution were evaluated. Based on the optimized parameters, the following method of extraction and analysis of picloram was developed: weigh 2.00 g of soil dried and sieved through a sieve mesh of 2.0 mm pore, add 20.0 mL of KCl concentration of 0.5 mol L-1, shake the bottle in the vortex for 10 seconds to form suspension and adjust to pH 7.00, with alkaline KOH 0.1 mol L-1. Homogenate the system in a shaker system for 60 minutes and then let it stand for 10 minutes. The bottles are centrifuged for 10 minutes at 3,500 rpm. After the settlement of the soil particles and cleaning of the supernatant extract, an aliquot is withdrawn and analyzed by high performance liquid chromatography. The optimized method was validated by determining the selectivity, linearity, detection and quantification limits, precision and accuracy. The ESL methodology was efficient for analysis of residues of the pesticides studied, with percentages of recovery above 90%. The limits of detection and quantification were 20.0 and 66.0 mg kg-1 soil for the PVA, and 40.0 and 132.0 mg kg-1 soil for the VLA. The coefficients of variation (CV) were equal to 2.32 and 2.69 for PVA and TH soils, respectively. The methodology resulted in low organic solvent consumption and cleaner extracts, as well as no purification steps for chromatographic analysis were required. The parameters evaluated in the validation process indicated that the ESL methodology is efficient for the extraction of picloram residues in soils, with low limits of detection and quantification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the present study was to validate the transit-time technique for long-term measurements of iliac and renal blood flow in rats. Flow measured with ultrasonic probes was confirmed ex vivo using excised arteries perfused at varying flow rates. An implanted 1-mm probe reproduced with accuracy different patterns of flow relative to pressure in freely moving rats and accurately quantitated the resting iliac flow value (on average 10.43 ± 0.99 ml/min or 2.78 ± 0.3 ml min-1 100 g body weight-1). The measurements were stable over an experimental period of one week but were affected by probe size (resting flows were underestimated by 57% with a 2-mm probe when compared with a 1-mm probe) and by anesthesia (in the same rats, iliac flow was reduced by 50-60% when compared to the conscious state). Instantaneous changes of iliac and renal flow during exercise and recovery were accurately measured by the transit-time technique. Iliac flow increased instantaneously at the beginning of mild exercise (from 12.03 ± 1.06 to 25.55 ± 3.89 ml/min at 15 s) and showed a smaller increase when exercise intensity increased further, reaching a plateau of 38.43 ± 1.92 ml/min at the 4th min of moderate exercise intensity. In contrast, exercise-induced reduction of renal flow was smaller and slower, with 18% and 25% decreases at mild and moderate exercise intensities. Our data indicate that transit-time flowmetry is a reliable method for long-term and continuous measurements of regional blood flow at rest and can be used to quantitate the dynamic flow changes that characterize exercise and recovery

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present report describes the development of a technique for automatic wheezing recognition in digitally recorded lung sounds. This method is based on the extraction and processing of spectral information from the respiratory cycle and the use of these data for user feedback and automatic recognition. The respiratory cycle is first pre-processed, in order to normalize its spectral information, and its spectrogram is then computed. After this procedure, the spectrogram image is processed by a two-dimensional convolution filter and a half-threshold in order to increase the contrast and isolate its highest amplitude components, respectively. Thus, in order to generate more compressed data to automatic recognition, the spectral projection from the processed spectrogram is computed and stored as an array. The higher magnitude values of the array and its respective spectral values are then located and used as inputs to a multi-layer perceptron artificial neural network, which results an automatic indication about the presence of wheezes. For validation of the methodology, lung sounds recorded from three different repositories were used. The results show that the proposed technique achieves 84.82% accuracy in the detection of wheezing for an isolated respiratory cycle and 92.86% accuracy for the detection of wheezes when detection is carried out using groups of respiratory cycles obtained from the same person. Also, the system presents the original recorded sound and the post-processed spectrogram image for the user to draw his own conclusions from the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Caco-2 cell line has been used as a model to predict the in vitro permeability of the human intestinal barrier. The predictive potential of the assay relies on an appropriate in-house validation of the method. The objective of the present study was to develop a single HPLC-UV method for the identification and quantitation of marker drugs and to determine the suitability of the Caco-2 cell permeability assay. A simple chromatographic method was developed for the simultaneous determination of both passively (propranolol, carbamazepine, acyclovir, and hydrochlorothiazide) and actively transported drugs (vinblastine and verapamil). Separation was achieved on a C18 column with step-gradient elution (acetonitrile and aqueous solution of ammonium acetate, pH 3.0) at a flow rate of 1.0 mL/min and UV detection at 275 nm during the total run time of 35 min. The method was validated and found to be specific, linear, precise, and accurate. This chromatographic system can be readily used on a routine basis and its utilization can be extended to other permeability models. The results obtained in the Caco-2 bi-directional transport experiments confirmed the validity of the assay, given that high and low permeability profiles were identified, and P-glycoprotein functionality was established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Graphite furnace atomic absorption spectrometry (GF AAS) was the technique chosen by the inorganic contamination laboratory (INCQ/ FIOCRUZ) to be validated and applied in routine analysis for arsenic detection and quantification. The selectivity, linearity, sensibility, detection, and quantification limits besides accuracy and precision parameters were studied and optimized under Stabilized Temperature Platform Furnace (STPF) conditions. The limit of detection obtained was 0.13 µg.L-1 and the limit of quantification was 1.04 µg.L-1, with an average precision, for total arsenic, less than 15% and an accuracy of 96%. To quantify the chemical species As(III) and As(V), an ion-exchange resin (Dowex 1X8, Cl- form) was used and the physical-chemical parameters were optimized resulting in a recuperation of 98% of As(III) and of 90% of As(V). The method was applied to groundwater, mineral water, and hemodialysis purified water samples. All results obtained were lower than the maximum limit values established by the legal Brazilian regulations, in effect, 50, 10, and 5 µg.L-1 para As total, As(III) e As(V), respectively. All results were statistically evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and low cost method to determine volatile contaminants in post-consumer recycled PET flakes was developed and validated by Headspace Dynamic Concentration and Gas Chromatography-Flame Ionization Detection (HDC-GC-FID). The analytical parameters evaluated by using surrogates include: correlation coefficient, detection limit, quantification limit, accuracy, intra-assay precision, and inter-assay precision. In order to compare the efficiency of the proposed method to recognized automated techniques, post-consumer PET packaging samples collected in Brazil were used. GC-MS was used to confirm the identity of the substances identified in the PET packaging. Some of the identified contaminants were estimated in the post-consumer material at concentrations higher than 220 ng.g-1. The findings in this work corroborate data available in the scientific literature pointing out the suitability of the proposed analytical method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sugars in apple juice prove its authenticity and its sensory and nutritional properties. The aim of this study was to develop and validate a simple analytical method using high performance liquid chromatography with refractive index detection (HPLC-RI) to determinate and quantify the sugars sucrose, D-glucose, D-fructose, and D-sorbitol polyol in apple juices, as well as to analyze the juices from the Fuji suprema and Lis Gala cultivars at three ripening stages. The analytical performance parameters evaluated indicated that the method was specific for the compounds analyzed, and the linearity of the calibration curves of sugars showed high correlation coefficients (close to 1.0). The limits of detection and quantification are consistent with recommendations available in the literature for this type of matrix. Sample preparation is simple and generates small amount of residues. Over 70% of the sugars were determined in the juices of apples at the pre-ripe stage, with an increase during senescence. This method is applicable for the determination of sugars in juices and evaluation of apple ripening.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Différentes méthodes ayant pour objectif une utilisation optimale d'antennes radio-fréquences spécialisées en imagerie par résonance magnétique sont développées et validées. Dans un premier temps, il est démontré qu'une méthode alternative de combinaison des signaux provenant des différents canaux de réception d'un réseau d'antennes mène à une réduction significative du biais causé par la présence de bruit dans des images de diffusion, en comparaison avec la méthode de la somme-des-carrés généralement utilisée. Cette réduction du biais engendré par le bruit permet une amélioration de l'exactitude de l'estimation de différents paramètres de diffusion et de diffusion tensorielle. De plus, il est démontré que cette méthode peut être utilisée conjointement avec une acquisition régulière sans accélération, mais également en présence d'imagerie parallèle. Dans une seconde perspective, les bénéfices engendrés par l'utilisation d'une antenne d'imagerie intravasculaire sont étudiés. Suite à une étude sur fantôme, il est démontré que l'imagerie par résonance magnétique intravasculaire offre le potentiel d'améliorer significativement l'exactitude géométrique lors de mesures morphologiques vasculaires, en comparaison avec les résultats obtenus avec des antennes de surface classiques. Il est illustré qu'une exactitude géométrique comparable à celle obtenue grâce à une sonde ultrasonique intravasculaire peut être atteinte. De plus, plusieurs protocoles basés sur une acquisition de type balanced steady-state free-precession sont comparés dans le but de mettre en évidence différentes relations entre les paramètres utilisés et l'exactitude géométrique obtenue. En particulier, des dépendances entre la taille du vaisseau, le rapport signal-sur-bruit à la paroi vasculaire, la résolution spatiale et l'exactitude géométrique atteinte sont mises en évidence. Dans une même optique, il est illustré que l'utilisation d'une antenne intravasculaire permet une amélioration notable de la visualisation de la lumière d'une endoprothèse vasculaire. Lorsque utilisée conjointement avec une séquence de type balanced steady-state free-precession utilisant un angle de basculement spécialement sélectionné, l'imagerie par résonance magnétique intravasculaire permet d'éliminer complètement les limitations normalement engendrées par l'effet de blindage radio-fréquence de l'endoprothèse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le but de cette étude était d’évaluer les qualifications de performance du système FlexiWare® chez le rat male Sprague Dawley et le singe Cynomolgus éveillés, ainsi que chez le chien Beagle éveillé et anesthésié, suite à l’administration de produits ayant une activité pharmacologique connue. Les produits utilisés incluaient l’albutérol administré par inhalation, la méthacholine, et le rémifentanil administrés par voie intraveineuse. Une solution saline administré par voie intraveneuse, a été utilisée comme substance témoin. Différentes variables ont servi à évaluer la réponse des animaux (rats, chien, singe). Ces dernières comprenaient la fréquence respiratoire (RR), le volume courant (TV), la ventilation minute (MV). Des paramètres additionnels ont été évalués chez le rat, soit les temps d’inspiration (IT) et d’expiration (ET), le temps du pic de débit expiratoire, les pics de débits inspiratoire et expiratoire, le ratio inspiratoire:expiratoire (I:E), le ratio inspiratoire sur respiration totale (I:TB), et l’écoulement expiratoire moyen (EF50). Les résultats obtenus ont démontré que le système FlexiWare® était suffisamment sensible et spécifique pour dépister, chez les espèces animales utilisées, les effets bronchodilateur, bronchoconstricteur et dépresseur central des substances testées. Il pourrait faire partie des méthodes (ICH 2000) utilisées en pharmacologie de sécurité lors de l’évaluation de substances pharmacologiques sur le système respiratoire des animaux de laboratoire. Les espèces animales utilisées ont semblé s’adapter aisément aux procédures de contention. Les paramètres évalués, RR, TV et MV ont permis de caractériser la réponse des animaux suite à l’administration de produits pharmacologiques à effets connus, judicieusement complétés par les variables de débit. L’ajout de paramètres du temps n’était pas primordiale pour détecter les effets des drogues, mais offre des outils complémentaires d’interpréter les changements physiologiques. Cependant, chez le rat conscient, la période d’évaluation ne devrait pas s’étendre au-delà d’une période de deux heures post traitement. Ces études constituent une évaluation des qualifications de performance de cet appareil et ont démontré de manière originale, la validation concurrentielle, en terme de précision (sensibilité et spécificité) et fiabilité pour différentes variables et sur différentes espèces.