885 resultados para validation of methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A gas chromatographic method has been developed for the assay of fluvastatin sodium (FLU). FLU was silylated with N,O-bis(trimethylsilyl)trifluoroacetamide-1% trimethylchlorosilane at 90 ºC for 30 min and analysed in a DB-1 column by capillary gas chromatograph with a flame ionization detector. The method was validated. The assay was linear over the concentration range at 10.0 to 50.0 µg mL-1. The limit of detection and the limit of quantitation were 1.0 and 3.0 µg mL-1, respectively. The recoveries of FLU derivatives were in the range of 99.25-99.80%. In inter-day and intra-day analysis, the values of relative standard deviation (%) and the relative mean error (%) were found between 0.20-0.80% and -0.20-0.75%, respectively. The developed method was succesfully applied to analyze the FLU content in tablet formulation. The results were statistically compared with those obtained by the official method, and no significant difference was found between the two methods. Therefore, it can be recommended for the quality control assay of FLU in pharmaceutical industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dissolution test for in vitro evaluation of tablet dosage forms containing 10 mg of rupatadine was developed and validated by RP-LC. A discriminatory dissolution method was established using apparatus paddle at a stirring rate of 50 rpm with 900 mL of deaerated 0.01 M hydrochloric acid. The proposed method was validated yielding acceptable results for the parameters evaluated, and was applied for the quality control analysis of rupatadine tablets, and to evaluate the formulation during an accelerated stability study. Moreover, quantitative analyses were also performed, to compare the applicability of the RP-LC and the LC-MS/MS methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple, precise, specific, repeatable and discriminating dissolution test for primaquine (PQ) matrix tablets was developed and validated according to ICH and FDA guidelines. Two UV assaying methods were validated for determination of PQ released in 0.1 M hydrochloric acid and water media. Both methods were linear (R²>0.999), precise (R.S.D.<1.87%) and accurate (97.65-99.97%). Dissolution efficiency (69-88%) and equivalence of formulations (f2) was assessed in different media and apparatuses (basket/100 rpm and paddle/50 rpm) tested. Discriminating condition was 900 mL aqueous medium, basket at 100 rpm and sampling times at 1, 4 and 8 h. Repeatability (R.S.D.<2.71%) and intermediate precision (R.S.D.<2.06%) of dissolution method were satisfactory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scarcity of long-term series of sediment-related variables has led watershed managers to apply mathematical models to simulate sediment fluxes. Due to the high efforts for installation and maintenance of sedimentological gauges, tracers have been pointed out as an alternative to validate soil redistribution modelling. In this study, the 137Cs technique was used to assess the WASA-SED model performance at the Benguê watershed (933 km²), in the Brazilian semiarid. Qualitatively, good agreement was found among the 137Cs technique and the WASA-SED model results. Nonetheless, quantitatively great differences, up to two orders of magnitude, were found between the two methods. Among the uncertainties inherent to the 137Cs technique, definition of the reference inventory seems to be a major source of imprecision. In addition, estimations of water and sediment fluxes with mathematical models usually also present high uncertainty, contributing to the quantitative differences of the soil redistribution estimates with the two methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To develop and validate an instrument for measuring the acquisition of technical skills in conducting operations of increasing difficulty for use in General Surgery Residency (GSR) programs. Methods: we built a surgical skills assessment tool containing 11 operations in increasing levels of difficulty. For instrument validation we used the face validaity method. Through an electronic survey tool (Survey MonKey(r)) we sent a questionnaire to Full and Emeritus members of the Brazilian College of Surgeons - CBC - all bearers of the CBC Specialist Title. Results: Of the 307 questionnaires sent we received 100 responses. For the analysis of the data collected we used the Cronbach's alpha test. We observed that, in general, the overall alpha presented with values near or greater than 0.70, meaning good consistency to assess their points of interest. Conclusion: The evaluation instrument built was validated and can be used as a method of assessment of technical skill acquisition in the General Surgery Residency programs in Brazil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prostate cancer (PCa) has emerged as the most commonly diagnosed lethal cancer in European men. PCa is a heterogeneous cancer that in the majority of the cases is slow growing: consequently, these patients would not need any medical treatment. Currently, the measurement of prostate-specific antigen (PSA) from blood by immunoassay followed by digital rectal examination and a pathological examination of prostate tissue biopsies are the most widely used methods in the diagnosis of PCa. These methods suffer from a lack of sensitivity and specificity that may cause either missed cancers or overtreatment as a consequence of over-diagnosis. Therefore, more reliable biomarkers are needed for a better discrimination between indolent and potentially aggressive cancers. The aim of this thesis was the identification and validation of novel biomarkers for PCa. The mRNA expression level of 14 genes including AMACR, AR, PCA3, SPINK1, TMPRSS2-ERG, KLK3, ACSM1, CACNA1D, DLX1, LMNB1, PLA2G7, RHOU, SPON2, and TDRD1 was measured by a truly quantitative reverse transcription PCR in different prostate tissue samples from men with and without PCa. For the last eight genes the function of the genes in PCa progression was studied by a specific siRNA knockdown in PC-3 and VCaP cells. The results from radical prostatectomy and cystoprostatectomy samples showed statistically significant overexpression for all the target genes, except for KLK3 in men with PCa compared with men without PCa. Statistically significant difference was also observed in low versus high Gleason grade tumors (for PLA2G7), PSA relapse versus no relapse (for SPON2), and low versus high TNM stages (for CACNA1D and DLX1). Functional studies and siRNA silencing results revealed a cytotoxicity effect for the knock-down of DLX1, PLA2G7, and RHOU, and altered tumor cell invasion for PLA2G7, RHOU, ACSM1, and CACNA1D knock-down in 3D conditions. In addition, effects on tumor cell motility were observed after silencing PLA2G7 and RHOU in 2D monolayer cultures. Altogether, these findings indicate the possibility of utilizing these new markers as diagnostic and prognostic markers, and they may also represent therapeutic targets for PCa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’athérosclérose est une maladie qui cause, par l’accumulation de plaques lipidiques, le durcissement de la paroi des artères et le rétrécissement de la lumière. Ces lésions sont généralement localisées sur les segments artériels coronariens, carotidiens, aortiques, rénaux, digestifs et périphériques. En ce qui concerne l’atteinte périphérique, celle des membres inférieurs est particulièrement fréquente. En effet, la sévérité de ces lésions artérielles est souvent évaluée par le degré d’une sténose (réduction >50 % du diamètre de la lumière) en angiographie, imagerie par résonnance magnétique (IRM), tomodensitométrie ou échographie. Cependant, pour planifier une intervention chirurgicale, une représentation géométrique artérielle 3D est notamment préférable. Les méthodes d’imagerie par coupe (IRM et tomodensitométrie) sont très performantes pour générer une imagerie tridimensionnelle de bonne qualité mais leurs utilisations sont dispendieuses et invasives pour les patients. L’échographie 3D peut constituer une avenue très prometteuse en imagerie pour la localisation et la quantification des sténoses. Cette modalité d’imagerie offre des avantages distincts tels la commodité, des coûts peu élevés pour un diagnostic non invasif (sans irradiation ni agent de contraste néphrotoxique) et aussi l’option d’analyse en Doppler pour quantifier le flux sanguin. Étant donné que les robots médicaux ont déjà été utilisés avec succès en chirurgie et en orthopédie, notre équipe a conçu un nouveau système robotique d’échographie 3D pour détecter et quantifier les sténoses des membres inférieurs. Avec cette nouvelle technologie, un radiologue fait l’apprentissage manuel au robot d’un balayage échographique du vaisseau concerné. Par la suite, le robot répète à très haute précision la trajectoire apprise, contrôle simultanément le processus d’acquisition d’images échographiques à un pas d’échantillonnage constant et conserve de façon sécuritaire la force appliquée par la sonde sur la peau du patient. Par conséquent, la reconstruction d’une géométrie artérielle 3D des membres inférieurs à partir de ce système pourrait permettre une localisation et une quantification des sténoses à très grande fiabilité. L’objectif de ce projet de recherche consistait donc à valider et optimiser ce système robotisé d’imagerie échographique 3D. La fiabilité d’une géométrie reconstruite en 3D à partir d’un système référentiel robotique dépend beaucoup de la précision du positionnement et de la procédure de calibration. De ce fait, la précision pour le positionnement du bras robotique fut évaluée à travers son espace de travail avec un fantôme spécialement conçu pour simuler la configuration des artères des membres inférieurs (article 1 - chapitre 3). De plus, un fantôme de fils croisés en forme de Z a été conçu pour assurer une calibration précise du système robotique (article 2 - chapitre 4). Ces méthodes optimales ont été utilisées pour valider le système pour l’application clinique et trouver la transformation qui convertit les coordonnées de l’image échographique 2D dans le référentiel cartésien du bras robotisé. À partir de ces résultats, tout objet balayé par le système robotique peut être caractérisé pour une reconstruction 3D adéquate. Des fantômes vasculaires compatibles avec plusieurs modalités d’imagerie ont été utilisés pour simuler différentes représentations artérielles des membres inférieurs (article 2 - chapitre 4, article 3 - chapitre 5). La validation des géométries reconstruites a été effectuée à l`aide d`analyses comparatives. La précision pour localiser et quantifier les sténoses avec ce système robotisé d’imagerie échographique 3D a aussi été déterminée. Ces évaluations ont été réalisées in vivo pour percevoir le potentiel de l’utilisation d’un tel système en clinique (article 3- chapitre 5).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CO, O3, and H2O data in the upper troposphere/lower stratosphere (UTLS) measured by the Atmospheric Chemistry Experiment Fourier Transform Spectrometer(ACE-FTS) on Canada’s SCISAT-1 satellite are validated using aircraft and ozonesonde measurements. In the UTLS, validation of chemical trace gas measurements is a challenging task due to small-scale variability in the tracer fields, strong gradients of the tracers across the tropopause, and scarcity of measurements suitable for validation purposes. Validation based on coincidences therefore suffers from geophysical noise. Two alternative methods for the validation of satellite data are introduced, which avoid the usual need for coincident measurements: tracer-tracer correlations, and vertical tracer profiles relative to tropopause height. Both are increasingly being used for model validation as they strongly suppress geophysical variability and thereby provide an “instantaneous climatology”. This allows comparison of measurements between non-coincident data sets which yields information about the precision and a statistically meaningful error-assessment of the ACE-FTS satellite data in the UTLS. By defining a trade-off factor, we show that the measurement errors can be reduced by including more measurements obtained over a wider longitude range into the comparison, despite the increased geophysical variability. Applying the methods then yields the following upper bounds to the relative differences in the mean found between the ACE-FTS and SPURT aircraft measurements in the upper troposphere (UT) and lower stratosphere (LS), respectively: for CO ±9% and ±12%, for H2O ±30% and ±18%, and for O3 ±25% and ±19%. The relative differences for O3 can be narrowed down by using a larger dataset obtained from ozonesondes, yielding a high bias in the ACEFTS measurements of 18% in the UT and relative differences of ±8% for measurements in the LS. When taking into account the smearing effect of the vertically limited spacing between measurements of the ACE-FTS instrument, the relative differences decrease by 5–15% around the tropopause, suggesting a vertical resolution of the ACE-FTS in the UTLS of around 1 km. The ACE-FTS hence offers unprecedented precision and vertical resolution for a satellite instrument, which will allow a new global perspective on UTLS tracer distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65–89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland–Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual’s dietary intake.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10–20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5–7 (14–16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m−3 is used for the MLD estimation. Using the larger criterion (0.125 kg m−3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of nutrient intakes in osteoporosis prevention in treatment is widely recognized. The objective of the present study was to develop and validate a FFQ for women with osteoporosis. The questionnaire was composed of 60 items, separated into 10 groups. The relative validation was accomplished through comparison of the 3-Day Food Record (3DR) with the FFQ. The 3DR was applied to 30 elderly women with confirmed osteoporosis, and after 45 days the FFQ was administrated. Statistical analysis comprised the Kolmogorov-Smirnov, Student T test and Pearson correlation coefficient. The agreement between two methods was evaluated by the frequency of similar classification into quartiles, and by the Bland-Altman method. No significant differences between methods were observed for the mean evaluated nutrients, except for carbohydrate and magnesium. Pearson correlation coefficients were positive and statistically significant for all nutrients. The overall proportion of subjects classified in the same quartile by the two methods was on average 50.01% and in the opposite quartile 0.47%. For calcium intake, only 3% of subjects were classified in opposite extreme quartiles by the two methods. The Bland-Altman analysis demonstrated that the differences obtained by the two methods in each subject were well distributed around the mean of the difference, and the disagreement increases as the mean intake increases. These results indicates that the FFQ for elderly women with osteoporosis presented here is highly acceptable and is an accurate method that can be used in large-scale or clinical studies for evaluation of nutrient intakes in a similar population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Oxidative modification of low-density lipoprotein (LDL) plays a key role in the pathogenesis of atherosclerosis. LDL(-) is present in blood plasma of healthy subjects and at higher concentrations in diseases with high cardiovascular risk, such as familial hypercholesterolemia or diabetes. Methods: We developed and validated a sandwich ELISA for LDL(-) in human plasma using two monoclonal antibodies against LDL(-) that do not bind to native LDL, extensively copper-oxidized LDL or malondialdehyde-modified LDL. The characteristics of assay performance, such as limits of detection and quantification, accuracy, inter- and intra-assay precision were evaluated. The linearity, interferences and stability tests were also performed. Results: The calibration range of the assay is 0.625-20.0 mU/L at 1: 2000 sample dilution. ELISA validation showed intra- and inter- assay precision and recovery within the required limits for immunoassays. The limits of detection and quantification were 0.423 mU/L and 0.517 mU/L LDL(-), respectively. The intra- and inter- assay coefficient of variation ranged from 9.5% to 11.5% and from 11.3% to 18.9%, respectively. Recovery of LDL(-) ranged from 92.8% to 105.1%. Conclusions: This ELISA represents a very practical tool for measuring LDL(-) in human blood for widespread research and clinical sample use. Clin Chem Lab Med 2008; 46: 1769-75.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy hierarchies and automated policy refinement are powerful approaches to simplify administration of security services in complex network environments. A crucial issue for the practical use of these approaches is to ensure the validity of the policy hierarchy, i.e. since the policy sets for the lower levels are automatically derived from the abstract policies (defined by the modeller), we must be sure that the derived policies uphold the high-level ones. This paper builds upon previous work on Model-based Management, particularly on the Diagram of Abstract Subsystems approach, and goes further to propose a formal validation approach for the policy hierarchies yielded by the automated policy refinement process. We establish general validation conditions for a multi-layered policy model, i.e. necessary and sufficient conditions that a policy hierarchy must satisfy so that the lower-level policy sets are valid refinements of the higher-level policies according to the criteria of consistency and completeness. Relying upon the validation conditions and upon axioms about the model representativeness, two theorems are proved to ensure compliance between the resulting system behaviour and the abstract policies that are modelled.