961 resultados para Precision-recall analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A flow injection method for the quantitative analysis of vancomycin hydrochloride, C66H75Cl2N9O24.HCl (HVCM), based on the reaction with copper (II) ions, is presented. HVCM forms a lilac-blue complex with copper ions at pH≅4.5 in aqueous solutions, with maximum absorption at 555 nm. The detection limit was estimated to be about 8.5×10-5 mol L-1; the quantitation limit is about 2.5×10-4 mol L-1 and about 30 determinations can be performed in an hour. The accuracy of the method was tested through recovery procedures in presence of four different excipients, in the proportion 1:1 w/w. The results were compared with those obtained with the batch spectrophotometric and with the HPLC methods. Statistical comparison was done using the Student's procedure. Complete agreement was found at a 0.95 significance level between the proposed flow injection and the batch spectrophotometric methods, which present similar precision (RSD: 2.1 % vs. 1.9%).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiostereometric analysis (RSA) is a highly accurate method for the measurement of in vivo micromotion of orthopaedic implants. Validation of the RSA method is a prerequisite for performing clinical RSA studies. Only a limited number of studies have utilised the RSA method in the evaluation of migration and inducible micromotion during fracture healing. Volar plate fixation of distal radial fractures has increased in popularity. There is still very little prospective randomised evidence supporting the use of these implants over other treatments. The aim of this study was to investigate the precision, accuracy, and feasibility of using RSA in the evaluation of healing in distal radius fractures treated with a volar fixed-angle plate. A physical phantom model was used to validate the RSA method for simple distal radius fractures. A computer simulation model was then used to validate the RSA method for more complex interfragmentary motion in intra-articular fractures. A separate pre-clinical investigation was performed in order to evaluate the possibility of using novel resorbable markers for RSA. Based on the validation studies, a prospective RSA cohort study of fifteen patients with plated AO type-C distal radius fractures with a 1-year follow-up was performed. RSA was shown to be highly accurate and precise in the measurement of fracture micromotion using both physical and computer simulated models of distal radius fractures. Resorbable RSA markers demonstrated potential for use in RSA. The RSA method was found to have a high clinical precision. The fractures underwent significant translational and rotational migration during the first two weeks after surgery, but not thereafter. Maximal grip caused significant translational and rotational interfragmentary micromotion. This inducible micromotion was detectable up to eighteen weeks, even after the achievement of radiographic union. The application of RSA in the measurement of fracture fragment migration and inducible interfragmentary micromotion in AO type-C distal radius fractures is feasible but technically demanding. RSA may be a unique tool in defining the progress of fracture union.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local head losses must be considered in estimating properly the maximum length of drip irrigation laterals. The aim of this work was to develop a model based on dimensional analysis for calculating head loss along laterals accounting for in-line drippers. Several measurements were performed with 12 models of emitters to obtain the experimental data required for developing and assessing the model. Based on the Camargo & Sentelhas coefficient, the model presented an excellent result in terms of precision and accuracy on estimating head loss. The deviation between estimated and observed values of head loss increased according to the head loss and the maximum deviation reached 0.17 m. The maximum relative error was 33.75% and only 15% of the data set presented relative errors higher than 20%. Neglecting local head losses incurred a higher than estimated maximum lateral length of 19.48% for pressure-compensating drippers and 16.48% for non pressure-compensating drippers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transportation of fluids is one of the most common and energy intensive processes in the industrial and HVAC sectors. Pumping systems are frequently subject to engineering malpractice when dimensioned, which can lead to poor operational efficiency. Moreover, pump monitoring requires dedicated measuring equipment, which imply costly investments. Inefficient pump operation and improper maintenance can increase energy costs substantially and even lead to pump failure. A centrifugal pump is commonly driven by an induction motor. Driving the induction motor with a frequency converter can diminish energy consumption in pump drives and provide better control of a process. In addition, induction machine signals can also be estimated by modern frequency converters, dispensing with the use of sensors. If the estimates are accurate enough, a pump can be modelled and integrated into the frequency converter control scheme. This can open the possibility of joint motor and pump monitoring and diagnostics, thereby allowing the detection of reliability-reducing operating states that can lead to additional maintenance costs. The goal of this work is to study the accuracy of rotational speed, torque and shaft power estimates calculated by a frequency converter. Laboratory tests were performed in order to observe estimate behaviour in both steady-state and transient operation. An induction machine driven by a vector-controlled frequency converter, coupled with another induction machine acting as load was used in the tests. The estimated quantities were obtained through the frequency converter’s Trend Recorder software. A high-precision, HBM T12 torque-speed transducer was used to measure the actual values of the aforementioned variables. The effect of the flux optimization energy saving feature on the estimate quality was also studied. A processing function was developed in MATLAB for comparison of the obtained data. The obtained results confirm the suitability of this particular converter to provide accurate enough estimates for pumping applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Choice of industrial development options and the relevant allocation of the research funds become more and more difficult because of the increasing R&D costs and pressure for shorter development period. Forecast of the research progress is based on the analysis of the publications activity in the field of interest as well as on the dynamics of its change. Moreover, allocation of funds is hindered by exponential growth in the number of publications and patents. Thematic clusters become more and more difficult to identify, and their evolution hard to follow. The existing approaches of research field structuring and identification of its development are very limited. They do not identify the thematic clusters with adequate precision while the identified trends are often ambiguous. Therefore, there is a clear need to develop methods and tools, which are able to identify developing fields of research. The main objective of this Thesis is to develop tools and methods helping in the identification of the promising research topics in the field of separation processes. Two structuring methods as well as three approaches for identification of the development trends have been proposed. The proposed methods have been applied to the analysis of the research on distillation and filtration. The results show that the developed methods are universal and could be used to study of the various fields of research. The identified thematic clusters and the forecasted trends of their development have been confirmed in almost all tested cases. It proves the universality of the proposed methods. The results allow for identification of the fast-growing scientific fields as well as the topics characterized by stagnant or diminishing research activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Master’s Thesis is dedicated to the investigation and testing conventional and nonconventional Kramers-Kronig relations on simulated and experimentally measured spectra. It is done for both linear and nonlinear optical spectral data. Big part of attention is paid to the new method of obtaining complex refractive index from a transmittance spectrum without direct information of the sample thickness. The latter method is coupled with terahertz tome-domain spectroscopy and Kramers-Kronig analysis applied for testing the validity of complex refractive index. In this research precision of data inversion is evaluated by root-mean square error. Testing of methods is made over different spectral range and implementation of this methods in future is considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis concerns the analysis of epidemic models. We adopt the Bayesian paradigm and develop suitable Markov Chain Monte Carlo (MCMC) algorithms. This is done by considering an Ebola outbreak in the Democratic Republic of Congo, former Zaïre, 1995 as a case of SEIR epidemic models. We model the Ebola epidemic deterministically using ODEs and stochastically through SDEs to take into account a possible bias in each compartment. Since the model has unknown parameters, we use different methods to estimate them such as least squares, maximum likelihood and MCMC. The motivation behind choosing MCMC over other existing methods in this thesis is that it has the ability to tackle complicated nonlinear problems with large number of parameters. First, in a deterministic Ebola model, we compute the likelihood function by sum of square of residuals method and estimate parameters using the LSQ and MCMC methods. We sample parameters and then use them to calculate the basic reproduction number and to study the disease-free equilibrium. From the sampled chain from the posterior, we test the convergence diagnostic and confirm the viability of the model. The results show that the Ebola model fits the observed onset data with high precision, and all the unknown model parameters are well identified. Second, we convert the ODE model into a SDE Ebola model. We compute the likelihood function using extended Kalman filter (EKF) and estimate parameters again. The motivation of using the SDE formulation here is to consider the impact of modelling errors. Moreover, the EKF approach allows us to formulate a filtered likelihood for the parameters of such a stochastic model. We use the MCMC procedure to attain the posterior distributions of the parameters of the SDE Ebola model drift and diffusion parts. In this thesis, we analyse two cases: (1) the model error covariance matrix of the dynamic noise is close to zero , i.e. only small stochasticity added into the model. The results are then similar to the ones got from deterministic Ebola model, even if methods of computing the likelihood function are different (2) the model error covariance matrix is different from zero, i.e. a considerable stochasticity is introduced into the Ebola model. This accounts for the situation where we would know that the model is not exact. As a results, we obtain parameter posteriors with larger variances. Consequently, the model predictions then show larger uncertainties, in accordance with the assumption of an incomplete model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Castor bean cropping has great social and economic value, but its production has been affected by factors such as low quality seeds used for sowing. The quick and precise evaluation of seed quality by x-ray test is known as an effective method to evaluate seed lots, but little is known about the interpretation between of the type of radiographic image and the seed quality correlation. The potential of x-ray analysis as a marker of seed physiological quality and as an initial process for the implementation of the use of computer-assisted image analysis was investigated using castor bean seeds of the different cultivars. The seeds were classified according to internal morphology visualized in the radiography and subjected to the germination test, emergency and seedling growth rate. It was possible to identify the different types of internal tissues, morphological and physical damage in castor bean seeds using the x-ray test. Tissues generating translucent images, embryo deformation, or tissues with less than 50% of endosperm reserves or spotted, negatively affected the physiological potential of the seed lots. Radiographic analysis is effective as an instrument to improve castor bean seed lot quality. This non destructive analysis allows the prediction of seedling performance and enabled the selection of high-quality seeds under the standards of a sustainable and precision agriculture

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A flow injection hydride generation direct current plasma atomic emission spectrometric (FI-HG-DCP-AES) method was developed for the determination of lead at ng.ml-l level. Potassium ferricyanide (K3Fe(CN)6) was used along with sodium tetrahydroborate(III) (NaBH4) to produce plumbane (PbH4) in an acid medium. The design of a gas-liquid separator (hydride generator) was tested and the parameters of the flow injection system were optimized to achieve a good detection limit and sample throughput. The technique developed gave a detection limit of 0.7 ng.ml-l(3ob). The precision at 20 ng.ml"* level was 1.6 % RSD with 1 1 measurements (n=l 1). Volume of sample loop was 500 |J.l. A sample throughput of 120 h"^ was achieved. The transition elements, Fe(II), FeOH), Cd(n), Co(II), Mn(n), Ni(II) and Zn(n) do not interfere in this method but 1 mg,l'l Cu(II) will suppress 50 % of the signal from a sample containing 20 ng.ml'l Pb. This method was successfully applied to determine lead in a calcium carbonate (CaC03) matrix of banded coral skeletons from Si-Chang Island in Thailand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was done to test the effectiveness of the Precision Fluency Shaping Program in controlling stuttering behaviour in adults. Two sites were chosen, each using the Precision Fluency Shaping Program to treat stuttering. At each clinic, a Speech Patbologist made a random selection of the subjects' pre- and post-therapy video-taped interviews, totalling 20 in all. During the interviews, the clients were asked questions and re~d a short passage to determine the frequency of stuttering in natural conversation and in reading. Perceptions of Stuttering Inventory questionnaires vvere also filled in before and after therapy. Two judges were trained to identify stuttering behaviour, and were given an inter-rater reliability test at selected intervals throughout the study. Protocols",:m.a;d;6 of each interview tape, were scored for (a) stuttering behaviour and (b) words spoken or read. An Analysis of Variance Repeated Measures Test was used to compare before and after scores of conversations, readings, and Perceptions of Stuttering Inventory to determine whether the Precision Fluency Shaping Program controlled stuttering behaviour significantly. A Pearson R Correlation Test was also administered to determine if a relationship existed bet\veen Perceptions of Stuttering Inventory and (i) conversation and (ii) reading scores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Affiliation: Margaret Cargo : Département de médecine sociale et préventive, Faculté de médecine, Université de Montréal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les cadriciels et les bibliothèques sont indispensables aux systèmes logiciels d'aujourd'hui. Quand ils évoluent, il est souvent fastidieux et coûteux pour les développeurs de faire la mise à jour de leur code. Par conséquent, des approches ont été proposées pour aider les développeurs à migrer leur code. Généralement, ces approches ne peuvent identifier automatiquement les règles de modification une-remplacée-par-plusieurs méthodes et plusieurs-remplacées-par-une méthode. De plus, elles font souvent un compromis entre rappel et précision dans leur résultats en utilisant un ou plusieurs seuils expérimentaux. Nous présentons AURA (AUtomatic change Rule Assistant), une nouvelle approche hybride qui combine call dependency analysis et text similarity analysis pour surmonter ces limitations. Nous avons implanté AURA en Java et comparé ses résultats sur cinq cadriciels avec trois approches précédentes par Dagenais et Robillard, M. Kim et al., et Schäfer et al. Les résultats de cette comparaison montrent que, en moyenne, le rappel de AURA est 53,07% plus que celui des autre approches avec une précision similaire (0,10% en moins).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les logiciels sont en constante évolution, nécessitant une maintenance et un développement continus. Ils subissent des changements tout au long de leur vie, que ce soit pendant l'ajout de nouvelles fonctionnalités ou la correction de bogues dans le code. Lorsque ces logiciels évoluent, leurs architectures ont tendance à se dégrader avec le temps et deviennent moins adaptables aux nouvelles spécifications des utilisateurs. Elles deviennent plus complexes et plus difficiles à maintenir. Dans certains cas, les développeurs préfèrent refaire la conception de ces architectures à partir du zéro plutôt que de prolonger la durée de leurs vies, ce qui engendre une augmentation importante des coûts de développement et de maintenance. Par conséquent, les développeurs doivent comprendre les facteurs qui conduisent à la dégradation des architectures, pour prendre des mesures proactives qui facilitent les futurs changements et ralentissent leur dégradation. La dégradation des architectures se produit lorsque des développeurs qui ne comprennent pas la conception originale du logiciel apportent des changements au logiciel. D'une part, faire des changements sans comprendre leurs impacts peut conduire à l'introduction de bogues et à la retraite prématurée du logiciel. D'autre part, les développeurs qui manquent de connaissances et–ou d'expérience dans la résolution d'un problème de conception peuvent introduire des défauts de conception. Ces défauts ont pour conséquence de rendre les logiciels plus difficiles à maintenir et évoluer. Par conséquent, les développeurs ont besoin de mécanismes pour comprendre l'impact d'un changement sur le reste du logiciel et d'outils pour détecter les défauts de conception afin de les corriger. Dans le cadre de cette thèse, nous proposons trois principales contributions. La première contribution concerne l'évaluation de la dégradation des architectures logicielles. Cette évaluation consiste à utiliser une technique d’appariement de diagrammes, tels que les diagrammes de classes, pour identifier les changements structurels entre plusieurs versions d'une architecture logicielle. Cette étape nécessite l'identification des renommages de classes. Par conséquent, la première étape de notre approche consiste à identifier les renommages de classes durant l'évolution de l'architecture logicielle. Ensuite, la deuxième étape consiste à faire l'appariement de plusieurs versions d'une architecture pour identifier ses parties stables et celles qui sont en dégradation. Nous proposons des algorithmes de bit-vecteur et de clustering pour analyser la correspondance entre plusieurs versions d'une architecture. La troisième étape consiste à mesurer la dégradation de l'architecture durant l'évolution du logiciel. Nous proposons un ensemble de m´etriques sur les parties stables du logiciel, pour évaluer cette dégradation. La deuxième contribution est liée à l'analyse de l'impact des changements dans un logiciel. Dans ce contexte, nous présentons une nouvelle métaphore inspirée de la séismologie pour identifier l'impact des changements. Notre approche considère un changement à une classe comme un tremblement de terre qui se propage dans le logiciel à travers une longue chaîne de classes intermédiaires. Notre approche combine l'analyse de dépendances structurelles des classes et l'analyse de leur historique (les relations de co-changement) afin de mesurer l'ampleur de la propagation du changement dans le logiciel, i.e., comment un changement se propage à partir de la classe modifiée è d'autres classes du logiciel. La troisième contribution concerne la détection des défauts de conception. Nous proposons une métaphore inspirée du système immunitaire naturel. Comme toute créature vivante, la conception de systèmes est exposée aux maladies, qui sont des défauts de conception. Les approches de détection sont des mécanismes de défense pour les conception des systèmes. Un système immunitaire naturel peut détecter des pathogènes similaires avec une bonne précision. Cette bonne précision a inspiré une famille d'algorithmes de classification, appelés systèmes immunitaires artificiels (AIS), que nous utilisions pour détecter les défauts de conception. Les différentes contributions ont été évaluées sur des logiciels libres orientés objets et les résultats obtenus nous permettent de formuler les conclusions suivantes: • Les métriques Tunnel Triplets Metric (TTM) et Common Triplets Metric (CTM), fournissent aux développeurs de bons indices sur la dégradation de l'architecture. La d´ecroissance de TTM indique que la conception originale de l'architecture s’est dégradée. La stabilité de TTM indique la stabilité de la conception originale, ce qui signifie que le système est adapté aux nouvelles spécifications des utilisateurs. • La séismologie est une métaphore intéressante pour l'analyse de l'impact des changements. En effet, les changements se propagent dans les systèmes comme les tremblements de terre. L'impact d'un changement est plus important autour de la classe qui change et diminue progressivement avec la distance à cette classe. Notre approche aide les développeurs à identifier l'impact d'un changement. • Le système immunitaire est une métaphore intéressante pour la détection des défauts de conception. Les résultats des expériences ont montré que la précision et le rappel de notre approche sont comparables ou supérieurs à ceux des approches existantes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Triple quadrupole mass spectrometers coupled with high performance liquid chromatography are workhorses in quantitative bioanalyses. It provides substantial benefits including reproducibility, sensitivity and selectivity for trace analysis. Selected Reaction Monitoring allows targeted assay development but data sets generated contain very limited information. Data mining and analysis of non-targeted high-resolution mass spectrometry profiles of biological samples offer the opportunity to perform more exhaustive assessments, including quantitative and qualitative analysis. The objectives of this study was to test method precision and accuracy, statistically compare bupivacaine drug concentration in real study samples and verify if high resolution and accurate mass data collected in scan mode can actually permit retrospective data analysis, more specifically, extract metabolite related information. The precision and accuracy data presented using both instruments provided equivalent results. Overall, the accuracy was ranging from 106.2 to 113.2% and the precision observed was from 1.0 to 3.7%. Statistical comparisons using a linear regression between both methods reveal a coefficient of determination (R2) of 0.9996 and a slope of 1.02 demonstrating a very strong correlation between both methods. Individual sample comparison showed differences from -4.5% to 1.6% well within the accepted analytical error. Moreover, post acquisition extracted ion chromatograms at m/z 233.1648 ± 5 ppm (M-56) and m/z 305.2224 ± 5 ppm (M+16) revealed the presence of desbutyl-bupivacaine and three distinct hydroxylated bupivacaine metabolites. Post acquisition analysis allowed us to produce semiquantitative evaluations of the concentration-time profiles for bupicavaine metabolites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Targeted peptide methods generally use HPLC-MS/MRM approaches. Although dependent on the instrumental resolution, interferences may occur while performing analysis of complex biological matrices. HPLC-MS/MRM3 is a technique, which provides a significantly better selectivity, compared with HPLC-MS/MRM assay. HPLC-MS/MRM3 allows the detection and quantitation by enriching standard MRM with secondary product ions that are generated within the linear ion trap. Substance P (SP) and neurokinin A (NKA) are tachykinin peptides playing a central role in pain transmission. The objective of this study was to verify whether HPLC-HPLCMS/ MRM3 could provide significant advantages over a more traditional HPLC-MS/MRM assay for the quantification of SP and NKA in rat spinal cord. The results suggest that reconstructed MRM3 chromatograms display significant improvements with the nearly complete elimination of interfering peaks but the sensitivity (i.e. signal-to-noise ratio) was severely reduced. The precision (%CV) observed was between 3.5% - 24.1% using HPLC-MS/MRM and in the range of 4.3% - 13.1% with HPLC-MS/MRM3, for SP and NKA. The observed accuracy was within 10% of the theoretical concentrations tested. HPLC-MS/MRM3 may improve the assay sensitivity to detect difference between samples by reducing significantly the potential of interferences and therefore reduce instrumental errors.