930 resultados para Precision-recall analysis
Resumo:
A new procedure for determining eleven organochlorine pesticides in soils using microwave-assisted extraction (MAE) and headspace solid phase microextraction (HS-SPME) is described. The studied pesticides consisted of mirex, α- and γ-chlordane, p,p’-DDT, heptachlor, heptachlor epoxide isomer A, γ-hexachlorocyclohexane, dieldrin, endrin, aldrine and hexachlorobenzene. The HS-SPME was optimized for the most important parameters such as extraction time, sample volume and temperature. The present analytical procedure requires a reduced volume of organic solvents and avoids the need for extract clean-up steps. For optimized conditions the limits of detection for the method ranged from 0.02 to 3.6 ng/g, intermediate precision ranged from 14 to 36% (as CV%), and the recovery from 8 up to 51%. The proposed methodology can be used in the rapid screening of soil for the presence of the selected pesticides, and was applied to landfill soil samples.
Resumo:
A procedure for the determination of seven indicator PCBs in soils and sediments using microwave-assisted extraction (MAE) and headspace solid-phase microextraction (HS-SPME) prior to GC-MS/MS is described. Optimization of the HS-SPME was carried out for the most important parameters such as extraction time, sample volume and temperature. The adopted methodology has reduced consumption of organic solvents and analysis runtime. Under the optimized conditions, the method detection limit ranged from 0.6 to 1 ng/g when 5 g of sample was extracted, the precision on real samples ranged from 4 to 21% and the recovery from 69 to 104%. The proposed method, which included the analysis of a certified reference material in its validation procedure, can be extended to several other PCBs and used in the monitoring of soil or sediments for the presence of PCBs.
Resumo:
The case describes the development of MyFARM’s internationalization plan, a service of Deimos Engenharia, under the GloCal Radar. This space engineering company hired Lisbon Consulting Company to undertake the project to overcome its lack of market orientation. The consultants’ analysis revealed Stevens County, Kansas, as the market with the highest potential for MyFARM. A suitable entry strategy and adaptation of the service for the local market was proposed. The case culminates with the Board of Directors discussing the viability of implementing the consultants’ recommendations to start diversifying their sources of revenue streams.
Resumo:
Chromogenic immunohistochemistry (IHC) is omnipresent in cancer diagnosis, but has also been criticized for its technical limit in quantifying the level of protein expression on tissue sections, thus potentially masking clinically relevant data. Shifting from qualitative to quantitative, immunofluorescence (IF) has recently gained attention, yet the question of how precisely IF can quantify antigen expression remains unanswered, regarding in particular its technical limitations and applicability to multiple markers. Here we introduce microfluidic precision IF, which accurately quantifies the target expression level in a continuous scale based on microfluidic IF staining of standard tissue sections and low-complexity automated image analysis. We show that the level of HER2 protein expression, as continuously quantified using microfluidic precision IF in 25 breast cancer cases, including several cases with equivocal IHC result, can predict the number of HER2 gene copies as assessed by fluorescence in situ hybridization (FISH). Finally, we demonstrate that the working principle of this technology is not restricted to HER2 but can be extended to other biomarkers. We anticipate that our method has the potential of providing automated, fast and high-quality quantitative in situ biomarker data using low-cost immunofluorescence assays, as increasingly required in the era of individually tailored cancer therapy.
Resumo:
A flow injection hydride generation direct current plasma atomic emission spectrometric (FI-HG-DCP-AES) method was developed for the determination of lead at ng.ml-l level. Potassium ferricyanide (K3Fe(CN)6) was used along with sodium tetrahydroborate(III) (NaBH4) to produce plumbane (PbH4) in an acid medium. The design of a gas-liquid separator (hydride generator) was tested and the parameters of the flow injection system were optimized to achieve a good detection limit and sample throughput. The technique developed gave a detection limit of 0.7 ng.ml-l(3ob). The precision at 20 ng.ml"* level was 1.6 % RSD with 1 1 measurements (n=l 1). Volume of sample loop was 500 |J.l. A sample throughput of 120 h"^ was achieved. The transition elements, Fe(II), FeOH), Cd(n), Co(II), Mn(n), Ni(II) and Zn(n) do not interfere in this method but 1 mg,l'l Cu(II) will suppress 50 % of the signal from a sample containing 20 ng.ml'l Pb. This method was successfully applied to determine lead in a calcium carbonate (CaC03) matrix of banded coral skeletons from Si-Chang Island in Thailand.
Resumo:
This study was done to test the effectiveness of the Precision Fluency Shaping Program in controlling stuttering behaviour in adults. Two sites were chosen, each using the Precision Fluency Shaping Program to treat stuttering. At each clinic, a Speech Patbologist made a random selection of the subjects' pre- and post-therapy video-taped interviews, totalling 20 in all. During the interviews, the clients were asked questions and re~d a short passage to determine the frequency of stuttering in natural conversation and in reading. Perceptions of Stuttering Inventory questionnaires vvere also filled in before and after therapy. Two judges were trained to identify stuttering behaviour, and were given an inter-rater reliability test at selected intervals throughout the study. Protocols",:m.a;d;6 of each interview tape, were scored for (a) stuttering behaviour and (b) words spoken or read. An Analysis of Variance Repeated Measures Test was used to compare before and after scores of conversations, readings, and Perceptions of Stuttering Inventory to determine whether the Precision Fluency Shaping Program controlled stuttering behaviour significantly. A Pearson R Correlation Test was also administered to determine if a relationship existed bet\veen Perceptions of Stuttering Inventory and (i) conversation and (ii) reading scores.
Resumo:
Affiliation: Margaret Cargo : Département de médecine sociale et préventive, Faculté de médecine, Université de Montréal
Resumo:
Les cadriciels et les bibliothèques sont indispensables aux systèmes logiciels d'aujourd'hui. Quand ils évoluent, il est souvent fastidieux et coûteux pour les développeurs de faire la mise à jour de leur code. Par conséquent, des approches ont été proposées pour aider les développeurs à migrer leur code. Généralement, ces approches ne peuvent identifier automatiquement les règles de modification une-remplacée-par-plusieurs méthodes et plusieurs-remplacées-par-une méthode. De plus, elles font souvent un compromis entre rappel et précision dans leur résultats en utilisant un ou plusieurs seuils expérimentaux. Nous présentons AURA (AUtomatic change Rule Assistant), une nouvelle approche hybride qui combine call dependency analysis et text similarity analysis pour surmonter ces limitations. Nous avons implanté AURA en Java et comparé ses résultats sur cinq cadriciels avec trois approches précédentes par Dagenais et Robillard, M. Kim et al., et Schäfer et al. Les résultats de cette comparaison montrent que, en moyenne, le rappel de AURA est 53,07% plus que celui des autre approches avec une précision similaire (0,10% en moins).
Resumo:
Les logiciels sont en constante évolution, nécessitant une maintenance et un développement continus. Ils subissent des changements tout au long de leur vie, que ce soit pendant l'ajout de nouvelles fonctionnalités ou la correction de bogues dans le code. Lorsque ces logiciels évoluent, leurs architectures ont tendance à se dégrader avec le temps et deviennent moins adaptables aux nouvelles spécifications des utilisateurs. Elles deviennent plus complexes et plus difficiles à maintenir. Dans certains cas, les développeurs préfèrent refaire la conception de ces architectures à partir du zéro plutôt que de prolonger la durée de leurs vies, ce qui engendre une augmentation importante des coûts de développement et de maintenance. Par conséquent, les développeurs doivent comprendre les facteurs qui conduisent à la dégradation des architectures, pour prendre des mesures proactives qui facilitent les futurs changements et ralentissent leur dégradation. La dégradation des architectures se produit lorsque des développeurs qui ne comprennent pas la conception originale du logiciel apportent des changements au logiciel. D'une part, faire des changements sans comprendre leurs impacts peut conduire à l'introduction de bogues et à la retraite prématurée du logiciel. D'autre part, les développeurs qui manquent de connaissances et–ou d'expérience dans la résolution d'un problème de conception peuvent introduire des défauts de conception. Ces défauts ont pour conséquence de rendre les logiciels plus difficiles à maintenir et évoluer. Par conséquent, les développeurs ont besoin de mécanismes pour comprendre l'impact d'un changement sur le reste du logiciel et d'outils pour détecter les défauts de conception afin de les corriger. Dans le cadre de cette thèse, nous proposons trois principales contributions. La première contribution concerne l'évaluation de la dégradation des architectures logicielles. Cette évaluation consiste à utiliser une technique d’appariement de diagrammes, tels que les diagrammes de classes, pour identifier les changements structurels entre plusieurs versions d'une architecture logicielle. Cette étape nécessite l'identification des renommages de classes. Par conséquent, la première étape de notre approche consiste à identifier les renommages de classes durant l'évolution de l'architecture logicielle. Ensuite, la deuxième étape consiste à faire l'appariement de plusieurs versions d'une architecture pour identifier ses parties stables et celles qui sont en dégradation. Nous proposons des algorithmes de bit-vecteur et de clustering pour analyser la correspondance entre plusieurs versions d'une architecture. La troisième étape consiste à mesurer la dégradation de l'architecture durant l'évolution du logiciel. Nous proposons un ensemble de m´etriques sur les parties stables du logiciel, pour évaluer cette dégradation. La deuxième contribution est liée à l'analyse de l'impact des changements dans un logiciel. Dans ce contexte, nous présentons une nouvelle métaphore inspirée de la séismologie pour identifier l'impact des changements. Notre approche considère un changement à une classe comme un tremblement de terre qui se propage dans le logiciel à travers une longue chaîne de classes intermédiaires. Notre approche combine l'analyse de dépendances structurelles des classes et l'analyse de leur historique (les relations de co-changement) afin de mesurer l'ampleur de la propagation du changement dans le logiciel, i.e., comment un changement se propage à partir de la classe modifiée è d'autres classes du logiciel. La troisième contribution concerne la détection des défauts de conception. Nous proposons une métaphore inspirée du système immunitaire naturel. Comme toute créature vivante, la conception de systèmes est exposée aux maladies, qui sont des défauts de conception. Les approches de détection sont des mécanismes de défense pour les conception des systèmes. Un système immunitaire naturel peut détecter des pathogènes similaires avec une bonne précision. Cette bonne précision a inspiré une famille d'algorithmes de classification, appelés systèmes immunitaires artificiels (AIS), que nous utilisions pour détecter les défauts de conception. Les différentes contributions ont été évaluées sur des logiciels libres orientés objets et les résultats obtenus nous permettent de formuler les conclusions suivantes: • Les métriques Tunnel Triplets Metric (TTM) et Common Triplets Metric (CTM), fournissent aux développeurs de bons indices sur la dégradation de l'architecture. La d´ecroissance de TTM indique que la conception originale de l'architecture s’est dégradée. La stabilité de TTM indique la stabilité de la conception originale, ce qui signifie que le système est adapté aux nouvelles spécifications des utilisateurs. • La séismologie est une métaphore intéressante pour l'analyse de l'impact des changements. En effet, les changements se propagent dans les systèmes comme les tremblements de terre. L'impact d'un changement est plus important autour de la classe qui change et diminue progressivement avec la distance à cette classe. Notre approche aide les développeurs à identifier l'impact d'un changement. • Le système immunitaire est une métaphore intéressante pour la détection des défauts de conception. Les résultats des expériences ont montré que la précision et le rappel de notre approche sont comparables ou supérieurs à ceux des approches existantes.
Resumo:
Triple quadrupole mass spectrometers coupled with high performance liquid chromatography are workhorses in quantitative bioanalyses. It provides substantial benefits including reproducibility, sensitivity and selectivity for trace analysis. Selected Reaction Monitoring allows targeted assay development but data sets generated contain very limited information. Data mining and analysis of non-targeted high-resolution mass spectrometry profiles of biological samples offer the opportunity to perform more exhaustive assessments, including quantitative and qualitative analysis. The objectives of this study was to test method precision and accuracy, statistically compare bupivacaine drug concentration in real study samples and verify if high resolution and accurate mass data collected in scan mode can actually permit retrospective data analysis, more specifically, extract metabolite related information. The precision and accuracy data presented using both instruments provided equivalent results. Overall, the accuracy was ranging from 106.2 to 113.2% and the precision observed was from 1.0 to 3.7%. Statistical comparisons using a linear regression between both methods reveal a coefficient of determination (R2) of 0.9996 and a slope of 1.02 demonstrating a very strong correlation between both methods. Individual sample comparison showed differences from -4.5% to 1.6% well within the accepted analytical error. Moreover, post acquisition extracted ion chromatograms at m/z 233.1648 ± 5 ppm (M-56) and m/z 305.2224 ± 5 ppm (M+16) revealed the presence of desbutyl-bupivacaine and three distinct hydroxylated bupivacaine metabolites. Post acquisition analysis allowed us to produce semiquantitative evaluations of the concentration-time profiles for bupicavaine metabolites.
Resumo:
Targeted peptide methods generally use HPLC-MS/MRM approaches. Although dependent on the instrumental resolution, interferences may occur while performing analysis of complex biological matrices. HPLC-MS/MRM3 is a technique, which provides a significantly better selectivity, compared with HPLC-MS/MRM assay. HPLC-MS/MRM3 allows the detection and quantitation by enriching standard MRM with secondary product ions that are generated within the linear ion trap. Substance P (SP) and neurokinin A (NKA) are tachykinin peptides playing a central role in pain transmission. The objective of this study was to verify whether HPLC-HPLCMS/ MRM3 could provide significant advantages over a more traditional HPLC-MS/MRM assay for the quantification of SP and NKA in rat spinal cord. The results suggest that reconstructed MRM3 chromatograms display significant improvements with the nearly complete elimination of interfering peaks but the sensitivity (i.e. signal-to-noise ratio) was severely reduced. The precision (%CV) observed was between 3.5% - 24.1% using HPLC-MS/MRM and in the range of 4.3% - 13.1% with HPLC-MS/MRM3, for SP and NKA. The observed accuracy was within 10% of the theoretical concentrations tested. HPLC-MS/MRM3 may improve the assay sensitivity to detect difference between samples by reducing significantly the potential of interferences and therefore reduce instrumental errors.
Resumo:
It has been widely known that a significant part of the bits are useless or even unused during the program execution. Bit-width analysis targets at finding the minimum bits needed for each variable in the program, which ensures the execution correctness and resources saving. In this paper, we proposed a static analysis method for bit-widths in general applications, which approximates conservatively at compile time and is independent of runtime conditions. While most related work focus on integer applications, our method is also tailored and applicable to floating point variables, which could be extended to transform floating point number into fixed point numbers together with precision analysis. We used more precise representations for data value ranges of both scalar and array variables. Element level analysis is carried out for arrays. We also suggested an alternative for the standard fixed-point iterations in bi-directional range analysis. These techniques are implemented on the Trimaran compiler structure and tested on a set of benchmarks to show the results.
Resumo:
Precision of released figures is not only an important quality feature of official statistics, it is also essential for a good understanding of the data. In this paper we show a case study of how precision could be conveyed if the multivariate nature of data has to be taken into account. In the official release of the Swiss earnings structure survey, the total salary is broken down into several wage components. We follow Aitchison's approach for the analysis of compositional data, which is based on logratios of components. We first present diferent multivariate analyses of the compositional data whereby the wage components are broken down by economic activity classes. Then we propose a number of ways to assess precision
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The technology for site-specific applications of nitrogen (N) fertilizer has exposed a gap in our knowledge about the spatial variation of soil mineral N, and that which will become available during the growing season within arable fields. Spring mineral N and potentially available N were measured in an arable field together with gravimetric water content, loss on ignition, crop yield, percentages of sand, silt, and clay, and elevation to describe their spatial variation geostatistically. The areas with a larger clay content had larger values of mineral N, potentially available N, loss on ignition and gravimetric water content, and the converse was true for the areas with more sandy soil. The results suggest that the spatial relations between mineral N and loss on ignition, gravimetric water content, soil texture, elevation and crop yield, and between potentially available N and loss on ignition and silt content could be used to indicate their spatial patterns. Variable-rate nitrogen fertilizer application would be feasible in this field because of the spatial structure and the magnitude of variation of mineral N and potentially available N.