867 resultados para measurement and metrology
Resumo:
We report a measurement of the ratio of the tt̅ to Z/γ* production cross sections in √s=1.96 TeV pp̅ collisions using data corresponding to an integrated luminosity of up to 4.6 fb-1, collected by the CDF II detector. The tt̅ cross section ratio is measured using two complementary methods, a b-jet tagging measurement and a topological approach. By multiplying the ratios by the well-known theoretical Z/γ*→ll cross section predicted by the standard model, the extracted tt̅ cross sections are effectively insensitive to the uncertainty on luminosity. A best linear unbiased estimate is used to combine both measurements with the result σtt̅ =7.70±0.52 pb, for a top-quark mass of 172.5 GeV/c2.
Resumo:
We report a measurement of the ratio of the tt̅ to Z/γ* production cross sections in √s=1.96 TeV pp̅ collisions using data corresponding to an integrated luminosity of up to 4.6 fb-1, collected by the CDF II detector. The tt̅ cross section ratio is measured using two complementary methods, a b-jet tagging measurement and a topological approach. By multiplying the ratios by the well-known theoretical Z/γ*→ll cross section predicted by the standard model, the extracted tt̅ cross sections are effectively insensitive to the uncertainty on luminosity. A best linear unbiased estimate is used to combine both measurements with the result σtt̅ =7.70±0.52 pb, for a top-quark mass of 172.5 GeV/c2.
Resumo:
We report a measurement of the ratio of the top-antitop to Z/gamma* production cross sections in sqrt(s) = 1.96 TeV proton-antiproton collisions using data corresponding to an integrated luminosity of up to 4.6 fb-1, collected by the CDF II detector. The top-antitop cross section ratio is measured using two complementary methods, a b-jet tagging measurement and a topological approach. By multiplying the ratios by the well-known theoretical Z/gamma*->ll cross section, the extracted top-antitop cross sections are effectively insensitive to the uncertainty on luminosity. A best linear unbiased estimate is used to combine both measurements with the result sigma_(top-antitop) = 7.70 +/- 0.52 pb, for a top-quark mass of 172.5 GeV/c^2.
Resumo:
Accurate mass flow measurement is very important in various monitoring and control applications. This paper proposes a novel method of fluid flow measurement by compensating the pressure drop across the ends of measuring unit using a compensating pump. The pressure drop due to the flow is balanced by a feedback control loop. This is a null-deflection type of measurement. As the insertion of such a measuring unit does not affect the functioning of the systems, this is also a non-disruptive flow measurement method. The implementation and design of such a unit are discussed. The system is modeled and simulated using the bond graph technique and it is experimentally validated. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Digital Speckle Correlation Method (DSCM) is a useful tool for whole field deformation measurement, and has been applied to analyze the deformation field of rock materials in recent years. In this paper, a Geo-DSCM system is designed and used to analyse the more complicated problems of rock mechanics, such as damage evolution and failure procedure. A weighted correlation equation is proposed to improve the accuracy of displacement measurement on a heterogeneous deformation field. In addition, a data acquisition system is described that can synchronize with the test machine and can capture speckle image at various speeds during experiment. For verification of the Geo-DSCM system, the failure procedure of a borehole rock structure is inspected and the evolution of the deformation localization is analysed. It is shown that the deformation localization generally initializes at the vulnerable area of the rock structure but may develop in a very complicated way.
Resumo:
Introduction: In this study, colloidal gold nanoparticle and precipitation of an insoluble product formed by HRP-biocatalyzed oxidation of 3,3'-diaminobenzidine (DAB) in the presence of H2O2 were used to enhance the signal obtained from the surface plasmon resonance biosensor.
Methods: The colloidal gold nanoparticle was synthesized as described by Turkevitch et al., and their surface was firstly functionalized with HS(CH2)11(OCH2CH2)3COOH (OEG3¬-COOH) by self assembling technique. Thereafter, those OEG3-COOH functionalized nanoparticles were covalently conjugated with horseradish peroxidase (HRP) and anti-IgG antibody (specific to the Fc portion of all human IgG subclasses) to form an enzyme-immunogold complex. Characterization was performed by several methods: UV-Vis absorption, dynamic light scattering (DLS), transmission electron microscopy (TEM) and FTIR. The as-prepared enzyme-immunogold complex has been applied in enhancement of SPR immunoassay. A sensor chip used in the experiment was constructed by using 1:10 molar ratio of HS(CH2)11(OCH2CH2)6COOH and HS(CH2)11(OCH2CH2)3OH. The capture protein, GAD65 (autoantigen) which is recognized by anti-GAD antibody (autoantibody) in the sera of insulin-dependent diabetes mellitus patients, was immobilized onto the 1:10 surface via biotin-streptavidin interaction.
Results and conclusions: In the research, we reported the influences of gold nanoparticle and enzyme precipitation on the enhancement of SPR signal. Gold nanoparticle showed its enhancement as being consistent with other previous studies, while the enzyme precipitation using DAB substrate was applied for the first time and greatly amplified the SPR detection. As the results, anti-GAD antibody could be detected at pg/ml level which is far higher than that of commercial ELISA detection kit. This study indicates another way to enhance SPR measurement, and it is generally applicable to other SPR-based immunoassays.
Resumo:
Tese de dout., Filosofia, Department of Management Science, University of Strathclyde, 2004
Resumo:
This paper analyzes the measurement of the diversity of sets based on the dissimilarity of the objects contained in the set. We discuss axiomatic approaches to diversity measurement and examine the considerations underlying the application of specific measures. Our focus is on descriptive issues: rather than assuming a specific ethical position or restricting attention to properties that are appealing in specific applications, we address the foundations of the measurement issue as such in the context of diversity.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Evaluations of measurement invariance provide essential construct validity evidence. However, the quality of such evidence is partly dependent upon the validity of the resulting statistical conclusions. The presence of Type I or Type II errors can render measurement invariance conclusions meaningless. The purpose of this study was to determine the effects of categorization and censoring on the behavior of the chi-square/likelihood ratio test statistic and two alternative fit indices (CFI and RMSEA) under the context of evaluating measurement invariance. Monte Carlo simulation was used to examine Type I error and power rates for the (a) overall test statistic/fit indices, and (b) change in test statistic/fit indices. Data were generated according to a multiple-group single-factor CFA model across 40 conditions that varied by sample size, strength of item factor loadings, and categorization thresholds. Seven different combinations of model estimators (ML, Yuan-Bentler scaled ML, and WLSMV) and specified measurement scales (continuous, censored, and categorical) were used to analyze each of the simulation conditions. As hypothesized, non-normality increased Type I error rates for the continuous scale of measurement and did not affect error rates for the categorical scale of measurement. Maximum likelihood estimation combined with a categorical scale of measurement resulted in more correct statistical conclusions than the other analysis combinations. For the continuous and censored scales of measurement, the Yuan-Bentler scaled ML resulted in more correct conclusions than normal-theory ML. The censored measurement scale did not offer any advantages over the continuous measurement scale. Comparing across fit statistics and indices, the chi-square-based test statistics were preferred over the alternative fit indices, and ΔRMSEA was preferred over ΔCFI. Results from this study should be used to inform the modeling decisions of applied researchers. However, no single analysis combination can be recommended for all situations. Therefore, it is essential that researchers consider the context and purpose of their analyses.
Resumo:
Although brand authenticity is gaining increasing interest in consumer behavior research and managerial practice, literature on its measurement and contribution to branding theory is still limited. This article develops an integrative framework of the concept of brand authenticity and reports the development and validation of a scale measuring consumers' perceived brand authenticity (PBA). A multi-phase scale development process resulted in a 15-item PBA scale measuring four dimensions: credibility, integrity, symbolism, and continuity. This scale is reliable across different brands and cultural contexts. We find that brand authenticity perceptions are influenced by indexical, existential, and iconic cues, whereby some of the latters' influence is moderated by consumers' level of marketing skepticism. Results also suggest that PBA increases emotional brand attachment and word-of-mouth, and that it drives brand choice likelihood through self-congruence for consumers high in self-authenticity.
Resumo:
Free drug measurement and pharmacodymanic markers provide the opportunity for a better understanding of drug efficacy and toxicity. High-performance liquid chromatography (HPLC)-mass spectrometry (MS) is a powerful analytical technique that could facilitate the measurement of free drug and these markers. Currently, there are very few published methods for the determination of free drug concentrations by HPLC-MS. The development of atmospheric pressure ionisation sources, together with on-line microdialysis or on-line equilibrium dialysis and column switching techniques have reduced sample run times and increased assay efficiency. The availability of such methods will aid in drug development and the clinical use of certain drugs, including anti-convulsants, anti-arrhythmics, immunosuppressants, local anaesthetics, anti-fungals and protease inhibitors. The history of free drug measurement and an overview of the current HPLC-MS applications for these drugs are discussed. Immunosuppressant drugs are used as an example for the application of HPLC-MS in the measurement of drug pharmacodynamics. Potential biomarkers of immunosuppression that could be measured by HPLC-MS include purine nucleoside/nucleotides, drug-protein complexes and phosphorylated peptides. At the proteomic level, two-dimensional gel electrophoresis combined with matrix-assisted laser desorption/ionisation time-of-flight (TOF) MS is a powerful tool for identifying proteins involved in the response to inflammatory mediators. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.