983 resultados para current measurement
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
"June 1997."
Resumo:
"June 1978."
Resumo:
Free drug measurement and pharmacodymanic markers provide the opportunity for a better understanding of drug efficacy and toxicity. High-performance liquid chromatography (HPLC)-mass spectrometry (MS) is a powerful analytical technique that could facilitate the measurement of free drug and these markers. Currently, there are very few published methods for the determination of free drug concentrations by HPLC-MS. The development of atmospheric pressure ionisation sources, together with on-line microdialysis or on-line equilibrium dialysis and column switching techniques have reduced sample run times and increased assay efficiency. The availability of such methods will aid in drug development and the clinical use of certain drugs, including anti-convulsants, anti-arrhythmics, immunosuppressants, local anaesthetics, anti-fungals and protease inhibitors. The history of free drug measurement and an overview of the current HPLC-MS applications for these drugs are discussed. Immunosuppressant drugs are used as an example for the application of HPLC-MS in the measurement of drug pharmacodynamics. Potential biomarkers of immunosuppression that could be measured by HPLC-MS include purine nucleoside/nucleotides, drug-protein complexes and phosphorylated peptides. At the proteomic level, two-dimensional gel electrophoresis combined with matrix-assisted laser desorption/ionisation time-of-flight (TOF) MS is a powerful tool for identifying proteins involved in the response to inflammatory mediators. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
We present a new model for the continuous measurement of a coupled quantum dot charge qubit. We model the effects of a realistic measurement, namely adding noise to, and filtering, the current through the detector. This is achieved by embedding the detector in an equivalent circuit for measurement. Our aim is to describe the evolution of the qubit state conditioned on the macroscopic output of the external circuit. We achieve this by generalizing a recently developed quantum trajectory theory for realistic photodetectors [P. Warszawski, H. M. Wiseman, and H. Mabuchi, Phys. Rev. A 65, 023802 (2002)] to treat solid-state detectors. This yields stochastic equations whose (numerical) solutions are the realistic quantum trajectories of the conditioned qubit state. We derive our general theory in the context of a low transparency quantum point contact. Areas of application for our theory and its relation to previous work are discussed.
Resumo:
Objectives: Cyclosporin is an immunosuppressant drug with a narrow therapeutic window. Trough and 2-h post-dose blood samples are currently used for therapeutic drug monitoring in solid organ transplant recipients. The aim of the current study was to develop a rapid HPLC-tandem mass spectrometry (HPLC-MS) method for the measurement of cyclosporin in whole blood that was not only suitable for the clinical setting but also considered a reference method. Methods: Blood samples (50 mu L) were prepared by protein precipitation followed by C-18 solid-phase extraction while using d(12) cyclosporin as the internal standard. Mass spectrometric detection was by selected reaction monitoring with an electrospray interface in positive ionization mode. Results: The assay was linear from 10 to 2000 mu g/L (r(2) > 0.996, n = 9). Inter-day,analytical recovery and imprecision using whole blood quality control samples at 10, 30, 400, 1500, and 2000 mu g/L were 94.9-103.5% and
Resumo:
Despite current imperatives to measure client outcomes, social workers have expressed frustration with the ability of traditional forms of quantitative methods to engage with complexity, individuality and meaning. This paper argues that the inclusion of a meaning-based as opposed to a function-based approach to quality of life (QOL) may offer a quantitative means of measurement that is congruent with social-work values and practice.
Resumo:
Circuit QED is a promising solid-state quantum computing architecture. It also has excellent potential as a platform for quantum control-especially quantum feedback control-experiments. However, the current scheme for measurement in circuit QED is low efficiency and has low signal-to-noise ratio for single-shot measurements. The low quality of this measurement makes the implementation of feedback difficult, and here we propose two schemes for measurement in circuit QED architectures that can significantly improve signal-to-noise ratio and potentially achieve quantum-limited measurement. Such measurements would enable the implementation of quantum feedback protocols and we illustrate this with a simple entanglement-stabilization scheme.
Resumo:
This research extends the consumer-based brand equity measurement approach to the measurement of the equity associated with retailers. This paper also addresses some of the limitations associated with current retailer equity measurement such as a lack of clarity regarding its nature and dimensionality. We conceptualise retailer equity as a four-dimensional construct comprising retailer awareness, retailer associations, perceived retailer quality, and retailer loyalty. The paper reports the result of an empirical study of a convenience sample of 601 shopping mall consumers at an Australian state capital city. Following a confirmatory factor analysis using structural equation modelling to examine the dimensionality of the retailer equity construct, the proposed model is tested for two retailer categories: department stores and speciality stores. Results confirm the hypothesised four-dimensional structure.
Resumo:
Few educational campaigns have focused on bowel cancer, though studies have indicated that members of the community need and want current information about relevant issues. In order to facilitate research in this area, reliable and valid measures of community attitudes are needed. Content validity of a survey instrument was obtained through use of a Delphi process with Directors of Education from the Australia Cancer Council and focus group discussions with informed members of the public. The subsequent survey of community perceptions about colorectal cancer included a broad range of content areas related to the risk of bowel cancer, preventing and coping with bowel cancer and beliefs about susceptibility and severity. The construct validity of these content areas was investigated by use of a factor analysis and confirmation of an association with related predictor variables. Two measures related to personal influence and anticipated coping responses showed favourable psychometric properties, including moderate to high levels of internal consistency and test-retest reliability. A test of the concurrent validity of these measures requires further development of instruments related to colorectal cancer or adaptation of measures from other areas of health research. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
Conventional detection scheme for self-mixing sensors uses an integrated photodiode within the laser package to monitor the self mixing signal. This arrangement can be simplified by directly obtaining the self-mixing signals across the laser diode itself and omitting the photodiode. This work reports on a Vertical-Cavity Surface-Emitting Laser (VCSEL) based selfmixing sensor using the laser junction voltage to obtain the selfmixing signal. We show that the same information can be obtained with only minor changes to the extraction circuitry leading to potential cost saving with reductions in component costs and complexity and significant increase in bandwidth favoring high speed modulation. Experiments using both photo current and voltage detection were carried out and the results obtained show good agreement with the theory.
Resumo:
Purpose – To investigate the impact of performance measurement in strategic planning process. Design/methodology/approach – A large scale survey was conducted online with Warwick Business School alumni. The questionnaire was based on the Strategic Development Process model by Dyson. The questionnaire was designed to map the current practice of strategic planning and to determine its most influential factors on the effectiveness of the process. All questions were close ended and a seven-point Likert scale used. The independent variables were grouped into four meaningful factors by factor analysis (Varimax, coefficient of rotation 0.4). The factors produced were used to build regression models (stepwise) for the five assessments of strategic planning process. Regression models were developed for the totality of the responses, comparing SMEs and large organizations and comparing organizations operating in slowly and rapidly changing environments. Findings – The results indicate that performance measurement stands as one of the four main factors characterising the current practice of strategic planning. This research has determined that complexity coming from organizational size and rate of change in the sector creates variation in the impact of performance measurement in strategic planning. Large organizations and organizations operating in rapidly changing environments make greater use of performance measurement. Research limitations/implications – This research is based on subjective data, therefore the conclusions do not concern the impact of strategic planning process' elements on the organizational performance achievements, but on the success/effectiveness of the strategic planning process itself. Practical implications – This research raises a series of questions about the use and potential impact of performance measurement, especially in the categories of organizations that are not significantly influenced by its utilisation. It contributes to the field of performance measurement impact. Originality/value – This research fills in the gap literature concerning the lack of large scale surveys on strategic development processes and performance measurement. It also contributes in the literature of this field by providing empirical evidences on the impact of performance measurement upon the strategic planning process.
Resumo:
Fare, Grosskopf, Norris and Zhang developed a non-parametric productivity index, Malmquist index, using data envelopment analysis (DEA). The Malmquist index is a measure of productivity progress (regress) and it can be decomposed to different components such as 'efficiency catch-up' and 'technology change'. However, Malmquist index and its components are based on two period of time which can capture only a part of the impact of investment in long-lived assets. The effects of lags in the investment process on the capital stock have been ignored in the current model of Malmquist index. This paper extends the recent dynamic DEA model introduced by Emrouznejad and Thanassoulis and Emrouznejad for dynamic Malmquist index. This paper shows that the dynamic productivity results for Organisation for Economic Cooperation and Development countries should reflect reality better than those based on conventional model.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
The aim of this study was to determine whether an ophthalmophakometric technique could offer a feasible means of investigating ocular component contributions to residual astigmatism in human eyes. Current opinion was gathered on the prevalence, magnitude and source of residual astigmatism. It emerged that a comprehensive evaluation of the astigmatic contributions of the eye's internal ocular surfaces and their respective axial separations (effectivity) had not been carried out to date. An ophthalmophakometric technique was developed to measure astigmatism arising from the internal ocular components. Procedures included the measurement of refractive error (infra-red autorefractometry), anterior corneal surface power (computerised video keratography), axial distances (A-scan ultrasonography) and the powers of the posterior corneal surface in addition to both surfaces of the crystalline lens (multi-meridional still flash ophthalmophakometry). Computing schemes were developed to yield the required biometric data. These included (1) calculation of crystalline lens surface powers in the absence of Purkinje images arising from its anterior surface, (2) application of meridional analysis to derive spherocylindrical surface powers from notional powers calculated along four pre-selected meridians, (3) application of astigmatic decomposition and vergence analysis to calculate contributions to residual astigmatism of ocular components with obliquely related cylinder axes, (4) calculation of the effect of random experimental errors on the calculated ocular component data. A complete set of biometric measurements were taken from both eyes of 66 undergraduate students. Effectivity due to corneal thickness made the smallest cylinder power contribution (up to 0.25DC) to residual astigmatism followed by contributions of the anterior chamber depth (up to 0.50DC) and crystalline lens thickness (up to 1.00DC). In each case astigmatic contributions were predominantly direct. More astigmatism arose from the posterior corneal surface (up to 1.00DC) and both crystalline lens surfaces (up to 2.50DC). The astigmatic contributions of the posterior corneal and lens surfaces were found to be predominantly inverse whilst direct astigmatism arose from the anterior lens surface. Very similar results were found for right versus left eyes and males versus females. Repeatability was assessed on 20 individuals. The ophthalmophakometric method was found to be prone to considerable accumulated experimental errors. However, these errors are random in nature so that group averaged data were found to be reasonably repeatable. A further confirmatory study was carried out on 10 individuals which demonstrated that biometric measurements made with and without cycloplegia did not differ significantly.