909 resultados para measurement and metrology
Resumo:
Verenpaineen kotimittaus − epidemiologia ja kliininen käyttö Kohonnutta verenpainetta, maailmanlaajuisesti merkittävintä ennenaikaiselle kuolemalle altistavaa riskitekijää, ei voida tunnistaa tai hoitaa ilman tarkkoja ja käytännöllisiä verenpaineen mittausmenetelmiä. Verenpaineen kotimittaus on saavuttanut suuren suosion potilaiden keskuudessa. Lääkärit eivät ole kuitenkaan vielä täysin hyväksyneet verenpaineen kotimittausta, sillä riittävä todistusaineisto sen toimivuudesta ja eduista on puuttunut. Tämän tutkimuksen tarkoituksena oli osoittaa, että kotona mitattu verenpaine (kotipaine) on perinteistä vastaanotolla mitattua verenpainetta (vastaanottopaine) tarkempi, ja että se on tehokas myös kliinisessä käytössä. Tutkimme kotipaineen käyttöä verenpainetaudin diagnosoinnissa ja hoidossa. Lisäksi tarkastelimme kotipaineen yhteyttä verenpainetaudin aiheuttamiin kohde-elinvaurioihin. Ensimmäinen aineisto, joka oli edustava otos Suomen aikuisväestöstä, koostui 2 120 45–74-vuotiaasta tutkimushenkilöstä. Tutkittavat mittasivat kotipainettaan viikon ajan ja osallistuivat terveystarkastukseen, johon sisältyi kliinisen tutkimuksen ja haastattelun lisäksi sydänfilmin otto ja vastaanottopaineen mittaus. 758 tutkittavalle suoritettiin lisäksi kaulavaltimon seinämän intima-mediakerroksen paksuuden (valtimonkovettumataudin mittari) mittaus ja 237:lle valtimon pulssiaallon nopeuden (valtimojäykkyyden mittari) mittaus. Toisessa aineistossa, joka koostui 98 verenpainetautia sairastavasta potilaasta, hoitoa ohjattiin satunnaistamisesta riippuen joko ambulatorisen eli vuorokausirekisteröinnillä mitatun verenpaineen tai kotipaineen perusteella. Vastaanottopaine oli kotipainetta merkittävästi korkeampi (systolisen/diastolisen paineen keskiarvoero oli 8/3 mmHg) ja yksimielisyys verenpainetaudin diagnoosissa kahden menetelmän välillä oli korkeintaan kohtalainen (75 %). 593 tutkittavasta, joilla oli kohonnut verenpaine vastaanotolla, 38 %:lla oli normaali verenpaine kotona eli ns. valkotakkiverenpaine. Verenpainetauti voidaan siis ylidiagnosoida joka kolmannella potilaalla seulontatilanteessa. Valkotakkiverenpaine oli yhteydessä lievästi kohonneeseen verenpaineeseen, matalaan painoindeksiin ja tupakoimattomuuteen, muttei psykiatriseen sairastavuuteen. Valkotakkiverenpaine ei kuitenkaan vaikuttaisi olevan täysin vaaraton ilmiö ja voi ennustaa tulevaa verenpainetautia, sillä siitä kärsivien sydän- ja verisuonitautien riskitekijäprofiili oli normaalipaineisten ja todellisten verenpainetautisten riskitekijäprofiilien välissä. Kotipaineella oli vastaanottopainetta vahvempi yhteys verenpainetaudin aiheuttamiin kohde-elinvaurioihin (intima-mediakerroksen paksuus, pulssiaallon nopeus ja sydänfilmistä todettu vasemman kammion suureneminen). Kotipaine oli tehokas verenpainetaudin hoidon ohjaaja, sillä kotipaineeseen ja ambulatoriseen paineeseen, jota on pidetty verenpainemittauksen ”kultaisena standardina”, perustuva lääkehoidon ohjaus johti yhtä hyvään verenpaineen hallintaan. Tämän ja aikaisempien tutkimusten tulosten pohjalta voidaan todeta, että verenpaineen kotimittaus on selkeä parannus perinteiseen vastaanotolla tapahtuvaan verenpainemittaukseen verrattuna. Verenpaineen kotimittaus on käytännöllinen, tarkka ja laajasti saatavilla oleva menetelmä, josta voi tulla jopa ensisijainen vaihtoehto verenpainetautia diagnosoitaessa ja hoitaessa. Verenpaineen mittauskäytäntöön tarvitaan muutos, sillä näyttöön perustuvan lääketieteen perusteella vaikuttaa, että vastaanotolla tapahtuvaa verenpainemittausta tulisi käyttää vain seulontatarkoitukseen.
Resumo:
This paper analyzes the measurement of the diversity of sets based on the dissimilarity of the objects contained in the set. We discuss axiomatic approaches to diversity measurement and examine the considerations underlying the application of specific measures. Our focus is on descriptive issues: rather than assuming a specific ethical position or restricting attention to properties that are appealing in specific applications, we address the foundations of the measurement issue as such in the context of diversity.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Evaluations of measurement invariance provide essential construct validity evidence. However, the quality of such evidence is partly dependent upon the validity of the resulting statistical conclusions. The presence of Type I or Type II errors can render measurement invariance conclusions meaningless. The purpose of this study was to determine the effects of categorization and censoring on the behavior of the chi-square/likelihood ratio test statistic and two alternative fit indices (CFI and RMSEA) under the context of evaluating measurement invariance. Monte Carlo simulation was used to examine Type I error and power rates for the (a) overall test statistic/fit indices, and (b) change in test statistic/fit indices. Data were generated according to a multiple-group single-factor CFA model across 40 conditions that varied by sample size, strength of item factor loadings, and categorization thresholds. Seven different combinations of model estimators (ML, Yuan-Bentler scaled ML, and WLSMV) and specified measurement scales (continuous, censored, and categorical) were used to analyze each of the simulation conditions. As hypothesized, non-normality increased Type I error rates for the continuous scale of measurement and did not affect error rates for the categorical scale of measurement. Maximum likelihood estimation combined with a categorical scale of measurement resulted in more correct statistical conclusions than the other analysis combinations. For the continuous and censored scales of measurement, the Yuan-Bentler scaled ML resulted in more correct conclusions than normal-theory ML. The censored measurement scale did not offer any advantages over the continuous measurement scale. Comparing across fit statistics and indices, the chi-square-based test statistics were preferred over the alternative fit indices, and ΔRMSEA was preferred over ΔCFI. Results from this study should be used to inform the modeling decisions of applied researchers. However, no single analysis combination can be recommended for all situations. Therefore, it is essential that researchers consider the context and purpose of their analyses.
Resumo:
Although brand authenticity is gaining increasing interest in consumer behavior research and managerial practice, literature on its measurement and contribution to branding theory is still limited. This article develops an integrative framework of the concept of brand authenticity and reports the development and validation of a scale measuring consumers' perceived brand authenticity (PBA). A multi-phase scale development process resulted in a 15-item PBA scale measuring four dimensions: credibility, integrity, symbolism, and continuity. This scale is reliable across different brands and cultural contexts. We find that brand authenticity perceptions are influenced by indexical, existential, and iconic cues, whereby some of the latters' influence is moderated by consumers' level of marketing skepticism. Results also suggest that PBA increases emotional brand attachment and word-of-mouth, and that it drives brand choice likelihood through self-congruence for consumers high in self-authenticity.
Resumo:
Free drug measurement and pharmacodymanic markers provide the opportunity for a better understanding of drug efficacy and toxicity. High-performance liquid chromatography (HPLC)-mass spectrometry (MS) is a powerful analytical technique that could facilitate the measurement of free drug and these markers. Currently, there are very few published methods for the determination of free drug concentrations by HPLC-MS. The development of atmospheric pressure ionisation sources, together with on-line microdialysis or on-line equilibrium dialysis and column switching techniques have reduced sample run times and increased assay efficiency. The availability of such methods will aid in drug development and the clinical use of certain drugs, including anti-convulsants, anti-arrhythmics, immunosuppressants, local anaesthetics, anti-fungals and protease inhibitors. The history of free drug measurement and an overview of the current HPLC-MS applications for these drugs are discussed. Immunosuppressant drugs are used as an example for the application of HPLC-MS in the measurement of drug pharmacodynamics. Potential biomarkers of immunosuppression that could be measured by HPLC-MS include purine nucleoside/nucleotides, drug-protein complexes and phosphorylated peptides. At the proteomic level, two-dimensional gel electrophoresis combined with matrix-assisted laser desorption/ionisation time-of-flight (TOF) MS is a powerful tool for identifying proteins involved in the response to inflammatory mediators. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.
Resumo:
Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.
Resumo:
Measuring variations in efficiency and its extension, eco-efficiency, during a restructuring period in different industries has always been a point of interest for regulators and policy makers. This paper assesses the impacts of restructuring of procurement in the Iranian power industry on the performance of power plants. We introduce a new slacks-based model for Malmquist-Luenberger (ML) Index measurement and apply it to the power plants to calculate the efficiency, eco-efficiency, and technological changes over the 8-year period (2003-2010) of restructuring in the power industry. The results reveal that although the restructuring had different effects on the individual power plants, the overall growth in the eco-efficiency of the sector was mainly due to advances in pure technology. We also assess the correlation between efficiency and eco-efficiency of the power plants, which indicates a close relationship between these two steps, thus lending support to the incorporation of environmental factors in efficiency analysis. © 2014 Elsevier Ltd.
Resumo:
This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.
Resumo:
In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.
Resumo:
Tanulmányunkban a hazai vállalatok teljesítménymérési és teljesítménymenedzsment gyakorlatát vizsgáljuk a Versenyben a világgal kutatási program 2009. évi felmérése adatainak felhasználásával. Célunk a döntéstámogatás hátterének vizsgálata: a vállalatok teljesítménymérési gyakorlatának jellemzése, konzisztenciájának értékelése, vizsgálva a korábbi (1996, 1999 és 2004 évi hasonló) kutatásaink során megfigyelt tendenciák további alakulását is. A vállalati teljesítménymérés gyakorlatát, a vállalatvezetők által fontosnak/hasznosnak tartott, illetve rendszeresen használt információforrásokat, teljesítménymutatókat, elemzési eszközöket a korábbi kutatásainkhoz kialakított elemzési keret (orientáció, egyensúly, konzisztencia, támogató szerep) felhasználásával értékeltük. Az információs rendszer különböző tevékenységeket támogató szerepének az értékelése során a különböző területekért felelős vezetők véleményét is összevetettük, s különböző vállalati jellemzők (vállalatméret, tulajdonosok típusa, fő tevékenység stb.) sajátosságait is vizsgáltuk. ___________ The paper analyses the performance measurement and performance management practice of Hungarian companies, based on the data of the Competitiveness research program (2009). Our goal was to evaluate the practice from the point of view of decision support, based on our previous framework, evaluating the orientation, the balance, the consistency and the supporting role of the performance measurement practice.