972 resultados para Multi Measurement Mode


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: TIDratio indirectly reflects myocardial ischemia and is correlated with cardiacprognosis. We aimed at comparing the influence of three different softwarepackages for the assessment of TID using Rb-82 cardiac PET/CT. Methods: Intotal, data of 30 patients were used based on normal myocardial perfusion(SSS<3 and SRS<3) and stress myocardial blood flow 2mL/min/g)assessed by Rb-82 cardiac PET/CT. After reconstruction using 2D OSEM (2Iterations, 28 subsets), 3-D filtering (Butterworth, order=10, ωc=0.5), data were automatically processed, and then manually processed fordefining identical basal and apical limits on both stress and rest images.TIDratio were determined with Myometrix®, ECToolbox® and QGS®software packages. Comparisons used ANOVA, Student t-tests and Lin concordancetest (ρc). Results: All of the 90 processings were successfullyperformed. TID ratio were not statistically different between software packageswhen data were processed automatically (P=0.2) or manually (P=0.17). There was a slight, butsignificant relative overestimation of TID with automatic processing incomparison to manual processing using ECToolbox® (1.07 ± 0.13 vs 1.0± 0.13, P=0.001)and Myometrix® (1.07 ± 0.15 vs 1.01 ± 0.11, P=0.003) but not using QGS®(1.02 ±0.12 vs 1.05 ± 0.11, P=0.16). The best concordance was achieved between ECToolbox®and Myometrix® manual (ρc=0.67) processing.Conclusion: Using automatic or manual mode TID estimation was not significantlyinfluenced by software type. Using Myometrix® or ECToolbox®TID was significantly different between automatic and manual processing, butnot using QGS®. Software package should be account for when definingTID normal reference limits, as well as when used in multicenter studies. QGS®software seemed to be the most operator-independent software package, whileECToolbox® and Myometrix® produced the closest results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measuring school efficiency is a challenging task. First, a performance measurement technique has to be selected. Within Data Envelopment Analysis (DEA), one such technique, alternative models have been developed in order to deal with environmental variables. The majority of these models lead to diverging results. Second, the choice of input and output variables to be included in the efficiency analysis is often dictated by data availability. The choice of the variables remains an issue even when data is available. As a result, the choice of technique, model and variables is probably, and ultimately, a political judgement. Multi-criteria decision analysis methods can help the decision makers to select the most suitable model. The number of selection criteria should remain parsimonious and not be oriented towards the results of the models in order to avoid opportunistic behaviour. The selection criteria should also be backed by the literature or by an expert group. Once the most suitable model is identified, the principle of permanence of methods should be applied in order to avoid a change of practices over time. Within DEA, the two-stage model developed by Ray (1991) is the most convincing model which allows for an environmental adjustment. In this model, an efficiency analysis is conducted with DEA followed by an econometric analysis to explain the efficiency scores. An environmental variable of particular interest, tested in this thesis, consists of the fact that operations are held, for certain schools, on multiple sites. Results show that the fact of being located on more than one site has a negative influence on efficiency. A likely way to solve this negative influence would consist of improving the use of ICT in school management and teaching. Planning new schools should also consider the advantages of being located on a unique site, which allows reaching a critical size in terms of pupils and teachers. The fact that underprivileged pupils perform worse than privileged pupils has been public knowledge since Coleman et al. (1966). As a result, underprivileged pupils have a negative influence on school efficiency. This is confirmed by this thesis for the first time in Switzerland. Several countries have developed priority education policies in order to compensate for the negative impact of disadvantaged socioeconomic status on school performance. These policies have failed. As a result, other actions need to be taken. In order to define these actions, one has to identify the social-class differences which explain why disadvantaged children underperform. Childrearing and literary practices, health characteristics, housing stability and economic security influence pupil achievement. Rather than allocating more resources to schools, policymakers should therefore focus on related social policies. For instance, they could define pre-school, family, health, housing and benefits policies in order to improve the conditions for disadvantaged children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To develop a breathhold method for black-blood viability imaging of the heart that may facilitate identifying the endocardial border. MATERIALS AND METHODS: Three stimulated-echo acquisition mode (STEAM) images were obtained almost simultaneously during the same acquisition using three different demodulation values. Two of the three images were used to construct a black-blood image of the heart. The third image was a T(1)-weighted viability image that enabled detection of hyperintense infarcted myocardium after contrast agent administration. The three STEAM images were combined into one composite black-blood viability image of the heart. The composite STEAM images were compared to conventional inversion-recovery (IR) delayed hyperenhanced (DHE) images in nine human subjects studied on a 3T MRI scanner. RESULTS: STEAM images showed black-blood characteristics and a significant improvement in the blood-infarct signal-difference to noise ratio (SDNR) when compared to the IR-DHE images (34 +/- 4.1 vs. 10 +/- 2.9, mean +/- standard deviation (SD), P < 0.002). There was sufficient myocardium-infarct SDNR in the STEAM images to accurately delineate infarcted regions. The extracted infarcts demonstrated good agreement with the IR-DHE images. CONCLUSION: The STEAM black-blood property allows for better delineation of the blood-infarct border, which would enhance the fast and accurate measurement of infarct size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The association between smoking and total energy expenditure (TEE) is still controversial. We examined this association in a multi-country study where TEE was measured in a subset of participants by the doubly labeled water (DLW) method, the gold standard for this measurement. METHODS: This study includes 236 participants from five different African origin populations who underwent DLW measurements and had complete data on the main covariates of interest. Self-reported smoking status was categorized as either light (<7 cig/day) or high (≥7 cig/day). Lean body mass was assessed by deuterium dilution and physical activity (PA) by accelerometry. RESULTS: The prevalence of smoking was 55% in men and 16% in women with a median of 6.5 cigarettes/day. There was a trend toward lower BMI in smokers than non-smokers (not statistically significant). TEE was strongly correlated with fat-free mass (men: 0.70; women: 0.79) and with body weight (0.59 in both sexes). Using linear regression and adjusting for body weight, study site, age, PA, alcohol intake and occupation, TEE was larger in high smokers than in never smokers among men (difference of 298 kcal/day, p = 0.045) but not among women (162 kcal/day, p = 0.170). The association became slightly weaker in men (254 kcal/day, p = 0.058) and disappeared in women (-76 kcal/day, p = 0.380) when adjusting for fat-free mass instead of body weight. CONCLUSION: There was an association between smoking and TEE among men. However, the lack of an association among women, which may be partly related to the small number of smoking women, also suggests a role of unaccounted confounding factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new-generation nebulizers are commonly used for the administration of salbutamol in mechanically ventilated patients. The different modes of administration and new devices have not been compared. We developed a liquid chromatography-tandem mass spectrometry method for the determination of concentrations as low as 0.05 ng/mL of salbutamol, corresponding to the desired plasma concentration after inhalation. Salbutamol quantification was performed by reverse-phase HPLC. Analyte quantification was performed by electrospray ionization-triple quadrupole mass spectrometry using selected reaction monitoring detection ESI in the positive mode. The method was validated over concentrations ranging from 0.05 to 100 ng/mL in plasma and from 0.18 to 135 ng/mL in urine. The method is precise, with mean inter-day coefficient of variation (CV%) within 3.1-8.3% in plasma and 1.3-3.9% in urine, as well as accurate. The proposed method was found to reach the required sensitivity for the evaluation of different nebulizers as well as nebulization modes. The present assay was applied to examine whether salbutamol urine levels, normalized with the creatinine levels, correlated with the plasma concentrations. A suitable, convenient and noninvasive method of monitoring patients receiving salbutamol by mechanical ventilation could be implemented. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Raltegravir (RAL), maraviroc (MVC), darunavir (DRV), and etravirine (ETV) are new antiretroviral agents with significant potential for drug interactions. This work describes a sensitive and accurate liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the determination of plasma drug levels. Single-step extraction of RAL, MVC, DRV, ETV and RTV from plasma (100 microl) is performed by protein precipitation using 600 microl of acetonitrile, after the addition of 100 microl darunavir-d(9) (DRV-d(9)) at 1000 ng/ml in MeOH/H(2)O 50/50 as internal standard (I.S.). The mixture is vortexed, sonicated for 10 min, vortex-mixed again and centrifuged. An aliquot of supernatant (150 microl) is diluted 1:1 with a mixture of 20 mM ammonium acetate/MeOH 40/60 and 10 microl is injected onto a 2.1 x 50 mm Waters Atlantis-dC18 3 microm analytical column. Chromatographic separations are performed using a gradient program with 2 mM ammonium acetate containing 0.1% formic acid and acetonitrile with 0.1% formic acid. Analytes quantification is performed by electrospray ionisation-triple quadrupole mass spectrometry using the selected reaction monitoring detection in the positive mode. The method has been validated over the clinically relevant concentrations ranging from 12.5 to 5000 ng/ml, 2.5 to 1000 ng/ml, 25 to 10,000 ng/ml, 10 to 4000 ng/ml, and 5 to 2000 ng/ml for RAL, MRV, DRV, ETV and RTV, respectively. The extraction recovery for all antiretroviral drugs is always above 91%. The method is precise, with mean inter-day CV% within 5.1-9.8%, and accurate (range of inter-day deviation from nominal values -3.3 to +5.1%). In addition our method enables the simultaneous assessment of raltegravir-glucuronide. This is the first analytical method allowing the simultaneous assay of antiretroviral agents targeted to four different steps of HIV replication. The proposed method is suitable for the Therapeutic Drug Monitoring Service of these new regimen combinations administered as salvage therapy to patients having experienced treatment failure, and for whom exposure, tolerance and adherence assessments are critical.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tumour immunologists strive to develop efficient tumour vaccination and adoptive transfer therapies that enlarge the pool of tumour-specific and -reactive effector T-cells in vivo. To assess the efficiency of the various strategies, ex vivo assays are needed for the longitudinal monitoring of the patient's specific immune responses providing both quantitative and qualitative data. In particular, since tumour cell cytolysis is the end goal of tumour immunotherapy, routine immune monitoring protocols need to include a read-out for the cytolytic efficiency of Ag-specific cells. We propose to combine current immune monitoring techniques in a highly sensitive and reproducible multi-parametric flow cytometry based cytotoxicity assay that has been optimised to require low numbers of Ag-specific T-cells. The possibility of re-analysing those T-cells that have undergone lytic activity is illustrated by the concomitant detection of CD107a upregulation on the surface of degranulated T-cells. To date, the LiveCount Assay provides the only possibility of assessing the ex vivo cytolytic activity of low-frequency Ag-specific cytotoxic T-lymphocytes from patient material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arterial Spin Labeling (ASL) is a method to measure perfusion using magnetically labeled blood water as an endogenous tracer. Being fully non-invasive, this technique is attractive for longitudinal studies of cerebral blood flow in healthy and diseased individuals, or as a surrogate marker of metabolism. So far, ASL has been restricted mostly to specialist centers due to a generally low SNR of the method and potential issues with user-dependent analysis needed to obtain quantitative measurement of cerebral blood flow (CBF). Here, we evaluated a particular implementation of ASL (called Quantitative STAR labeling of Arterial Regions or QUASAR), a method providing user independent quantification of CBF in a large test-retest study across sites from around the world, dubbed "The QUASAR reproducibility study". Altogether, 28 sites located in Asia, Europe and North America participated and a total of 284 healthy volunteers were scanned. Minimal operator dependence was assured by using an automatic planning tool and its accuracy and potential usefulness in multi-center trials was evaluated as well. Accurate repositioning between sessions was achieved with the automatic planning tool showing mean displacements of 1.87+/-0.95 mm and rotations of 1.56+/-0.66 degrees . Mean gray matter CBF was 47.4+/-7.5 [ml/100 g/min] with a between-subject standard variation SD(b)=5.5 [ml/100 g/min] and a within-subject standard deviation SD(w)=4.7 [ml/100 g/min]. The corresponding repeatability was 13.0 [ml/100 g/min] and was found to be within the range of previous studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: This descriptive article illustrates the application of Global Positioning System (GPS) professional receivers in the field of locomotion studies. The technological challenge was to assess the external mechanical work in outdoor walking. METHODS: Five subjects walked five times during 5 min on an athletic track at different imposed stride frequency (from 70-130 steps x min(-1)). A differential GPS system (carrier phase analysis) measured the variation of the position of the trunk at 5 Hz. A portable indirect calorimeter recorded breath-by-breath energy expenditure. RESULTS: For a walking speed of 1.05 +/- 0.11 m x s(-1), the vertical lift of the trunk (43 +/- 14 mm) induced a power of 46.0 +/- 20.4 W. The average speed variation per step (0.15 +/- 0.03 m x s(-1)) produced a kinetic power of 16.9 +/- 7.2 W. As compared with commonly admitted values, the energy exchange (recovery) between the two energy components was low (39.1 +/- 10.0%), which induced an overestimated mechanical power (38.9 +/- 18.3 W or 0.60 W x kg(-1) body mass) and a high net mechanical efficiency (26.9 +/- 5.8%). CONCLUSION: We assumed that the cause of the overestimation was an unwanted oscillation of the GPS antenna. It is concluded that GPS (in phase mode) is now able to record small body movements during human locomotion, and constitutes a promising tool for gait analysis of outdoor unrestrained walking. However, the design of the receiver and the antenna must be adapted to human experiments and a thorough validation study remains to be conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Knowledge of cerebral blood flow (CBF) alterations in cases of acute stroke could be valuable in the early management of these cases. Among imaging techniques affording evaluation of cerebral perfusion, perfusion CT studies involve sequential acquisition of cerebral CT sections obtained in an axial mode during the IV administration of iodinated contrast material. They are thus very easy to perform in emergency settings. Perfusion CT values of CBF have proved to be accurate in animals, and perfusion CT affords plausible values in humans. The purpose of this study was to validate perfusion CT studies of CBF by comparison with the results provided by stable xenon CT, which have been reported to be accurate, and to evaluate acquisition and processing modalities of CT data, notably the possible deconvolution methods and the selection of the reference artery. METHODS: Twelve stable xenon CT and perfusion CT cerebral examinations were performed within an interval of a few minutes in patients with various cerebrovascular diseases. CBF maps were obtained from perfusion CT data by deconvolution using singular value decomposition and least mean square methods. The CBF were compared with the stable xenon CT results in multiple regions of interest through linear regression analysis and bilateral t tests for matched variables. RESULTS: Linear regression analysis showed good correlation between perfusion CT and stable xenon CT CBF values (singular value decomposition method: R(2) = 0.79, slope = 0.87; least mean square method: R(2) = 0.67, slope = 0.83). Bilateral t tests for matched variables did not identify a significant difference between the two imaging methods (P >.1). Both deconvolution methods were equivalent (P >.1). The choice of the reference artery is a major concern and has a strong influence on the final perfusion CT CBF map. CONCLUSION: Perfusion CT studies of CBF achieved with adequate acquisition parameters and processing lead to accurate and reliable results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a viscometric affinity biosensor that can potentially allow continuous multi-analyte monitoring in biological fluids like blood or plasma. The sensing principle is based on the detection of viscosity changes of a polymeric solution which has a selective affinity for the analyte of interest. The chemico-mechanical sensor incorporates an actuating piezoelectric diaphragm, a sensing piezoelectric diaphragm and a flow-resisting microchannel for viscosity detection. A free-standing Anodic Alumina Oxide (AAO) porous nano-membrane is used as selective interface. A glucose-sensitive sensor was fabricated and extensively assessed in buffer solution. The sensor reversibility, stability and sensitivity were excellent during at least 65 hours. Results showed also a good degree of stability for a long term measurement (25 days). The sensor behaviour was furthermore tested in fetal bovine serum (FBS). The obtained results for glucose sensing are very promising, indicating that the developed sensor is a candidate for continuous monitoring in biological fluids. Sensitive solutions for ionized calcium and pH are currently under development and should allow multi-analyte sensing in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the intense international competition, demanding, and sophisticated customers, and diverse transforming technological change, organizations need to renew their products and services by allocating resources on research and development (R&D). Managing R&D is complex, but vital for many organizations to survive in the dynamic, turbulent environment. Thus, the increased interest among decision-makers towards finding the right performance measures for R&D is understandable. The measures or evaluation methods of R&D performance can be utilized for multiple purposes; for strategic control, for justifying the existence of R&D, for providing information and improving activities, as well as for the purposes of motivating and benchmarking. The earlier research in the field of R&D performance analysis has generally focused on either the activities and considerable factors and dimensions - e.g. strategic perspectives, purposes of measurement, levels of analysis, types of R&D or phases of R&D process - prior to the selection of R&Dperformance measures, or on proposed principles or actual implementation of theselection or design processes of R&D performance measures or measurement systems. This study aims at integrating the consideration of essential factors anddimensions of R&D performance analysis to developed selection processes of R&D measures, which have been applied in real-world organizations. The earlier models for corporate performance measurement that can be found in the literature, are to some extent adaptable also to the development of measurement systemsand selecting the measures in R&D activities. However, it is necessary to emphasize the special aspects related to the measurement of R&D performance in a way that make the development of new approaches for especially R&D performance measure selection necessary: First, the special characteristics of R&D - such as the long time lag between the inputs and outcomes, as well as the overall complexity and difficult coordination of activities - influence the R&D performance analysis problems, such as the need for more systematic, objective, balanced and multi-dimensional approaches for R&D measure selection, as well as the incompatibility of R&D measurement systems to other corporate measurement systems and vice versa. Secondly, the above-mentioned characteristics and challenges bring forth the significance of the influencing factors and dimensions that need to be recognized in order to derive the selection criteria for measures and choose the right R&D metrics, which is the most crucial step in the measurement system development process. The main purpose of this study is to support the management and control of the research and development activities of organizations by increasing the understanding of R&D performance analysis, clarifying the main factors related to the selection of R&D measures and by providing novel types of approaches and methods for systematizing the whole strategy- and business-based selection and development process of R&D indicators.The final aim of the research is to support the management in their decision making of R&D with suitable, systematically chosen measures or evaluation methods of R&D performance. Thus, the emphasis in most sub-areas of the present research has been on the promotion of the selection and development process of R&D indicators with the help of the different tools and decision support systems, i.e. the research has normative features through providing guidelines by novel types of approaches. The gathering of data and conducting case studies in metal and electronic industry companies, in the information and communications technology (ICT) sector, and in non-profit organizations helped us to formulate a comprehensive picture of the main challenges of R&D performance analysis in different organizations, which is essential, as recognition of the most importantproblem areas is a very crucial element in the constructive research approach utilized in this study. Multiple practical benefits regarding the defined problemareas could be found in the various constructed approaches presented in this dissertation: 1) the selection of R&D measures became more systematic when compared to the empirical analysis, as it was common that there were no systematic approaches utilized in the studied organizations earlier; 2) the evaluation methods or measures of R&D chosen with the help of the developed approaches can be more directly utilized in the decision-making, because of the thorough consideration of the purpose of measurement, as well as other dimensions of measurement; 3) more balance to the set of R&D measures was desired and gained throughthe holistic approaches to the selection processes; and 4) more objectivity wasgained through organizing the selection processes, as the earlier systems were considered subjective in many organizations. Scientifically, this dissertation aims to make a contribution to the present body of knowledge of R&D performance analysis by facilitating dealing with the versatility and challenges of R&D performance analysis, as well as the factors and dimensions influencing the selection of R&D performance measures, and by integrating these aspects to the developed novel types of approaches, methods and tools in the selection processes of R&D measures, applied in real-world organizations. In the whole research, facilitation of dealing with the versatility and challenges in R&D performance analysis, as well as the factors and dimensions influencing the R&D performance measure selection are strongly integrated with the constructed approaches. Thus, the research meets the above-mentioned purposes and objectives of the dissertation from the scientific as well as from the practical point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Left atrial (LA) dilatation is associated with a large variety of cardiac diseases. Current cardiovascular magnetic resonance (CMR) strategies to measure LA volumes are based on multi-breath-hold multi-slice acquisitions, which are time-consuming and susceptible to misregistration. AIM: To develop a time-efficient single breath-hold 3D CMR acquisition and reconstruction method to precisely measure LA volumes and function. METHODS: A highly accelerated compressed-sensing multi-slice cine sequence (CS-cineCMR) was combined with a non-model-based 3D reconstruction method to measure LA volumes with high temporal and spatial resolution during a single breath-hold. This approach was validated in LA phantoms of different shapes and applied in 3 patients. In addition, the influence of slice orientations on accuracy was evaluated in the LA phantoms for the new approach in comparison with a conventional model-based biplane area-length reconstruction. As a reference in patients, a self-navigated high-resolution whole-heart 3D dataset (3D-HR-CMR) was acquired during mid-diastole to yield accurate LA volumes. RESULTS: Phantom studies. LA volumes were accurately measured by CS-cineCMR with a mean difference of -4.73 ± 1.75 ml (-8.67 ± 3.54%, r2 = 0.94). For the new method the calculated volumes were not significantly different when different orientations of the CS-cineCMR slices were applied to cover the LA phantoms. Long-axis "aligned" vs "not aligned" with the phantom long-axis yielded similar differences vs the reference volume (-4.87 ± 1.73 ml vs. -4.45 ± 1.97 ml, p = 0.67) and short-axis "perpendicular" vs. "not-perpendicular" with the LA long-axis (-4.72 ± 1.66 ml vs. -4.75 ± 2.13 ml; p = 0.98). The conventional bi-plane area-length method was susceptible for slice orientations (p = 0.0085 for the interaction of "slice orientation" and "reconstruction technique", 2-way ANOVA for repeated measures). To use the 3D-HR-CMR as the reference for LA volumes in patients, it was validated in the LA phantoms (mean difference: -1.37 ± 1.35 ml, -2.38 ± 2.44%, r2 = 0.97). Patient study: The CS-cineCMR LA volumes of the mid-diastolic frame matched closely with the reference LA volume (measured by 3D-HR-CMR) with a difference of -2.66 ± 6.5 ml (3.0% underestimation; true LA volumes: 63 ml, 62 ml, and 395 ml). Finally, a high intra- and inter-observer agreement for maximal and minimal LA volume measurement is also shown. CONCLUSIONS: The proposed method combines a highly accelerated single-breathhold compressed-sensing multi-slice CMR technique with a non-model-based 3D reconstruction to accurately and reproducibly measure LA volumes and function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The short version of the Oxford-Liverpool Inventory of Feelings and Experiences (sO-LIFE) is a widely used measure assessing schizotypy. There is limited information, however, on how sO-LIFE scores compare across different countries. The main goal of the present study is to test the measurement invariance of the sO-LIFE scores in a large sample of non-clinical adolescents and young adults from four European countries (UK, Switzerland, Italy, and Spain). The scores were obtained from validated versions of the sO-LIFE in their respective languages. The sample comprised 4190 participants (M = 20.87 years; SD = 3.71 years). The study of the internal structure, using confirmatory factor analysis, revealed that both three (i.e., positive schizotypy, cognitive disorganisation, and introvertive anhedonia) and four-factor (i.e., positive schizotypy, cognitive disorganisation, introvertive anhedonia, and impulsive nonconformity) models fitted the data moderately well. Multi-group confirmatory factor analysis showed that the three-factor model had partial strong measurement invariance across countries. Eight items were non-invariant across samples. Significant statistical differences in the mean scores of the s-OLIFE were found by country. Reliability scores, estimated with Ordinal alpha ranged from 0.75 to 0.87. Using the Item Response Theory framework, the sO-LIFE provides more accuracy information at the medium and high end of the latent trait. The current results show further evidence in support of the psychometric proprieties of the sO-LIFE, provide new information about the cross-cultural equivalence of schizotypy and support the use of this measure to screen for psychotic-like features and liability to psychosis in general population samples from different European countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although brand authenticity is gaining increasing interest in consumer behavior research and managerial practice, literature on its measurement and contribution to branding theory is still limited. This article develops an integrative framework of the concept of brand authenticity and reports the development and validation of a scale measuring consumers' perceived brand authenticity (PBA). A multi-phase scale development process resulted in a 15-item PBA scale measuring four dimensions: credibility, integrity, symbolism, and continuity. This scale is reliable across different brands and cultural contexts. We find that brand authenticity perceptions are influenced by indexical, existential, and iconic cues, whereby some of the latters' influence is moderated by consumers' level of marketing skepticism. Results also suggest that PBA increases emotional brand attachment and word-of-mouth, and that it drives brand choice likelihood through self-congruence for consumers high in self-authenticity.