8 resultados para Measurement theory
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
From the early 1900s, some psychologists have attempted to establish their discipline as a quantitative science. In using quantitative methods to investigate their theories, they adopted their own special definition of measurement of attributes such as cognitive abilities, as though they were quantities of the type encountered in Newtonian science. Joel Michell has presented a carefully reasoned argument that psychological attributes lack additivity, and therefore cannot be quantities in the same way as the attributes of classical Newtonian physics. In the early decades of the 20th century, quantum theory superseded Newtonian mechanics as the best model of physical reality. This paper gives a brief, critical overview of the evolution of current measurement practices in psychology, and suggests the need for a transition from a Newtonian to a quantum theoretical paradigm for psychological measurement. Finally, a case study is presented that considers the implications of a quantum theoretical model for educational measurement. In particular, it is argued that, since the OECD’s Programme for International Student Assessment (PISA) is predicated on a Newtonian conception of measurement, this may constrain the extent to which it can make accurate comparisons of the achievements of different education systems.
Resumo:
Complex collaboration in rapidly changing business environments create challenges for management capability in Utility Horizontal Supply Chains (UHSCs) involving the deploying and evolving of performance measures. The aim of the study is twofold. First, there is a need to explore how management capability can be developed and used to deploy and evolve Performance Measurement (PM), both across a UHSC and within its constituent organisations, drawing upon a theoretical nexus of Dynamic Capability (DC) theory and complementary Goal Theory. Second, to make a contribution to knowledge by empirically building theory using these constructs to show the management motivations and behaviours within PM-based DCs. The methodology uses an interpretive theory building, multiple case based approach (n=3) as part of a USHC. The data collection methods include, interviews (n=54), focus groups (n=10), document analysis and participant observation (reflective learning logs) over a five-year period giving longitudinal data. The empirical findings lead to the development of a conceptual framework showing that management capabilities in driving PM deployment and evolution can be represented as multilevel renewal and incremental Dynamic Capabilities, which can be further understood in terms of motivation and behaviour by Goal-Theoretic constructs. In addition three interrelated cross cutting themes of management capabilities in consensus building, goal setting and resource change were identified. These management capabilities require carefully planned development and nurturing within the UHSC.
Resumo:
Purpose: Environmental turbulence including rapid changes in technology and markets has resulted in the need for new approaches to performance measurement and benchmarking. There is a need for studies that attempt to measure and benchmark upstream, leading or developmental aspects of organizations. Therefore, the aim of this paper is twofold. The first is to conduct an in-depth case analysis of lead performance measurement and benchmarking leading to the further development of a conceptual model derived from the extant literature and initial survey data. The second is to outline future research agendas that could further develop the framework and the subject area.
Design/methodology/approach: A multiple case analysis involving repeated in-depth interviews with managers in organisational areas of upstream influence in the case organisations.
Findings: It was found that the effect of external drivers for lead performance measurement and benchmarking was mediated by organisational context factors such as level of progression in business improvement methods. Moreover, the legitimation of the business improvement methods used for this purpose, although typical, had been extended beyond their original purpose with the development of bespoke sets of lead measures.
Practical implications: Examples of methods and lead measures are given that can be used by organizations in developing a programme of lead performance measurement and benchmarking.
Originality/value: There is a paucity of in-depth studies relating to the theory and practice of lead performance measurement and benchmarking in organisations.
Resumo:
A method is discussed for measuring the acoustic impedance of tubular objects that gives accurate results for a wide range of frequencies. The apparatus that is employed is similar to that used in many previously developed methods; it consists of a cylindrical measurement duct fitted with several microphones, of which two are active in each measurement session, and a driver at one of its ends. The object under study is fitted at the other end. The impedance of the object is determined from the microphone signals obtained during excitation of the air inside the 1 duct by the driver, and from three coefficients that are pre-determined using four calibration measurements with closed cylindrical tubes. The calibration procedure is based on the simple mathematical relationships between the impedances of the calibration tubes, and does not require knowledge of the propagation constant. Measurements with a cylindrical tube yield an estimate of the attenuation constant for plane waves, which is found to differ from the theoretical prediction by less than 1.4% in the frequency range 1 kHz-20 kHz. Impedance measurements of objects with abrupt changes in diameter are found to be in good agreement with multimodal theory.
Resumo:
Radiotherapy employs ionizing radiation to induce lethal DNA lesions in cancer cells while minimizing damage to healthy tissues. Due to their pattern of energy deposition, better therapeutic outcomes can, in theory, be achieved with ions compared to photons. Antiprotons have been proposed to offer a further enhancement due to their annihilation at the end of the path. The work presented here aimed to establish and validate an experimental procedure for the quantification of plasmid and genomic DNA damage resulting from antiproton exposure. Immunocytochemistry was used to assess DNA damage in directly and indirectly exposed human fibroblasts irradiated in both plateau and Bragg peak regions of a 126 MeV antiproton beam at CERN. Cells were stained post irradiation with an anti-gamma-H2AX antibody. Quantification of the gamma-H2AX foci-dose relationship is consistent with a linear increase in the Bragg peak region. A qualitative analysis of the foci detected in the Bragg peak and plateau region indicates significant differences highlighting the different severity of DNA lesions produced along the particle path. Irradiation of desalted plasmid DNA with 5 Gy antiprotons at the Bragg peak resulted in a significant portion of linear plasmid in the resultant solution.
Resumo:
Many different immunochemical platforms exist for the screening of naturally occurring contaminants in food from the low cost enzyme linked immunosorbent assays (ELISA) to the expensive instruments such as optical biosensors based on the phenomenon of surface plasmon resonance (SPR). The primary aim of this study was to evaluate and compare a number of these platforms to assess their accuracy and precision when applied to naturally contaminated samples containing HT-2/T-2 mycotoxins. Other important factors considered were the speed of analysis, ease of use (sample preparation techniques and use of the equipment) and ultimately the cost implications. The three screening procedures compared included an SPR biosensor assay, a commercially available ELISA and an enzyme-linked immunomagnetic electrochemical array (ELIME array). The qualitative data for all methods demonstrated very good overall agreements with each other, however on comparison with mass spectrometry confirmatory results, the ELISA and SPR assay performed slightly better than the ELIME array, exhibiting an overall agreement of 95.8% compared to 91.7%. Currently, SPR is more costly than the other two platforms and can only be used in the laboratory whereas in theory both the ELISA and ELIME array are portable and can be used in the field, but ultimately this is dependent on the sample preparation techniques employed. Sample preparative techniques varied for all methods evaluated, the ELISA was the most simple to perform followed by that of the SPR method. The ELIME array involved an additional clean-up step thereby increasing both the time and cost of analysis. Therefore in the current format, field use would not be an option for the ELIME array. In relation to speed of analysis, the ELISA outperformed the other methods.
Resumo:
The outcomes of educational assessments undoubtedly have real implications for students, teachers, schools and education in the widest sense. Assessment results are, for example, used to award qualifications that determine future educational or vocational pathways of students. The results obtained by students in assessments are also used to gauge individual teacher quality, to hold schools to account for the standards achieved by their students, and to compare international education systems. Given the current high-stakes nature of educational assessment, it is imperative that the measurement practices involved have stable philosophical foundations. However, this paper casts doubt on the theoretical underpinnings of contemporary educational measurement models. Aspects of Wittgenstein’s later philosophy and Bohr’s philosophy of quantum theory are used to argue that a quantum theoretical rather than a Newtonian model is appropriate for educational measurement, and the associated implications for the concept of validity are elucidated. Whilst it is acknowledged that the transition to a quantum theoretical framework would not lead to the demise of educational assessment, it is argued that, where practical, current high-stakes assessments should be reformed to become as ‘low-stakes’ as possible. The paper also undermines some of the pro high-stakes testing rhetoric that has a tendency to afflict education.
Resumo:
Measurement of the dynamic properties of hydrogen and helium under extreme pressures is a key to understanding the physics of planetary interiors. The inelastic scattering signal from statically compressed hydrogen inside diamond anvil cells at 2.8 GPa and 6.4 GPa was measured at the Diamond Light Source synchrotron facility in the UK. The first direct measurement of the local field correction to the Coulomb interactions in degenerate plasmas was obtained from spectral shifts in the scattering data and compared to predictions by the Utsumi-Ichimaru theory for degenerate electron liquids.