968 resultados para Implicit calibration
Resumo:
The recent expansion of clinical applications for optical coherence tomography (OCT) is driving the development of approaches for consistent image acquisition. There is a simultaneous need for time-stable, easy-to-use imaging targets for calibration and standardization of OCT devices. We present calibration targets consisting of three-dimensional structures etched into nanoparticle-embedded resin. Spherical iron oxide nanoparticles with a predominant particle diameter of 400 nm were homogeneously dispersed in a two part polyurethane resin and allowed to harden overnight. These samples were then etched using a precision micromachining femtosecond laser with a center wavelength of 1026 nm, 100kHz repetition rate and 450 fs pulse duration. A series of lines in depth were etched, varying the percentage of inscription energy and speed of the translation stage moving the target with respect to the laser. Samples were imaged with a dual wavelength spectral-domain OCT system and point-spread function of nanoparticles within the target was measured.
Resumo:
The leadership categorisation theory suggests that followers rely on a hierarchical cognitive structure in perceiving leaders and the leadership process, which consists of three levels; superordinate, basic and subordinate. The predominant view is that followers rely on Implicit Leadership Theories (ILTs) at the basic level in making judgments about managers. The thesis examines whether this presumption is true by proposing and testing two competing conceptualisations; namely the congruence between the basic level ILTs (general leader) and actual manager perceptions, and subordinate level ILTs (job-specific leader) and actual manager. The conceptualisation at the job-specific level builds on context-related assertions of the ILT explanatory models: leadership categorisation, information processing and connectionist network theories. Further, the thesis addresses the effects of ILT congruence at the group level. The hypothesised model suggests that Leader-Member Exchange (LMX) will act as a mediator between ILT congruence and outcomes. Three studies examined the proposed model. The first was cross-sectional with 175 students reporting on work experience during a 1-year industrial placement. The second was longitudinal and had a sample of 343 students engaging in a business simulation in groups with formal leadership. The final study was a cross-sectional survey in several organisations with a sample of 178. A novel approach was taken to congruence analysis; the hypothesised models were tested using Latent Congruence Modelling (LCM), which accounts for measurement error and overcomes the majority of limitations of traditional approaches. The first two studies confirm the traditional theorised view that employees rely on basic-level ILTs in making judgments about their managers with important implications, and show that LMX mediates the relationship between ILT congruence and work-related outcomes (performance, job satisfaction, well-being, task satisfaction, intragroup conflict, group satisfaction, team realness, team-member exchange, group performance). The third study confirms this with conflict, well-being, self-rated performance and commitment as outcomes.
Resumo:
Insights from the stream of research on knowledge calibration, which refers to the correspondence between accuracy and confidence in knowledge, enable a better understanding of consequences of inaccurate perceptions of managers. This paper examines the consequences of inaccurate managerial knowledge through the lens of knowledge calibration. Specifically, the paper examines the antecedent role of miscalibration of knowledge in strategy formation. It is postulated that miscalibrated managers who overestimate external factors and display a high level of confidence in their estimates are likely to enact strategies that are relatively more evolutionary and incremental in nature, whereas miscalibrated managers who overestimate internal factors and display a high level of confidence in their estimates are likely to enact strategies that are relatively more discontinuous and disruptive in nature. Perspectives from social cognitive theory provide support for the underlying processes. The paper, in part, explains the paradox of the prevalence of inaccurate managerial perceptions and efficacious performance. It also advances the literature on strategy formation through the application of the construct of knowledge calibration.
Resumo:
Calibration of consumer knowledge of the web refers to the correspondence between accuracy and confidence in knowledge of the web. Being well-calibrated means that a person is realistic in his or her assessment of the level of knowledge that he or she possesses. This study finds that involvement leads to better calibration and that calibration is higher for procedural knowledge and common knowledge, as compared to declarative knowledge and specialized knowledge. Neither usage, nor experience, has any effect on calibration of knowledge of the web. No difference in calibration is observed between genders. But, in agreement with previous findings, this study also finds that males are more confident in their knowledge of the web. The results point out that calibration could be more a function of knowledge-specific factors and less that of individual-specific factors. The study also identifies flow and frustration with the web as consequences of calibration of knowledge of the web and draws the attention of future researchers to examine these aspects.
Resumo:
For over 30. years information-processing approaches to leadership and more specifically Implicit Leadership Theories (ILTs) research has contributed a significant body of knowledge on leadership processes in applied settings. A new line of research on Implicit Followership Theories (IFTs) has re-ignited interest in information-processing and socio-cognitive approaches to leadership and followership. In this review, we focus on organizational research on ILTs and IFTs and highlight their practical utility for the exercise of leadership and followership in applied settings. We clarify common misperceptions regarding the implicit nature of ILTs and IFTs, review both direct and indirect measures, synthesize current and ongoing research on ILTs and IFTs in organizational settings, address issues related to different levels of analysis in the context of leadership and follower schemas and, finally, propose future avenues for organizational research. © 2013 Elsevier Inc.
Resumo:
The article explores the possibilities of formalizing and explaining the mechanisms that support spatial and social perspective alignment sustained over the duration of a social interaction. The basic proposed principle is that in social contexts the mechanisms for sensorimotor transformations and multisensory integration (learn to) incorporate information relative to the other actor(s), similar to the "re-calibration" of visual receptive fields in response to repeated tool use. This process aligns or merges the co-actors' spatial representations and creates a "Shared Action Space" (SAS) supporting key computations of social interactions and joint actions; for example, the remapping between the coordinate systems and frames of reference of the co-actors, including perspective taking, the sensorimotor transformations required for lifting jointly an object, and the predictions of the sensory effects of such joint action. The social re-calibration is proposed to be based on common basis function maps (BFMs) and could constitute an optimal solution to sensorimotor transformation and multisensory integration in joint action or more in general social interaction contexts. However, certain situations such as discrepant postural and viewpoint alignment and associated differences in perspectives between the co-actors could constrain the process quite differently. We discuss how alignment is achieved in the first place, and how it is maintained over time, providing a taxonomy of various forms and mechanisms of space alignment and overlap based, for instance, on automaticity vs. control of the transformations between the two agents. Finally, we discuss the link between low-level mechanisms for the sharing of space and high-level mechanisms for the sharing of cognitive representations. © 2013 Pezzulo, Iodice, Ferraina and Kessler.
Resumo:
The recent expansion of clinical applications for optical coherence tomography (OCT) is driving the development of approaches for consistent image acquisition. There is a simultaneous need for time-stable, easy-to-use imaging targets for calibration and standardization of OCT devices. We present calibration targets consisting of three-dimensional structures etched into nanoparticle-embedded resin. Spherical iron oxide nanoparticles with a predominant particle diameter of 400 nm were homogeneously dispersed in a two part polyurethane resin and allowed to harden overnight. These samples were then etched using a precision micromachining femtosecond laser with a center wavelength of 1026 nm, 100kHz repetition rate and 450 fs pulse duration. A series of lines in depth were etched, varying the percentage of inscription energy and speed of the translation stage moving the target with respect to the laser. Samples were imaged with a dual wavelength spectral-domain OCT system and point-spread function of nanoparticles within the target was measured.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
After exogenously cueing attention to a peripheral location, the return of attention and response to the location can be inhibited. We demonstrate that these inhibitory mechanisms of attention can be associated with objects and can be automatically and implicitly retrieved over relatively long periods. Furthermore, we also show that when face stimuli are associated with inhibition, the effect is more robust for faces presented in the left visual field. This effect can be even more spatially specific, where most robust inhibition is obtained for faces presented in the upper as compared to the lower visual field. Finally, it is revealed that the inhibition is associated with an object’s identity, as inhibition moves with an object to a new location; and that the retrieved inhibition is only transiently present after retrieval.
Resumo:
The study examined the effect of range of a confidence scale on consumer knowledge calibration, specifically whether a restricted range scale (25%- 100%) leads to difference in calibration compared to a full range scale (0%-100%), for multiple-choice questions. A quasi-experimental study using student participants (N = 434) was employed. Data were collected from two samples; in the first sample (N = 167) a full range confidence scale was used, and in the second sample (N = 267) a restricted range scale was used. No differences were found between the two scales on knowledge calibration. Results from studies of knowledge calibration employing restricted range and full range confidence scales are thus comparable. © Psychological Reports 2014.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
Nanoindentation has become a common technique for measuring the hardness and elastic-plastic properties of materials, including coatings and thin films. In recent years, different nanoindenter instruments have been commercialised and used for this purpose. Each instrument is equipped with its own analysis software for the derivation of the hardness and reduced Young's modulus from the raw data. These data are mostly analysed through the Oliver and Pharr method. In all cases, the calibration of compliance and area function is mandatory. The present work illustrates and describes a calibration procedure and an approach to raw data analysis carried out for six different nanoindentation instruments through several round-robin experiments. Three different indenters were used, Berkovich, cube corner, spherical, and three standardised reference samples were chosen, hard fused quartz, soft polycarbonate, and sapphire. It was clearly shown that the use of these common procedures consistently limited the hardness and reduced the Young's modulus data spread compared to the same measurements performed using instrument-specific procedures. The following recommendations for nanoindentation calibration must be followed: (a) use only sharp indenters, (b) set an upper cut-off value for the penetration depth below which measurements must be considered unreliable, (c) perform nanoindentation measurements with limited thermal drift, (d) ensure that the load-displacement curves are as smooth as possible, (e) perform stiffness measurements specific to each instrument/indenter couple, (f) use Fq and Sa as calibration reference samples for stiffness and area function determination, (g) use a function, rather than a single value, for the stiffness and (h) adopt a unique protocol and software for raw data analysis in order to limit the data spread related to the instruments (i.e. the level of drift or noise, defects of a given probe) and to make the H and E r data intercomparable. © 2011 Elsevier Ltd.
Resumo:
This study extends a previous research concerning intervertebral motion registration by means of 2D dynamic fluoroscopy to obtain a more comprehensive 3D description of vertebral kinematics. The problem of estimating the 3D rigid pose of a CT volume of a vertebra from its 2D X-ray fluoroscopy projection is addressed. 2D-3D registration is obtained maximising a measure of similarity between Digitally Reconstructed Radiographs (obtained from the CT volume) and real fluoroscopic projection. X-ray energy correction was performed. To assess the method a calibration model was realised a sheep dry vertebra was rigidly fixed to a frame of reference including metallic markers. Accurate measurement of 3D orientation was obtained via single-camera calibration of the markers and held as true 3D vertebra position; then, vertebra 3D pose was estimated and results compared. Error analysis revealed accuracy of the order of 0.1 degree for the rotation angles of about 1mm for displacements parallel to the fluoroscopic plane, and of order of 10mm for the orthogonal displacement. © 2010 P. Bifulco et al.
Resumo:
2000 Mathematics Subject Classification: 54H25, 47H10.