988 resultados para Fantôme de calibration


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Insights from the stream of research on knowledge calibration, which refers to the correspondence between accuracy and confidence in knowledge, enable a better understanding of consequences of inaccurate perceptions of managers. This paper examines the consequences of inaccurate managerial knowledge through the lens of knowledge calibration. Specifically, the paper examines the antecedent role of miscalibration of knowledge in strategy formation. It is postulated that miscalibrated managers who overestimate external factors and display a high level of confidence in their estimates are likely to enact strategies that are relatively more evolutionary and incremental in nature, whereas miscalibrated managers who overestimate internal factors and display a high level of confidence in their estimates are likely to enact strategies that are relatively more discontinuous and disruptive in nature. Perspectives from social cognitive theory provide support for the underlying processes. The paper, in part, explains the paradox of the prevalence of inaccurate managerial perceptions and efficacious performance. It also advances the literature on strategy formation through the application of the construct of knowledge calibration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Calibration of consumer knowledge of the web refers to the correspondence between accuracy and confidence in knowledge of the web. Being well-calibrated means that a person is realistic in his or her assessment of the level of knowledge that he or she possesses. This study finds that involvement leads to better calibration and that calibration is higher for procedural knowledge and common knowledge, as compared to declarative knowledge and specialized knowledge. Neither usage, nor experience, has any effect on calibration of knowledge of the web. No difference in calibration is observed between genders. But, in agreement with previous findings, this study also finds that males are more confident in their knowledge of the web. The results point out that calibration could be more a function of knowledge-specific factors and less that of individual-specific factors. The study also identifies flow and frustration with the web as consequences of calibration of knowledge of the web and draws the attention of future researchers to examine these aspects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article explores the possibilities of formalizing and explaining the mechanisms that support spatial and social perspective alignment sustained over the duration of a social interaction. The basic proposed principle is that in social contexts the mechanisms for sensorimotor transformations and multisensory integration (learn to) incorporate information relative to the other actor(s), similar to the "re-calibration" of visual receptive fields in response to repeated tool use. This process aligns or merges the co-actors' spatial representations and creates a "Shared Action Space" (SAS) supporting key computations of social interactions and joint actions; for example, the remapping between the coordinate systems and frames of reference of the co-actors, including perspective taking, the sensorimotor transformations required for lifting jointly an object, and the predictions of the sensory effects of such joint action. The social re-calibration is proposed to be based on common basis function maps (BFMs) and could constitute an optimal solution to sensorimotor transformation and multisensory integration in joint action or more in general social interaction contexts. However, certain situations such as discrepant postural and viewpoint alignment and associated differences in perspectives between the co-actors could constrain the process quite differently. We discuss how alignment is achieved in the first place, and how it is maintained over time, providing a taxonomy of various forms and mechanisms of space alignment and overlap based, for instance, on automaticity vs. control of the transformations between the two agents. Finally, we discuss the link between low-level mechanisms for the sharing of space and high-level mechanisms for the sharing of cognitive representations. © 2013 Pezzulo, Iodice, Ferraina and Kessler.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent expansion of clinical applications for optical coherence tomography (OCT) is driving the development of approaches for consistent image acquisition. There is a simultaneous need for time-stable, easy-to-use imaging targets for calibration and standardization of OCT devices. We present calibration targets consisting of three-dimensional structures etched into nanoparticle-embedded resin. Spherical iron oxide nanoparticles with a predominant particle diameter of 400 nm were homogeneously dispersed in a two part polyurethane resin and allowed to harden overnight. These samples were then etched using a precision micromachining femtosecond laser with a center wavelength of 1026 nm, 100kHz repetition rate and 450 fs pulse duration. A series of lines in depth were etched, varying the percentage of inscription energy and speed of the translation stage moving the target with respect to the laser. Samples were imaged with a dual wavelength spectral-domain OCT system and point-spread function of nanoparticles within the target was measured.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study examined the effect of range of a confidence scale on consumer knowledge calibration, specifically whether a restricted range scale (25%- 100%) leads to difference in calibration compared to a full range scale (0%-100%), for multiple-choice questions. A quasi-experimental study using student participants (N = 434) was employed. Data were collected from two samples; in the first sample (N = 167) a full range confidence scale was used, and in the second sample (N = 267) a restricted range scale was used. No differences were found between the two scales on knowledge calibration. Results from studies of knowledge calibration employing restricted range and full range confidence scales are thus comparable. © Psychological Reports 2014.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanoindentation has become a common technique for measuring the hardness and elastic-plastic properties of materials, including coatings and thin films. In recent years, different nanoindenter instruments have been commercialised and used for this purpose. Each instrument is equipped with its own analysis software for the derivation of the hardness and reduced Young's modulus from the raw data. These data are mostly analysed through the Oliver and Pharr method. In all cases, the calibration of compliance and area function is mandatory. The present work illustrates and describes a calibration procedure and an approach to raw data analysis carried out for six different nanoindentation instruments through several round-robin experiments. Three different indenters were used, Berkovich, cube corner, spherical, and three standardised reference samples were chosen, hard fused quartz, soft polycarbonate, and sapphire. It was clearly shown that the use of these common procedures consistently limited the hardness and reduced the Young's modulus data spread compared to the same measurements performed using instrument-specific procedures. The following recommendations for nanoindentation calibration must be followed: (a) use only sharp indenters, (b) set an upper cut-off value for the penetration depth below which measurements must be considered unreliable, (c) perform nanoindentation measurements with limited thermal drift, (d) ensure that the load-displacement curves are as smooth as possible, (e) perform stiffness measurements specific to each instrument/indenter couple, (f) use Fq and Sa as calibration reference samples for stiffness and area function determination, (g) use a function, rather than a single value, for the stiffness and (h) adopt a unique protocol and software for raw data analysis in order to limit the data spread related to the instruments (i.e. the level of drift or noise, defects of a given probe) and to make the H and E r data intercomparable. © 2011 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study extends a previous research concerning intervertebral motion registration by means of 2D dynamic fluoroscopy to obtain a more comprehensive 3D description of vertebral kinematics. The problem of estimating the 3D rigid pose of a CT volume of a vertebra from its 2D X-ray fluoroscopy projection is addressed. 2D-3D registration is obtained maximising a measure of similarity between Digitally Reconstructed Radiographs (obtained from the CT volume) and real fluoroscopic projection. X-ray energy correction was performed. To assess the method a calibration model was realised a sheep dry vertebra was rigidly fixed to a frame of reference including metallic markers. Accurate measurement of 3D orientation was obtained via single-camera calibration of the markers and held as true 3D vertebra position; then, vertebra 3D pose was estimated and results compared. Error analysis revealed accuracy of the order of 0.1 degree for the rotation angles of about 1mm for displacements parallel to the fluoroscopic plane, and of order of 10mm for the orthogonal displacement. © 2010 P. Bifulco et al.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most pavement design procedures incorporate reliability to account for design inputs-associated uncertainty and variability effect on predicted performance. The load and resistance factor design (LRFD) procedure, which delivers economical section while considering design inputs variability separately, has been recognised as an effective tool to incorporate reliability into design procedures. This paper presents a new reliability-based calibration in LRFD format for a mechanics-based fatigue cracking analysis framework. This paper employs a two-component reliability analysis methodology that utilises a central composite design-based response surface approach and a first-order reliability method. The reliability calibration was achieved based on a number of field pavement sections that have well-documented performance history and high-quality field and laboratory data. The effectiveness of the developed LRFD procedure was evaluated by performing pavement designs of various target reliabilities and design conditions. The result shows an excellent agreement between the target and actual reliabilities. Furthermore, it is clear from the results that more design features need to be included in the reliability calibration to minimise the deviation of the actual reliability from the target reliability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auditor decisions regarding the causes of accounting misstatements can have an audit effectiveness and efficiency. Specifically, overconfidence in one's decision can lead to an ineffective audit, whereas underconfidence in one's decision can lead to an inefficient audit. This dissertation explored the implications of providing various types of information cues to decision-makers regarding an Analytical Procedure task and investigated the relationship between different types of evidence cues (confirming, disconfirming, redundant or non-redundant) and the reduction in calibration bias. Information was collected using a laboratory experiment, from 45 accounting students participants. Research questions were analyzed using a 2 x 2 x 2 between-subject and within-subject analysis of covariance (ANCOVA). ^ Results indicated that presenting subjects with information cues dissimilar to the choice they made is an effective intervention in reducing the common overconfidence found in decision-making. In addition, other information characteristics, specifically non-redundant information can help in reducing a decision-maker's overconfidence/calibration bias for difficulty (compared to easy) decision-tasks. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current commercially available mimics contain varying amounts of either the actual explosive/drug or the chemical compound of suspected interest by biological detectors. As a result, there is significant interest in determining the dominant chemical odor signatures of the mimics, often referred to as pseudos, particularly when compared to the genuine contraband material. This dissertation discusses results obtained from the analysis of drug and explosive headspace related to the odor profiles as recognized by trained detection canines. Analysis was performed through the use of headspace solid phase microextraction in conjunction with gas chromatography mass spectrometry (HS-SPME-GC-MS). Upon determination of specific odors, field trials were held using a combination of the target odors with COMPS. Piperonal was shown to be a dominant odor compound in the headspace of some ecstasy samples and a recognizable odor mimic by trained detection canines. It was also shown that detection canines could be imprinted on piperonal COMPS and correctly identify ecstasy samples at a threshold level of approximately 100ng/s. Isosafrole and/or MDP-2-POH show potential as training aid mimics for non-piperonal based MDMA. Acetic acid was shown to be dominant in the headspace of heroin samples and verified as a dominant odor in commercial vinegar samples; however, no common, secondary compound was detected in the headspace of either. Because of the similarities detected within respective explosive classes, several compounds were chosen for explosive mimics. A single based smokeless powder with a detectable level of 2,4-dinitrotoluene, a double based smokeless powder with a detectable level of nitroglycerine, 2-ethyl-1-hexanol, DMNB, ethyl centralite and diphenylamine were shown to be accurate mimics for TNT-based explosives, NG-based explosives, plastic explosives, tagged explosives, and smokeless powders, respectively. The combination of these six odors represents a comprehensive explosive odor kit with positive results for imprint on detection canines. As a proof of concept, the chemical compound PFTBA showed promise as a possible universal, non-target odor compound for comparison and calibration of detection canines and instrumentation. In a comparison study of shape versus vibration odor theory, the detection of d-methyl benzoate and methyl benzoate was explored using canine detectors. While results did not overwhelmingly substantiate either theory, shape odor theory provides a better explanation of the canine and human subject responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smokeless powder additives are usually detected by their extraction from post-blast residues or unburned powder particles followed by analysis using chromatographic techniques. This work presents the first comprehensive study of the detection of the volatile and semi-volatile additives of smokeless powders using solid phase microextraction (SPME) as a sampling and pre-concentration technique. Seventy smokeless powders were studied using laboratory based chromatography techniques and a field deployable ion mobility spectrometer (IMS). The detection of diphenylamine, ethyl and methyl centralite, 2,4-dinitrotoluene, diethyl and dibutyl phthalate by IMS to associate the presence of these compounds to smokeless powders is also reported for the first time. A previously reported SPME-IMS analytical approach facilitates rapid sub-nanogram detection of the vapor phase components of smokeless powders. A mass calibration procedure for the analytical techniques used in this study was developed. Precise and accurate mass delivery of analytes in picoliter volumes was achieved using a drop-on-demand inkjet printing method. Absolute mass detection limits determined using this method for the various analytes of interest ranged between 0.03–0.8 ng for the GC-MS and between 0.03–2 ng for the IMS. Mass response graphs generated for different detection techniques help in the determination of mass extracted from the headspace of each smokeless powder. The analyte mass present in the vapor phase was sufficient for a SPME fiber to extract most analytes at amounts above the detection limits of both chromatographic techniques and the ion mobility spectrometer. Analysis of the large number of smokeless powders revealed that diphenylamine was present in the headspace of 96% of the powders. Ethyl centralite was detected in 47% of the powders and 8% of the powders had methyl centralite available for detection from the headspace sampling of the powders by SPME. Nitroglycerin was the dominant peak present in the headspace of the double-based powders. 2,4-dinitrotoluene which is another important headspace component was detected in 44% of the powders. The powders therefore have more than one headspace component and the detection of a combination of these compounds is achievable by SPME-IMS leading to an association to the presence of smokeless powders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research sought to understand the role that differentially assessed lands (lands in the United States given tax breaks in return for their guarantee to remain in agriculture) play in influencing urban growth. Our method was to calibrate the SLEUTH urban growth model under two different conditions. The first used an excluded layer that ignored such lands, effectively rendering them available for development. The second treated those lands as totally excluded from development. Our hypothesis was that excluding those lands would yield better metrics of fit with past data. Our results validate our hypothesis since two different metrics that evaluate goodness of fit both yielded higher values when differentially assessed lands are treated as excluded. This suggests that, at least in our study area, differential assessment, which protects farm and ranch lands for tenuous periods of time, has indeed allowed farmland to resist urban development. Including differentially assessed lands also yielded very different calibrated coefficients of growth as the model tried to account for the same growth patterns over two very different excluded areas. Excluded layer design can greatly affect model behavior. Since differentially assessed lands are quite common through the United States and are often ignored in urban growth modeling, the findings of this research can assist other urban growth modelers in designing excluded layers that result in more accurate model calibration and thus forecasting.