24 resultados para artifacts
Resumo:
Optical coherence tomography (OCT) systems are becoming more commonly used in biomedical imaging and, to enable continued uptake, a reliable method of characterizing their performance and validating their operation is required. This paper outlines the use of femtosecond laser subsurface micro-inscription techniques to fabricate an OCT test artifact for validating the resolution performance of a commercial OCT system. The key advantage of this approach is that by utilizing the nonlinear absorption a three dimensional grid of highly localized point and line defects can be written in clear fused silica substrates.
Resumo:
This paper addresses the dearth of research into material artifacts and how they are engaged in strategizing activities. Building on the strategy-as-practice perspective, and the notion of epistemic objects, we develop a typology of strategy practices that show how managers use material artifacts to strategize by a dual process of knowledge abstraction and substitution. Empirically, we study the practice of underwriting managers in reinsurance companies. Our findings first identify the artifacts – pictures, maps, data packs, spreadsheets and graphs – that these managers use to appraise reinsurance deals. Second, the analysis of each artifact’s situated use led to the identification of five practices for doing strategy with artifacts: physicalizing, locating, enumerating, analyzing, and selecting. Last, we developed a typology that shows how practices vary in terms of their level of abstraction from the physical properties of the risk being reinsured and unfold through a process of substituting. Our conceptual framework extends existing work in the strategy-as-practice field that calls for research into the role of material artifacts.
Resumo:
Recently, we have seen an explosion of interest in ontologies as artifacts to represent human knowledge and as critical components in knowledge management, the semantic Web, business-to-business applications, and several other application areas. Various research communities commonly assume that ontologies are the appropriate modeling structure for representing knowledge. However, little discussion has occurred regarding the actual range of knowledge an ontology can successfully represent.
Resumo:
Increasingly the body of knowledge derived from strategy theory has been criticized because it is not actionable in practice, particularly under the conditions of a knowledge economy. Since strategic management is an applied discipline this is a serious criticism. However, we argue that the theory-practice question is too simple. Accordingly, this paper expands this question by outlining first the theoretical criteria under which strategy theory is not actionable, and then outlines an alternative perspective on strategy knowledge in action, based upon a practice epistemology. The paper is in three sections. The first section explains two contextual conditions which impact upon strategy theory within a knowledge economy, environmental velocity and knowledge intensity. The impact of these contextual conditions upon the application of four different streams of strategy theory is examined. The second section suggests that the theoretical validity of these contextual conditions breaks down when we consider the knowledge artifacts, such as strategy tools and frameworks, which arise from strategy research. The third section proposes a practice epistemology for analyzing strategy knowledge in action that stands in contrast to more traditional arguments about actionable knowledge. From a practice perspective, strategy knowledge is argues to be actionable as part of the everyday activities of strategizing. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
We present the prototype tool CADS* for the computer-aided development of an important class of self-* systems, namely systems whose components can be modelled as Markov chains. Given a Markov chain representation of the IT components to be included into a self-* system, CADS* automates or aids (a) the development of the artifacts necessary to build the self-* system; and (b) their integration into a fully-operational self-* solution. This is achieved through a combination of formal software development techniques including model transformation, model-driven code generation and dynamic software reconfiguration.
Resumo:
We have simulated the performance of various apertures used in Coded Aperture Imaging - optically. Coded pictures of extended and continuous-tone planar objects from the Annulus, Twin Annulus, Fresnel Zone Plate and the Uniformly Redundant Array have been decoded using a noncoherent correlation process. We have compared the tomographic capabilities of the Twin Annulus with the Uniformly Redundant Arrays based on quadratic residues and m-sequences. We discuss the ways of reducing the 'd. c.' background of the various apertures used. The non-ideal System-Point-Spread-Function inherent in a noncoherent optical correlation process produces artifacts in the reconstruction. Artifacts are also introduced as a result of unwanted cross-correlation terms from out-of-focus planes. We find that the URN based on m-sequences exhibits good spatial resolution and out-of-focus behaviour when imaging extended objects.
Resumo:
Bilateral corneal blindness represents a quarter of the total blind, world-wide. The artificial cornea in assorted forms, was developed to replace opaque non-functional corneas and to return sight in otherwise hopeless cases that were not amenable to corneal grafts; believed to be 2% of corneal blind. Despite technological advances in materials design and tissue engineering no artificial cornea has provided absolute, long-term success. Formidable problems exist, due to a combination of unpredictable wound healing and unmanageable pathology. To have a solid guarantee of reliable success an artificial cornea must possess three attributes: an optical window to replace the opaque cornea; a strong, long term union to surrounding ocular tissue; and the ability to induce desired host responses. A unique artificial cornea possesses all three functional attributes- the Osteo-odonto-keratoprosthesis (OOKP). The OOKP has a high success rate and can survive for up to twenty years, but it is complicated both in structure and in surgical procedure; it is expensive and not universally available. The aim of this project was to develop a synthetic substitute for the OOKP, based upon key features of the tooth and bone structure. In doing so, surgical complexity and biological complications would be reduced. Analysis of the biological effectiveness of the OOKP showed that the structure of bone was the most crucial component for implant retention. An experimental semi-rigid hydroxyapatite framework was fabricated with a complex bone-like architecture, which could be fused to the optical window. The first method for making such a framework, was pressing and sintering of hydroxyapatite powders; however, it was not possible to fabricate a void architecture with the correct sizes and uniformity of pores. Ceramers were synthesised using alternative pore forming methods, providing for improved mechanical properties and stronger attachment to the plastic optical window. Naturally occurring skeletal structures closely match the structural features of all forms of natural bone. Synthetic casts were fabricated using the replamineform process, of desirable natural artifacts, such as coral and sponges. The final method of construction by-passed ceramic fabrication in favour of pre-formed coral derivatives and focused on methods for polymer infiltration, adhesion and fabrication. Prototypes were constructed and evaluated; a fully penetrative synthetic OOKP analogue was fabricated according to the dimensions of the OOKP. Fabrication of the cornea shaped OOKP synthetic analogue was also attempted.
Resumo:
A textural and microstructural study of a variety of zinc sulfide-containing ores has been undertaken, and the possible depositional and deformational controls of textural and microstructural development considered. Samples for the study were taken from both deformed and undeformed zinc ores of the Central U.S. Appalachians, and deformed zinc ores of the English Pennines. A variety of mineralogical techniques were employed, including transmitted and reflected light microscopy of etched and unetched material, transmission electron microscopy and electron microprobe analysis. For the Pennine zinc sulfides, spectroscopic, x-ray diffraction and fluid inclusion studies were also undertaken. Optical and electron optical examination of the Appalachian material confirmed the suitability of zinc sulfide for detailed study with such techniques. Growth and deformation-related microstructures could be distinguished from specimen-preparation induced artifacts. A deformationally-mduced lamelliform optical anisotropy is seen to be developed in areas hosting a dense planar microstructure of {111} twin- and slip-planes. The Pennine zinc sulfide texturally records a changing depositional environment. Thus, for example, delicately growth- zoned crystals are truncated and cross-cut by solution disconformities. Fluid inclusion studies indicate a highly saline (20-25 wt. % equiv. NaCl), low temperature (100-150°C.) fluid. Texturally, two varieties of zinc sulfide can be recognised; a widely developed, iron- banded variety, and a paragenetically early variety, banded due to horizons rich in crystal defects and microscopic inclusions. The zinc sulfide takes the form of a disordered 3C-polytype, with much of the disorder being deformational in origin. Twin- and slip-plane fabrics are developed . A deformation-related optical anisotropy is seen to overprint growth-related anisotropy, along with cuprian alteration of certain {111} deformation planes.
Resumo:
Three experiments assessed the development of children's part and configural (part-relational) processing in object recognition during adolescence. In total, 312 school children aged 7-16 years and 80 adults were tested in 3-alternative forced choice (3-AFC) tasks. They judged the correct appearance of upright and inverted presented familiar animals, artifacts, and newly learned multipart objects, which had been manipulated either in terms of individual parts or part relations. Manipulation of part relations was constrained to either metric (animals, artifacts, and multipart objects) or categorical (multipart objects only) changes. For animals and artifacts, even the youngest children were close to adult levels for the correct recognition of an individual part change. By contrast, it was not until 11-12 years of age that they achieved similar levels of performance with regard to altered metric part relations. For the newly learned multipart objects, performance was equivalent throughout the tested age range for upright presented stimuli in the case of categorical part-specific and part-relational changes. In the case of metric manipulations, the results confirmed the data pattern observed for animals and artifacts. Together, the results provide converging evidence, with studies of face recognition, for a surprisingly late consolidation of configural-metric relative to part-based object recognition.
Resumo:
The open content creation process has proven itself to be a powerful and influential way of developing text-based content, as demonstrated by the success of Wikipedia and related sites. Distributed individuals independently edit, revise, or refine content, thereby creating knowledge artifacts of considerable breadth and quality. Our study explores the mechanisms that control and guide the content creation process and develops an understanding of open content governance. The repertory grid method is employed to systematically capture the experiences of individuals involved in the open content creation process and to determine the relative importance of the diverse control and guiding mechanisms. Our findings illustrate the important control and guiding mechanisms and highlight the multifaceted nature of open content governance. A range of governance mechanisms is discussed with regard to the varied levels of formality, the different loci of authority, and the diverse interaction environments involved. Limitations and opportunities for future research are provided.
Resumo:
We argue that, for certain constrained domains, elaborate model transformation technologies-implemented from scratch in general-purpose programming languages-are unnecessary for model-driven engineering; instead, lightweight configuration of commercial off-the-shelf productivity tools suffices. In particular, in the CancerGrid project, we have been developing model-driven techniques for the generation of software tools to support clinical trials. A domain metamodel captures the community's best practice in trial design. A scientist authors a trial protocol, modelling their trial by instantiating the metamodel; customized software artifacts to support trial execution are generated automatically from the scientist's model. The metamodel is expressed as an XML Schema, in such a way that it can be instantiated by completing a form to generate a conformant XML document. The same process works at a second level for trial execution: among the artifacts generated from the protocol are models of the data to be collected, and the clinician conducting the trial instantiates such models in reporting observations-again by completing a form to create a conformant XML document, representing the data gathered during that observation. Simple standard form management tools are all that is needed. Our approach is applicable to a wide variety of information-modelling domains: not just clinical trials, but also electronic public sector computing, customer relationship management, document workflow, and so on. © 2012 Springer-Verlag.
Resumo:
Many studies have accounted for whole body vibration effects in the fields of exercise physiology, sport and rehabilitation medicine. Generally, surface EMG is utilized to assess muscular activity during the treatment; however, large motion artifacts appear superimposed to the raw signal, making sEMG recording not suitable before any artifact filtering. Sharp notch filters, centered at vibration frequency and at its superior harmonics, have been used in previous studies, to remove the artifacts. [6, 10] However, to get rid of those artifacts some true EMG signal is lost. The purpose of this study was to reproduce the effect of motor-unit synchronization on a simulated surface EMG during vibratory stimulation. In addition, authors mean to evaluate the EMG power percentage in those bands in which are also typically located motion artifact components. Model characteristics were defined to take into account two main aspect: the muscle MUs discharge behavior and the triggering effects that appear during local vibratory stimulation. [7] Inter-pulse-interval, was characterized by a polimodal distribution related to the MU discharge frequency (IPI 55-80ms, σ=12ms) and to the correlation with the vibration period within the range of ±2 ms due to vibration stimulus. [1, 7] The signals were simulated using different stimulation frequencies from 30 to 70 Hz. The percentage of the total simulated EMG power within narrow bands centered at the stimulation frequency and its superior harmonics (± 1 Hz) resulted on average about 8% (± 2.85) of the total EMG power. However, the artifact in those bands may contain more than 40% of the total power of the total signal. [6] Our preliminary results suggest that the analysis of the muscular activity of muscle based on raw sEMG recordings and RMS evaluation, if not processed during vibratory stimulation may lead to a serious overestimation of muscular response.
Resumo:
This study aims to reproduce the effect of motor-unit synchronization on surface EMG recordings during vibratory stimulation to highlight vibration evoked muscle activity. The authors intended to evaluate, through numerical simulations, the changes in surface EMG spectrum in muscles undergoing whole body vibration stimulation. In some specific bands, in fact, vibration induced motion artifacts are also typically present. In addition, authors meant to compare the simulated EMGs with respect to real recordings in order to discriminate the effect of synchronization of motor units discharges with vibration frequencies from motion artifacts. Computations were performed using a model derived from previous studies and modified to consider the effect of vibratory stimulus, the motor unit synchronization and the endplates-electrodes relative position on the EMG signal. Results revealed that, in particular conditions, synchronization of MUs' discharge generates visible peaks at stimulation frequency and its harmonics. However, only a part of the total power of surface EMGs might be enclosed within artifacts related bands (±1. Hz centered at the stimulation frequency and its superior harmonics) even in case of strong synchronization of motor units discharges with the vibratory stimulus. © 2013 Elsevier Ireland Ltd.
Resumo:
The impact of whole body vibrations (vibration stimulus mechanically transferred to the body) on muscular activity and neuromuscular response has been widely studied but without standard protocol and by using different kinds of exercises and parameters. In this study, we investigated how whole body vibration treatments affect electromyographic signal of rectus femoris during static and dynamic squat exercises. The aim was the identification of squat exercise characteristics useful to maximize neuromuscular activation and hence progress in training efficacy. Fourteen healthy volunteers performed both static and dynamic squat exercises without and with vibration treatments. Surface electromyographic signals of rectus femoris were recorded during the whole exercise and processed to reduce artifacts and to extract root mean square values. Paired t-test results demonstrated an increase of the root mean square values (p<0.05) in both static and dynamic squat exercises with vibrations respectively of 63% and 108%. For each exercise, subjects gave a rating of the perceived exertion according to the Borg's scale but there were no significant changes in the perceived exertion rate between exercises with and without vibration. Finally, results from analysis of electromyographic signals identified the static squat with WBV treatment as the exercise with higher neuromuscular system response. © 2012 IEEE.
Resumo:
The aim of this study is to highlight the relationship between muscle motion, generated by whole body vibration, and the correspondent electromyographic (EMG) activity and to suggest a new method to customize the stimulation frequency. Simultaneous recordings of EMG and tri-axial accelerations of quadriceps rectus femoris from fifteen subjects undergoing vibration treatments were collected. Vibrations were delivered via a sinusoidal oscillating platform at different frequencies (10-45 Hz). Muscle motion was estimated by processing the accelerometer data. Large EMG motion artifacts were removed using sharp notch filters centred at the vibration frequency and its superior harmonics. EMG-RMS values were computed and analyzed before and after artifact suppression to assess muscular activity. Muscles acceleration amplitude increased with frequency. Muscle displacements revealed a mechanical resonant-like behaviour of the muscle. Resonance frequencies and dumping factors depended on subject. Moreover, RMS of artifact-free EMG was found well correlated (R 2 = 0.82) to the actual muscle displacement, while the maximum of the EMG response was found related to the mechanical resonance frequency of muscle. Results showed that maximum muscular activity was found in correspondence to the mechanical resonance of the muscle itself. Assuming the hypothesis that muscle activation is proportional to muscle displacement, treatment optimization (i.e. to choose the best stimulation frequency) could be obtained by simply monitoring local acceleration (resonance), leading to a more effective muscle stimulation. Motion artifact produced an overestimation of muscle activity, therefore its removal was essential. © 2009 IPEM.