22 resultados para Artefacts
em Aston University Research Archive
Resumo:
Magnetoencephalography (MEG), a non-invasive technique for characterizing brain electrical activity, is gaining popularity as a tool for assessing group-level differences between experimental conditions. One method for assessing task-condition effects involves beamforming, where a weighted sum of field measurements is used to tune activity on a voxel-by-voxel basis. However, this method has been shown to produce inhomogeneous smoothness differences as a function of signal-to-noise across a volumetric image, which can then produce false positives at the group level. Here we describe a novel method for group-level analysis with MEG beamformer images that utilizes the peak locations within each participant's volumetric image to assess group-level effects. We compared our peak-clustering algorithm with SnPM using simulated data. We found that our method was immune to artefactual group effects that can arise as a result of inhomogeneous smoothness differences across a volumetric image. We also used our peak-clustering algorithm on experimental data and found that regions were identified that corresponded with task-related regions identified in the literature. These findings suggest that our technique is a robust method for group-level analysis with MEG beamformer images.
Resumo:
Optical coherence tomography (OCT) is a non-invasive three-dimensional imaging system that is capable of producing high resolution in-vivo images. OCT is approved for use in clinical trials in Japan, USA and Europe. For OCT to be used effectively in a clinical diagnosis, a method of standardisation is required to assess the performance across different systems. This standardisation can be implemented using highly accurate and reproducible artefacts for calibration at both installation and throughout the lifetime of a system. Femtosecond lasers can write highly reproducible and highly localised micro-structured calibration artefacts within a transparent media. We report on the fabrication of high quality OCT calibration artefacts in fused silica using a femtosecond laser. The calibration artefacts were written in fused silica due to its high purity and ability to withstand high energy femtosecond pulses. An Amplitude Systemes s-Pulse Yb:YAG femtosecond laser with an operating wavelength of 1026 nm was used to inscribe three dimensional patterns within the highly optically transmissive substrate. Four unique artefacts have been designed to measure a wide variety of parameters, including the points spread function (PSF), modulation transfer function (MTF), sensitivity, distortion and resolution - key parameters which define the performance of the OCT. The calibration artefacts have been characterised using an optical microscope and tested on a swept source OCT. The results demonstrate that the femtosecond laser inscribed artefacts have the potential of quantitatively and qualitatively validating the performance of any OCT system.
Resumo:
Inductive reasoning is fundamental to human cognition, yet it remains unclear how we develop this ability and what might influence our inductive choices. We created novel categories in which crucial factors such as domain and category structure were manipulated orthogonally. We trained 403 4-9-year-old children to categorise well-matched natural kind and artefact stimuli with either featural or relational category structure, followed by induction tasks. This wide age range allowed for the first full exploration of the developmental trajectory of inductive reasoning in both domains. We found a gradual transition from perceptual to categorical induction with age. This pattern was stable across domains, but interestingly, children showed a category bias one year later for relational categories. We hypothesise that the ability to use category information in inductive reasoning develops gradually, but is delayed when children need to process and apply more complex category structures. © 2014 © 2014 Taylor & Francis.
Resumo:
The ethnographic museum in the West has a long and troubling history. The display of 'exotic peoples' in travelling exhibitions began as early as the sixteenth century, but it was the mid and late nineteenth century that saw the great expansion of museums as sites to show artefacts collected - under anything but reputable circumstances - from what were considered the 'primitive', 'natural', or 'tribal' peoples of the world. Today the ethnographic museum is still a feature of large European cities, though faced with newly formulated dilemmas in the postcolonial world. For how can the material culture of a non-western people be collected and displayed in the West without its makers being translated into wordless and powerless objects of visual consumption? In national museums the processes of choosing, contextualizing and commentating exhibits help form national identity; in the ethnographic museum, similarly, they shape perceptions of the apparently distant Other. Like written ethnography, the museum is a 'translation of culture', with many of the associated problems traced by Talal Asad (1986). Like the written form, it has to represent the dialogic realities of cultural encounters in a fixed and intelligible form, to propose categories that define and order the material it has gathered. As the public face of academic ethnography, the museum interprets other cultures for the benefit of the general reader, and in that task museum practice, like all ethnography, operates within very specific historical and political parameters. How are museums in western Europe responding to the issues raised by critical ethnographers like James Clifford (1988), with their focus on the politics of representation? Is globalisation increasing the degree of accountability imposed on the ethnographic museum, or merely reinforcing older patterns? What opportunities and problems are raised by the use of more words - more 'translation' in the narrower sense - in ethnographic museums, and how do museums gain from introducing a reflexive and contextualizing concept of "thick translation" (Appiah 1993) into their work of interpretation?
Resumo:
This paper is predicated upon the assumption that the annual accounts of companies constitute important corporate artefacts in their own right that are both image and rhetoric intensive. The objective is to import concepts from structural poetics and bring them to bear on annual corporate reports. The understanding of the poetics of corporate reporting in this manner is of importance as it allows the consideration of the relationship between the audience and the authors of the script to be explored. We do this by analysing the annual reports of various water companies in the UK.
Resumo:
This thesis has focused on three key areas of interest for femtosecond micromachining and inscription. The first area is micromachining where the work has focused on the ability to process highly repeatable, high precision machining with often extremely complex geometrical structures with little or no damage. High aspect ratio features have been demonstrated in transparent materials, metals and ceramics. Etch depth control was demonstrated especially in the work on phase mask fabrication. Practical chemical sensing and microfluidic devices were also fabricated to demonstrate the capability of the techniques developed during this work. The second area is femtosecond inscription. Here, the work has utilised the non-linear absorption mechanisms associated with femtosecond pulse-material interactions to create highly localised refractive index changes in transparent materials to create complex 3D structures. The techniques employed were then utilised in the fabrication of Phase masks and Optical Coherence Tomography (OCT) phantom calibration artefacts both of which show the potential to fill voids in the development of the fields. This especially the case for the OCT phantoms where there exists no previous artefacts of known shape, allowing for the initial specification of parameters associated with the quality of OCT machines that are being taken up across the world in industry and research. Finally the third area of focus was the combination of all of the techniques developed through work in planar samples to create a range of artefacts in optical fibres. The development of techniques and methods for compensating for the geometrical complexities associated with working with the cylindrical samples with varying refractive indices allowed for fundamental inscription parameters to be examined, structures for use as power monitors and polarisers with the optical fibres and finally the combination of femtosecond inscription and ablation techniques to create a magnetic field sensor with an optical fibre coated in Terfenol-D with directional capability. Through the development of understanding, practical techniques and equipment the work presented here demonstrates several novel pieces of research in the field of femtosecond micromachining and inscription that has provided a broad range of related fields with practical devices that were previously unavailable or that would take great cost and time to facilitate.
Resumo:
This review is structured in three sections and provides a conceptual framework for the empirical analysis of strategy tools as they are used in practice. Examples of strategy tools are SWOT analysis or Porter’s Five Forces, among others. Section one reviews empirical research into the use of strategy tools, classifying them according to variations in their use. Section two explains the concept of boundary objects as the basis for our argument that strategy tools may be understood as boundary objects. Boundary objects are artefacts that are meaningfully and usefully incorporated to enable sharing of information and transfer of knowledge across intra-organizational boundaries, such as laterally across different strategic business units or vertically across hierarchical levels. Section three draws the two bodies of literature together, conceptualizing strategy tools in practice as boundary objects. This review contributes to knowledge on using strategy tools in practice.
Resumo:
Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.
Resumo:
The fossil arthropod Class Trilobita is characterised by the possession of a highly mineralised dorsal exoskeleton with an incurved marginal flange (doublure). This cuticle is usually the only part of the organism to be preserved. Despite the common occurrence of trilobites in Palaeozoic sediments, the original exoskeletal mineralogy has not been determined previously. Petrographic data involving over seventy trilobite species, ranging in age from Cambrian to Devonian, together with atomic absorption and stable isotope analyses, indicate a primary low-magnesian calcite composition. Trilobite cuticles exhibit a variety of preservational textures which are related to the different diagenetic realms through which they have passed. A greater knowledge of post-depositional processes and the specific features they produce, has enabled post-mortem artefacts to be distinguished from primary cuticular microstructures. Alterations of the cuticle can either enhance or destroy primary features, and their effects are best observed in thin-sections, both under transmitted light and cathodoluminescence. Well-preserved trilobites often retain primary microstructures such as laminations, canals, and tubercles. These have been examined in stained thin-sections and by scanning electron microscopy, from as wide a range of trilobites as possible. Construction of sensory field maps has shown that although the basic organisation of the exoskeleton is the same in all trilobites, the types of microstructures found, and their distribution is species-specific. The composition, microstructure, and architecture of the trilobite exoskeleton have also been studied from a biomechanical viewpoint. Total cuticle thickness, and the relative proportions of the different layers, together with the overall architecture all affected the mechanical properties of the exoskeleton.
Resumo:
Ambulatory EEG recording enables patients with epilepsy and related disorders to be monitored in an unrestricted environment for prolonged periods. Attacks can therefore be recorded and EEG changes at the time can aid diagnosis. The relevant Iiterature is reviewed and a study made of' 250 clinical investigations. A study was also made of the artefacts,encountered during ambulatory recording. Three quarters of referrals were for distinguishing between epileptic and non-epileptic attacks. Over 60% of patients showed no abnormality during attacks. In comparison with the basic EEG the ambulatory EEG provided about ten times as much information. A preliminary follow-up study showed that results, of ambulatory monitoring agreed with the final diagnosis in 8 of 12 patients studied. Of 10 patients referred, for monitoring the occurrence of absence seizures, 8 showed abnormality during the baslcJ EEG .and 10 during the ambulatory EEG. Other patients. were referred: for sleep recording and to clarify the seizure type. An investigation into once daily (OD) versus twice daily administration of sodium valproate in patients with absence seizures showed that an OD regime was equally as effective as a BD regime. Circadian variations in spike and wave activity in patients on and off treatment were also examined. There was significant agreement between subjects on the time of occurrence of abnormality during sleep only, This pattern was not ,affected with treatment nor was there any difference in the daily pattern of occurrence of abnormality between the two regimes. Overall findings suggested that ambulatory monitoring was a valuable tool in the diagnosis and treatment of epilepsy which with careful planning and patient selection could be used in any EEG department and would benefit a:wide range of patients.
Resumo:
In this paper we have done back to back comparison of quantitive phase and refractive index from a microscopic image of waveguide previously obtained by Allsop et al. Paper also shows microscopic image of the first 3 waveguides from the sample. Tomlins et al. have demonstrated use of femtosecond fabricated artefacts as OCT calibration samples. Here we present the use of femtosecond waveguides, inscribed with optimized parameters, to test and calibrate the sensitivity of the OCT systems.
Resumo:
The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. In this paper, a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them.
Resumo:
Models at runtime can be defined as abstract representations of a system, including its structure and behaviour, which exist in tandem with the given system during the actual execution time of that system. Furthermore, these models should be causally connected to the system being modelled, offering a reflective capability. Significant advances have been made in recent years in applying this concept, most notably in adaptive systems. In this paper we argue that a similar approach can also be used to support the dynamic generation of software artefacts at execution time. An important area where this is relevant is the generation of software mediators to tackle the crucial problem of interoperability in distributed systems. We refer to this approach as emergent middleware, representing a fundamentally new approach to resolving interoperability problems in the complex distributed systems of today. In this context, the runtime models are used to capture meta-information about the underlying networked systems that need to interoperate, including their interfaces and additional knowledge about their associated behaviour. This is supplemented by ontological information to enable semantic reasoning. This paper focuses on this novel use of models at runtime, examining in detail the nature of such runtime models coupled with consideration of the supportive algorithms and tools that extract this knowledge and use it to synthesise the appropriate emergent middleware.
Resumo:
Computational reflection is a well-established technique that gives a program the ability to dynamically observe and possibly modify its behaviour. To date, however, reflection is mainly applied either to the software architecture or its implementation. We know of no approach that fully supports requirements reflection- that is, making requirements available as runtime objects. Although there is a body of literature on requirements monitoring, such work typically generates runtime artefacts from requirements and so the requirements themselves are not directly accessible at runtime. In this paper, we define requirements reflection and a set of research challenges. Requirements reflection is important because software systems of the future will be self-managing and will need to adapt continuously to changing environmental conditions. We argue requirements reflection can support such self-adaptive systems by making requirements first-class runtime entities, thus endowing software systems with the ability to reason about, understand, explain and modify requirements at runtime. © 2010 ACM.
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.