679 resultados para WWII artefacts
Resumo:
The fossil arthropod Class Trilobita is characterised by the possession of a highly mineralised dorsal exoskeleton with an incurved marginal flange (doublure). This cuticle is usually the only part of the organism to be preserved. Despite the common occurrence of trilobites in Palaeozoic sediments, the original exoskeletal mineralogy has not been determined previously. Petrographic data involving over seventy trilobite species, ranging in age from Cambrian to Devonian, together with atomic absorption and stable isotope analyses, indicate a primary low-magnesian calcite composition. Trilobite cuticles exhibit a variety of preservational textures which are related to the different diagenetic realms through which they have passed. A greater knowledge of post-depositional processes and the specific features they produce, has enabled post-mortem artefacts to be distinguished from primary cuticular microstructures. Alterations of the cuticle can either enhance or destroy primary features, and their effects are best observed in thin-sections, both under transmitted light and cathodoluminescence. Well-preserved trilobites often retain primary microstructures such as laminations, canals, and tubercles. These have been examined in stained thin-sections and by scanning electron microscopy, from as wide a range of trilobites as possible. Construction of sensory field maps has shown that although the basic organisation of the exoskeleton is the same in all trilobites, the types of microstructures found, and their distribution is species-specific. The composition, microstructure, and architecture of the trilobite exoskeleton have also been studied from a biomechanical viewpoint. Total cuticle thickness, and the relative proportions of the different layers, together with the overall architecture all affected the mechanical properties of the exoskeleton.
Resumo:
Ambulatory EEG recording enables patients with epilepsy and related disorders to be monitored in an unrestricted environment for prolonged periods. Attacks can therefore be recorded and EEG changes at the time can aid diagnosis. The relevant Iiterature is reviewed and a study made of' 250 clinical investigations. A study was also made of the artefacts,encountered during ambulatory recording. Three quarters of referrals were for distinguishing between epileptic and non-epileptic attacks. Over 60% of patients showed no abnormality during attacks. In comparison with the basic EEG the ambulatory EEG provided about ten times as much information. A preliminary follow-up study showed that results, of ambulatory monitoring agreed with the final diagnosis in 8 of 12 patients studied. Of 10 patients referred, for monitoring the occurrence of absence seizures, 8 showed abnormality during the baslcJ EEG .and 10 during the ambulatory EEG. Other patients. were referred: for sleep recording and to clarify the seizure type. An investigation into once daily (OD) versus twice daily administration of sodium valproate in patients with absence seizures showed that an OD regime was equally as effective as a BD regime. Circadian variations in spike and wave activity in patients on and off treatment were also examined. There was significant agreement between subjects on the time of occurrence of abnormality during sleep only, This pattern was not ,affected with treatment nor was there any difference in the daily pattern of occurrence of abnormality between the two regimes. Overall findings suggested that ambulatory monitoring was a valuable tool in the diagnosis and treatment of epilepsy which with careful planning and patient selection could be used in any EEG department and would benefit a:wide range of patients.
Resumo:
In this paper we have done back to back comparison of quantitive phase and refractive index from a microscopic image of waveguide previously obtained by Allsop et al. Paper also shows microscopic image of the first 3 waveguides from the sample. Tomlins et al. have demonstrated use of femtosecond fabricated artefacts as OCT calibration samples. Here we present the use of femtosecond waveguides, inscribed with optimized parameters, to test and calibrate the sensitivity of the OCT systems.
Resumo:
The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. In this paper, a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them.
Resumo:
Models at runtime can be defined as abstract representations of a system, including its structure and behaviour, which exist in tandem with the given system during the actual execution time of that system. Furthermore, these models should be causally connected to the system being modelled, offering a reflective capability. Significant advances have been made in recent years in applying this concept, most notably in adaptive systems. In this paper we argue that a similar approach can also be used to support the dynamic generation of software artefacts at execution time. An important area where this is relevant is the generation of software mediators to tackle the crucial problem of interoperability in distributed systems. We refer to this approach as emergent middleware, representing a fundamentally new approach to resolving interoperability problems in the complex distributed systems of today. In this context, the runtime models are used to capture meta-information about the underlying networked systems that need to interoperate, including their interfaces and additional knowledge about their associated behaviour. This is supplemented by ontological information to enable semantic reasoning. This paper focuses on this novel use of models at runtime, examining in detail the nature of such runtime models coupled with consideration of the supportive algorithms and tools that extract this knowledge and use it to synthesise the appropriate emergent middleware.
Resumo:
Computational reflection is a well-established technique that gives a program the ability to dynamically observe and possibly modify its behaviour. To date, however, reflection is mainly applied either to the software architecture or its implementation. We know of no approach that fully supports requirements reflection- that is, making requirements available as runtime objects. Although there is a body of literature on requirements monitoring, such work typically generates runtime artefacts from requirements and so the requirements themselves are not directly accessible at runtime. In this paper, we define requirements reflection and a set of research challenges. Requirements reflection is important because software systems of the future will be self-managing and will need to adapt continuously to changing environmental conditions. We argue requirements reflection can support such self-adaptive systems by making requirements first-class runtime entities, thus endowing software systems with the ability to reason about, understand, explain and modify requirements at runtime. © 2010 ACM.
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.
Resumo:
Textbooks are an integral part of structured syllabus coverage in higher education. The argument advanced in this article is that textbooks are not simply products of inscription and embodied scholarly labour for pedagogical purposes, but embedded institutional artefacts that configure entire academic subject fields. Empirically, this article shows the various ways that motives of the (non-) adoption of textbooks have field institutional configuration effects. The research contribution of our study is threefold. First, we re-theorise the textbook as an artefact that is part of the institutional work and epistemic culture of academia. Second, we empirically show that the vocabularies of motive of textbook (non-) adoption and rhetorical strategies form the basis for social action and configuration across micro, meso and macro field levels. Our final contribution is a conceptualization of the ways that textbook (non-) adoption motives ascribe meaning to the legitimating processes in the configuration of whole subject fields.
Resumo:
This chapter introduces activity theory as an approach for studying strategy as practice. Activity theory conceptualizes the ongoing construction of activity as a product of activity systems, comprising the actor, the community with which that actor interacts and those symbolic and material tools that mediate between actors, their community and their pursuit of activity. The focus on the mediating role of tools and cultural artefacts in human activity seems especially promising for advancing the strategy-as-practice agenda, for example as a theoretical resource for the growing interest in sociomateriality and the role of tools and artefacts in (strategy) practice (for example, Balogun et al. 2014; Lanzara 2009; Nicolini 2009; Spee and Jarzabkowski 2009; Stetsenko 2005). Despite its potential, in a recent review Vaara and Whittington (2012) identified only three strategy-as-practice articles explicitly applying an activity theory lens. In the wider area of practice-based studies in organizations, activity theory has been slightly more popular (for example, Blackler 1993; 1995; Blackler, Crump and McDonald 2000; Engeström, Kerosuo and Kajamaa 2007; Groleau 2006; Holt 2008; Miettinen and Virkkunen 2005). It still lags behind its potential, however, primarily because of its origins as a social psychology theory developed in Russia with little initial recognition outside the Russian context, particularly in the area of strategy and organization theory, until recently (Miettinen, Samra-Fredericks and Yanow 2009). This chapter explores activity theory as a resource for studying strategy as practice as it is socially accomplished by individuals in interaction with their wider social group and the artefacts of interaction. In particular, activity theory’s focus on actors as social individuals provides a conceptual basis for studying the core question in strategy-as-practice research: what strategy practitioners do. The chapter is structured in three parts. First, an overview of activity theory is provided. Second, activity theory as a practice-based approach to studying organizational action is introduced and an activity system conceptual framework is developed. Third, the elements of the activity system are explained in more detail and explicitly linked to each of the core SAP concepts: practitioners, practices and praxis. In doing so, links are made to existing strategy-as-practice research, with brief empirical examples of topics that might be addressed using activity theory. Throughout the chapter, we introduce key authors in the development of activity theory and its use in management and adjacent disciplinary fields, as further resources for those wishing to make greater use of activity theory.
Resumo:
As optical coherence tomography (OCT) becomes widespread, validation and characterization of systems becomes important. Reference standards are required to qualitatively and quantitatively measure the performance between difference systems. This would allow the performance degradation of the system over time to be monitored. In this report, the properties of the femtosecond inscribed structures from three different systems for making suitable OCT characterization artefacts (phantoms) are analyzed. The parameter test samples are directly inscribed inside transparent materials. The structures are characterized using an optical microscope and a swept-source OCT. The high reproducibility of the inscribed structures shows high potential for producing multi-modality OCT calibration and characterization phantoms. Such that a single artefact can be used to characterize multiple performance parameters such the resolution, linearity, distortion, and imaging depths. © 2012 SPIE.
Resumo:
Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.
Resumo:
Sucrose is used as a cryo-preservation agent on large mammalian eyes post formalin fixation and is shown to reduce freezing artefacts allowing the collection of 12-μm thick sections from these large aqueous samples. The suitability of this technique for use in MALDI imaging experiments is demonstrated by the acquisition of the first images of lipid distributions within whole sagittal porcine eye sections. © 2012 John Wiley & Sons, Ltd.
Resumo:
In this paper we have done back to back comparison of quantitive phase and refractive index from a microscopic image of waveguide previously obtained by Allsop et al. Paper also shows microscopic image of the first 3 waveguides from the sample. Tomlins et al. have demonstrated use of femtosecond fabricated artefacts as OCT calibration samples. Here we present the use of femtosecond waveguides, inscribed with optimized parameters, to test and calibrate the sensitivity of the OCT systems.
Resumo:
The current research activities of the Institute of Mathematics and Informatics at the Bulgarian Academy of Sciences (IMI—BAS) include the study and application of knowledge-based methods for the creation, integration and development of multimedia digital libraries with applications in cultural heritage. This report presents IMI-BAS’s developments at the digital library management systems and portals, i.e. the Bulgarian Iconographical Digital Library, the Bulgarian Folklore Digital Library and the Bulgarian Folklore Artery, etc. developed during the several national and international projects: - "Digital Libraries with Multimedia Content and its Application in Bulgarian Cultural Heritage" (contract 8/21.07.2005 between the IMI–BAS, and the State Agency for Information Technologies and Communications; - FP6/IST/P-027451 PROJECT LOGOS "Knowledge-on-Demand for Ubiquitous Learning", EU FP6, IST, Priority 2.4.13 "Strengthening the Integration of the ICT research effort in an Enlarged Europe" - NSF project D-002-189 SINUS "Semantic Technologies for Web Services and Technology Enhanced Learning". - NSF project IO-03-03/2006 ―Development of Digital Libraries and Information Portal with Virtual Exposition "Bulgarian Folklore Heritage". The presented prototypes aims to provide flexible and effective access to the multimedia presentation of the cultural heritage artefacts and collections, maintaining different forms and format of the digitized information content and rich functionality for interaction. The developments are a result of long- standing interests and work in the technological developments in information systems, knowledge processing and content management systems. The current research activities aims at creating innovative solutions for assembling multimedia digital libraries for collaborative use in specific cultural heritage context, maintaining their semantic interoperability and creating new services for dynamic aggregation of their resources, access improvement, personification, intelligent curation of content, and content protection. The investigations are directed towards the development of distributed tools for aggregating heterogeneous content and ensuring semantic compatibility with the European digital library EUROPEANA, thus providing possibilities for pan- European access to rich digitalised collections of Bulgarian cultural heritage.
Resumo:
This thesis examines the influence of non-state actors on Polish-German relations by considering foreign policy-making towards Poland in Germany and vice versa. The approach chosen for this thesis is interdisciplinary and takes into consideration literature from domestic politics (Area Studies), Foreign Policy Analysis and International Relations (IR). The thesis argues that IR, by purely looking into the quality of inter-state relations, too often treats these relations as a result of policies emanating from the relevant governments, without considering the policies’ background. Therefore, the thesis argues that it is necessary to engage with the domestic factors which might explain where foreign policies come from. It points out that non-state actors influence governments’ choices by supplying resources, and by cooperating or competing with the government on an issue at stake. In order to determine the degree of influence that non-state actors can have on foreign policymaking two variables are examined: the institutionalisation of the state relations in question; and the domestic structures of the relevant states. Specifically, the thesis examines the institutionalisation of Polish-German relations, and examines Germany’s and Poland’s domestic structures and their effect on the two states’ foreign policy-making in general. Thereafter, the thesis uses case studies in order to unravel the influence of non-state actors on specific foreign policies. Three case studies are examined in detail: (i) Poland’s EU accession negotiations with regard to the free movement of capital chapter of the acquis communautaire; (ii) Germany’s EU 2004 Eastern Enlargement negotiations with regard to the free movement of workers chapter of the acquis communautaire; and (iii) Germany’s decision to establsh a permanent exhibition in Berlin that will depict the expulsions of millions of Germans from the East following WWII.