62 resultados para Tools and techniques


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Re-introduction is a technique widely used in the conservation of threatened bird species. With advances in aviculture the use of captive-produced individuals as the release stock is becoming more commonplace, and ideally, survival of captive-produced, released individuals should be no different from their wild-bred counterparts. During the late 1980s the Critically Endangered Mauritius kestrel (Falco punctatus) was successfully re-introduced into the Bambous mountain range, Mauritius, some 30 years after its local extinction. Between 1987 and 2001 the developing population was closely monitored enabling us to construct re-sighting histories for 88 released and 284 wild-bred kestrels. We used age-structured models in the survival analysis software program MARK to determine if an individual's origin influenced its subsequent survival. Our analysis indicated no compelling evidence for reduced survival among juvenile captive-reared and released individuals, relative to their wild-bred counterparts, across the majority of cohorts and only limited evidence of a cohort-specific effect. This study illustrates that despite the lack of a formal experimental approach it is still feasible to conduct an assessment of re-introduction outcomes and techniques.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This document provides guidelines for fish stock assessment and fishery management using the software tools and other outputs developed by the United Kingdom's Department for International Development's Fisheries Management Science Programme (FMSP) from 1992 to 2004. It explains some key elements of the precautionary approach to fisheries management and outlines a range of alternative stock assessment approaches that can provide the information needed for such precautionary management. Four FMSP software tools, LFDA (Length Frequency Data Analysis), CEDA (Catch Effort Data Analysis), YIELD and ParFish (Participatory Fisheries Stock Assessment), are described with which intermediary parameters, performance indicators and reference points may be estimated. The document also contains examples of the assessment and management of multispecies fisheries, the use of Bayesian methodologies, the use of empirical modelling approaches for estimating yields and in analysing fishery systems, and the assessment and management of inland fisheries. It also provides a comparison of length- and age-based stock assessment methods. A CD-ROM with the FMSP software packages CEDA, LFDA, YIELD and ParFish is included.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Re-introduction is a technique widely used in the conservation of threatened bird species. With advances in aviculture the use of captive-produced individuals as the release stock is becoming more commonplace, and ideally, survival of captive-produced, released individuals should be no different from their wild-bred counterparts. During the late 1980s the Critically Endangered Mauritius kestrel (Falco punctatus) was successfully re-introduced into the Bambous mountain range, Mauritius, some 30 years after its local extinction. Between 1987 and 2001 the developing population was closely monitored enabling us to construct re-sighting histories for 88 released and 284 wild-bred kestrels. We used age-structured models in the survival analysis software program MARK to determine if an individual's origin influenced its subsequent survival. Our analysis indicated no compelling evidence for reduced survival among juvenile captive-reared and released individuals, relative to their wild-bred counterparts, across the majority of cohorts and only limited evidence of a cohort-specific effect. This study illustrates that despite the lack of a formal experimental approach it is still feasible to conduct an assessment of re-introduction outcomes and techniques. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e.g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The capability of a feature model of immediate memory (Nairne, 1990; Neath, 2000) to predict and account for a relationship between absolute and proportion scoring of immediate serial recall when memory load is varied (the list-length effect, LLE) is examined. The model correctly predicts the novel finding of an LLE in immediate serial order memory similar to that observed with free recall and previously assumed to be attributable to the long-term memory component of that procedure (Glanzer, 1972). The usefulness of formal models as predictive tools and the continuity between short-term serial order and longer term item memory are considered.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A quasi-optical deembedding technique for characterizing waveguides is demonstrated using wide-band time-resolved terahertz spectroscopy. A transfer function representation is adopted for the description of the signal in the input and output port of the waveguides. The time-domain responses were discretized and the waveguide transfer function was obtained through a parametric approach in the z-domain after describing the system with an AutoRegressive with eXogenous input (ARX), as well as with a state-space model. Prior to the identification procedure, filtering was performed in the wavelet domain to minimize both signal distortion, as well as the noise propagating in the ARX and subspace models. The optimal filtering procedure used in the wavelet domain for the recorded time-domain signatures is described in detail. The effect of filtering prior to the identification procedures is elucidated with the aid of pole-zero diagrams. Models derived from measurements of terahertz transients in a precision WR-8 waveguide adjustable short are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Identity issues are under-explored in construction management. We provide a brief introduction to the organization studies literature on subjectively construed identities, focusing on discourse, agency, relations of power and identity work. The construction management literature is investigated in order to examine identity concerns as they relate to construction managers centred on (1) professionalism; (2) ethics; (3) relational aspects of self-identity; (4) competence, knowledge and tools; and (5) national culture. Identity, we argue, is a key performance issue, and needs to be accounted for in explanations of the success and failure of projects. Our overriding concern is to raise identity issues in order to demonstrate their importance to researchers in construction management and to spark debate. The purpose of this work is not to provide answers or to propose prescriptive models, but to explore ideas, raise awareness and to generate questions for further programmatic research. To this end, we promote empirical work and theorizing by outlining elements of a research agenda which argues that 'identity' is a potentially generative theme for scholars in construction management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There are a number of challenges associated with managing knowledge and information in construction organizations delivering major capital assets. These include the ever-increasing volumes of information, losing people because of retirement or competitors, the continuously changing nature of information, lack of methods on eliciting useful knowledge, development of new information technologies and changes in management and innovation practices. Existing tools and methodologies for valuing intangible assets in fields such as engineering, project management and financial, accounting, do not address fully the issues associated with the valuation of information and knowledge. Information is rarely recorded in a way that a document can be valued, when either produced or subsequently retrieved and re-used. In addition there is a wealth of tacit personal knowledge which, if codified into documentary information, may prove to be very valuable to operators of the finished asset or future designers. This paper addresses the problem of information overload and identifies the differences between data, information and knowledge. An exploratory study was conducted with a leading construction consultant examining three perspectives (business, project management and document management) by structured interviews and specifically how to value information in practical terms. Major challenges in information management are identified. An through-life Information Evaluation methodology (IEM) is presented to reduce information overload and to make the information more valuable in the future.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The fabrication and characterization of micromachined reduced-height air-filled rectangular waveguide components suitable for integration is reported in this paper. The lithographic technique used permits structures with heights of up to 100 μm to be successfully constructed in a repeatable manner. Waveguide S-parameter measurements at frequencies between 75-110 GHz using a vector network analyzer demonstrate low loss propagation in the TE10 mode reaching 0.2 dB per wavelength. Scanning electron microscope photographs of conventional and micromachined waveguides show that the fabrication technique can provide a superior surface finish than possible with commercially available components. In order to circumvent problems in efficiently coupling free-space propagating beams to the reduced-height G-band waveguides, as well as to characterize them using quasi-optical techniques, a novel integrated micromachined slotted horn antenna has been designed and fabricated, E-, H-, and D-plane far-field antenna pattern measurements at different frequencies using a quasi-optical setup show that the fabricated structures are optimized for 180-GHz operation with an E-plane half-power beamwidth of 32° elevated 35° above the substrate, a symmetrical H-plane pattern with a half-power beamwidth of 23° and a maximum D-plane cross-polar level of -33 dB. Far-field pattern simulations using HFSS show good agreement with experimental results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As digital technologies become widely used in designing buildings and infrastructure, questions arise about their impacts on construction safety. This review explores relationships between construction safety and digital design practices with the aim of fostering and directing further research. It surveys state-of-the-art research on databases, virtual reality, geographic information systems, 4D CAD, building information modeling and sensing technologies, finding various digital tools for addressing safety issues in the construction phase, but few tools to support design for construction safety. It also considers a literature on safety critical, digital and design practices that raises a general concern about ‘mindlessness’ in the use of technologies, and has implications for the emerging research agenda around construction safety and digital design. Bringing these strands of literature together suggests new kinds of interventions, such as the development of tools and processes for using digital models to promote mindfulness through multi-party collaboration on safety

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.