836 resultados para location-based media
Resumo:
Norms constitute a powerful coordination mechanism among heterogeneous agents. In this paper, we propose a rule language to specify and explicitly manage the normative positions of agents (permissions, prohibitions and obligations), with which distinct deontic notions and their relationships can be captured. Our rule-based formalism includes constraints for more expressiveness and precision and allows to supplement (and implement) electronic institutions with norms. We also show how some normative aspects are given computational interpretation. © 2008 Springer Science+Business Media, LLC.
Resumo:
This paper describes a data model for content representation of temporal media in an IP based sensor network. The model is formed by introducing the idea of semantic-role from linguistics into the underlying concepts of formal event representation with the aim of developing a common event model. The architecture of a prototype system for a multi camera surveillance system, based on the proposed model is described. The important aspects of the proposed model are its expressiveness, its ability to model content of temporal media, and its suitability for use with a natural language interface. It also provides a platform for temporal information fusion, as well as organizing sensor annotations by help of ontologies.
Resumo:
This article explores the different ways that film-makers and historians approach the narrating of the past. It draws upon a collaborative, practice-based case study of a feature film project, The enigma of Frank Ryan, in order to explore the role of the history film as a vehicle for extending historical understanding. In the dialogue between film-maker and historian, a range of issues regarding the import of the history film for the practice or 'poetics' of history is explored.
Resumo:
We analyze the effect of different pulse shaping filters on the orthogonal frequency division multiplexing (OFDM) based wireless local area network (LAN) systems in this paper. In particular, the performances of the square root raised cosine (RRC) pulses with different rolloff factors are evaluated and compared. This work provides some guidances on how to choose RRC pulses in practical WLAN systems, e.g., the selection of rolloff factor, truncation length, oversampling rate, quantization levels, etc.
Resumo:
The doubly-fed induction generator (DFIG) now represents the dominant technology in wind turbine design. One consequence of this is limited damping and inertial response during transient grid disturbances. A dasiadecoupledpsila strategy is therefore proposed to operate the DFIG grid-side converter (GSC) as a static synchronous compensator (STATCOM) during a fault, supporting the local voltage, while the DFIG operates as a fixed-speed induction generator (FSIG) providing an inertial response. The modeling aspects of the decoupled control strategy, the selection of protection control settings, the significance of the fault location and operation at sub- and super-synchronous speeds are analyzed in detail. In addition, a case study is developed to validate the proposed strategy under different wind penetrations levels. The simulations show that suitable configuration of the decoupled strategy can be deployed to improve system voltage stability and inertial response for a range of scenarios, especially at high wind penetration. The conclusions are placed in context of the practical limitations of the technology employed and the system conditions.
Resumo:
This paper considers the enhancement of loss-of-mains detection by use of a differential rate-of-change-of-frequency relay to reduce nuisance tripping and improve sensitivity to small excursions in frequency. The telecommunications media which might carry the differential ROCOF signal are reviewed with a focus on channel latency, bandwidth and security.
Resumo:
This paper discusses methods of using the Internet as a communications media between distributed generator sites to provide new forms of loss-of-mains protection. An analysis of the quality of the communications channels between several nodes on the network was carried out experimentally. It is shown that Internet connections in urban environments are already capable of providing real-time power system protection, whilst rural Internet connections are borderline suitable but could not yet be recommended as a primary method of protection. Two strategies of providing loss-of-mains across Internet protocol are considered, broadcast of a reference frequency or phasor representing the utility and an Internet based inter-tripping scheme.
Resumo:
Purpose: Polymorphisms in the vitamin D receptor (VDR) gene may be of etiological importance in determining cancer risk. The aim of this study was to assess the association between common VDR gene polymorphisms and esophageal adenocarcinoma (EAC) risk in an all-Ireland population-based case-control study. Methods: EAC cases and frequency-matched controls by age and gender recruited between March 2002 and December 2004 throughout Ireland were included. Participants were interviewed, and a blood sample collected for DNA extraction. Twenty-seven single nucleotide polymorphisms in the VDR gene were genotyped using Sequenom or TaqMan assays while the poly(A) microsatellite was genotyped by fluorescent fragment analysis. Unconditional logistic regression was applied to assess the association between VDR polymorphisms and EAC risk. Results: A total of 224 cases of EAC and 256 controls were involved in analyses. After adjustment for potential confounders, TT homozygotes at rs2238139 and rs2107301 had significantly reduced risks of EAC compared with CC homozygotes. In contrast, SS alleles of the poly(A) microsatellite had significantly elevated risks of EAC compared with SL/LL alleles. However, following permutation analyses to adjust for multiple comparisons, no significant associations were observed between any VDR gene polymorphism and EAC risk. Conclusions: VDR gene polymorphisms were not significantly associated with EAC development in this Irish population. Confirmation is required from larger studies. © Springer Science+Business Media, LLC 2011.
Resumo:
Research into localization has produced a wealth of algorithms and techniques to estimate the location of wireless network nodes, however the majority of these schemes do not explicitly account for non-line of sight conditions. Disregarding this common situation reduces their accuracy and their potential for exploitation in real world applications. This is a particular problem for personnel tracking where the user's body itself will inherently cause time-varying blocking according to their movements. Using empirical data, this paper demonstrates that, by accounting for non-line of sight conditions and using received signal strength based Monte Carlo localization, meter scale accuracy can be achieved for a wrist-worn personnel tracking tag in a 120 m indoor office environment. © 2012 IEEE.
Resumo:
Two models that can predict the voltage-dependent scattering from liquid crystal (LC)-based reflectarray cells are presented. The validity of both numerical techniques is demonstrated using measured results in the frequency range 94-110 GHz. The most rigorous approach models, for each voltage, the inhomogeneous and anisotropic permittivity of the LC as a stratified media in the direction of the biasing field. This accounts for the different tilt angles of the LC molecules inside the cell calculated from the solution of the elastic problem. The other model is based on an effective homogeneous permittivity tensor that corresponds to the average tilt angle along the longitudinal direction for each biasing voltage. In this model, convergence problems associated with the longitudinal inhomogeneity are avoided, and the computation efficiency is improved. Both models provide a correspondence between the reflection coefficient (losses and phase-shift) of the LC-based reflectarray cell and the value of biasing voltage, which can be used to design beam scanning reflectarrays. The accuracy and the efficiency of both models are also analyzed and discussed.
Resumo:
Periodic monitoring of structures such as bridges is necessary as their condition can deteriorate due to environmental conditions and ageing, causing the bridge to become unsafe. This monitoring - so called Structural Health Monitoring (SHM) - can give an early warning if a bridge becomes unsafe. This paper investigates an alternative wavelet-based approach for the monitoring of bridge structures which consists of the use of a vehicle fitted with accelerometers on its axles. A simplified vehicle-bridge interaction model is used in theoretical simulations to examine the effectiveness of the approach in detecting damage in the bridge. The accelerations of the vehicle are processed using a continuous wavelet transform, allowing a time-frequency analysis to be performed. This enables the identification of both the existence and location of damage from the vehicle response. Based on this analysis, a damage index is established. A parametric study is carried out to investigate the effect of parameters such as the bridge span length, vehicle speed, vehicle mass, damage level, signal noise level and road surface roughness on the accuracy of results. In addition, a laboratory experiment is carried out to validate the results of the theoretical analysis and assess the ability of the approach to detect changes in the bridge response.
Resumo:
This paper presents the results of an experimental investigation, carried out in order to verify the feasibility of a ‘drive-by’ approach which uses a vehicle instrumented with accelerometers to detect and locate damage in a bridge. In theoretical simulations, a simplified vehicle-bridge interaction model is used to investigate the effectiveness of the approach in detecting damage in a bridge from vehicle accelerations. For this purpose, the accelerations are processed using a continuous wavelet transform and damage indicators are evaluated and compared. Alternative statistical pattern recognition techniques are incorporated to allow for repeated vehicle passes. Parameters such as vehicle speed, damage level, location and road roughness are varied in simulations to investigate the effect. A scaled laboratory experiment is carried out to assess the effectiveness of the approach in a more realistic environment, considering a number of bridge damage scenarios.
Resumo:
1. Quantitative reconstruction of past vegetation distribution and abundance from sedimentary pollen records provides an important baseline for understanding long term ecosystem dynamics and for the calibration of earth system process models such as regional-scale climate models, widely used to predict future environmental change. Most current approaches assume that the amount of pollen produced by each vegetation type, usually expressed as a relative pollen productivity term, is constant in space and time.
2. Estimates of relative pollen productivity can be extracted from extended R-value analysis (Parsons and Prentice, 1981) using comparisons between pollen assemblages deposited into sedimentary contexts, such as moss polsters, and measurements of the present day vegetation cover around the sampled location. Vegetation survey method has been shown to have a profound effect on estimates of model parameters (Bunting and Hjelle, 2010), therefore a standard method is an essential pre-requisite for testing some of the key assumptions of pollen-based reconstruction of past vegetation; such as the assumption that relative pollen productivity is effectively constant in space and time within a region or biome.
3. This paper systematically reviews the assumptions and methodology underlying current models of pollen dispersal and deposition, and thereby identifies the key characteristics of an effective vegetation survey method for estimating relative pollen productivity in a range of landscape contexts.
4. It then presents the methodology used in a current research project, developed during a practitioner workshop. The method selected is pragmatic, designed to be replicable by different research groups, usable in a wide range of habitats, and requiring minimum effort to collect adequate data for model calibration rather than representing some ideal or required approach. Using this common methodology will allow project members to collect multiple measurements of relative pollen productivity for major plant taxa from several northern European locations in order to test the assumption of uniformity of these values within the climatic range of the main taxa recorded in pollen records from the region.
Resumo:
Previous studies on work instruction delivery for complex assembly tasks have shown that the mode and delivery method for the instructions in an engineering context can influence both build time and product quality. The benefits of digital, animated instructional formats when compared to static pictures and text only formats have already been demonstrated. Although pictograms have found applications for relatively straight forward operations and activities, their applicability to relatively complex assembly tasks has yet to be demonstrated. This study compares animated instructions and pictograms for the assembly of an aircraft panel. Based around a series of build experiments, the work records build time as well as the number of media references to measure and compare build efficiency. The number of build errors and the time required to correct them is also recorded. The experiments included five participants completing five builds over five consecutive days for each media type. Results showed that on average the total build time was 13.1% lower for the group using animated instructions. The benefit of animated instructions on build time was most prominent in the first three builds, by build four this benefit had disappeared. There were a similar number of instructional references for the two groups over the five builds but the pictogram users required a lot more references during build 1. There were more errors among the group using pictograms requiring more time for corrections during the build.
Resumo:
This paper reports the synthesis of dendrons containing a spermine unit at their focal point. The dendritic branching is based on L-lysine building blocks, and has terminal oligo(ethyleneglycol) units on the surface. As a consequence of the solubilising surface groups, these dendrons have high solubility in solvents with widely different polarities (e.g., dichloromethane and water). The protonated spermine unit at the focal point is an effective anion binding fragment and, as such, these dendrons are able to bind to polyanions. This paper demonstrates that polyanions can be bound in both dichloromethane (using a dye solubilisation assay) and in water (competitive ATP binding assay). In organic media the dendritic branching appears to have a pro-active effect on the solubilisation of the dye, with more dye being solubilised by higher generations of dendron. On the other hand, in water the degree of branching has no impact on the anion binding process. We propose that in this case, the spermine unit is effectively solvated by the bulk solvent and the dendritic branching does not need to play an active role in assisting solubility. Dendritic effects on anion binding have therefore been elucidated in different solvents. The dendritic branching plays a pro-active role in providing the anion binding unit with good solubility in apolar solvent media.