25 resultados para multi-modal interaction


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Anti-spoofing is attracting growing interest in biometrics, considering the variety of fake materials and new means to attack biometric recognition systems. New unseen materials continuously challenge state-of-the-art spoofing detectors, suggesting for additional systematic approaches to target anti-spoofing. By incorporating liveness scores into the biometric fusion process, recognition accuracy can be enhanced, but traditional sum-rule based fusion algorithms are known to be highly sensitive to single spoofed instances. This paper investigates 1-median filtering as a spoofing-resistant generalised alternative to the sum-rule targeting the problem of partial multibiometric spoofing where m out of n biometric sources to be combined are attacked. Augmenting previous work, this paper investigates the dynamic detection and rejection of livenessrecognition pair outliers for spoofed samples in true multi-modal configuration with its inherent challenge of normalisation. As a further contribution, bootstrap aggregating (bagging) classifiers for fingerprint spoof-detection algorithm is presented. Experiments on the latest face video databases (Idiap Replay- Attack Database and CASIA Face Anti-Spoofing Database), and fingerprint spoofing database (Fingerprint Liveness Detection Competition 2013) illustrate the efficiency of proposed techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To assess the potential source of variation that surgeon may add to patient outcome in a clinical trial of surgical procedures. Methods: Two large (n = 1380) parallel multicentre randomized surgical trials were undertaken to compare laparoscopically assisted hysterectomy with conventional methods of abdominal and vaginal hysterectomy; involving 43 surgeons. The primary end point of the trial was the occurrence of at least one major complication. Patients were nested within surgeons giving the data set a hierarchical structure. A total of 10% of patients had at least one major complication, that is, a sparse binary outcome variable. A linear mixed logistic regression model (with logit link function) was used to model the probability of a major complication, with surgeon fitted as a random effect. Models were fitted using the method of maximum likelihood in SAS((R)). Results: There were many convergence problems. These were resolved using a variety of approaches including; treating all effects as fixed for the initial model building; modelling the variance of a parameter on a logarithmic scale and centring of continuous covariates. The initial model building process indicated no significant 'type of operation' across surgeon interaction effect in either trial, the 'type of operation' term was highly significant in the abdominal trial, and the 'surgeon' term was not significant in either trial. Conclusions: The analysis did not find a surgeon effect but it is difficult to conclude that there was not a difference between surgeons. The statistical test may have lacked sufficient power, the variance estimates were small with large standard errors, indicating that the precision of the variance estimates may be questionable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the requirements for a Work/flow Management System that is intended to automate the production and distribution chain for cross-media content which is by nature multi-partner and multi-site. It advocates the requirements for an ontology-based object lifecycle tracking within work/flow integration by identifying various types of interfaces, object life cycles and the work-flow interaction environments within the AXMEDIS Framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coordination of design is a multi-faceted problem in construction. In design interactions in particular the real-time coordination of design activity is a persistent concern. The use of objects to coordinate the activity of design is studied as this happens in interactions between an architect and a building user group, in a setting where maintaining awareness of the design situation is important. An account of ways in which this was accomplished and how design activity is coordinated through interactional practices is provided. The empirical analyses examine design interaction from an ethnomethodological/conversation analysis (EM/CA) informed perspective to examine: ways in which mutual orientation to design issues are accomplished, how objects can provide a resource for the recognition of the activities of others and ways in which objects might be observable as momentarily intelligible. Subtle interactional practices involving talk, gesture and gaze were some of the small ways in which mutual orientation to the design actions of others became observable. The production of actions sequentially, in response to another's action, marked the real-time coordination of design moves in this setting. The relevance of accounts of micro-interaction to develop understanding of design activity and how it is coordinated are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Where users are interacting in a distributed virtual environment, the actions of each user must be observed by peers with sufficient consistency and within a limited delay so as not to be detrimental to the interaction. The consistency control issue may be split into three parts: update control; consistent enactment and evolution of events; and causal consistency. The delay in the presentation of events, termed latency, is primarily dependent on the network propagation delay and the consistency control algorithms. The latency induced by the consistency control algorithm, in particular causal ordering, is proportional to the number of participants. This paper describes how the effect of network delays may be reduced and introduces a scalable solution that provides sufficient consistency control while minimising its effect on latency. The principles described have been developed at Reading over the past five years. Similar principles are now emerging in the simulation community through the HLA standard. This paper attempts to validate the suggested principles within the schema of distributed simulation and virtual environments and to compare and contrast with those described by the HLA definition documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider two weakly coupled systems and adopt a perturbative approach based on the Ruelle response theory to study their interaction. We propose a systematic way of parameterizing the effect of the coupling as a function of only the variables of a system of interest. Our focus is on describing the impacts of the coupling on the long term statistics rather than on the finite-time behavior. By direct calculation, we find that, at first order, the coupling can be surrogated by adding a deterministic perturbation to the autonomous dynamics of the system of interest. At second order, there are additionally two separate and very different contributions. One is a term taking into account the second-order contributions of the fluctuations in the coupling, which can be parameterized as a stochastic forcing with given spectral properties. The other one is a memory term, coupling the system of interest to its previous history, through the correlations of the second system. If these correlations are known, this effect can be implemented as a perturbation with memory on the single system. In order to treat this case, we present an extension to Ruelle's response theory able to deal with integral operators. We discuss our results in the context of other methods previously proposed for disentangling the dynamics of two coupled systems. We emphasize that our results do not rely on assuming a time scale separation, and, if such a separation exists, can be used equally well to study the statistics of the slow variables and that of the fast variables. By recursively applying the technique proposed here, we can treat the general case of multi-level systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of single forcing runs from CMIP5 (the fifth Coupled Model Intercomparison Project) simulations shows that the mid-twentieth century temperature hiatus, and the coincident decrease in precipitation, is likely to have been influenced strongly by anthropogenic aerosol forcing. Models that include a representation of the indirect effect of aerosol better reproduce inter-decadal variability in historical global-mean near-surface temperatures, particularly the cooling in the 1950s and 1960s, compared to models with representation of the aerosol direct effect only. Models with the indirect effect also show a more pronounced decrease in precipitation during this period, which is in better agreement with observations, and greater inter-decadal variability in the inter-hemispheric temperature difference. This study demonstrates the importance of representing aerosols, and their indirect effects, in general circulation models, and suggests that inter-model diversity in aerosol burden and representation of aerosol–cloud interaction can produce substantial variation in simulations of climate variability on multi decadal timescales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Hadley Centre Global Environmental Model (HadGEM) includes two aerosol schemes: the Coupled Large-scale Aerosol Simulator for Studies in Climate (CLASSIC), and the new Global Model of Aerosol Processes (GLOMAP-mode). GLOMAP-mode is a modal aerosol microphysics scheme that simulates not only aerosol mass but also aerosol number, represents internally-mixed particles, and includes aerosol microphysical processes such as nucleation. In this study, both schemes provide hindcast simulations of natural and anthropogenic aerosol species for the period 2000–2006. HadGEM simulations of the aerosol optical depth using GLOMAP-mode compare better than CLASSIC against a data-assimilated aerosol re-analysis and aerosol ground-based observations. Because of differences in wet deposition rates, GLOMAP-mode sulphate aerosol residence time is two days longer than CLASSIC sulphate aerosols, whereas black carbon residence time is much shorter. As a result, CLASSIC underestimates aerosol optical depths in continental regions of the Northern Hemisphere and likely overestimates absorption in remote regions. Aerosol direct and first indirect radiative forcings are computed from simulations of aerosols with emissions for the year 1850 and 2000. In 1850, GLOMAP-mode predicts lower aerosol optical depths and higher cloud droplet number concentrations than CLASSIC. Consequently, simulated clouds are much less susceptible to natural and anthropogenic aerosol changes when the microphysical scheme is used. In particular, the response of cloud condensation nuclei to an increase in dimethyl sulphide emissions becomes a factor of four smaller. The combined effect of different 1850 baselines, residence times, and abilities to affect cloud droplet number, leads to substantial differences in the aerosol forcings simulated by the two schemes. GLOMAP-mode finds a presentday direct aerosol forcing of −0.49Wm−2 on a global average, 72% stronger than the corresponding forcing from CLASSIC. This difference is compensated by changes in first indirect aerosol forcing: the forcing of −1.17Wm−2 obtained with GLOMAP-mode is 20% weaker than with CLASSIC. Results suggest that mass-based schemes such as CLASSIC lack the necessary sophistication to provide realistic input to aerosol-cloud interaction schemes. Furthermore, the importance of the 1850 baseline highlights how model skill in predicting present-day aerosol does not guarantee reliable forcing estimates. Those findings suggest that the more complex representation of aerosol processes in microphysical schemes improves the fidelity of simulated aerosol forcings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The EU FP7 Project MEGAPOLI: "Megacities: Emissions, urban, regional and Global Atmospheric POLlution and climate effects, and Integrated tools for assessment and mitigation" (http://megapoli.info) brings together leading European research groups, state-of-the-art scientific tools and key players from non-European countries to investigate the interactions among megacities, air quality and climate. MEGAPOLI bridges the spatial and temporal scales that connect local emissions, air quality and weather with global atmospheric chemistry and climate. The suggested concept of multi-scale integrated modelling of megacity impact on air quality and climate and vice versa is discussed in the paper. It requires considering different spatial and temporal dimensions: time scales from seconds and hours (to understand the interaction mechanisms) up to years and decades (to consider the climate effects); spatial resolutions: with model down- and up-scaling from street- to global-scale; and two-way interactions between meteorological and chemical processes.