130 resultados para feature representation
Resumo:
Technology Acceptance Model (TAM) posits that Perceived Ease of Use (PEOU) and Perceived Usefulness (PU) influence the ‘intention to use’. The Post-Acceptance Model (PAM) posits that continued use is influenced by prior experience. In order to study the factors that influence how professionals use complex systems, we create a tentative research model that builds on PAM and TAM. Specifically we include PEOU and the construct ‘Professional Association Guidance’. We postulate that feature usage is enhanced when professional associations influence PU by highlighting additional benefits. We explore the theory in the context of post-adoption use of Electronic Medical Records (EMRs) by primary care physicians in Ontario. The methodology can be extended to other professional environments and we suggest directions for future research.
Resumo:
We present a new sparse shape modeling framework on the Laplace-Beltrami (LB) eigenfunctions. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes by forming a Fourier series expansion. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we propose to filter out only the significant eigenfunctions by imposing l1-penalty. The new sparse framework can further avoid additional surface-based smoothing often used in the field. The proposed approach is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shapes in the normal population. In addition, we show how the emotional response is related to the anatomy of the subcortical structures.
Resumo:
As a major mode of intraseasonal variability, which interacts with weather and climate systems on a near-global scale, the Madden – Julian Oscillation (MJO) is a crucial source of predictability for numerical weather prediction (NWP) models. Despite its global significance and comprehensive investigation, improvements in the representation of the MJO in an NWP context remain elusive. However, recent modifications to the model physics in the ECMWF model led to advances in the representation of atmospheric variability and the unprecedented propagation of the MJO signal through the entire integration period. In light of these recent advances, a set of hindcast experiments have been designed to assess the sensitivity of MJO simulation to the formulation of convection. Through the application of established MJO diagnostics, it is shown that the improvements in the representation of the MJO can be directly attributed to the modified convective parametrization. Furthermore, the improvements are attributed to the move from a moisture-convergent- to a relative-humidity-dependent formulation for organized deep entrainment. It is concluded that, in order to understand the physical mechanisms through which a relative-humidity-dependent formulation for entrainment led to an improved simulation of the MJO, a more process-based approach should be taken. T he application of process-based diagnostics t o t he hindcast experiments presented here will be the focus of Part II of this study.
Resumo:
This paper explores the development of multi-feature classification techniques used to identify tremor-related characteristics in the Parkinsonian patient. Local field potentials were recorded from the subthalamic nucleus and the globus pallidus internus of eight Parkinsonian patients through the implanted electrodes of a Deep brain stimulation (DBS) device prior to device internalization. A range of signal processing techniques were evaluated with respect to their tremor detection capability and used as inputs in a multi-feature neural network classifier to identify the activity of Parkinsonian tremor. The results of this study show that a trained multi-feature neural network is able, under certain conditions, to achieve excellent detection accuracy on patients unseen during training. Overall the tremor detection accuracy was mixed, although an accuracy of over 86% was achieved in four out of the eight patients.
Resumo:
This chapter looks into the gap between presentational realism and the representation of physical experience in Werner Herzog's work so as to retrieve the indexical trace – or the absolute materiality of death. To that end, it draws links between Herzog and other directors akin to realism in its various forms, including surrealism. In particular, it focuses on François Truffaut and Glauber Rocha, representing respectively the Nouvelle Vague and the Cinema Novo, whose works had a decisive weight on Herzog’s aesthetic choices to the point of originating distinct phases of his outputs. The analyses, though restricted to a small number of films, intends to re-evaluate Herzog’s position within, and contribution to, film history.
Resumo:
A representation of the conformal mapping g of the interior or exterior of the unit circle onto a simply-connected domain Ω as a boundary integral in terms ofƒ|∂Ω is obtained, whereƒ :=g -l. A product integration scheme for the approximation of the boundary integral is described and analysed. An ill-conditioning problem related to the domain geometry is discussed. Numerical examples confirm the conclusions of this discussion and support the analysis of the quadrature scheme.
Resumo:
The goal of this article is to make an epistemological and theoretical contribution to the nascent field of third language (L3) acquisition and show how examining L3 development can offer a unique view into longstanding debates within L2 acquisition theory. We offer the Phonological Permeability Hypothesis (PPH), which maintains that examining the development of an L3/Ln phonological system and its effects on a previously acquired L2 phonological system can inform contemporary debates regarding the mental constitution of postcritical period adult phonological acquisition. We discuss the predictions and functional significance of the PPH for adult SLA and multilingualism studies, detailing a methodology that examines the effects of acquiring Brazilian Portuguese on the Spanish phonological systems learned before and after the so-called critical period (i.e., comparing simultaneous versus successive adult English-Spanish bilinguals learning Brazilian Portuguese as an L3).
Resumo:
A number of recent studies demonstrate that bilinguals with languages that differ in grammatical and lexical categories may shift their cognitive representation of those categories towards that of monolingual speakers of their second language. The current paper extended that investigation to the domain of colour in Greek–English bilinguals with different levels of bilingualism, and English monolinguals. Greek differentiates the blue region of colour space into a darker shade called ble and a lighter shade called ghalazio. Results showed a semantic shift of category prototypes with level of bilingualism and acculturation, while the way bilinguals judged the perceptual similarity between within- and cross-category stimulus pairs depended strongly on the availability of the relevant colour terms in semantic memory, and the amount of time spent in the L2-speaking country. These results suggest that cognition is tightly linked to semantic memory for specific linguistic categories, and to cultural immersion in the L2-speaking country.
Resumo:
Voluntary selective attention can prioritize different features in a visual scene. The frontal eye-fields (FEF) are one potential source of such feature-specific top-down signals, but causal evidence for influences on visual cortex (as was shown for "spatial" attention) has remained elusive. Here, we show that transcranial magnetic stimulation (TMS) applied to right FEF increased the blood oxygen level-dependent (BOLD) signals in visual areas processing "target feature" but not in "distracter feature"-processing regions. TMS-induced BOLD signals increase in motion-responsive visual cortex (MT+) when motion was attended in a display with moving dots superimposed on face stimuli, but in face-responsive fusiform area (FFA) when faces were attended to. These TMS effects on BOLD signal in both regions were negatively related to performance (on the motion task), supporting the behavioral relevance of this pathway. Our findings provide new causal evidence for the human FEF in the control of nonspatial "feature"-based attention, mediated by dynamic influences on feature-specific visual cortex that vary with the currently attended property.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.