942 resultados para multi-value


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays multi-touch devices (MTD) can be found in all kind of contexts. In the learning context, MTD availability leads many teachers to use them in their class room, to support the use of the devices by students, or to assume that it will enhance the learning processes. Despite the raising interest for MTD, few researches studying the impact in term of performance or the suitability of the technology for the learning context exist. However, even if the use of touch-sensitive screens rather than a mouse and keyboard seems to be the easiest and fastest way to realize common learning tasks (as for instance web surfing behaviour), we notice that the use of MTD may lead to a less favourable outcome. The complexity to generate an accurate fingers gesture and the split attention it requires (multi-tasking effect) make the use of gestures to interact with a touch-sensitive screen more difficult compared to the traditional laptop use. More precisely, it is hypothesized that efficacy and efficiency decreases, as well as the available cognitive resources making the users’ task engagement more difficult. Furthermore, the presented study takes into account the moderator effect of previous experiences with MTD. Two key factors of technology adoption theories were included in the study: familiarity and self-efficacy with the technology.Sixty university students, invited to a usability lab, are asked to perform information search tasks on an online encyclopaedia. The different tasks were created in order to execute the most commonly used mouse actions (e.g. right click, left click, scrolling, zooming, key words encoding…). Two different conditions were created: (1) MTD use and (2) laptop use (with keyboard and mouse). The cognitive load, self-efficacy, familiarity and task engagement scales were adapted to the MTD context. Furthermore, the eye-tracking measurement would offer additional information about user behaviours and their cognitive load.Our study aims to clarify some important aspects towards the usage of MTD and the added value compared to a laptop in a student learning context. More precisely, the outcomes will enhance the suitability of MTD with the processes at stakes, the role of previous knowledge in the adoption process, as well as some interesting insights into the user experience with such devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade, multi-touch devices (MTD) have spread in a range of contexts. In the learning context, MTD accessibility leads more and more teachers to use them in their classroom, assuming that it will improve the learning activities. Despite a growing interest, only few studies have focused on the impacts of MTD use in terms of performance and suitability in a learning context.However, even if the use of touch-sensitive screens rather than a mouse and keyboard seems to be the easiest and fastest way to realize common learning tasks (as for instance web surfing), we notice that the use of MTD may lead to a less favorable outcome. More precisely, tasks that require users to generate complex and/or less common gestures may increase extrinsic cognitive load and impair performance, especially for intrinsically complex tasks. It is hypothesized that task and gesture complexity will affect users’ cognitive resources and decrease task efficacy and efficiency. Because MTD are supposed to be more appealing, it is assumed that it will also impact cognitive absorption. The present study also takes into account user’s prior knowledge concerning MTD use and gestures by using experience with MTD as a moderator. Sixty university students were asked to perform information search tasks on an online encyclopedia. Tasks were set up so that users had to generate the most commonly used mouse actions (e.g. left/right click, scrolling, zooming, text encoding…). Two conditions were created: MTD use and laptop use (with mouse and keyboard) in order to make a comparison between the two devices. An eye tracking device was used to measure user’s attention and cognitive load. Our study sheds light on some important aspects towards the use of MTD and the added value compared to a laptop in a student learning context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel multi-scale seamless model of brittle-crack propagation is proposed and applied to the simulation of fracture growth in a two-dimensional Ag plate with macroscopic dimensions. The model represents the crack propagation at the macroscopic scale as the drift-diffusion motion of the crack tip alone. The diffusive motion is associated with the crack-tip coordinates in the position space, and reflects the oscillations observed in the crack velocity following its critical value. The model couples the crack dynamics at the macroscales and nanoscales via an intermediate mesoscale continuum. The finite-element method is employed to make the transition from the macroscale to the nanoscale by computing the continuum-based displacements of the atoms at the boundary of an atomic lattice embedded within the plate and surrounding the tip. Molecular dynamics (MD) simulation then drives the crack tip forward, producing the tip critical velocity and its diffusion constant. These are then used in the Ito stochastic calculus to make the reverse transition from the nanoscale back to the macroscale. The MD-level modelling is based on the use of a many-body potential. The model successfully reproduces the crack-velocity oscillations, roughening transitions of the crack surfaces, as well as the macroscopic crack trajectory. The implications for a 3-D modelling are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to investigate the impact of different agency practice on agency fees, business efficiency, and housing market liquidity. Design/methodology/approach: The paper studies the effect of sole and multiple agency practices on estate agent efficiency, housing market liquidity, and commission fee levels. The analysis uses the survey data from 2000 to 2006 to investigate the different agency practices across England and Wales and their effect on estate agency business efficiency, housing market liquidity, selling price, and fee levels. Findings: The empirical analysis confirms that agency practice has a locality bias, that is, some regions are more likely to adopt sole agency practice than other regions. The estate agents with a sole agency practice charge a lower agency fee, help clients to achieve better selling price and are more efficient; whereas multiple agency practice facilitates liquidity in the housing market, but experiences higher fall-through rate. Research limitations/implications: The research focuses on estate agent rather than consumers due to the limitation of the data based on a research project concerning transaction costs designed prior to this analysis. Originality/value: There is little other research that investigates the residential estate agency practice and its impact on housing market in the past three decades in England and Wales. The findings are a useful guide for practitioners to better understand the issues associated with different agency practices and should enhance business efficiency and performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term biological time-series in the oceans are relatively rare. Using the two longest of these we show how the information value of such ecological time-series increases through space and time in terms of their potential policy value. We also explore the co-evolution of these oceanic biological time-series with changing marine management drivers. Lessons learnt from reviewing these sequences of observations provide valuable context for the continuation of existing time-series and perspective for the initiation of new time-series in response to rapid global change. Concluding sections call for a more integrated approach to marine observation systems and highlight the future role of ocean observations in adaptive marine management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the process of creating and exploiting synergies between business units of a multi-unit corporation and the creation of internal value by combining and exploiting knowledge. It offers a framework to create and manage such synergies and undertakes an empirical test through in-depth study across three business units of Royal Vopak, a Dutch-based global multi-unit corporation. Finally, it offers lessons for corporate managers trying to create and manage cross-unit synergies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly infrastructure providers are supplying the cloud marketplace with storage and on-demand compute resources to host cloud applications. From an application user's point of view, it is desirable to identify the most appropriate set of available resources on which to execute an application. Resource choice can be complex and may involve comparing available hardware specifications, operating systems, value-added services, such as network configuration or data replication, and operating costs, such as hosting cost and data throughput. Providers' cost models often change and new commodity cost models, such as spot pricing, have been introduced to offer significant savings. In this paper, a software abstraction layer is used to discover infrastructure resources for a particular application, across multiple providers, by using a two-phase constraints-based approach. In the first phase, a set of possible infrastructure resources are identified for a given application. In the second phase, a heuristic is used to select the most appropriate resources from the initial set. For some applications a cost-based heuristic is most appropriate; for others a performance-based heuristic may be used. A financial services application and a high performance computing application are used to illustrate the execution of the proposed resource discovery mechanism. The experimental result shows the proposed model could dynamically select an appropriate set of resouces that match the application's requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A key pathological feature of late-onset Alzheimer's disease (LOAD) is the abnormal extracellular accumulation of the amyloid-ß (Aß) peptide. Thus, altered Aß degradation could be a major contributor to the development of LOAD. Variants in the gene encoding the Aß-degrading enzyme, angiotensin-1 converting enzyme (ACE) therefore represent plausible candidates for association with LOAD pathology and risk. Following Alzgene meta-analyses of all published case-control studies, the ACE variants rs4291 and rs1800764 showed significant association with LOAD risk. Furthermore ACE haplotypes are associated with both plasma ACE levels and LOAD risk. We tested three ACE variants (rs4291, rs4343, and rs1800764) for association with LOAD in ten Caucasian case-control populations (n = 8,212). No association was found using multiple logistic models (all p > 0.09). We found no population heterogeneity (all p > 0.38) or evidence for association with LOAD risk following meta-analysis of the ten populations for rs4343 (OR = 1.00), rs4291 (OR = 0.97), or rs1800764 (OR = 0.99). Although we found no haplotypic association in our complete dataset (p = 0.51), a significant global haplotypic p-value was observed in one population (p = 0.007) due to an association of the H3 haplotype (OR = 0.72, p = 0.02) and a trend towards an association of H4 (OR = 1.38, p = 0.09) and H7 (OR = 2.07, p = 0.08) although these did not survive Bonferroni correction. Previously reported associations of ACE variants with LOAD will be diminished following this study. At best, ACE variants have modest effect sizes, which are likely part of a complex interaction between genetic, phenotypic and pharmacological effects that would be undetected in traditional case-control studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiscale micro-mechanics theory is extensively used for the prediction of the material response and damage analysis of unidirectional lamina using a representative volume element (RVE). Th is paper presents a RVE-based approach to characterize the materi al response of a multi-fibre cross-ply laminate considering the effect of matrix damage and fibre-matrix interfacial strength. The framework of the homogenization theory for periodic media has been used for the analysis of a 'multi-fibre multi-layer representative volume element' (M2 RVE) representing cross-ply laminate. The non-homogeneous stress-strain fields within the M2RVE are related to the average stresses and strains by using Gauss theorem and the Hill-Mandal strain energy equivalence principle. The interfacial bonding strength affects the in-plane shear stress-strain response significantl y. The material response predicted by M2 RVE is in good agreement with the experimental results available in the literature. The maximum difference between the shear stress predicted using M2 RVE and the experimental results is ~15% for the bonding strength of 30MPa at the strain value of 1.1%

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A binding protein displaying broad-spectrum cross-reactivity within the sulfonamide group was used in conjunction with a sulfonamide specific sensor chip and a surface plasmon resonance biosensor to develop a rapid broad spectrum screening assay for sulfonamides in porcine muscle. Results for 40 samples were available in just over 5 h after the completion of a simple sample preparation protocol. Twenty sulfonamide compounds were detected. Acetylated metabolites were not recognised by the binding protein. Limit of detection (mean-three times standard deviation value when n = 20) was calculated to be 16.9 ng g(-1) in tissue samples. Intra-assay precision (n = 10) was calculated at 4.3 %CV for a sample spiked at 50 ng g(-1) with sulfamethazine, 3.6 %CV for a sample spiked at 100 ng g(-1) with sulfamethazine, 7.2 %CV for a sample spiked at 50 ng g(-1) with sulfadiazine and 3.1 %CV for a sample spiked at 100 ng g-1 with sulfadiazine. Inter-assay precision (n = 3) was calculated at 9.7 %CV for a sample spiked at 50 ng g-1 with sulfamethazine, 3.8 %CV for a sample spiked at 100 ng g(-1) with sulfamethazine, 3.5 %CV for a sample spiked at 50 ng g(-1) with sulfadiazine and 2.8 %CV for a sample spiked at 100 ng g(-1) with sulfadiazine. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrating evidence from multiple domains is useful in prioritizing disease candidate genes for subsequent testing. We ranked all known human genes (n = 3819) under linkage peaks in the Irish Study of High-Density Schizophrenia Families using three different evidence domains: 1) a meta-analysis of microarray gene expression results using the Stanley Brain collection, 2) a schizophrenia protein-protein interaction network, and 3) a systematic literature search. Each gene was assigned a domain-specific p-value and ranked after evaluating the evidence within each domain. For comparison to this
ranking process, a large-scale candidate gene hypothesis was also tested by including genes with Gene Ontology terms related to neurodevelopment. Subsequently, genotypes of 3725 SNPs in 167 genes from a custom Illumina iSelect array were used to evaluate the top ranked vs. hypothesis selected genes. Seventy-three genes were both highly ranked and involved in neurodevelopment (category 1) while 42 and 52 genes were exclusive to neurodevelopment (category 2) or highly ranked (category 3), respectively. The most significant associations were observed in genes PRKG1, PRKCE, and CNTN4 but no individual SNPs were significant after correction for multiple testing. Comparison of the approaches showed an excess of significant tests using the hypothesis-driven neurodevelopment category. Random selection of similar sized genes from two independent genome-wide association studies (GWAS) of schizophrenia showed the excess was unlikely by chance. In a further meta-analysis of three GWAS datasets, four candidate SNPs reached nominal significance. Although gene ranking using integrated sources of prior information did not enrich for significant results in the current experiment, gene selection using an a priori hypothesis (neurodevelopment) was superior to random selection. As such, further development of gene ranking strategies using more carefully selected sources of information is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we make use of the 9-year old wave of the Growing Up in Ireland study to analyse multidimensional deprivation in Ireland. The Alkire and Foster adjusted head count ratio approach (AHCR; 2007, 2011a, 2011b) applied here constitutes a significant improvement on union and intersection approaches and allows for the decomposition of multidimensional poverty in terms of dimensions and sub-groups. The approach involves a censoring of data such that deprivations count only for those above the specified multidimensional threshold leading to a stronger set of interrelationships between deprivation dimensions. Our analysis shows that the composition of the adjusted head ratio is influenced by a range of socio-economic factors. For less-favoured socio-economic groups dimensions relating to material deprivation are disproportionately represented while for the more advantaged groups, those relating to behavioral and emotional issues and social interaction play a greater role. Notwithstanding such variation in composition, our analysis showed that the AHCR varied systematically across categories of household type, and the social class, education and age group of the primary care giver. Furthermore, these variables combined in a cumulative manner. The most systematic variation was in relation to the head count of those above the multidimensional threshold rather than intensity, conditional on being above that cut-off point. Without seeking to arbitrate on the relative value of composite indices versus disaggregated profiles, our analysis demonstrates that there is much to be gained from adopting an approach with clearly understood axiomatic properties. Doing so allows one to evaluate the consequences of the measurement strategy employed for the understanding of levels of multidimensional deprivation, the nature of such deprivation profiles and socio-economic risk patterns. Ultimately it permits an informed assessment of the strengths and weaknesses of the particular choices made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a multi-level wordline driver scheme is presented to improve SRAM read and write stability while lowering power consumption during hold operation. The proposed circuit applies a shaped wordline voltage pulse during read mode and a boosted wordline pulse during write mode. During read, the applied shaped pulse is tuned at nominal voltage for short period of time, whereas for the remaining access time, the wordline voltage is reduced to a lower level. This pulse results in improved read noise margin without any degradation in access time which is explained by examining the dynamic and nonlinear behavior of the SRAM cell. Furthermore, during hold mode, the wordline voltage starts from a negative value and reaches zero voltage, resulting in a lower leakage current compared to conventional SRAM. Our simulations using TSMC 65nm process show that the proposed wordline driver results in 2X improvement in static read noise margin while the write margin is improved by 3X. In addition, the total leakage of the proposed SRAM is reduced by 10% while the total power is improved by 12% in the worst case scenario of a single SRAM cell. The total area penalty is 10% for a 128Kb standard SRAM array.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the characterization of a new eight-allele microsatellite (D3S621) isolated from a human chromosome 3 library. Two-point and multi-locus genetic linkage analysis have shown D3S621 to co-segregate with the previously mapped RP4 (theta m = 0.12, Zm = 4.34) and with other genetic markers on the long arm of the chromosome, including D3S14 (R208) (theta m = 0.00, Zm = 15.10), D3S47 (C17) (theta m = 0.11, Zm = 4.95), Rho (theta m = 0.07, Zm = 1.37), D3S21 (L182) (theta m = 0.07, Zm = 2.40) and D3S19 (U1) (theta m = 0.13, Zm = 2.78). This highly informative marker, with a polymorphic information content of 0.78, should be of considerable value in the extension of linkage data for autosomal dominant retinitis pigmentosa with respect to locii on the long arm of chromosome 3.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose
The study contributes to the literature on public value and performance examining politicians’ and managers’ perspectives by investigating the importance they attach to the different facets of performance information (i.e. budgetary, accrual based- and non-financial information (NFI)).

Design/methodology/approach
We survey politicians and managers in all Italian municipalities of at least 80,000 inhabitants.

Findings
Overall, NFI is more appreciated than financial information (FI). Moreover, budgetary accounting is preferred to accrual accounting. Politicians’ and managers’ preferences are generally aligned.

Research limitations/implications
NFI as a measure of public value is not alternative, but rather complementary, to FI. The latter remains a fundamental element of public sector accounting due to its role in resource allocation and control.

Practical implications
The preference for NFI over FI and of budgetary over accruals accounting suggests that the current predominant emphasis on (accrual-based) financial reporting might be misplaced.

Originality/value
Public value and performance are multi-faceted concepts. They can be captured by different types of information and evaluated according to different criteria, which will also depend on the category of stakeholders or users who assesses public performance. So far, most literature has considered the financial and non-financial facets of performance as virtually separate. Similarly, in the practice, financial management tends to be decoupled from non-financial performance management. However, this research shows that only by considering their joint interactions we can achieve an accurate representation of what public value really is.