990 resultados para Virtual elements


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yeoman, A., Urquhart, C. & Sharp, S. (2003). Moving Communities of Practice forward: the challenge for the National electronic Library for Health and its Virtual Branch Libraries. Health Informatics Journal, 9(4), 241-252. Previously appeared as a conference paper for the iSHIMR2003 conference (Proceedings of the Eighth International Symposium on Health Information Management Research, June 1-3, 2003, Boras, Sweden) Sponsorship: NHS Information Authority/National electronic Library for Health

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an experiment developed to study the performance of virtual agent animated cues within digital interfaces. Increasingly, agents are used in virtual environments as part of the branding process and to guide user interaction. However, the level of agent detail required to establish and enhance efficient allocation of attention remains unclear. Although complex agent motion is now possible, it is costly to implement and so should only be routinely implemented if a clear benefit can be shown. Pevious methods of assessing the effect of gaze-cueing as a solution to scene complexity have relied principally on two-dimensional static scenes and manual peripheral inputs. Two experiments were run to address the question of agent cues on human-computer interfaces. Both experiments measured the efficiency of agent cues analyzing participant responses either by gaze or by touch respectively. In the first experiment, an eye-movement recorder was used to directly assess the immediate overt allocation of attention by capturing the participant’s eyefixations following presentation of a cueing stimulus. We found that a fully animated agent could speed up user interaction with the interface. When user attention was directed using a fully animated agent cue, users responded 35% faster when compared with stepped 2-image agent cues, and 42% faster when compared with a static 1-image cue. The second experiment recorded participant responses on a touch screen using same agent cues. Analysis of touch inputs confirmed the results of gaze-experiment, where fully animated agent made shortest time response with a slight decrease on the time difference comparisons. Responses to fully animated agent were 17% and 20% faster when compared with 2-image and 1-image cue severally. These results inform techniques aimed at engaging users’ attention in complex scenes such as computer games and digital transactions within public or social interaction contexts by demonstrating the benefits of dynamic gaze and head cueing directly on the users’ eye movements and touch responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urquhart, C. (editor for JUSTEIS team), Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Armstrong, A., Lonsdale, R. & Fenton, R. (2003). JUSTEIS (JISC Usage Surveys: Trends in Electronic Information Services) Strand A: survey of end users of all electronic information services (HE and FE), with Action research report. Final report 2002/2003 Cycle Four. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth with Information Automation Ltd (CIQM). Sponsorship: JISC

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEEE Transactions on Knowledge and Data Engineering, vol. 15, no. 5, pp. 1338-1343, 2003.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new approach to window-constrained scheduling, suitable for multimedia and weakly-hard real-time systems. We originally developed an algorithm, called Dynamic Window-Constrained Scheduling (DWCS), that attempts to guarantee no more than x out of y deadlines are missed for real-time jobs such as periodic CPU tasks, or delay-constrained packet streams. While DWCS is capable of generating a feasible window-constrained schedule that utilizes 100% of resources, it requires all jobs to have the same request periods (or intervals between successive service requests). We describe a new algorithm called Virtual Deadline Scheduling (VDS), that provides window-constrained service guarantees to jobs with potentially different request periods, while still maximizing resource utilization. VDS attempts to service m out of k job instances by their virtual deadlines, that may be some finite time after the corresponding real-time deadlines. Notwithstanding, VDS is capable of outperforming DWCS and similar algorithms, when servicing jobs with potentially different request periods. Additionally, VDS is able to limit the extent to which a fraction of all job instances are serviced late. Results from simulations show that VDS can provide better window-constrained service guarantees than other related algorithms, while still having as good or better delay bounds for all scheduled jobs. Finally, an implementation of VDS in the Linux kernel compares favorably against DWCS for a range of scheduling loads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increased use of "Virtual Machines" (VMs) as vehicles that isolate applications running on the same host, it is necessary to devise techniques that enable multiple VMs to share underlying resources both fairly and efficiently. To that end, one common approach is to deploy complex resource management techniques in the hosting infrastructure. Alternately, in this paper, we advocate the use of self-adaptation in the VMs themselves based on feedback about resource usage and availability. Consequently, we define a "Friendly" VM (FVM) to be a virtual machine that adjusts its demand for system resources, so that they are both efficiently and fairly allocated to competing FVMs. Such properties are ensured using one of many provably convergent control rules, such as AIMD. By adopting this distributed application-based approach to resource management, it is not necessary to make assumptions about the underlying resources nor about the requirements of FVMs competing for these resources. To demonstrate the elegance and simplicity of our approach, we present a prototype implementation of our FVM framework in User-Mode Linux (UML)-an implementation that consists of less than 500 lines of code changes to UML. We present an analytic, control-theoretic model of FVM adaptation, which establishes convergence and fairness properties. These properties are also backed up with experimental results using our prototype FVM implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Java programming language has been widely described as secure by design. Nevertheless, a number of serious security vulnerabilities have been discovered in Java, particularly in the component known as the Bytecode Verifier. This paper describes a method for representing Java security constraints using the Alloy modeling language. It further describes a system for performing a security analysis on any block of Java bytecodes by converting the bytes into relation initializers in Alloy. Any counterexamples found by the Alloy analyzer correspond directly to insecure code. Analysis of a real-world malicious applet is given to demonstrate the efficacy of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Internet and World Wide Web have had, and continue to have, an incredible impact on our civilization. These technologies have radically influenced the way that society is organised and the manner in which people around the world communicate and interact. The structure and function of individual, social, organisational, economic and political life begin to resemble the digital network architectures upon which they are increasingly reliant. It is increasingly difficult to imagine how our ‘offline’ world would look or function without the ‘online’ world; it is becoming less meaningful to distinguish between the ‘actual’ and the ‘virtual’. Thus, the major architectural project of the twenty-first century is to “imagine, build, and enhance an interactive and ever changing cyberspace” (Lévy, 1997, p. 10). Virtual worlds are at the forefront of this evolving digital landscape. Virtual worlds have “critical implications for business, education, social sciences, and our society at large” (Messinger et al., 2009, p. 204). This study focuses on the possibilities of virtual worlds in terms of communication, collaboration, innovation and creativity. The concept of knowledge creation is at the core of this research. The study shows that scholars increasingly recognise that knowledge creation, as a socially enacted process, goes to the very heart of innovation. However, efforts to build upon these insights have struggled to escape the influence of the information processing paradigm of old and have failed to move beyond the persistent but problematic conceptualisation of knowledge creation in terms of tacit and explicit knowledge. Based on these insights, the study leverages extant research to develop the conceptual apparatus necessary to carry out an investigation of innovation and knowledge creation in virtual worlds. The study derives and articulates a set of definitions (of virtual worlds, innovation, knowledge and knowledge creation) to guide research. The study also leverages a number of extant theories in order to develop a preliminary framework to model knowledge creation in virtual worlds. Using a combination of participant observation and six case studies of innovative educational projects in Second Life, the study yields a range of insights into the process of knowledge creation in virtual worlds and into the factors that affect it. The study’s contributions to theory are expressed as a series of propositions and findings and are represented as a revised and empirically grounded theoretical framework of knowledge creation in virtual worlds. These findings highlight the importance of prior related knowledge and intrinsic motivation in terms of shaping and stimulating knowledge creation in virtual worlds. At the same time, they highlight the importance of meta-knowledge (knowledge about knowledge) in terms of guiding the knowledge creation process whilst revealing the diversity of behavioural approaches actually used to create knowledge in virtual worlds and. This theoretical framework is itself one of the chief contributions of the study and the analysis explores how it can be used to guide further research in virtual worlds and on knowledge creation. The study’s contributions to practice are presented as actionable guide to simulate knowledge creation in virtual worlds. This guide utilises a theoretically based classification of four knowledge-creator archetypes (the sage, the lore master, the artisan, and the apprentice) and derives an actionable set of behavioural prescriptions for each archetype. The study concludes with a discussion of the study’s implications in terms of future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Portfolio of Exploration (PoE) tracks a transformative learning developmental journey that is directed at changing meaning making structures and mental models within an innovation practice. The explicit purpose of the Portfolio is to develop new and different perspectives that enable the handling of new and more complex phenomena through self transformation and increased emotional intelligence development. The Portfolio provides a response to the question: ‘What are the key determinants that enable a Virtual Team (VT) to flourish where flourishing means developing and delivering on the firm’s innovative imperatives?’ Furthermore, the PoE is structured as an investigation into how higher order meaning making promotes ‘entrepreneurial services’ within an intra-firm virtual team, with a secondary aim to identify how reasoning about trust influence KGPs to exchange knowledge. I have developed a framework which specifically focuses on the effectiveness of any firms’ Virtual Team (VT) through transforming the meaning making of the VT participants. I hypothesized it is the way KGPs make meaning (reasoning about trust) which differentiates the firm as a growing firm in the sense of Penrosean resources: ‘inducement to expand and a limit of expansion’ (1959). Reasoning about trust is used as a higher order meaning-making concept in line with Kegan’s (1994) conception of complex meaning making, which is the combining of ideas and data in ways that transform meaning and implicates participants to find new ways of knowledge generation. Simply, it is the VT participants who develop higher order meaning making that hold the capabilities to transform the firm from within, providing a unique competitive advantage that enables the firm to grow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computational detection of regulatory elements in DNA is a difficult but important problem impacting our progress in understanding the complex nature of eukaryotic gene regulation. Attempts to utilize cross-species conservation for this task have been hampered both by evolutionary changes of functional sites and poor performance of general-purpose alignment programs when applied to non-coding sequence. We describe a new and flexible framework for modeling binding site evolution in multiple related genomes, based on phylogenetic pair hidden Markov models which explicitly model the gain and loss of binding sites along a phylogeny. We demonstrate the value of this framework for both the alignment of regulatory regions and the inference of precise binding-site locations within those regions. As the underlying formalism is a stochastic, generative model, it can also be used to simulate the evolution of regulatory elements. Our implementation is scalable in terms of numbers of species and sequence lengths and can produce alignments and binding-site predictions with accuracy rivaling or exceeding current systems that specialize in only alignment or only binding-site prediction. We demonstrate the validity and power of various model components on extensive simulations of realistic sequence data and apply a specific model to study Drosophila enhancers in as many as ten related genomes and in the presence of gain and loss of binding sites. Different models and modeling assumptions can be easily specified, thus providing an invaluable tool for the exploration of biological hypotheses that can drive improvements in our understanding of the mechanisms and evolution of gene regulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the ancient and acidic Ultisol soils of the Southern Piedmont, USA, we studied changes in trace element biogeochemistry over four decades, a period during which formerly cultivated cotton fields were planted with pine seedlings that grew into mature forest stands. In 16 permanent plots, we estimated 40-year accumulations of trace elements in forest biomass and O horizons (between 1957 and 1997), and changes in bioavailable soil fractions indexed by extractions of 0.05 mol/L HCl and 0.2 mol/L acid ammonium oxalate (AAO). Element accumulations in 40-year tree biomass plus O horizons totaled 0.9, 2.9, 4.8, 49.6, and 501.3 kg/ha for Cu, B, Zn, Mn, and Fe, respectively. In response to this forest development, samples of the upper 0.6-m of mineral soil archived in 1962 and 1997 followed one of three patterns. (1) Extractable B and Mn were significantly depleted, by -4.1 and -57.7 kg/ha with AAO, depletions comparable to accumulations in biomass plus O horizons, 2.9 and 49.6 kg/ha, respectively. Tree uptake of B and Mn from mineral soil greatly outpaced resupplies from atmospheric deposition, mineral weathering, and deep-root uptake. (2) Extractable Zn and Cu changed little during forest growth, indicating that nutrient resupplies kept pace with accumulations by the aggrading forest. (3) Oxalate-extractable Fe increased substantially during forest growth, by 275.8 kg/ha, about 10-fold more than accumulations in tree biomass (28.7 kg/ha). The large increases in AAO-extractable Fe in surficial 0.35-m mineral soils were accompanied by substantial accretions of Fe in the forest's O horizon, by 473 kg/ha, amounts that dwarfed inputs via litterfall and canopy throughfall, indicating that forest Fe cycling is qualitatively different from that of other macro- and micronutrients. Bioturbation of surficial forest soil layers cannot account for these fractions and transformations of Fe, and we hypothesize that the secondary forest's large inputs of organic additions over four decades has fundamentally altered soil Fe oxides, potentially altering the bioavailability and retention of macro- and micronutrients, contaminants, and organic matter itself. The wide range of responses among the ecosystem's trace elements illustrates the great dynamics of the soil system over time scales of decades.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual contemporaries, Sergei Rachmaninoff and Sergei Prokofiev were pianists, steeped in the traditions of Russian pianism; recordings of both pianists-composers playing their own works are available. Although the composers can be perceived as having little in common, in fact both composed in classical forms, both had a strong lyrical sense and both had an unbreakable connection with their Russian heritage. Rachmaninoff was the last great representative of Russian late Romanticism as well as one of the finest pianists of his generation. He cultivated a sweepingly passionate and melodious idiom, with pronounced lyrical quality, expressive breath and structural ingenuity. Prokofiev, on the other hand, tried to push the Russian Romantic traditions to a point of exacerbation and caricature before experimenting with various kinds of modernism. Stressing simplicity, he helped invent Neo-Classicism. His melodies are essentially tonal with wide skips and sweeping long lines. Harmonically, he used triadic harmony full of dissonances, strange inversions, unusual spacings, and jarring juxtapositions. Writing in classical forms, he incorporated rhythmic vitality and lyrical elements into his music. I have chosen to perform five works by each composer, written in a variety of genres, including the sonata, the toccata, variations, the concerto. I also have divided the pieces into three recital programs to show the idiosyncratic characteristics of the composers. I have endeavored to select pieces based on the technical and artistic challenges that they offer, thereby allowing me to grow as a pianist and an artist. My goal is to gain a thorough understanding of not only the pieces but also the musical styles of both composers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2015 IEEE.In virtual reality applications, there is an aim to provide real time graphics which run at high refresh rates. However, there are many situations in which this is not possible due to simulation or rendering issues. When running at low frame rates, several aspects of the user experience are affected. For example, each frame is displayed for an extended period of time, causing a high persistence image artifact. The effect of this artifact is that movement may lose continuity, and the image jumps from one frame to another. In this paper, we discuss our initial exploration of the effects of high persistence frames caused by low refresh rates and compare it to high frame rates and to a technique we developed to mitigate the effects of low frame rates. In this technique, the low frame rate simulation images are displayed with low persistence by blanking out the display during the extra time such image would be displayed. In order to isolate the visual effects, we constructed a simulator for low and high persistence displays that does not affect input latency. A controlled user study comparing the three conditions for the tasks of 3D selection and navigation was conducted. Results indicate that the low persistence display technique may not negatively impact user experience or performance as compared to the high persistence case. Directions for future work on the use of low persistence displays for low frame rate situations are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation project explores some of the technical and musical challenges that face pianists in a collaborative role—specifically, those challenges that may be considered virtuosic in nature. The material was chosen from the works of Rachmaninoff and Ravel because of the technically and musically demanding yet idiomatic piano writing. This virtuosic piano writing also extends into the collaborative repertoire. The pieces were also chosen to demonstrate these virtuosic elements in a wide variety of settings. Solo piano pieces were chosen to provide a point of departure, and the programmed works ranged from vocal to two-piano, to sonatas and a piano trio. The recitals were arranged to demonstrate as much contrast as possible, while being grouped by composer. The first recital was performed on April 24, 2009. This recital featured five songs of Rachmaninoff, as well as three solo piano preludes and his Suite No. 2 for two pianos. The second recital occurred on November 16, 2010. This recital featured the music of both Rachmaninoff and Ravel, as well as a short lecture introducing the solo work “Ondine” from Gaspard de la nuit by Ravel. Following the lecture were the Cinq mélodies populaires grecques and the program closed with the substantial Rachmaninoff Sonata for Cello and Piano. The final program was given on October 10, 2011. This recital featured the music of Ravel, and it included his Sonata for Violin and Piano, the Debussy Nocturnes transcribed for two pianos by Ravel, and the Piano Trio. The inclusion of a transcription of a work by another composer highlights Ravel’s particular style of writing for the piano. All of these recitals were performed at the Gildenhorn Recital Hall in the Clarice Smith Performing Arts Center at the University of Maryland. The recitals are recorded on compact discs, which can be found in the Digital Repository at the University of Maryland (DRUM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the beginning of the twentieth century, composers Béla Bartók and Zoltán Kodály collected thousands of folksongs from the rural regions of Hungary. In an effort to preserve a part of their culture that they feared would be lost, they not only transcribed and catalogued these folksongs, but also incorporated the folk traditions they encountered into their own compositional style. This dissertation deals with violin music written by Bartók, Kodály and their Hungarian contemporaries that have in common the use of rhythms, modes, melodies, figurations and playing techniques sourced in folk traditions. The use of the Hungarian folk idiom in classical music was not exclusive to the twentieth century. From the late eighteenth century until the first decades of the twentieth century, composers utilized aspects of a popular eighteenth-century form of Hungarian folk music called verbunkos. What makes the use of folk music unique in the twentieth century is that, thanks to the work of Bartók and Kodály, composers found inspiration in the more authentic “peasant music.” Unlike the popular, urban verbunkos music, peasant music was the product of the more secluded village-music tradition, largely untouched by the influences of city life. In addition to stimulating a new focus on peasant music, Bartók and Kodály fully assimilated the folk idiom into their compositional toolkits, creating a new style of folk-inspired art music that influenced a generation of Hungarian composers. The new style included characteristic elements of both peasant music and the verbunkos tradition, such as ancient modes and scales, accompanimental and melodic rhythmic patterns, ornamentation, and phrase structures sourced in folk song. To demonstrate the implementation of the folk idiom by twentieth-century Hungarian composers, three recital programs were given at the University of Maryland that included works by Béla Bartók, Sándor Veress, Leo Weiner, Zoltán Kodály, Ernő Dohnányi, Zoltán Székely and György Kurtág. The works can be divided into two main categories: settings or transcriptions of folk material (e.g. Bartók’s Hungarian Folksongs) and compositions using classical forms that include the Hungarian folk idiom (e.g. Bartók’s Contrasts). Recital collaborators include Li-Tan Hsu, Evelyn Elsing, Elizabeth Brown, Shelby Sender and Samantha Angelo.